Need help in understanding how CHEFFIB can be solved using centroid decomposition!

Hello,
CHEFFIB problem in the recent DEC17 challenge can be solved using centroid decomposition but I am unable to understand what should be stored in the centroid so that update and query both can be done in log(n) time. Please share some insight over this.
TIA.

1 Like

I only got log^2(n) per query by storing a fenwick tree at each node. Unfortunately, it didn’t pass even with c++.

Yeah same case here too.

I also have the same doubt and waiting for an official or unofficial editorial.

any idea when the official editorials show up?

here’s a solution of O(log^2(n)) per query that passed by @spj_29

https://www.codechef.com/viewsolution/16467008

At each node, fenwick tree was built on 1 to max(distance) possible for elements in the subtree of decomposed tree (distance wrt original tree). At level 0, the max distance can be n , n/2…1, hence the space complexity is o(nlogn).

This makes the precomputation quite fast.

2 Likes

I even tried the same thing done by @spj_29 but got TLE. I think O(n(lgn^2)) require high optimizations.

@sdssudhu One thing that I can see is your LCA method was O(logn), should have used O(1) method.

I tried O(nlog^2n preprocessing for lca too in some of my earlier submissions I can’t find it though). But I think there is a better and faster solution to this. Maybe nlogn*(log(logn)^2).

Can this be solved using HLD?
I was thinking of updating only the chain heads as we move up,but not sure.
Plz if any1 could tell…