KGP13C - Editorial



Editorialist: Jingbo Shang




Dynamic Programming


Given a K-steps stair, try to merge some adjacent steps to get M-steps remained such that the total volume added is minimum.


We can only merge consecutive steps in this problem. This gives us the intuition of dynamic programming.

Let’s denote the cost of merging steps in [L, R] as cost[L][R]. Clearly, we can brute-forcely build this array in O(K^3) time by definition, because K is only at most 500 here.

And let f[i][j] states the minimum cost of merging the first i steps into j steps. Then, we can try to deal with the problem using the following recursion.

f[i][j] = min { f[k][j - 1] + cost[k + 1][i] }  for all 0<= k < i.

Initially, by definition, we have

f[0][0] = 0, f[0][j > 0] = + infinite.

Therefore, we can solve this problem in O(K^3).

In fact, we can also further optimize this algorithm to O(K^2).

We can observe that cost[L][R] can be calculated by some values (e.g. prefix sum of heights, lengths, areas) of L and R separately. More specifically,

cost[L][R] = (SHeight[R] - SHeight[L - 1]) * (SLeng[R] - SLeng[L - 1]) - (SArea[R] - SArea[L - 1])

About the dynamic programming, we should observe that f[i][j] only depends on f[k < i][j - 1]. Furthermore, f[i][j] can be expressed in the minimum of Ak * f[k][j-1] + Bk. If we first loop over j and then enumerate i from small to large, due to the monotonic property of Ak, we can maintain a double-ended-queue of k ordered by the Ak (pop out the invalid k). Every k will enter or leave this queue at most once. Because we can then find the minimum in O(1) time, the total complexity is reduced to O(K^2).