FSFSFS - Editorial



Author: Jingbo Shang
Tester: Xiaoxu Guo
Editorialist: Pushkar Mishra




Primes, Bitmask DP


Given a number N, in how many ways can we drop certain numbers from N! such that the resulting product is maximum-possible perfect square.


We need to somehow reduce the problem to something simpler. Let us think of the most obvious reduction that we can do: think in terms of primes.

Let us look at some of the key ideas which will give us a hint about our required complexity:

  • Only prime factors of numbers should be considered.
  • Each number could be treated as a set of prime factors.
  • There is no number that contains two or more prime factors more than \sqrt{3000}.
  • For small prime factors (less than \sqrt{3000}, only 16 primes), we record whether we have covered them or not.
  • For large prime factors (greater than \sqrt{3000}), we just want to select at most one number to cover it. This number is either the number itself or a multiple of it.

We can now think of doing a bitmask-dp on the 16 primes that we need to deal with. The Bitmask DP is explained with comments in this solution:


\mathcal{O}(2^{16} * 3000) in the worst case.



1 Like

I’m sure this contest’s problem codes have a deep meaning :smiley:

My solution is also bitmask DP, but it’s not the nothing described here. In fact, I don’t know how to accurately estimate its complexity; the exponential part is something like O(3^{16}) here, but that’s a very loose bound due to various reasons.

So, first, what will the maximum square be? Let’s find the power of p in N! for each p. If this power is odd, we have to remove a number divisible by p, and what number is better than p itself. In other words, the optimal square can be found by removing all primes which have an odd power in N!. Let’s store them in a list of “bad” primes.

What we’re counting is the number of subsets of \lbrace1..N\rbrace (the numbers we remove) such that they’re products of distinct bad primes and each bad prime divides exactly one of them. It seems like counting disjoint set covers.

This isn’t the representation we’re looking for - it’s better to look at it as counting the number of ways to partition the bad primes into sets with the product in each set being > 1 and \le N; we separated number 1 here, but that just multiplies the final answer by 2.

What’s the bound for the second largest bad prime in any such set? The simplest estimate is 16; by checking all N, we can decrease it to 13. Each of our sets is a subset of those small primes + possibly some larger one.

We’ll move from the largest to the smallest bad prime; let DP[s] be the number of ways in which we’ve already used subset s of small primes. If the current prime isn’t among the small ones, we can try all of their subsets s' with sufficiently small product (we can precompute those 2^13 products), all s and generate a new array DP_{new}[] - for each such s',s, we’re adding DP[s] to DP_{new}[s|s']. If the current prime is small, there are small changes to ensure that we don’t have to do anything if it got taken before and have to take it (include it in s') otherwise, but the principle stays the same.

Okay, so what’s the complexity? The number of ways to pick (s',s) in each step is O(3^{13}) (it’s possible to precompute disjoint pairs in O(13^2 2^{13})), which is kind of big considering that there are around 300 bad primes at most, but the number of ways to pick s' will be much smaller due to the fact that its product must be \le N/\mathrm{(current\ bad\ prime)}, which is usually (for all but the small primes) \le \sqrt{N}. If we assume there won’t be more than 10 ways on average to pick s', then we have complexity… O(300\cdot10\cdot2^{13}). Of course, this is non-asymptotic, we just have too many empirical numbers in it.


Thank you so much xellos. :slight_smile:
I really did not get time to work on this one. University is really demanding at times.

I know what you mean, I had one editorial of my own to write at the time. Well, you could always just publish hints as hints with “further explanation later”.