### PROBLEM LINK:

**Author and Editorialist:** Lalit Kundu

**Tester:** Kamil Debowski

### DIFFICULTY:

medium-hard

### PREREQUISITES:

combinatorics, fast-fourier transform, divide and conquer

### PROBLEM:

Given an array of integers A_1, A_2, ..., A_N and an integer K, count number of ways to choose a multiset(*i.e.* a set with duplicate elements) of size K. Two multisets are different iff count of any element is different in both.

### QUICK EXPLANATION:

======================

Answer is coefficient of x^K in polynomial \dfrac{(1 - x^{f_1 + 1})*(1 - x^{f_2 + 1})*....*(1 - x^{f_k + 1})}{(1-x)^{k}}, where f_1, f_2, ..., f_k are frequencies of k distinct elements present in the array A. Numerator of this polynomial can be evaluated using any fast polynomial multiplication algorithm since it’s degree is N.

### EXPLANATION:

================

**Theorem 1**:

It’s a popular result using generating functions that number of integral solutions to equation x_1 + x_2, ..., + x_r = N, where x_i \le l_i are coefficient of x^N in (1 + x + x^2 + ... + x^{l_1})*1 + x + x^2 + ... + x^{l_2})*...*1 + x + x^2 + ... + x^{l_r}), which we re-write as \dfrac{(1 - x^{l_1 + 1})*(1 - x^{l_2 + 1})*....*(1 - x^{l_r + 1})}{(1-x)^{r}}.

**Theorem 2**:

Coefficient of x^k in (1-x)^{-N} is C(N + k - 1, k), where C(i, j) is number of ways to choose j distinct elements from a set of i distinct elements. This can be proved using Taylor series expansion.

We’ll avoid the proof here, but you can follow above link to explore more.

Now, assume we have k distinct elements in the array with frequencies f_1, f_2, ..., f_k. Let’s say x_i is the frequency of i^{th} distinct element in chosen multiset, then x_1 + x_2 + ... + x_k = K, where x_i \le f_i. So, from above theorem, we need to find coefficient of x^K in polynomial \dfrac{(1 - x^{f_1 + 1})*(1 - x^{f_2 + 1})*....*(1 - x^{f_k + 1})}{(1-x)^{k}}. Note we can evaulate the numerator since it’s degree is O(f_1 + f_2 + ..., f_k) = O(N). Let’s say we use FFT to multiply two polynomials of degree p in O(p \text{ log } p). If we start multiplying these k terms in numerator in the given order, it could still take O(N^2) worst case. Here comes in the idea of divide and conquer. At each step we break down numerator into two parts with k/2 terms each, solve them recursively and multiply to find the polymial. This approach takes O(N \text{log}^{2} N) worst case.

For multiplication of two polymials you can use FTT or even Karatsuba. For further reading on these algorithms you can read this.