There are a total of N candidates and M companies. Each candidate has a certain minimum expectation of salary. You will be given N*M numbers, such that for each pair (candidate,company) you will be given a value (0/1) telling if this company accepts this candidate. A company will provide a fixed salary to the candidates it employs (and it’s the same for any candidate it employs). Also, a company has an upper bound on the number of candidates it can employ and finally give an offer to.
Each candidate from (1,2,3,…N) (in this order) would call all companies that accepts him, and check which of them still hasn’t reached its job offers limit. Among which haven’t,he picks the company which provides the maximum offered salary, provided that it is at least his demanded salary. You have to find the number of the candidates that will end up with a job, the total amount of salaries that the candidates will get, and the number of companies that won’t be able to employ even a single candidate.
No two companies provide the same salary.
According to our bolded constraint (which made the problem much simpler) you can observe that any candidate would always have one choice (only), or he may not find a company at all (if all companies that he is interested in working at (and accepting employing him) reached their offers limit).
So turning the instructions written in the problem statement into a code (a straight forward simulation) is pretty enough to solve the problem. Process candidates in order, for each one iterate through all companies and check ones that accept this candidate, provides a salary that satisfies his demand and most importantly still have vacancies. Among these, pick the one with best salary, assign our candidate to this company and decrement its remaining vacancies by one. Calculating required values would be pretty easy. Check my implementation for better understanding.
Solution runs in O(N*M)