The INOI 2012 question paper can be accessed by clicking on the link below and scrolling over to the end.
My code’s here
For the first question, this is what I tried. Let’s consider the sample input.
18 7 6
23 10 27
20 9 14
After taking the input, I maintain a vector of pairs called sum which contains the total time for all the 3 events for each person along with the person’s index (indexed from 0) i.e.
Then, I sort this vector in descending order based on the total time for all 3 events. In this sorted order of total time, the order of indices of the citizens is the actual order of the citizens taking part (adopting a greedy strategy). The sorted order is
So 1,2,0 is the order in which citizens take part in the events. Now, I maintain a result array where I calculate the total time including the waiting time for the previous COBOL participants (from the cobol array) which consists of:
Finally, the maximum element in this result array gives me the answer. However, this algorithm gives me a WA and TLE on the server. I understand why it is time inefficient, it’s because while calculating the result array, the number of iterations is of the order 10^10. I’ll work on that on my own. But why is the algorithm otherwise wrong? The reason behind me adopting that greedy strategy was to minimize the maximum element in the result array by adding the most waiting time to the person with least total time for 3 events.
Could someone help me disprove this and also give me a pointer on how to formally prove and disprove ad-hoc algorithms? I want to learn not to always trust intuition.