This article is about the process of (up)solving this Codeforces problem from the last contest in which I participated (and lost 27 rating points). I loved this problem! Not because it is particularly interesting or difficult (compared to others I have solved), but because of the achievement feeling I got when finally cracking it.
Here is the problem statement, maybe give it a thought before reading the solution.
Statement of the problem
Alice and Bob are playing a game. Initially, there are cakes, with the -th cake having a tastiness value of .
Alice and Bob take turns eating them, with Alice starting first:
- In her turn, Alice chooses and eats any remaining cake whose tastiness is strictly greater than the maximum tastiness of any of the cakes she's eaten before that. Note that on the first turn, she can choose any cake.
- In his turn, Bob chooses any remaining cake and eats it.
The game ends when the current player can't eat a suitable cake. Let be the number of cakes that Alice ate. Then, Alice wants to maximize , while Bob wants to minimize .
Find out how many cakes Alice will eat if both players play optimally.
Input
Each test contains multiple test cases. The first line of input contains a single integer — the number of test cases. The description of the test cases follows.
The first line of each test case contains a single integer — the number of cakes.
The second line of each test case contains integers — the tastiness values of the cakes.
It is guaranteed that the sum of over all test cases does not exceed 5000.
Output
For each test case, output a single integer — the number of cakes Alice will eat if both players play optimally.
First thoughts
First of all a little bit of context. This problem is the fourth problem (D) of a contest with 9 total, and 3 hours of duration. At the time I encountered it, I had done the 3 problems before it (and it remained that way, this is my final standing). I was feeling pretty confident about it since I had done the 3 first problems with good timing compared to other friends with significantly higher ratings. I had even done problem C which one of them didn't do.
For problem D I had no immediate idea of the solution, so I went to see problem E with the same result. So, I stuck to problem D.
Strategies
Back to the problem, Alice's strategy is quite simple: she has to eat the least tasty cake she can. That way she has the lowest limit for the least tasty cake she can eat, thus increasing her possibilities.
Bob's strategy is not so simple. In general, he must prevent Alice from eating as many cakes as he can.
Something to notice is that, because Alice can't eat cakes with the same tastiness twice, it could help to make a list of frequency of the tastiness values of the cakes.
.cpp int t; cin >> t;
while
Heuristics
The first thing I thought was: Bob has to eat every cake starting from the most tasty down. But, why? It seems naive because it is. This is in fact the worst strategy for Bob, provided he has to play each turn.
Then if not starting from the most tasty, maybe starting from the least tasty. Bob should eat the cakes with the least tastiness that Alice has yet to eat. But Bob needs to eat each of the cakes of a particular level of tastiness before Alice can get to it, otherwise, even if he ate all but one, it's the same for Alice as she can only eat one of each tastiness.
This poses a challenge to Bob's choice of dessert to sabotage Alice. Choosing one with a high frequency might prevent him from removing others with lower frequencies. A heuristic for Bob could be to eat first the cakes that are unique. But what after that?
Wrong answer
Maybe Bob should eat the cakes in order of frequency as he can. With each "move" of Alice, he can eat one cake, i.e.: remove one to the frequency count of any tastiness value. So we can go one by one accumulating how many cakes Alice has eaten and after each, search for a more tasty stack of cakes with a frequency so Bob can eat it and prevent Alice from doing it.
.cpp // This the "Rest of the code" in the snippet above
int ans = 0;
int da = 0;
for
cout << ans << '\n';
In this approach, Bob goes greedily trying to eat the cakes from least to most frequent. This was my first attempt, and as you can see it led to a Wrong Answer.
Why?
A simple counter-example, let's say we have a frequency array like this: [1, 2, 2, 50, 50, 50, 50, 1, 1]
; with this algorithm, Bob would eat the two 1's at the end but Alice would get to eat from each of the 6 first stacks. Bob's strategy should be to eat from the third stack first, that way he prevents Alice from eating from it, and still has time afterward to eat the 1's at the end.
So, should Bob greedily eat the first stack he can? That doesn't work either. Look at the example [1, 2, 2, 2, 1, 1, 2]
; in this case, the optimal strategy for Bob is to eat the last 3 stacks, if he eats any other stack before, Alice would get to the 1's.
Solution
With the last example that I gave, I got to a realization. Bob needs to take the cakes such that the sum of their frequencies is less than or equal to the amount of cakes Alice has taken. Also, the configuration should be possible, i.e.: Bob should be able to get to all the cakes he ate before Alice.
This isn't much of a realization 😅, it's kind of the bare minimum description of what a solution should look like. Nevertheless, it helped me shape my thoughts and eventually come up with the solution.
Too slow
The most naive approach that gets to the true solution would be to check every possible combination of Alice and Bob taking each value, checking if the configuration is achievable. There are unique values of tastiness1. If each value can be taken by Alice or Bob it gives possible configurations. Checking all configurations would be impossible.
Let's see some numbers
A modern personal computer's CPU can do around elemental operations2 per second. The fastest supercomputer in 2023 had a speed of around 1 billion gigaFLOPS3; around elemental operations per second. There are around atoms in the universe4.
So, we have to check up to possible configurations. If we had a supercomputer on each of the atoms of the universe and used them exclusively for our problem (also assuming each check is an elemental operation, which is not true), we would take seconds or years.
In the context of competitive programming, the online judges run the submissions in (simulated) hardware with about the same computing power as a personal computer. So, for the problem in question, the solution should have a time complexity less or in the same order of .
Almost fast enough
In my experience, these sorts of problems in which one could take something or not are usually solvable using Dynamic Programming, a technique in which one re-uses the optimal solution to smaller problems to solve the bigger ones.
The process of solving a problem using dynamic programming involves identifying the overlapping sub-problems whose optimal solutions can be used for the bigger problem, then defining the state variables of the subproblem and the recurrence relation to build the solution from the subproblems.
I found it difficult to get to it because of the checks we needed to perform to be sure that Bob could eat some cakes but in Figure 1 you can see the first checks of the idea.
To check if Bob can eat a stack of cakes or not, we need to know how many cakes Alice has eaten, and we should also keep count of the cakes Bob has eaten so far. So, the state could be a pair , where is the number of cakes Alice has eaten, or rather the number of cakes Bob can eat (which is the number of cakes Alice has eaten minus the number of cakes Bob has eaten); and is the number of stacks of cakes that Bob has eaten.
The first stack can only be eaten by Alice, so the initial set of possible states is . Then we start traversing the array of frequencies from the second position and for each one, we apply the recurrence relation:
- For Bob to be able to eat the stack we need a previous state such that , and then that state becomes .
- If Bob decides not to eat that stack, Alice eats it and the state becomes .
This seems like we are also trying all possible combinations of taking or not, but the difference is that we don't care about the configuration of the previous state; we only care about how many more cakes Bob can eat and how many stacks he has eaten. This makes the number of possible states to be , because and can take different values.
Because we need to go through the frequency array of size , and for each index check states, the total complexity of the algorithm would be . This is fast enough to get the result in a few minutes or seconds, but not fast enough for the contest, we need to make it even faster.
State compression
Do we really need all states ? After passing through the array, the solution to the problem would be . So, for every state with the same we only care about the one with that biggest , as the other options are suboptimal for our problem. Similarly, we could say that for the same value of , we only want the state with the biggest , as it increases Bob's chances to eat future stacks.
This way we only keep states and then the complexity of the program will be . Here is my full submission with this approach:
.cppusing namespace std;
using lli = long long int;
using vi = vector<int>;
using ii = pair<int, int>;
int
I got "Time Limit Exceeded" 😐. What happened? I'm using a map
to save the dynamic programming state, and map
's operations like finding and inserting a key have logarithmic time complexity. So the actual complexity of my program is . which is not a small multiplier.
I changed the map
to an unordered_map
that performs the find and insert operations in amortized constant time, i.e.: , and I got my first "Accepted" on that problem.
Definitely fast enough
Although the previous solution passed, it took 1640 ms to run while other accepted solutions had times around 100 ms. The issue is still in the unordered_map
. It performs insert and find operations in amortized constant time, amortized being the important word here. unordered_map
uses hashing of the key to place the key-value pairs and to locate them in constant time after the hashing is performed. But the hashing itself is an expensive operation.
For our problem, the keys are numbers from 1 up to 5000, we can implement a map using a simple array where the values are stored in the index corresponding with the key. This is my final submission that ran in 93 ms!
.cppusing namespace std;
using lli = long long int;
using vi = vector<int>;
using ii = pair<int, int>;
int
Conclusion
Sometimes in the past, I have felt frustrated because I was able to solve some problems after the contest that I wasn't able to within the contest. Frustrated that I couldn't come up with the solution in time. This time I didn't and I'm glad I'm improving in that sense.
When I started doing competitive programming contests again last year I was barely able to solve problem B, it took me a long time to figure them out. But as you practice you get better, and upsolving is a huge part of this process. Taking the time to continue the thinking process you started during the contest, even if you need to take a break in between, helps improve your problem-solving skills.
In hindsight, the solution seems pretty obvious. Could I have come up with it during the contest? For some problems in past competitions, I think I could, but for this one, I'm almost certain that the answer is no. I exposed here some milestones of my thought process, but going through it entirely was necessary to get to the solution. This doesn't mean I won't be able to solve problems of this difficulty or harder during future contests, precisely because I took all of this time to solve this one, I might have a clearer vision in future events.
For simplicity I will use as the number of unique tastiness values, as it gives the worst case.
Elemental operations would be anything like comparing two numbers, addition, subtraction, multiplication, division, etc. Some of those operations take more time than others, but let's consider "elemental operation" to be an average value.
Data from Our World in Data. A FLOPS is a FLoating point OPeration per Second, something like the "elemental operations" I mentioned before. Giga means .
Source Wikipedia.