Von Neumann's Minimax theorem (quoted from Wikipedia):
For every two-person, zero-sum game with finite strategies, there exists a value V and a mixed strategy for each player, such that (a) Given player 2's strategy, the best payoff possible for player 1 is V, and (b) Given player 1's strategy, the best payoff possible for player 2 is −V.
Nash's theorem:
Every finite game has a mixed strategy equilibrium.
Now, to me, it seems that the Minimax theorem is simply a specific instance of the Nash theorem, for a two-player zero-sum game (the moment an equilibrium is established, the results concerning the game value follow immediately).
But in my Game Theory course, we studied these as two separate theorems, with entirely different proofs. Some exams even had proving both theorems as two exam questions - making it seem like a claim that "Minimax follows immediately from Nash's theorem" would be suspect.
Am I misunderstanding some fundamental difference between these two theorems? Or did we simply learn two different proofs of the same thing?