Imagine you have 5 different territories on a game board. Suppose two players each have 100 soldiers. They both independently distribute their 100 soldiers throughout the 5 territories. The two players then reveal where they placed their soldiers; Whoever has the most soldiers on each territory takes control of that territory. Which-ever player has the most territories wins the game. If neither player has more territories than the other, then a stalemate occurs.
Essentially I want to determine the optimal strategy or strategies. I am at least a semi-decent Java programmer, so I thought I might be able to write a program to test every possibility excessively, so as to see which one results in the highest probability of winning.
However, how would I systematically test every possibility, I am having a hard time conceptualizing how I might do that? Would the number of possibilities be too astronomical to test on my home computer?
Here is a rough algorithm I thought up for solving the problem:
1) Record all possible strategies and number them each 1 to x (e.g. one of many possibilities: {40,40,10,10,0})
2) Start with strategy 1 and randomize the numbers
3) Make another randomization of strategy of 1
4) Commence "battle" between the two
5) Record who wins
6) Repeat steps 2 to 4, say 100 times
7) Move onto strategy 1 vs 2, then 1vs3, 1vs4, etc...until all are exhausted, using process stated above
8) Move onto strategy 2vs2, 2vs3, 2vs4, etc...until we reach x vs x.
So basically, what I need help with is: 1) How many possibilities are there? Is it too astronomical to test? 2) General mathematical algorithm for generating all possibilities.
Thanks!
Edit (for clarification): Suppose strategy 1 = <100,0,0,0,0>, 2 = <99,1,0,0,0>, etc... Oh, I think I may have just answered my own question? If this continued until a in (a,b,c,d,e) equals zero and b = 100, then that would be 100 different ways, then we continue this making 400 or 500 ways (fence post problem, not sure). Anyone else follow what I am trying to say here? :)
For anyone who wanted to know, if we assumed the program took 10 operations and we needed to do 2 quadrillion operations and our computer was capable of doing 3.3billion operations/sec, then it would take about 10 weeks to complete.