Definition:
Let $(N, A, u)$ be normal-form game, and for any set $X$ let $\Pi(X)$ be the set of all probability distributions over $X$. Then the set of fixed strategies for player $S_i=\Pi(A_i)$.
- Where N is a finite set of n players, indexed by i
- $A=A_1\times...\times A_n$, where $A_i$ is a finite set of actions available to player i
The part that is unclear to me is this:
"...and for any set $X$ let $\Pi(X)$ be the set of all probability distributions over $X$"
Let's say that $X = \{0, 1\}$, what does "set of all probability distributions over $X$" mean? Is it a set $\{a, b\}$ of all possible values of $a$ and $b$, where both $a$ and $b$ are positive and add up to 1?
But what does it mean for $S_i$ that it's an infinite set that has all the positive numbers which add up to one? In mine example where $X = \{0, 1\}$, $S_i = \{(0.5,\ 0.5),\ (0.25,\ 0.75),\ (0.6,\ 0.4)\ ...\}$
As you can see, the "all" part confuses me.