Information theory is not at all my field of expertise, so maybe my question will be a bit naive.
As said in title, I would like to quantify the gain of information of a new information.
For instance, if I have a binary random event:
$P(X=0)=.9$
$P(X=1)=0.1$
If I finally know that $X=0$ (situation 1), I do not gain much information. But if get to know that $X=1$ (situation 2), the situation changes a lot (I gain a lot of information?).
However if I compute the difference of entropy between the original situation and the final one. I get the same difference.
If I say that the fact of getting this new information is itself random event S (with probability $0.5$), s.t:
Situation 1:
$P(S=0 & X=0) = 0.9*0.5$
$P(S=0 & X=1) = 0.1*0.5$
$P(S=1 & X=0) = 0.5$
$P(S=1 & X=1)= 0$
Situation 2:
$P(S=0 & X=0) = 0.9*0.5$
$P(S=0 & X=1) = 0.1*0.5$
$P(S=1 & X=0) = 0$
$P(S=1 & X=1)= 0.5$
The mutual information between the random event S and X is bigger in situation 2, which confirms the intuition that the information gain is bigger in situation 2.
Still, it is a bit weird to me to say that S is a random event and to arbitrary set its probability at 0.5.... (But it is the best that I found so far).
Could you please tell me if there is a more standard way to deal with this situation, and quantify the gain of information?