1
$\begingroup$

I have a bayesian network like that:

B -> A <- F

And this are the values for A:

P(A=true | B=true, F=false) = 0.01

P(A=true | B=true, F=true) = 0.92

P(A=true | B=false, F=false) = 1.00

P(A=true | B=false, F=true) = 1.00

Now I want to sum out the variable B because I need the values P(A=true | F=true) and P(A=true | F=false). What I thought I have to do is sum this lines where the F values are the same and B is true and false. So line 1 and line 3 on the one side and line 2 and 4 on the other. My result would be:

P(A=true | F=true) = 1.92

P(A=true | F=false) = 1.01

I was quite sure this is right but now I wonder why the sum is greater then 1!? I thought probabilities have to go between 0 and 1? Am I doing something wrong or where is my fault?

  • 0
    sorry, you're right :)2012-05-29

1 Answers 1

1

Don't you need the prior probabilities of $B$ and $F$?

$P(A=t|F=t) = \frac{P(A=t,F=t)}{P(F=t)} = \frac{P(A=t,B=t,F=t)+P(A=t,B=f,F=t)}{P(F=t)}$

$ = \frac{P(A=t|B=t,F=t) P(B=t) P(F=t) +P(A=t|B=f,F=t)P(B=f)P(F=t)} {P(F=t)}$

$ = 0.92 P(B=t) +1P(B=f)$

  • 0
    @EmreA is right. But we don't even need to call them prior probabilities. If you have P(A|B and F) and P(A|D and F) adding them together has no meaning but P(A and B and F)=P(A|B and F)P(B and F) =P(A|B and F) P(B|F)P(F). Similarly P(A and $C$ and F) =$P$(A|$C$ a$n$d F)P(C|F)P(F) You can then sum P(A and B and F) with P(A and C and F) to get P(A and F and (B or C)) and if B and C are mutually exclusive that will be P(A and B and F) + P(A and C and F). Note that each term that you summed needed to be multiplied by two probabilities before they could be summed!2012-05-29