I am probably misusing Bayes' theorem, but I can't figure out how.
$P(B|A) = 0.4$
$P(B) = 0.1$
$P(A) = 0.3$
$P(A|B) = \dfrac{0.4 \cdot 0.3}{0.1} = 1.2$
Since the probability of anything can't be > $1$, I must be wrong somewhere. Please help.
I am probably misusing Bayes' theorem, but I can't figure out how.
$P(B|A) = 0.4$
$P(B) = 0.1$
$P(A) = 0.3$
$P(A|B) = \dfrac{0.4 \cdot 0.3}{0.1} = 1.2$
Since the probability of anything can't be > $1$, I must be wrong somewhere. Please help.
It is not possible to have the hypothesized probabilities. $P(B|A)=\frac{P(A\cap B)}{P(A)}\leq\frac{P(B)}{P(A)}=\frac{1}{3}\lt 0.4$.