Given P(A|B) and P(A|C), how to get or strategically approach P(A|(B & C))?
Is there a way to approach this if it is not known whether B and C are dependent? If not, how to get P(A|(B & C)) assuming B and C are independent, or, dependent?
Given P(A|B) and P(A|C), how to get or strategically approach P(A|(B & C))?
Is there a way to approach this if it is not known whether B and C are dependent? If not, how to get P(A|(B & C)) assuming B and C are independent, or, dependent?
Suppose two $0/1$ coins $X_1,X_2$ are thrown. Let $B$ be the event $X_1 = 0$, $C$ be the event $X_2 = 0$, $A^1$ be the event $X_1 + X_2 = 0$, and $A^0$ be the event $X_1 + X_2 = 1$. Then $P(A^0|B) = P(A^1|B) = P(A^0|C) = P(A^1|C) = 1/2$ whereas $P(A^0|B \land C) = 0$ and $P(A^1|B\land C) = 1$.
I think instead of B and C (in)dependence you should consider its conditional (in)dependence given A.
Definition: "B and C are conditionally independent given A" <=> P(B&C|A)=P(B|A)*P(C|A).
This notion is much more useful in practical applications of Bayes rule than pure B&C (in)dependence. See this and this examples explaining this notion intuitively.
So, if you know from common sense that B and C conditionally independent given A, then you can do P(A|(B & C)) = P(B&C|A)*P(A)/P(B&C) = P(B|A)*P(C|A)*P(A)/P(B&C).
P.S.
You can further simplify this for practical reasons if you consider odds of $A$ against $\overline{A}$ (alternative hypothesis) instead of pure probability. This will help you get rid off P(B&C):
If B and C independent given $A$ and also independent given $\overline{A}$, then
"Odds of $A$ against $\overline{A}$ given B&C" =
P($A$|(B & C))/ P($\overline{A}$|(B & C)) =
P(B|A)*P(C|A)*P(A)/P(B|$\overline{A}$)*P(C|$\overline{A}$)*P($\overline{A}$) =
"Odds of B" * "Odds of C" * "Odds of A"
Further explanation of Byes Rule in odds form can be found at arbital.com/p/bayes_rule_odds/
If A and B are independent from C, then P(C|A) = P(C|B). If they are dependent, then just follow Bayes' theorem for a three event senerio.