2
$\begingroup$

In bayesian estimation, when the model and plant noise is hold , the optimal estimator is Kalman filter. but I am wondering is there any literature that could prove the following gaussian identities?

$$ N(z; Hx, R)N(x; y, P) = N(z; Hy, C)N(x; e, E) $$ which $$ C=R+HPH^T$$ $$ E^{-1}=P^{-1}+H^TR^{-1}H, E^{-1}e=P^{-1}y+H^TR^{-1}z $$

  • 1
    Could you explain your notation? To me it looks as if these might amount to standard identities on multivariate normal distributions. I'm guessing you have in mind that $R$, $P$, $C$, and $E$ are square symmetric positive-definite matrices and are variances (or "covariance matrices"), $x$ and $y$ are column vectors, and $H$ is a not necessarily square matrix, so that the vectors $Hx$, $y$, $Hy$, and $e$ are expected values. But, I don't know what the semi-colon denotes. I had an initial guess, but it doesn't make sense.2011-11-09
  • 0
    N is standard identities, R, P are SPD matrix, H is not a scalar, the notion $N(z; Hx, R)$ means variable z has a expect value Hx and variance R, which is equivalent to : $$N(z; Hx, R) = \frac{1}{\sqrt{2\pi det R}} exp((z-Hx)^TR^{-1}(z-Hx))$$2011-11-09

1 Answers 1