2
$\begingroup$

In bayesian estimation, when the model and plant noise is hold , the optimal estimator is Kalman filter. but I am wondering is there any literature that could prove the following gaussian identities?

$ N(z; Hx, R)N(x; y, P) = N(z; Hy, C)N(x; e, E) $ which $ C=R+HPH^T$ $ E^{-1}=P^{-1}+H^TR^{-1}H, E^{-1}e=P^{-1}y+H^TR^{-1}z $

  • 0
    N is standard identities, R,$P$are SPD matrix,$H$is not a scalar, the notion $N(z; Hx, R)$ means variable z has a expect value$Hx$and variance R, which is equivalent to : $N(z; Hx, R) = \frac{1}{\sqrt{2\pi det R}} exp((z-Hx)^TR^{-1}(z-Hx))$2011-11-09

1 Answers 1

0

It is proven in Appendix D of Mahler's book "statistical Multisource-Multitarget Information Fusion". As far as only the exponent is concerned, it is also proven in a technical report called "Tracking in Uncertain Environments" by D. J. Salmond. This proof essentially involves completing the square and using the Woodbury identity. See also Result 4.6 in Johnson & Wichern: "Applied Multivariate Statistical Analysis" or Section 1.4.14 in Bar-Shalom et al: "Estimation with Applications to Tracking and Navigation".