3
$\begingroup$
  1. I was wondering how to solve the Kalman filter's recursive equation (also see the appendix at the end of this post) for the estimated state $\hat{\textbf{x}}_{n|n}$ at time $n$, over discrete times $k=1,\dots,n$? By solving the recursive equation, I mean to expand $\hat{\textbf{x}}_{n|n}$, for example, as linear combination of

    • the true initial state $\textbf{x}_0$,
    • the user-provided initial state $ \hat{\textbf{x}}_{0|0}$,
    • the control inputs up to time $n$, i.e. $ \textbf{u}_k, k=1,\dots,n$,
    • the state noises up to time $n$, i.e. $ \textbf{w}_k, k=1,\dots,n$,
    • the output noises up to time $n$, i.e. $ \textbf{v}_k, k=1,\dots,n$
  2. By substituting previous recursion into each recursion, starting from the initial one, I quickly realized the expanded form gets more and more complicated, and failed to observe some patterns common over the discrete times, in order to simplify the expanded form. So I wonder if it is always possible to solve a general recursive equation by observing some patterns? When possible, what are some ways to observe the patterns?

I have searched for a while in available books and on the internet, but haven't found one mentioning such expanded form for the estimated state by the Kalman filter. So I appreciate it if someone could provide some pointers as well.

Thanks and regards!

Appendix:

The Kalman filter tries to estimate the state $\textbf{x}_{k}$ from the outputs $\textbf{z}_{i}, i=1,\dots,k$ in the following state-space model: $ \textbf{x}_{k} = \textbf{F}_{k} \textbf{x}_{k-1} + \textbf{B}_{k} \textbf{u}_{k} + \textbf{w}_{k} $ $ \textbf{z}_{k} = \textbf{H}_{k} \textbf{x}_{k} + \textbf{v}_{k} $ where the state noises and output noises have distributions: $\textbf{w}_k \sim N(0, \textbf{Q}_k)$ $ \textbf{v}_{k} \sim N(0, \textbf{R}_k) $

The Kalman filter algorithm gives

  • a recursive equation for the estimated states $\hat{\textbf{x}}_{k|k}$ from previous one $\hat{\textbf{x}}_{k-1|k-1}$, and
  • a recursive equation for the estimated variance $\textbf{P}_{k|k}$ from previous one $\textbf{P}_{k-1|k-1}$, which is needed in the recursive equation for $\hat{\textbf{x}}_{k|k}$.

Initialize $ \hat{\textbf{x}}_{0|0}$ and $\textbf{P}_{0|0}$.

At each iteration $k=1,\dots,n$

Predict

Predicted (a priori) state estimate $ \hat{\textbf{x}}_{k|k-1} = \textbf{F}_{k}\hat{\textbf{x}}_{k-1|k-1} + \textbf{B}_{k} \textbf{u}_{k} $ Predicted (a priori) estimate covariance $ \textbf{P}_{k|k-1} = \textbf{F}_{k} \textbf{P}_{k-1|k-1} \textbf{F}_{k}^{\text{T}} + \textbf{Q}_{k}$ Update

Innovation or measurement residual $ \tilde{\textbf{y}}_k = \textbf{z}_k - \textbf{H}_k\hat{\textbf{x}}_{k|k-1}$ Innovation (or residual) covariance $\textbf{S}_k = \textbf{H}_k \textbf{P}_{k|k-1} \textbf{H}_k^\text{T} + \textbf{R}_k$ Optimal Kalman gain $\textbf{K}_k = \textbf{P}_{k|k-1}\textbf{H}_k^\text{T}\textbf{S}_k^{-1}$ Updated (a posteriori) state estimate $\hat{\textbf{x}}_{k|k} = \hat{\textbf{x}}_{k|k-1} + \textbf{K}_k\tilde{\textbf{y}}_k$ Updated (a posteriori) estimate covariance $\textbf{P}_{k|k} = (I - \textbf{K}_k \textbf{H}_k) \textbf{P}_{k|k-1}$

  • 0
    I was wrong previously. The output $\textbf{z}_k$ is involved in the recursive equation of $\hat{x}_{k|k}$, so $\hat{x}_{k|k}$ also depends on the true initial state $\textbf{x}_0$ through the state-space model.2012-05-13

0 Answers 0