3
$\begingroup$

Really sorry to be a noob, but I'm a programmer, not a mathematician, and all of my knowledge about statistics come from this book "Schaum's Outline of Theory and Problems of Probability, Random Variables, and Random Processes".

I'm implementing an UKF for target tracking using C++. Everything went well until an error about covariance matrix of state is not positive definite happened.

After a little research, I found this link Under what circumstance will a covariance matrix be positive semi-definite rather than positive definite? which almost answer everything I need.

Only one thing I don't understand: The answer says "This happens if and only if some linear combination of X is ‘fully correlated". Can anyone explain for me what does "fully correlated" mean? And example would be great. I have search Google about its definition but there is no luck at all.

  • 1
    Just so we're clear, this is not standard terminology. Note that the answer there actually says “‘fully correlated’, *to use your phrasing”* (emphasis mine).2012-06-15

2 Answers 2

0

I think this means that the relationship is exactly linear and the correlation is 1 or -1.

  • 2
    Correlation of what with what?2012-06-15
  • 0
    I think he meant correlation coefficient2012-06-15
  • 0
    I interpreted the statement to mean that X and a variable Y are "fully correlated" because Y is exactly a linear function of X. Remember this is in the context of a covariance matrix being singular. So my interpretation makes perfect sense.2012-06-15
  • 1
    What is Y in the context of a covariance matrix? (Please use @ if you want the people you are talking to to know that you are talking to them.)2012-06-16
  • 0
    @did My response was not directed at anyone in particular. Y is just a symbol for one of two variables whose correlation is + or - 1. Why would anyone think that my explanation doesn't make sense? Inexplicably someone just downvoted it.2012-06-17
  • 2
    Hence the question: what are those *two variables*? Unless I am mistaken, there is only one, which the OP calls *some linear combination of X*. (Note: I am simply repeating my first question here.)2012-06-17
  • 0
    @did Come on! X is one variable Y= a linear combination of X namely Y=a+bX makes two.2012-06-17
  • 2
    Then, this has nothing to do with the covariance matrix being positive semi-definite rather than positive definite, hence the explanatory power of this remark is null.2012-06-17
  • 0
    @did Where do you get that?? X and Y are two elements in the covariance matrix. Y is a linear function of X. Hence the 2x2 covariance matrix of X and Y is singular. Hence the covariance matrix is positive semi-definite rather than positive definite. Why isn't this obvious?2012-06-17
  • 2
    Once again: let X and its covariance matrix C be given; consider assertion A: *some linear combination of X is fully correlated*. I fail to see what A means. You think that A is equivalent to the correlation between X (or rather, a linear combination of X since X is vector valued) and a linear combination Y of X is +1. Well... since Y=X always yields correlation +1 (whether C is definite positive or not), A is **always** true ? Then, **everybody** is fully correlated? Odd, isn't it? (And without further ado, let me silently take my leave...)2012-06-17
  • 0
    @did First of all the statement doesn't say that X is a vector. It could be a vector or a scalar. I don't actually understand what you are trying to say with your sarcastic remarks. I think this is not complicated. A covariance matrix is singular and hence positive semi-definite if one of the variables is an exact linear combination of some of the others. I think that is all that was meant by that rather awkwardly written statement.2012-06-17
  • 0
    I think there is an misunderstanding here. X and Y that Michael mentioned is the variables inside a r.v, and X that "did" mentioned is the r.v itself. Anyway, Michael has done a great yet simple explanation and I will accept his answer. Thank everyone for help me2012-06-18
2

(I was the poster of the question to which the OP refers and I am heartily embarrassed by my distinct lack of clarity).

The intention was that the relationship is exactly linear and the correlation is 1 or -1.

That is:

$\rho_{i,j} = \frac{\sigma_{i,j}}{\sigma_i \sigma_j} = \pm 1$

where $\sigma_{i,j}$ is the covariance of elements $i$ and $j$ and ${\sigma_i}^2$ and ${\sigma_j}^2$ is the variance of $i$ and $j$ respectively. In all cases $i \ne j$, except for the scalar case which is covered in the original post.

There is the assumption that the random variable is multivariate normal, which is the an assumption of the (Unscented) Kalman-based estimators that are the subject of both the original question and this one.

I hope this doesn't cause past answers to be wrong!

  • 0
    A comment: The assumption of Gaussian is not strictly required to derive an optimality property of the KF (See Kalman's original paper), but that is not the subject of this discussion!2013-10-09