0
$\begingroup$

What are the possible measures of uncertainty for a discrete variable X=(x1, x2, ... xn), where values are defined by the alphabet - xi ∈ A, given probabilities p(xi) = P(X = xi) change over time?

E.g. consider time interval T∈( t0, tn) where t0 < t1 <... < tn
1) t = t0 < t1: P = { p(x0), p(x1), ..., p(x2) }
2) t = t1 < t2: P = { p(x0)', p(x1)', ..., p(x2)' }
3) ...

Where p(xi) ≠ p(xi)' and Δt = ti - ti-1 → 0

  • 0
    A more careful formulation of the question is required: if $X$ is a random variable, the expression $x\in X$ is mysterious.2011-07-19
  • 0
    I imagine that the best you can do is to give a measure of uncertainty for a fixed time, eg the variance for given $p(x_i)$. I suppose you could integrate these expressions across time, though it's not quite clear what that would mean.2011-07-19
  • 0
    @Didier I tried to update the question to make it a bit more clear2011-07-19
  • 0
    My comment still applies: a random variable is not the set of the values it takes.2011-07-19
  • 0
    Like @Chris said (but considering also entropy, apart from variance, as a standard measure of uncertainty).2011-07-19
  • 0
    @Didier, I have confused you with the word *random*, sorry for that, what I meant is a discrete variable that takes values from an alphabet with a given probability which change over time.2011-07-19
  • 0
    @Didier, I am not too sure which entropy I can apply, shall I make entropy function of time as well?2011-07-19
  • 0
    @oleksii You seem to have put the word *random* under the rug (in your context, I do not know what is a variable which is not a random variable) but the same objection still fully applies: a random variable X **is not** its image set A but a function from a probability space (Omega,F,P) to A. // The entropy of a discrete distribution $p$ is $H(p)=-\sum_xp(x)\log p(x)$, see http://en.wikipedia.org/wiki/Entropy_(information_theory)#Definition. If the distribution p(t) changes over time t, you might consider the integral of H(p(t)).2011-07-20
  • 0
    @Didier I think this answers my question, could you please post it as an answer so I can mark it?2011-07-20

1 Answers 1

1

Recall that a random variable X is not its image set A but a (measurable) function from a probability space (Ω,F,P) to a (measurable) set A. In your context, A={x1, x2, ... xn} and the distribution of the random variable X is characterized by a collection of nonnegative real numbers p(x1), p(x2), ..., and p(xn) summing to 1.

As @Chris said, your first task is to define a measure of uncertainty, suitable in your context. In other words, for every distribution p one must define a number U(p).

Two options are to use the variance

U(p)=M2(p)-M(p)2 with M2(p)=∑i xi2 p(xi) and M(p)=∑i xi p(xi),

or the entropy

U(p)=−∑i p(xi) log p(xi).

Both quantities are nonnegative, and zero only in the degenerate case when p(xi)=1 for a given i and p(xj)=0 for every other j.

Once the functional U is defined, if the distribution p changes with time t, one might consider the integral

0T U(p(t)) dt

of the functional U applied to p(t) on the time interval of interest.