2
$\begingroup$

The random walk on $R^2$ is defined as infinite series of $\{x_i\}_{i=0}^{\infty}$ where $x_0= (0,0)$ and each move can be one of these vectors: $ \{(-1,0) ,(0,-1) ,(1,0) ,(0,1)\} $

How can I bound the probability I am in a box of $ [-k,k] \times [-k,k]$, i.e. example how can I calculate $\sigma $ if I want to use Chebysehv's?

1 Answers 1

2

I'm assuming those moves have equal probability. Let me write the $n$'th step as $s_n$, so $x_n = \sum_{j=1}^n s_n$. Then $E[x_n] = 0$ and $E[\|x_n\|^2] = n$. If $x_n = (X_n, Y_n)$, then $X_n$ and $Y_n$ both have mean $0$ and their covariance matrix is $\pmatrix{n/2 & 0\cr 0 & n/2\cr}$. You could try Olkin and Pratt's multivariate version of Chebyshev's inequality (see the Wikipedia article on Chebyshev's inequality), but for your purposes it may be enough to use the more elementary $ P(|X_n| \le k\ \text{and}\ |Y_n| \le k) \ge 1 - P(|X_n| > k) - P(|Y_n| > k) \ge 1 - \frac{n}{2k^2} - \frac{n}{2k^2} = 1 - \frac{n}{k^2}$

  • 0
    Euclidean norm. It's basically the fact that the variance of a sum of independent random variables is the sum of the variances.2012-11-16