I'm looking at a random walk on a square lattice with a bias toward the origin. Any step away from the origin occurs with probability a probability p, which is less than the unbiased value of 1/4. I'd like to know the average amount of time it would take for the walker to reach a distance d from the origin. Does anyone have an idea how to solve this, or references to look at?
Edit: Some elaboration to my question. I consider a process for which it is only possible to move to neighbouring sites. Let's use (x,y) and (x',y') to denote the co-ordinates for two (of the four) sites neighbouring the site (x_0,y_0), and p(x,y) and p(x',y') to denote the probabilities of moving from (x_0,y_0) to (x,y) and (x',y') respectively. Then
p(x,y)/p(x',y') = exp[ (V(x',y') - V(x,y))/T ]
Where V(x,y) is a potential of the form
V(x,y) = (x^2+y^2)^(alpha/2).
There is also a probability p_s of staying at (x_0,y_0)
p_s/p(x,y) = exp[ - V(x,y)/T ].
I think this should be enough to specify the process.
Ideally I would like an expression for the hitting time to a boundary that lies at a radius l from the origin in terms of d, alpha and T. It would support the arguments I want to make if the hitting time is O(exp[d]) for all positive alpha and T. So I'm quite happy to study a simpler model if it can be expected to behave in an equivalent way. Can anyone give me some idea how to approach this. Or can you argue that what I want to argue is obviously true. Or can you point me toward some resources?