I'd like to mention this is a homework problem and I am interested in the process of finding the answer, not the answer itself.
The problem is, we have:
A random variable X uniformly distributed on the interval [L-S, L+S]
A random variable Y uniformly distributed on the interval [X-S, X+S]
Where 0 < S < L
We are looking to find E[XY], My understanding is that if X and Y had been independent we could have used the property:
E[XY] = E[X]E[Y] since cov[X,Y] would have been equal to zero. (And if I am not mistaken, Y is dependent on X in this case).
Is it mathematically correct to substitute X's min and max possible values into the interval [X-S, X+S] as to say that Y is uniformly distributed on that interval: [L-2S, L+2S]?
Then, that E[XY] = L^2?
If not, how would I need to look at the problem? Is it possible to calculate cov(X,Y) directly?
Thanks for any help :)