Suppose you have a space $\Omega$ where you have a probability defined. The possible outcomes of tossing a coin, for example. Now, imagine you are gambling... for each possible outcome it is defined the amount of money you will get (or loose). This is a random variable!!
The catch is that with a function $f: \Omega \to \mathbb{R}$, you can transport the probability defined for $\Omega$ to a probability in $\mathbb{R}$. So, if for instance you get $10$ bucks when you get heads, but looses $5$ for tails, then you can talk about the probability of loosing $5$ bucks when you toss the coin...
Notice that it does not make much sense to ask the expected "value" for heads or tails. If instead of heads and tails you get a coin with faces coloured green and blue, it is probably meaningless to say that in mean, the expected colour is cyan... On the other hand, if you are talking about loosing and gaining money, it makes sense to talk about the expected amount of money you will gain or loose. And this is the expectation.
It is important to emphasise that a "random variable" is NOT a function that gives randomly different values for the same "input". The amount you get for heads or tails is always the same for a fixed $f$! It is just a way to transport the probability in $\Omega$ to a probability in "amounts" (real numbers), so you can talk about expectation.
Usually one is not interested in the random variable itself... people talk about random variable when they just want to talk about the probability they induce in $\mathbb{R}$. This induced probability is the distribution of the random variable.
Two random variables $f$ and $g$ are independent, for example, when knowing or not the outcome of $f$ (in terms of events: $f \in A \subset \mathbb{R}$), makes no difference in determining the probability of $g$'s outcome.