Here is a very cheap example. Let $X_a$ be the random variable that takes on the values $a$ and $-a$, each with probability $1/2$, where $a$ is a parameter. The mean stays fixed at $0$ and the variance varies. If you would like to change the mean to a fixed quantity other than $0$, the example can be easily adjusted.
Using the same idea, we take a random walk, say on the line, moving $1$ step left or $1$ step right, each with probability $1/2$. Let the random variable $W_n$ denote our net displacement (positive or negative) after $n$ steps.
We can do the same trick using the difference $U-V$ of two independent identically distributed binomials. As we vary the parameters $p$ and $n$, the mean stays at $0$. However, by adjusting the parameters, we can obtain arbitrary variance.
The following is a more important class of examples. Repeat an experiment independently $n$ times, with probability of success each time equal to $p$. Let the random variable $Y_{p,n}$ be the sample mean. Then $E(Y_{p,n})=p$ and $\text{Var}(Y_{p,n})=\frac{p(1-p)}{n}$. So we can decrease the variance by increasing the parameter $n$, a very useful fact.
The negative binomial family, specially if we look at the more general one with $r$ a positive real, can be made to have the right property by changing the named parameters. For example, use $\mu$ and $p$, where $\mu$ is the mean. Almost by definition, you can vary $p$ while keeping $\mu$ fixed, and thereby change the variance. But admittedly this is more than a little artificial.