Let $f$ be a random variable on a probability space $(\Omega, \Sigma,\mu)$ where $\mu f^2 < \infty$.
How would I prove (or disprove) that $\mu(\{|f-\mu f|>K\}) \le \frac{1}{K^2}(\mu f^2 -(\mu f)^2),$ for any $K>0$?
Let $f$ be a random variable on a probability space $(\Omega, \Sigma,\mu)$ where $\mu f^2 < \infty$.
How would I prove (or disprove) that $\mu(\{|f-\mu f|>K\}) \le \frac{1}{K^2}(\mu f^2 -(\mu f)^2),$ for any $K>0$?
For $K>0$, write \begin{align} K^2\mu(|X-EX|>K)&=K^2\mu((X-EX)^2>K^2)\\ &=\int_{\Omega} K^2\chi_{\{(X-EX)^2>K^2\}}d\mu\\ &\leq \int_{\Omega}(X-EX)^2d\mu\\ &=\int_{\Omega}\left(X^2-2XEX-(EX)^2\right)d\mu\\ &=EX^2-2(EX)^2+(EX)^2\\ &=EX^2-(EX)^2. \end{align}
The Chebyshev inequality says that μ({|f−μf|>Kσ})≤1/K$^2$ for any K>0. Here σ$^2$=(μf$^2$−(μf)$^2$). This is very similar to your expression but I do not see how removing the sigma from the left hand side of the inequality would bring a factor of sigma square to the right hand side.