Let $x,y\in \left( 0;1\right).$ I want to prove that
$\left( \dfrac {x} {y}\right) ^{x}\left( \dfrac {1-x} {1-y}\right) ^{1-x}\geqslant 1$
Let $x,y\in \left( 0;1\right).$ I want to prove that
$\left( \dfrac {x} {y}\right) ^{x}\left( \dfrac {1-x} {1-y}\right) ^{1-x}\geqslant 1$
You need to show $y^x(1 -y)^{1 - x} \leq x^x(1 - x)^{1-x}$ for all $0 < y < 1$, given a fixed $0 < x < 1$. So maximize $y^x(1 -y)^{1 - x}$ with respect to $y$ using your favorite maximization technique. Logarithmic differentiation works for example. You'll get $y = x$ as the max.
In information theory, the Kullback-Liebler divergence (also called Kullback-Liebler distance or relative entropy) between the probability distributions $P$ and $Q$ of Bernoulli random variables with parameters $x$ and $y$ respectively is defined to be
$D(P||Q) = x\ln\left(\frac{x}{y}\right) + (1-x)\ln\left(\frac{1-x}{1-y}\right).$ For any two distributions (not just those of Bernoulli random variables), the Kullback-Liebler divergence is nonnegative, and is zero precisely when $P = Q$. See, for example, Theorem 2.6.3 in Cover and Thomas, Elements of Information Theory, Wiley-Interscience, 1991.
For fixed $x$, let $f_x(y) = y^x(1-y)^{1-x}$. Then
$f^{'}_x(y) = y^{x-1}(1-y)^{-x}(x(1-y)-(1-x)y)$
and (with some more checking) this shows that the maximum value occurs when $x(1-y)-y(1-x)=0$, i.e. when $x=y$.
But the statement that $f_x(x) \ge f_x(y)$ for all $x \in (0,1)$ is precisely what you wanted to prove. (Just divide by the RHS.)
Take the logarithm of both sides, this is equivalent to $ x(\log x - \log y) + (1-x)(\log(1-x) - \log(1-y)) \geq 0 $ for $x,y \in (0,1)$. Since $(0,1)$ is open, if the left hand side has a minimum, it occurs at a critical point. Differentiate in $y$ and set equal to $0$, $ \frac{-x}{y} + \frac{1-x}{1-y} = 0 $
you will find $x = y$ is the only possibility. Differentiating in $y$ again we find $ \frac{x}{y^2} + \frac{1-x}{(1-y)^2} $ which is strictly positive at $x=y$, so this will be a local minimum. Thus by plugging in $x=y$ we find that if there is a minimum of the left side, then it is $1$.
Next we must rule out the possibility that the function is unbounded below. Since $[\epsilon,1-\epsilon]$ is compact for all $\epsilon>0$ we know that the inequality we want holds and that $1$ is the minimum on $[\epsilon, 1-\epsilon]$. If the function was unbounded below, by continuity of the function we would have a point $(x_0,y_0) \in (0,1)^2$ at which the left side is less than $1$. But this contradicts the previous remark as $(x_0,y_0)$ has strictly positive distance from the boundary. Thus the function must be bounded below, and hence the inequality holds for all $x,y \in (0,1)$.