Your inequality is equivalent to $$ \left(\frac{a}{a+b}\right)^a \left(\frac{b}{a+b}\right)^b > \left(\frac{1}{2}\right)^{a+b}. $$
Let $p = a/(a+b)$. Then $1/2 < p < 1$ and the inequality is $$ p^a (1-p)^b > (1/2)^{a+b}. $$ Suppose (wlog) that $a,b$ are integers. Define $f(q) = q^a (1-q)^b$, the probability that a coin with bias $q$ comes up $a$ times "head" and $b$ times "tails". The inequality $f(p) > f(1/2)$ can be generalized to $$ \frac{a}{a+b} = \operatorname*{argmax}_{q \in [0,1]} f(q). $$ This states that the Maximum Likelihood estimate for $q$ is $a/(a+b)$.
If you don't like the fact that $a,b$ should be interpreted as integers, you can consider the equivalent inequality $$ g(p) > g(1/2), \quad g(q) = q^p (1-q)^{1-p}. $$ This has the same interpretation. Moreover, the "log likelihood ratio" $$\log \frac{g(p)}{g(1/2)}$$ is exactly equal to the Kullback-Leibler divergence between a $p$-biased coin and a fair coin. Non-negativity of the latter is equivalent to the inequality.