The variational principle says that:
The supremum of the metric entropies over all invariant probability measures is equal to the topological entropy.
First a minor correction: one should not say that the topological entropy has to do with the "expansion of $T$". Note that the notion is purely topological: it is an invariant of topological conjugacy and in fact it only depends on the topology on the space, not on the distance. Instead, we should see topological entropy it as a measure of how points get separated under iteration of the dynamics.
From a similar point of view, metric entropy measures how typical points for a given measure get separated under iteration of the dynamics. Indeed, there exists an equivalent description of metric entropy in terms of how some sets centered at the images of finitely many points under iteration cover a set of sufficiently large measure.
Summing up:
The variational principle tells us that the "speed" at which points get separated under iteration of the dynamics is obtained by maximizing over all invariant measures the "speed" at which typical points for an invariant measure get separated under iteration of the dynamics.
I write "speed" because I already noted that the we do not need a distance and so it needs to be described, in general, in terms of covers.
If you prefer we can instead say that the maximum of information is obtained at the topological level and that it is the supremum of the informations obtained at the measure-theoretical level for each finite invariant measure. Also, if you prefer, we can replace "information" by "uncertainty", this is really more of philosophical nature (I mean, whether it is better to use one or the other; mathematicians should prefer the first, while physicists should be more inclined for the second).