I'm reading a paper in which it is stated that
The entropy of an ergodic measure is defined as $\lim_{n \to \infty} -\frac{1}{n} \sum_{|w|=n} \mu[w] \log \mu[w].\tag{1} \label{eq:1}$
Here the underlying space is $(\{0,1\}^{\mathbb{N}}, \mathscr{B}, \mu)$, where $\mathscr{B}$ is the $\sigma$-algebra generated by cylinders. So for a fixed $n$, the sum is over all binary strings of lenght $n$.
I've not seen the entropy of a measure defined anywhere, so I need to make sure I know what that means. Since the definition at (1) above is actually of the entropy of an ergodic measure, and a measure can only be ergodic with respect to some underlying measure-preserving transformation, it seems that the transformation must be the shift since the shift, $T$, is already mentioned. With that assumption, I conclude then, that the entropy of $\mu$ must be what most books (that I've seen) call the entropy of $T$.
Question 1: Is my interpretation of the definition at (1) correct?
If the answer to question 1 is yes, then the definition of the entropy of $T$ (or $\mu$) differs from the one I've seen, which is that $h(T) = \sup_{\mathcal{A}} \, h(T,\mathcal{A}),\tag{2} \label{eq:2}$ where $\mathcal{A}$ ranges over all finite (measurable) partitions and for any fixed finite partition $\mathcal{A}$, $h(T,\mathcal{A}) = \limsup_{n\to \infty} \frac{1}{n} H \left( \bigvee_{k=0}^{n-1} T^{-k} \mathcal{A} \right),$ where $\bigvee$ denotes the common refinement of partition (i.e. the join).
Question 2: Are the definitions at (1) and (2) equivalent in this scenario?