I know that the Pitman-Koopman-Darmois theorem says that only exponential family distributions have sufficient statistics whose dimension stays constant as the sample size increases.
I further know that the Fisher-Neyman factorization theorem says that any distribution with a sufficient statistic can be factorized as $f(x;\theta) = h(x)\cdot g(T(x);\theta).$
Trivially, if $T(x)$ is a bijection, then $g$ could just invert $T$ and recover the whole original sample, and thus $T$ should be "sufficient". But the way the Pitman-Koopman-Darmois theorem is always stated raises what I think is an obvious question, but to which I can't seem to find a clear answer :
Are there any distributions which have sufficient statistics which grow in dimension as the sample size grows, and which are "lossy" functions of their input data?
In particular, I'm thinking that if there's some class of distributions where the dimension of $T$ grows sublinearly with the dataset size, then those distributions could be incredibly useful in distributed computational settings.