The definition of an unbiased estimator of $\theta$ is a statistic $\hat\theta \equiv \hat\theta(X_1,...,X_n)$ that has an expected value equal to $\theta$ : $\Bbb E$$\hat\theta = \theta, \forall \theta \in \Theta, X_i \in P_\theta$.
The question is, if we are trying to get the expected value of $\hat\theta$ we should consider it as a random value in some probability space $(\Omega_\hat\theta, \Sigma_\hat\theta, P_\hat\theta)$. Then expected value is an integral by probability measure : $$\int_{\Omega_\hat\theta} \hat\theta(\omega) \,P_\hat\theta(d\omega)$$
So, what is $P_\hat\theta$?
My guess is that $\forall i$ $X_i$ is a random value from $(\Omega_\theta, \Sigma_\theta, P_\theta)$ to $\Bbb R$, so $X_{[n]} = (X_1,...,X_n)$ is a random vector from $(\underbrace{\Omega_\theta\times\dots\times\Omega_\theta}_\text{n times}, \Sigma_{X_{[n]}}, P_{X_{[n]}})$ to $\Bbb R^n$
Then $\hat\theta$ is a statistics so it is a function from $(X_1,...,X_n)$ and it can be considered as a random value in $(\Bbb R^n, \Sigma_{\hat\theta(X_{[n]})}, P_{\hat\theta(X_{[n]})})$ (or not, point if I'm wrong).
But what is $P_{\hat\theta(X_{[n]})}$, is it equal to $P_\theta$ or not and how to prove it?