EDIT: new formulation of the question (old version below).
In a paper I found the statement that a certain sum $M_n =Y_1+\dotsb Y_n$ is a martingale, $Y_i=f (X_k, Z_k) - E ( f(X_k, Z_k) | X_k)$. (The new $S_k$ would be $:=f(X_k, Z_k)$ but I won't use this in the new formulkation of the question). We have $E(f(X_k, Z_k) | X_k) = E(f(X_k, Z_k) | \mathcal{F}^X_k)$ with $\mathcal{F}^X_k=\{X_1, \dots, X_k\}$. $Z_k$ is iid, and ($X_k$, $Z_k$) are independent for each $k$, f is differentiable.
How can I assert that this is a martingale?
Let me explain what I know so far about the filtrations: In the original question (see bottom) I thought I would have to use the natural filtration, which doesn't seem to fulfill the martingale property. Then @Gortaur suggested to use the filtration $\mathcal{F}^X_k$. I wondered why this is measurable with respect to the filtration (question) - it isn't. So to achieve $M_n$ being a martingale, we have to choose at least the filtration $D_k:=\mathcal{F}^{(X, Z)}_k$, but with this filtration the martingale property doesn't seem to hold (instead of $E(M_{k+1} | F_k)=M_k$, $E(Y_{k+1}|F_k)=0$ is sufficient):
$E(Y_{k+1}|D_k)=E(f(X_k, Z_k) | D_k) - E(E( f(X_k, Z_k ) | \mathcal{F}^X_k ) | D_k) = f(X_k, Z_k) - E(f(X_k, Z_k)|\mathcal{F}^X_k) $
And if the subtrahend would be $=f(X_k, Z_k)$ then it would have to be $\mathcal{F}^X_k$-measurable as a version of this conditional expectation which it would not be in general (consider Doob's Lemma for example in this answer to my other question)
Link to the paper: here . In the notation there, $X_k:=\hat{\theta_k}$, $Z_k:= \Delta_k$ and $f=\hat{g}_{ki} (\hat{\theta_k})=\frac{L(\hat{\theta_k} + c_k \Delta_k)-L(\hat{\theta_k} - c_k \Delta_k)}{2c_k \Delta_{ki}}$ (I assume that I can deal with the cancelation of noise terms via the paper's assumption $ E(\epsilon_k^+ - \epsilon_k^- | F_k, \Delta_k) =0 $ and put the noise-less $L$ directly for $f$)
The original question:
I want to show that a sum is a martingale sequence, $M_n=Y_1+\dots+Y_n$. I know that it is sufficient to show that $E(Y_{n+1} | M_n) =0$ a.s. Unfortunately, the sum is rather complicated, $Y_k=S_k - E(S_k|X_k)$, where $S_k=\frac{f(X_k, Z_k)}{Z_k}$. That means for $k=1$: $ E[Y_2 | Y_1] = E \left[ \tfrac{f(X_2, Z_2)}{Z_2} - E \left(\tfrac{f(X_2, Z_2)}{Z_2} \right | X_2) | \tfrac{f(X_1, Z_1)}{Z_1} - E \left(\tfrac{f(X_1, Z_1)}{Z_1} \right | X_1) \right] =0 $
I think I will have to use that $E(Y_k)=0$, and I could if I knew, that $Y_k$ independent from $Y_1+\dots +Y_{k-1}$. $Z_k$ is an independent sequence and $X_2$ is more or less $X_1+f(X_1,Z_1)$ but still, the intuition would be that how far $Y_1$ is away from it's expectation is independent from how far $Y_2$ is Q: How can I show that $M_n$ is a martingal sequence? Are my preliminary considerations right? Would you think with this way one can show martingale property? Do I have to impose additional conditions on $f$ ?
I am thankful vor every hint
Note: $f:R^m \times R \to R$, $(X_k (\omega) , Z_k(\omega) ) \mapsto f(X_k (\omega) , Z_k(\omega) ) $
EDIT: I had forgotten, that in $Y_k$ we have the conditional expectation on $X_k$ instead of expectation per se.