Any random variable $Z:(\Omega,\mathcal F)\to(\mathbb R^n,\mathcal B(\mathbb R^n))$ such that $\mathrm E(\mathrm e^{\mathrm i\langle u,Z\rangle})=\mathrm e^{-\kappa\|u\|^2}$ for every $u$ in $\mathbb R^n$ and for some positive $\kappa$, is centered normal with variance-covariance matrix $2\kappa I$. A quick way to see this is to note that the function $\varphi_Z:\mathbb R^n\to\mathbb C$, $u\mapsto\mathrm E(\mathrm e^{\mathrm i\langle u,Z\rangle})$ is the Fourier transform of the distribution $\mathrm P_Z$ of $Z$ on $(\mathbb R^n,\mathcal B(\mathbb R^n))$, hence $\varphi_Z$ uniquely determines $\mathrm P_Z$.
Likewise, given any sigma-algebra $\mathcal G\subseteq\mathcal F$, if $\mathrm E(\mathrm e^{\mathrm i\langle u,Z\rangle}\mid\mathcal G)=\mathrm e^{-\kappa\|u\|^2}$ for every $u$ in $\mathbb R^n$ , then, for any $\mathcal G$-measurable random variable $Y:(\Omega,\mathcal G)\to(\mathbb R,\mathcal B(\mathbb R))$ and every $u$ in $\mathbb R^n$ and $v$ in $\mathbb R$, $ \mathrm E(\mathrm e^{\mathrm i\langle u,Z\rangle+\mathrm ivY}\mid\mathcal G)=\mathrm e^{\mathrm ivY}\cdot\mathrm E(\mathrm e^{\mathrm i\langle u,Z\rangle}\mid\mathcal G)=\mathrm e^{\mathrm ivY}\cdot\mathrm e^{-\kappa\|u\|^2}, $ hence $\mathrm E(\mathrm e^{\mathrm i\langle u,Z\rangle+\mathrm ivY})=\mathrm E(\mathrm e^{\mathrm ivY})\cdot\mathrm e^{-\kappa\|u\|^2}, $ which proves that $Z$ and $Y$ are independent since the Fourier transform of the distribution $\mathrm P_{(Y,Z)}$ coincide with the Fourier transform of the product distribution $\mathrm P_Y\otimes\mathrm P_Z$. Since this holds for every $\mathcal G$-measurable $Y$, $Z$ is independent of $\mathcal G$.
Applying this to $Z=X_t-X_s$ and $\mathcal G=\mathcal G_s$ shows that $X_t-X_s$ is independent of $\mathcal G_s$.