Unfortunately, it didn't work in general (not saying that your claim is false though). Here is what I was trying to do. Maybe it can help you come up with ideas, otherwise just nevermind it.
Let $L^2(X)$ be the set of all predictable processes $H$ such that the process $(\int_0^t H^2_s d\langle X\rangle_s)_{t\geq 0}$ is integrable, where $(\langle X \rangle_t)_{t\geq 0}$ denotes the predictable quadratic variation process.
In the following $\mathcal{H}^2$ (resp. $\mathcal{H}_{\text{loc}}^2$) denotes the set of all square integrable martingales (resp. locally square integrable martingales). Then we have the following theorem from Jacod & Shiryaev:
Theorem 4.40(b). Let $(X_t)_{t\geq 0}\in \mathcal{H}_{\text{loc}}^2$. Then $(\int_0^t H_s d X_s)_{t\geq 0} \in \mathcal{H}^2$ if and only if $H\in L^2(X)$.
Obviously we are in the scope of this theorem as $(B_t)_{t\geq 0}\in\mathcal{H}^2$ with predictable quadratic variation $\langle B_t\rangle = t$, $t\geq 0$. Furthermore if $(X_t)_{t\geq 0}$ is the solution to the SDE of the original post, i.e. $ X_t=\int_0^t \sigma(X_s) dB_s,\quad t\geq 0, $ then $(X_t)$ is adapted and continuous and hence it is a predictable process. The theorem now yields that $(X_t)_{t\geq 0}\in \mathcal{H}^2$ if and only if $ E\left[\int_0^t \sigma(X_s)^2 d \langle B\rangle_s\right]=E\left[\int_0^t \sigma(X_s)^2 ds\right]=\int_0^t E\left[\sigma(X_s)^2\right] ds<\infty $ holds for all $t\geq 0$. Using the Lipschitz assumption we get a sufficient condition for $(X_t)$ being a square integrable martingale: $ \int_0^t E[|X_s|]ds<\infty, \quad t\geq 0. $