It is perhaps easiest to think in terms of frames.
Let $\{e_0, e_1, \ldots, e_n\}$ be a basis (we treat them as explicit column vectors in the standard coordinates of the vector space) of $\mathbb{R}^{1+n}$ such that under the Minkowski metric you have $ m(e_0, e_0) = -1\qquad m(e_i,e_j) = \delta_{ij}, i,j \geq 1, \qquad m(e_0,e_j) = 0$ (so that it forms an orthonormal basis of the Minkowski metric). You have that the matrix $ \Lambda = \begin{pmatrix} e_0 & e_1 &\ldots & e_n\end{pmatrix}$ formed with columns the basis vectors is an element of $O(1,n)$. Similarly, from the definition of $O(1,n)$ (that is preserves the Minkowski metric) you see that every element must be of the form above: that its individual column vectors form an orthonormal basis of Minkowski space.
Since the determinant is a continuous function, we have that the subsets of $O(1,n)$ for which $\det$ are $\pm1$ must be disconnected. Define $SO(1,n)$ to be the component with the positive determinant.
Now, since $m(e_0,e_0) = -1$, we expand $ m(e_0,e_0) = - \left[(e_0)_0\right]^2 + \sum_{i = 1}^n \left[(e_0)_i\right]^2 \geq - \left[(e_0)_0\right]^2 $ where $(e_0)_\mu$ are the components of the column vector $e_0$. This implies that $\left[(e_0)_0\right]^2 = (\Lambda_{00})^2 \geq 1$. Hence $\Lambda_{00}$ is either greater or equal to positive 1, or less than or equal to negative 1. Since for a connected component, $\Lambda_{00}$ is a continuous function, we must have that the sets $O(1,n) \cap \{\Lambda_{00} > 0\}$ and $O(1,3)\cap \{\Lambda_{00} < 0\}$ are disconnected. We call the component where $\Lambda_{00} > 0$ by $O^+(1,3)$.
We claim that $O^+(1,n)$ forms a subgroup. We argue geometrically. Let $\Lambda$ be as above.
Claim 1. Let $u,v,w\in \mathbb{R}^{1+n}$ be such that $m(u,u), m(v,v), m(w,w)$ are all negative (so they are all time-like). Then $m(u,v)\cdot m(v,w)$ and $m(u,w)$ have the opposite signs.
Proof: complete $v$ to an orthonormal basis $\{v_0 = v, v_1, v_2, \ldots, v_n\}$. We have that $u = - m(v,u) v_0 + \sum m(v_i,u) v_i$ and similarly $w$. The claim follows immediately.
Claim 2. If $v,w$ are time-like vectors such that $v-w$ is space-like. Then $m(v,w) < 0$.
Proof: Observe that $ m(v,w) + m(v, v-w) + m(w,v) + m(w,w-v) = m(v,v) + m(w,w) < 0 $ and that the left hand side rearranges to $ 2 m(v,w) + m(v-w,v-w) \geq 2 m(v-w) $
Claim 3. If $\Lambda \in O^+(1,n)$, and $v$ is a time-like vector such that the $0$ component of $v$ is positive, then so is $\Lambda\cdot v$. (That is, $\Lambda$ preserves time orientation.)
Proof: Let $s_0$ be the column vector $(1,0,0\ldots)$. By assumption on the $0$ component, we have $m(s_0,v) < 0$. By the isometry property, $\Lambda\cdot v$ is time-like. By definition, $\Lambda \cdot v - |m(s_0,v)| e_0$ is space-like. $e_0$ is time-like. By Claim 2 we have that $m(e_0, \Lambda\cdot v) < 0$. By assumption we have $m(e_0,s_0) < 0$. So by Claim 1 we must also have $m(\Lambda \cdot v, s_0) < 0$, as desired.
Claim 4. $O^+(1,n)$ is a subgroup.
Proof: the identity clearly belongs to it. It suffices to show that matrix multiplication is closed. Let $\Lambda,\Lambda'$ be elements of $O^+(1,n)$. The $00$ component of the product $\Lambda\Lambda'$ is obtained by $s_0^T\Lambda\Lambda' s_0$. That is to say, for $\Lambda\Lambda'$ to have positive $00$ component it suffices that $m(\Lambda\Lambda's_0,s_0) < 0$. But this follows from Claim 3.
The above implies that $O^+(1,n)$ and $SO(1,n)$ are subgroups of $O(1,n)$. Hence their intersection $SO^+(1,n) = O^+(1,n) \cap SO(1,n)$ is a subgroup of $O(1,n)$. Furthermore, by definition, the connected component of the identity of $O(1,n)$ entirely contained in $SO^+(1,n)$. Thus it remains to show that $SO^+(1,n)$ is (path) connected. One can in principle show this by considering the exponential map applied to the corresponding Lie algebra, but a more intuitive way is as follows:
using the above characterisation of $\Lambda$ as a frame, it suffices to find a continuous family of frames connecting the standard basis $\{s_0, s_1, \ldots, s_n\}$ to the basis $\{e_0, e_1, \ldots, e_n\}$. A path can be sketched as follows: first to an $SO(n)$ rotation on $\{e_0, \ldots, e_n\}$ such that $e_0$ is in the span of $\{s_0, s_1\}$. This is path connected. Then a hyperbolic translation (Lorentz boost) can bring $e_0$ back to $s_0$; this step requires that $\Lambda\in O^+(1,3)$. Now finish with another $SO(n)$ rotation aligning the remainder of the axes. This uses the fact that $SO(n)$ is connected, and that $\Lambda\in SO^+(1,3)$, so that once $e_0$ is "brought back" to $s_0$, the remainder of the axis are of the same orientation as $\{s_1, \ldots, s_n\}$.