I have a matrix-valued inhomogenous linear ODE
$X' = F(t)X + G(t)$, $X(0) = I_{n \times n}$,
$F(t),G(t) \in \mathbb{R}^{n \times n}$,
and the entries of $f$ and $g$ are continuous functions. What assumptions do I need on $F$ and $G$ to ensure that $X$ is invertible on a given interval $[0,T]$? If anyone could provide a reference on the topic, I'd appreciate it.
Edit: In one dimension, it's sufficient that $G$ is positive on $[0,T]$. Could something like this carry over to higher dimensions?