I don't see any problem, or perhaps I'm missing something in the question. For example if $0\le T$ and one has
$$\frac{d}{dt}(z(t)) \le u(t) z(t) + v(t)$$
on $I=[0, T]$, then if $u,v$ are continuous
$$
z(T)\le z(0)e^{\int_0^T u(s) ds}+\int_0^T e^{\int_s^T u(r)dr} v(s) ds
$$
A question is what weaker conditions can we give on $u, v$ such that this holds ? I would suggest $u, v \in L^1(I)$
To explain the bounds in the integral, let's recall the theory of the linear first order equation
$$\frac{d}{dt}(z(t)) = u(t) z(t) + v(t)$$
If $v$ is identically $0$, the solution is $z(t) = C z_0(t) = C e^{U(t)}$ where $U$ is an antiderivative of $u$ and $C$ is an arbitrary constant. When $v \ne 0$, the constant $C$ must be replaced by an arbitrary antiderivative of $\frac{v(t)}{z_0(t)} = v(t) e^{-U(t)}$. The solution becomes
$$
z(t) = \left(C + \int_0^t v(s)e^{-U(s)}ds\right) e^{U(t)}
$$
Now choosing $U(t) = \int_0^t u(r) dr$ yields the above formula because
$$
e^{U(t)-U(s)} = e^{\int_0^tu(r)dr - \int_0^s u(r) dr} = e^{\int_s^t u(r) dr}
$$