For a function defined on a finite dimensional normed vector space (one can also extend this to the Fréchet derivatives on Banach spaces), we can think of the derivative as a linear approximation to the function. Let $(X,\|\cdot\|_X)$ and $(Y,\|\cdot\|_Y)$ be Banach spaces, a function $f:X\to Y$ is said to be differentiable at a point $x_0\in X$ if there is a bounded linear transformation $L_{x_0}:X\to Y$ such that the limit
$ \lim_{X\ni h \to 0} \frac{\| f(x_0 + h) - f(x_0) - L_{x_0}(h) \|_Y}{\|h\|_X} = 0 $
Now. In your case, you take $X = M_n$, the space of matrices. You pick a norm $\|\cdot\|$ (any matrix norm will do: since $M_n$ is a finite dimensional vector space, all norms are equivalent). Your function takes values, I guess, in $\mathbb{C}$. So we will just use the usual norm $|\cdot|$ on it. Linear maps between finite dimensional normed spaces are automatically bounded, so we don't have to worry about that adjective.
So be more explicit. Let us choose the Frobenius norm for your matrices. Then the condition for differentiability at $M\in M_n$ is that there exists a linear map $L_{M}: M_n \to \mathbb{C}$ such that
$ \lim_{M_n\ni H \to 0} \frac{ \left| f(M + H) - f(M) - L_M(H) \right|}{\sqrt{\sum_{1\leq i,j\leq n} |h_{ij}|^2}} = 0$
where $(h_{ij})$ are the matrix entries of $H$.