EDIT:
So after all the effort I put in couple of days back to prove this, I found out today that this result was already published earlier this year. The article is "LDU decomposition of an extension matrix of the Pascal matrix," by Ik-Pyo Kim (Linear Algebra and Its Applications 434(10) 2187-2196, 2011) and can be found here.
The $LU$ factorization of $M_{n+1}$ is given below. Let $x$ be such that $x^2 = 1 + \frac{c}{ab}$. Then we have that $M_{n+1} = L_{n+1} U_{n+1}$ where $\small L_{n+1} = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\a & ax & 0 & 0 & 0 & \cdots & 0 & 0\\ a^2 & 2a^2x & a^2x^2 & 0 & 0 & \cdots & 0 & 0\\ a^3 & 3a^3x & 3a^3x^2 & a^3x^3 & 0 & \cdots & 0 & 0\\ a^4 & 4a^4x & 6a^4x^2 & 4a^4x^3 & a^4x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ a^n & \binom{n}{1}a^nx^1 & \binom{n}{2}a^nx^2 & \binom{n}{3}a^nx^3 & \binom{n}{4}a^nx^4 & \cdots & \binom{n}{n-1}a^nx^{n-1} & a^nx^n \end{bmatrix}$
$\small U_{n+1}^T = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\b & bx & 0 & 0 & 0 & \cdots & 0 & 0\\ b^2 & 2b^2x & b^2x^2 & 0 & 0 & \cdots & 0 & 0\\ b^3 & 3b^3x & 3b^3x^2 & b^3x^3 & 0 & \cdots & 0 & 0\\ b^4 & 4b^4x & 6b^4x^2 & 4b^4x^3 & b^4x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ b^n & \binom{n}{1}b^nx^1 & \binom{n}{2}b^nx^2 & \binom{n}{3}b^nx^3 & \binom{n}{4}b^nx^4 & \cdots & \binom{n}{n-1}b^nx^{n-1} & b^nx^n \end{bmatrix}$
This gives us $\det(L_n) = (ax)^{\frac{n(n-1)}{2}}$ and $\det(U_n) = (bx)^{\frac{n(n-1)}{2}}$.
Hence, $\det(M_n) = (ax)^{\frac{n(n-1)}{2}}(bx)^{\frac{n(n-1)}{2}} = (abx^2)^{\frac{n(n-1)}{2}} = (ab+c)^{\frac{n(n-1)}{2}}$.
Also note that $L_n = A_n X_n$ and $U_n^T = B_n X_n$ where $A_{n+1} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & a & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & a^2 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & a^3 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & 0 & a^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots\\ 0 & 0 & 0 & 0 & 0& \cdots & 0 & a^n\\ \end{bmatrix}$ $B_{n+1} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & b & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & b^2 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & b^3 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & 0 & b^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots\\ 0 & 0 & 0 & 0 & 0& \cdots & 0 & b^n\\ \end{bmatrix}$ $X_{n+1} = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\1 & x & 0 & 0 & 0 & \cdots & 0 & 0\\ 1 & 2x & x^2 & 0 & 0 & \cdots & 0 & 0\\ 1 & 3x & 3x^2 & x^3 & 0 & \cdots & 0 & 0\\ 1 & 4x & 6x^2 & 4x^3 & x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ 1 & \binom{n}{1}x^1 & \binom{n}{2}x^2 & \binom{n}{3}x^3 & \binom{n}{4}x^4 & \cdots & \binom{n}{n-1}x^{n-1} & x^n \end{bmatrix}$ Further $X_n$ can be written as $C_n \tilde{X}_n$ where $C_{n+1} = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\1 & 1 & 0 & 0 & 0 & \cdots & 0 & 0\\ 1 & 2 & 1 & 0 & 0 & \cdots & 0 & 0\\ 1 & 3 & 3 & 1 & 0 & \cdots & 0 & 0\\ 1 & 4 & 6 & 4 & 1 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ 1 & \binom{n}{1} & \binom{n}{2} & \binom{n}{3} & \binom{n}{4} & \cdots & \binom{n}{n-1} & 1 \end{bmatrix}$ and $\tilde{X}_{n+1} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & x & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & x^2 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & x^3 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & 0 & x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots\\ 0 & 0 & 0 & 0 & 0& \cdots & 0 & x^n\\ \end{bmatrix}$ $M_n = A_n C_n \tilde{X}_n^2 C_n^T B_n^T$ Setting $D_n = \tilde{X}_n^2$, we get a nice decomposition for $M_n$ as $M_n = A_n C_n D_n C_n^T B_n.$ Note that $A_n,B_n,D_n$ are diagonal matrices where $A_n(i,i) = a^i$,$B_n(i,i) = b^i$ and $D_n(i,i) = \left( 1+\frac{c}{ab} \right)^i$.
$C_n$ is a triangular matrix containing the binomial coefficients where $C_n(i,j) = \left \{ \begin{array}{lr} \binom{i-1}{j-1} & j \leq i\\ 0 & j > i \end{array}\right.$.
EDIT
I computed this decomposition by the usual $LU$ algorithm. But given that now we have this, it should be possible to reverse engineer other proofs, similar to the proofs for the Pascal's matrix.
Proof
Let me first write out the proof based on the algorithm I used to construct these $L$ and $U$ factors.
Consider the elimination matrix $E_{n+1}^y = \begin{pmatrix} 1 & 0 & 0 & 0 & \cdots\\ -y & 1 & 0 & 0 & \cdots\\ 0 & -y & 1 & 0 & \cdots\\ \vdots & \vdots & \vdots & \ddots & \cdots\\ \vdots & \vdots & \vdots & -y & 1 \end{pmatrix}$
Note that $E_{n+1}^a L_{n+1} = \begin{pmatrix}1 & 0\\ 0 & ax L_n \end{pmatrix}$ and $E_{n+1}^b U_{n+1}^T = \begin{pmatrix}1 & 0\\ 0 & bx U_n^T \end{pmatrix}$
Hence, $E_{n+1}^a L_{n+1} U_{n+1} (E_{n+1}^b )^T = \begin{pmatrix}1 & 0\\ 0 & abx^2 M_n \end{pmatrix}$.
Our hope is that the above matrix should match with $E_{n+1}^a M_{n+1} (E_{n+1}^b)^T$.
Consider the $(i,j)^{th}$ entry of $E_{n+1}^a M_{n+1}$. This is $M_{n+1}(i,j)-a M_{n+1}(i-1,j)$.
Now the $(i,j)^{th}$ entry of $E_{n+1}^a M_{n+1} (E_{n+1}^b)^T$ is $\begin{align} M_{n+1}(i,j)-a M_{n+1}(i-1,j) -b \left( M_{n+1}(i,j-1)-a M_{n+1}(i-1,j-1) \right)\\ = M_{n+1}(i,j) - a M_{n+1}(i-1,j) -b M_{n+1}(i,j-1) + ab M_{n+1}(i-1,j-1) \\ = \left( M_{n+1}(i,j) - a M_{n+1}(i-1,j) -b M_{n+1}(i,j-1) \right) + ab M_{n+1}(i-1,j-1) \\ = c M_{n+1}(i-1,j-1) + ab M_{n+1}(i-1,j-1) = abx^2 M_{n+1}(i-1,j-1) \end{align}$
But $M_{n+1}(i-1,j-1) = M_n(i-1,j-1)$. Hence, we get that $M_{n+1}(i,j)-a M_{n+1}(i-1,j) -b \left( M_{n+1}(i,j-1)-a M_{n+1}(i-1,j-1) \right) = abx^2 M_{n}(i-1,j-1).$
Hence, coupling with induction we get the complete proof. (Note that the matrix $E_n^y$ is always invertible enabling us to conclude what we want.)
Proof of the decomposition $M_n = A_n C_n D_n C_n^T B_n$ based on a combinatorial argument.
Let us first fix what we are counting.
Consider a positive integer lattice. Starting at a point $(m,n)$ the allowed options to move are either horizontal towards right to $(m+1,n)$ (or) vertical towards top to $(m,n+1)$ (or) diagonal towards right top $(m+1,n+1)$.
The cost for a horizontal movement is $a$, the cost for a vertical movement is $b$ and the cost for a diagonal movement is $c$. Further, the total cost of a path is the product of the costs of individual movements i.e. if a path takes $x$ horizontal right movement, $y$ vertical top movements and $z$ diagonal right top movements, the total cost of this path is $a^xb^yc^z$. Note that by this the cost to stay where you are is $1$.
Claim: $M(i,j)$ gives the sum of the costs over all paths to move from $(1,1)$ to $(i,j)$.
Proof: This immediately follows from the recursive definition of $M(i,j)$. The last step before hitting $(i,j)$, we will be either at $(i-1,j)$ (or) $(i-1,j-1)$ (or) $(i,j-1)$. Hence, the total cost over all paths from $(1,1)$ to $(i,j)$ is $aM(i-1,j) + bM(i,j-1) + cM(i-1,j-1)$.
So all we need to prove is $(A_n C_n D_n C_n^T B_n) (i,j)$ also gives the cost of moving from $(1,1)$ to $(i,j)$. Note that $A_n,B_n,D_n$ are diagonal matrices which essentially means though the product has $4$ summations, they can be collapsed to just one i.e. $ \begin{align} (A_n C_n D_n C_n^T B_n) (i,j) & = \sum_{k1} \sum_{k2} \sum_{k3} \sum_{k4} A_n(i,k1) C_n(k1,k2) D_n(k2,k3) C_n(k4,k3) B_n(k4,j)\\ & = \sum_{k1} \sum_{k} \sum_{k4} A_n(i,k1) C_n(k1,k) D_n(k,k) C_n(k4,k) B_n(k4,j)\\ & = \sum_{k} A_n(i,i) C_n(i,k) D_n(k,k) C_n(j,k) B_n(j,j)\\ & = \sum_{k=1}^{\min(i,j)} a^{i-1} \binom{i-1}{k-1} (x^2)^{k-1} \binom{j-1}{k-1} b^{j-1}\\ & = \sum_{k=1}^{\min(i,j)} a^{i-1} \binom{i-1}{k-1} \left( 1 + \frac{c}{ab} \right)^{k-1} \binom{j-1}{k-1} b^{j-1}\\ & = \sum_{k=1}^{\min(i,j)} a^{i-k} \binom{i-1}{k-1} \left( ab + c \right)^{k-1} \binom{j-1}{k-1} b^{j-k}\\ & = \sum_{k=1}^{\min(i,j)} \sum_{r=0}^{k-1} a^{i-k} \binom{i-1}{k-1} \binom{k-1}{r} (ab)^{k-1-r} c^r \binom{j-1}{k-1} b^{j-k}\\ & = \sum_{r=0}^{\min(i,j)-1} a^{i-1-r} b^{j-1-r} c^r \left( \sum_{k=r+1}^{\min(i,j)} \binom{i-1}{k-1} \binom{k-1}{r} \binom{j-1}{k-1} \right) \end{align} $ All we need to prove now is that $\displaystyle \left( \sum_{k=r+1}^{\min(i,j)} \binom{i-1}{k-1} \binom{k-1}{r} \binom{j-1}{k-1} \right)$ counts the number of paths with $r$ top right diagonal movements.
Equivalently, we need to prove that $\displaystyle \left( \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} \right)$ counts the number of paths with $r$ top right diagonal movements.
Counting the number of paths from $(1,1)$ to $(i,j)$ with $r$ top right diagonal movements. If we denote the right horizontal movement by $h$, top vertical movement by $v$ and top right diagonal movement by $d$, then the path we are interested in contains $(i-1-r)$ $h$'s, $(j-1-r)$ $v$'s and $(r)$ $d$'s. Any arrangement of this gives us the desired path. Hence, the total number of paths from $(1,1)$ to $(i,j)$ with $r$ top right diagonal movements is $\frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}.$
Hence all we need to prove is that $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$
Claim:$\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$
First note that $\displaystyle \binom{i-1}{t+r} \binom{t+r}{r} = \binom{i-1}{r} \binom{i-1-r}{t}$.
(Can be proved algebraically or through a combinatorial argument of choosing $3$ sets of things in different orders.)
Hence, we get that $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \binom{i-1}{r} \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1-r}{t} \binom{j-1}{t+r}$
Now $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1-r}{t} \binom{j-1}{t+r} = \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1-r}{t} \binom{j-1}{j-1-t+r} = \binom{i+j-r-2}{j-r-1}$
In the first step above, the first step is nothing but $\binom{n}{r} = \binom{n}{n-r}$ while the second step splits things into two different sets and does the same counting.
Hence, we get that $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \binom{i-1}{r} \binom{i+j-r-2}{j-r-1}$
But $\displaystyle \binom{i-1}{r} \binom{i+j-r-2}{j-r-1} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$ which can be seen through direct algebra or again a combinatorial argument which relies on picking things in different order.
Hence, we get that $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$.
This completes the combinatorial proof.