Here is another way to see this. Define $R_i = A_{ii} - \sum_{j\neq i} \lVert A_{ij} \rVert$. Your condition is that $R_i>0$ for all $i$.
Let $s_{ij} = \frac{A_{ij}}{\lVert A_{ij}\rVert}$ be the sign of $A_{ij}$. Then you can check algebraically (just match coefficients) that for all $x\in\mathbb{R}^n$ \[x^T A x = \sum_{i=1}^n R_ix_i^2 + \sum_{i=1}^n \sum_{j>i} \lVert A_{ij}\rVert (x_i + s_{ij} x_j)^2. \] Since squares are nonnegative and the $R_i$ are assumed positive, all summands are nonnegative for all $x\in\mathbb{R}^n$. Furthermore, if $x\neq 0$ then $x_i\neq 0$ for some $i$, so $x^TAx\geq R_ix_i^2>0$. Therefore $A$ is positive definite.
This expression for $x^TAx$ can be alternatively viewed as expressing $A$ as a weighted sum $A = \sum_k c_k v_kv_k^T$, where each $c_k>0$ and each $v_k\in\mathbb{R}^n$ is a vector with support (number of nonzero entries) at most two, each of which is $\pm 1$. But $v_kv_k^T$ is always positive semidefinite for $v_k\in\mathbb{R}^n$ and positive combinations of positive semidefinite matrices are positive semidefinite. Since each $R_i>0$, we can decrease some of the $c_k$ slightly (those which correspond to the $v_k$ which are standard unit vectors) and instead write $A = cI + \sum_k c_k v_kv_k^T$ for $c>0$, which shows that $A$ is in fact positive definite.
One nice thing about this proof: Every positive definite (or semidefinite) matrix can be written as a positive combination of matrices $vv^T$, but this proof shows that for diagonally dominant matrices we can take all the $v$ to have support at most $2$. This gives some intuition for why "most" positive definite matrices are not diagonally dominant. For example if $v$ is any vector of support size at least three then for small enough $c$, $cI + vv^T$ is positive definite but not diagonally dominant.