If you are looking for a software solution, you may use singular value decomposition (SVD), which has been implemented in a vast amount of software libraries. See the "external links" section of the Wikipedia page on SVD.
(As the row vector convention is used in the question, in the sequel, all vectors are row vectors; column vectors are represented as vector transposes.) Let $A=(n_1^T,\,\ldots,\,n_k^T)$, i.e. $A$ includes your $k$ normal vectors as column vectors. Now, if $w$ is in the intersection of the planes normal to the $n_i$s, we have $w\cdot n_i=0$ for every $i$. In other words, $ wA=0 $ where "$0$" means the zero $k$-vector $(0,0,\ldots,0)$. Now perform a SVD on $A$. This means $ A = \underbrace{U}_{n\times n} \underbrace{\begin{pmatrix} \sigma_1&&&0&\ldots&0\\ &\ddots&&\vdots&&\vdots\\ &&\sigma_r&\vdots&&\vdots\\ 0&\ldots&0&0&\ldots&0\\ \vdots&&\vdots&\vdots&&\vdots\\ 0&\ldots&0&0&\ldots&0\\ \end{pmatrix}}_{n\times k} \underbrace{V^T}_{k\times k} $ for some $n\times n$ real orthogonal matrix $U$, some $k\times k$ real orthogonal matrix $V$ and some $\sigma_1\ge\ldots\ge\sigma_r>0$, where $r=\mathrm{rank}(A)=$ no. of linearly independent vectors among $n_1, n_2, \ldots, n_k$. (Most computer implementations of SVD will decompose $A$ in this way, but certainly you should read the relevant documentation carefully.)
Now, let the $n$ columns of $U$ be $u_1^T, \ldots, u_n^T$. Since they are orthonormal, the equation $wA=0$ implies that $w\cdot u_i=0$ whenever $i\le r$. In other words, $w$ is spanned by $u_{r+1}, u_{r+2}, \ldots, u_n$. So, the whole problem reduces to finding the orthogonal projection of a vector $v$ to the subspace spanned by $u_{r+1}, u_{r+2}, \ldots, u_n$. And the answer is $ v\leftarrow \sum_{i=r+1}^n (v\cdot u_i)u_i, $ or equivalently, $ v\leftarrow v - \sum_{i=1}^r (v\cdot u_i)u_i. $ Depending on the actual value of $r$, one of these two formulations will be more efficient than the other.