A matrix $A$ has eigenvalue $\lambda$ if and only if there exists a nonzero vector $\mathbf{x}$ such that $A\mathbf{x}=\lambda\mathbf{x}$. This is equivalent to the existence of a nonzero vector $\mathbf{x}$ such that $(A-\lambda I)\mathbf{x}=\mathbf{0}$. This is equivalent to the matrix $A-\lambda I$ having nontrivial nullspace, which in turn is equivalent to $A-\lambda I$ being singular (determinant equal to $0$).
In particular, $\lambda=0$ is an eigenvector if and only if $\det(A)=0$. If the matrix is "nearly singular" but not actually singular, then $\lambda=0$ is not an eigenvalue.
As it happens,
$$\begin{align*}
\det(A) &= \left|\begin{array}{rr}
-3 & \hphantom{-}4\\
-4 & 5
\end{array}\right| + 2\left|\begin{array}{cc}
2 & 4\\
3 & 5
\end{array}\right| + 3\left|\begin{array}{rr}
\hphantom{-}2 & -3\\
3 & -4
\end{array}\right|\\
&= \Bigl(-15 + 16\Bigr) + 2\Bigl( 10 - 12\Bigr) + 3\Bigl(-8+9\Bigr)\\
&= 1 - 4 + 3 = 0,
\end{align*}$$
so the matrix is not "nearly singular", it is just plain singular.
The eigenvectors corresponding to $\lambda$ are found by solving the system $(A-\lambda I)\mathbf{x}=\mathbf{0}$. So, the eigenvectors corresponding to $\lambda=0$ are found by solving the system $(A-0I)\mathbf{x}=A\mathbf{x}=\mathbf{0}$. That is: solve
$$\begin{array}{rcrcrcl}
x & - & 2y & + & 3z & = & 0 \\
2x & - & 3y & + & 4z & = & 0 \\
3x & - & 4y & + & 5z & = & 0.
\end{array}$$
The solutions (other than the trivial solution) are the eigenvectors. A basis for the solution space (the nullspace of $A$) is a basis for the eigenspace $E_{\lambda}$.
Added. If you know a square matrix is singular, then finding eigenvectors corresponding to $0$ is equivalent to solving the corresponding system of linear equations. There are plenty of algorithms for doing that: Gaussian elimination, for instance (Wikipedia even has pseudocode for implementing it). If you want numerical stability, you can also use Grassmann's algorithm.