Define the $n$-digit number as follows. Let $f(1)=2$. Suppose that we have defined $f(n)$, and $2^n$ divides $f(n)$. Then define $f(n+1)$ as follows.
If $2^{n+1}$ divides $f(n)$, then $f(n+1)$ is obtained by putting a $2$ in front of the decimal expansion of $f(n)$. Or, to put it another way, $f(n+1)=2\cdot 10^n+f(n)$. If $2^{n+1}$ does not divide $f(n)$, then $f(n+1)$ is obtained by putting a $1$ in front of the decimal expansion of $f(n)$, that is, $f(n+1)=10^n+f(n)$. We show that $2^{n+1}$ divides $f(n+1)$.
Suppose first that $2^{n+1}$ divides $f(n)$. Then $f(n+1)=2\cdot 10^n+f(n)$. Note that $2\cdot 10^n$ is divisible by $2^{n+1}$, which shows that $f(n+1)$ is divisible by $2^{n+1}$.
Suppose next that $2^{n+1}$ does not divide $f(n)$. Then $f(n)\equiv 2^n\pmod{2^{n+1}}$ (the remainder when we divide $10^n$ by $2^{n+1}$ is $2^n$). But we have $10^n \equiv 2^n \pmod{2^{n+1}}$. It follows that $f(n+1)=10^n+f(n)\equiv 2^n+2^n\equiv 2^{n+1}\equiv 0\pmod{2^{n+1}}$.