Here's a somewhat more systematic argument for the intuitively plausible result that the growth factor of the $F_k$ is bounded away from $2$ for $n\to\infty$ whereas the growth factor of the $G_k$ tends to $2$ for $n\to\infty$. I will use the exponential growth formula (Theorem IV.7 in Analytic Combinatorics (free download link)), which states that the radius of convergence of an analytic function at the origin (which for a rational function is the magnitude of the pole nearest to the origin) is the inverse of the growth factor of the coefficients.
First, for $F_k$, we have two cases. The case of even $n$ is easy. The colouring is $010\ldots0102$, and independently of $n$ the possible colour walks are given by arbitrarily long strings of the form $010\ldots010$ with $2$s in between. The generating function to count these has a factor $x$ for each $2$ and a factor $x+x^3+x^5+\dotso=x/(1-x^2)$ for the strings of the form $010\ldots010$, and this can be repeated any number of times, so with $u=x^2/(1-x^2)$ we have
$f(x)=1+u+u^2+\dotso=\frac1{1-u}=\frac1{1-x^2/(1-x^2)}=\frac{1-x^2}{1-2x^2}\;.$
Actually the last string of $0$s and $1$s can also end in a $1$, and there doesn't have to be a $2$ at the beginning, but these "finite-size effects" are just corrections in the numerator that don't affect the zeros of the denominator that determine the growth factor (which makes sense). The only poles of $f$ are at $1/\sqrt2$, so the growth factor in this case is $\sqrt2$.
In the case of odd $n$, the colouring is $010\ldots1012$, and now it makes a difference which side of the $2$ we go to and whether we return to the same side or the other side. The generating function again has a factor of $x$ for each $2$, but the factors for the strings of $0$s and $1$s are a bit more complicated. There's a factor of $2$ because we can go to either side of the $2$. Then we have the choice of either adding an odd number of alternating $0$s and $1$s and returning to the $2$ from the same side, or of adding an even number of at least $n-1$ $0$s and $1$s and returning to the $2$ from the other side. These correspond to contributions $x(1+x^2+x^4+\dotso)$ and $x^{n-1}(1+x^2+x^4+\dotso)$, respectively, so in this case we get (again neglecting finite-size effects) $u=2x(x+x^{n-1})/(1-x^2)$ and thus
$f(x)=\frac1{1-u}=\frac1{1-2x(x+x^{n-1})/(1-x^2)}=\frac{1-x^2}{1-3x^2-2x^n}\;.$
Here are the growth factors for the first few odd $n$:
$ \begin{array}{c|c} n&1/\rho_n\\ \hline 3&2\\ 5&1.824621\\ 7&1.765400\\ 9&1.743786\\ 11&1.736075\\ 13&1.733412 \end{array} $
The growth factors clearly tend to $\sqrt3$, and I presume this wouldn't be difficult to show rigorously from the simple form of the denominator. This makes sense, since in the limit $n\to\infty$ there is no longer any gain from traversing the vast expanse of repeating $0$s and $1$s, so the $x^n$ term becomes negligible and the denominator is just $1-3x^2$.
For the $G_k$, the situation is somewhat more complicated. Setting up the complete generating functions would be a bit tedious, though possible, so I'll use a rough bound to show that the growth factor tends to $2$: I'll consider only the colour walks that reach and leave the final, irregular vertex from and to one of the two sides. Then the colour of that vertex and its neighbours becomes irrelevant, and we don't have to distinguish different cases depending on $n\bmod 3$.
Each part of the walk between two visits to the final vertex corresponds to a Dyck path that doesn't go above $n-2$ or below $0$. These are counted in this paper by the generating function
$R_{n-2}(x)=\frac{U^*_{n-2}(x)}{U^*_{n-1}(x)}\;,$
where the $U^*_n$ are defined in terms of the Chebyshev polynomials of the second kind by
$U^*_n(x)=x^nU_k\left(\frac1{2x}\right)\;.$
Again, we have a factor of $x$ for each occurence of the final vertex, and a factor of $xR_{n-2}(x)$ for each foray into the rest of the cycle ($x$ because it takes one step in addition to the Dyck path to get started), and we can repeat this any number of times, so the generating function (up to finite-size effects) is
$f(x)=\frac1{1-xR_{n-2}(x)}=\frac{U_{n-1}(t)}{U_{n-1}(t)-xU_{n-2}(t)}$
with $t=1/2x$. By a lucky coincidence (or perhaps not), the Chebyshev polynomials of the second kind satisfy the recurrence relation $U_n(t)=2tU_{n-1}(t)-U_{n-2}(t)$, which gives us
$f(x)=\frac{2tU_{n-1}(t)}{U_n(t)}\;.$
Thus, the growth factor is twice the largest root of $U_n$. The roots of $U_n$ are given by
$x_k=\cos\frac {k\pi}{n+1}$
for $k=1,\dotsc,n$, so the largest one is $\cos\pi/(n+1)$, which tends to $1$ as $n\to\infty$; thus the growth factor tends to $2$. Here are the growth factors for the first few values of $n$:
$ \begin{array}{c|c} n&1/\rho_n\\ \hline 1&0\\ 2&1\\ 3&\sqrt2\\ 4&1.618034\\ 5&\sqrt3\\ 6&1.801938\\ 7&1.847759 \end{array} $
We can check that the first three values make sense: For $n=1$, there are no walks at all of the kind described; for $n=2$, there are no choices and thus only one walk for every $n$, and for $n=2$ there is a binary choice to make every second step when we're on the first vertex; thus these three growth factors are correct.
In summary, the growth factor for the $F_k$ for even $n$ is $\sqrt2$; the growth factors for the $F_k$ for odd $n$ tend to $\sqrt3$ from above for $n\to\infty$; and the growth factors for the $G_k$ tend to $2$ from below for $n\to\infty$. The crossover is at $N=7$ at the latest (taking into account that the factors for the $G_k$ are only lower bounds since we didn't count all the walks). Thus the claim in the question is correct.