I think you're confusing a few issues. Firstly, $F$ isn't an 'algorithm' in and of itself; it's a function. You can ask if $F$ is polynomially bounded, and whether there's a polynomial-time algorithm to compute $F$, but those two are different statements. I'm assuming that you're after polynomial boundedness - that is, that there's some polynomial $p(n)$ such that $F\in O(p(n))$ , but clarifying the question would be immensely useful.
The first thing to note is that the inner summation is trivial, and in fact the presence of 'b' as a lower summation limit is somewhat misleading; your formula works out to $F(n) = \Sigma_{i=a}^{f(a,n)}\left(g(a,b,i,n)-b\right)$ and it's easy to absorb the $-b$ term into the definition of $g$. Now, if $g$ is polynomially bounded in all its arguments, then its maximum over all $i$ in some polynomial range is also polynomially bounded as a function of $a$, $b$, and $n$; this is essentially the statement that polynomials are closed under composition, that is that if $p(n)$ is a polynomial and $q(n)$ is a polynomial, then $r(n)=p(q(n))$ is also a polynomial (and this extends to the multivariate case). So the sum (as a function of $a$, $b$, and $n$) is bounded by $f(a,n)$ times the maximum value of $g$ over the specified range, and thus is polynomially bounded.
Note that you need polynomial bounding in all the parameters for this argument to go through, and that 'polynomially computable' doesn't imply 'polynomially bounded'. For instance, ignoring the parameters $a$ and $b$ for a moment, it's easy to find $g(n,i) = n+2^i$ in time polynomial in $n$ and $i$, and it's even polynomially bounded as a function of $n$ for fixed values of $i$ - but then for something as simple as $f(n) = n$ the resulting sum isn't polynomially bounded (it grows exponentially with $n$).
Added on the clarification: with the clarified description, yes, it follows almost trivially that $F(n)$ is polynomial-time computable if $f$ and $g$ are polynomially bounded, for essentially the same reasons; seen from an algorithmic perspective, a polynomial-in-$n$ number of steps are executed a polynomial-in-$n$ number of times and by the composition theorem mentioned above the result is still polynomially bounded as a function of $n$. That's not necessarily useful, though, because for a problem of this nature what one would ideally like is to have a subpolynomially-bounded algorithm - for instance, things like multiplication can be easily done this way (choose $a=b=1$, $f(n)=n$, $g(n)=c$ to compute $F(n)=cn$) but one isn't usually interested in linear-time algorithms for multiplication! Ideally what you'd like is a polylogarithmic algorithm - something that can compute these sums in time $O(\log(n)^d)$ for some $d$ - and that's a much harder question, but I believe (without a specific example) that this can't be computed in polylogarithmic time; for instance, it shouldn't be hard to construct a reduction from this to something like the sum-of-square-roots problem.