This is not an answer, but was too long for a comment.
More like some observations and additional, related questions
on the theme of whether generating functions would help.
My particular choice of generating functions model additive signals
and therefore could be used as upper bounds for nonadditive ones.
My approach below is probably not original.
Good combinatorial interpretations probably already exist.
This is a Galton-Watson stochastic branching process.
The one-dimensional case has generating function
in the formal power series ring with indeterminates $x,x^{-1}$
(or its quotient with the ideal $\left< x^{-1}x-1 \right>$)
given by
$$
\eqalign{
p_t^*(x)
&= p^t \left(x+x^{-1}\right)^t
= p^t \sum_{s=0}^{t} {t \choose s} x^{2s-t}
= p^t \sum_{s=0}^{t} {t \choose s} x^{t-2s}
\cr &
= p^t
\left(
\left\lfloor\tfrac{t}2\right\rfloor-
\left\lfloor\tfrac{t-1}2\right\rfloor
\right)
{t \choose t/2}
+ p^t
\sum_{s=0}^{\left\lfloor\tfrac{t-1}2\right\rfloor}
{t \choose s}
\left(x^{t-2s}+x^{2s-t}\right)
}
$$
for time $t$, i.e. the coefficient of $x^m$
is the probability the signal reaches coordinate
$x=m$ at time $t$. The first term on the second line,
the constant term, representing the probability
of revisiting the origin, is positive with
maximum coefficient for $t$ even, but $0$ for $t$ odd.
The differences in parentheses above,
$$
e(t)=
\left\lfloor\tfrac{t}2\right\rfloor-
\left\lfloor\tfrac{t-1}2\right\rfloor
=
\tfrac{t+1}2-2\left\lfloor
\tfrac{t+1}2\right\rfloor
=\frac{1+(-1)^{t+1}}{2}
=(t+1)~\text{mod}~2
$$
indicates evenness.
We could also take the last expression and throw away
the (strictly) negative powered terms $x^{2s-t}$,
redefining the coefficient of $x^m$ to be the probability
that a signal starting at the origin at time $0$
reaches $x=\pm m$ (one or the other) after time $t$,
and call this the reduced generating function
$$
p_t(x)
= p^t \left[ e(t) {t \choose t/2}
+ \sum_{s=0}^{\left\lfloor\tfrac{t-1}2\right\rfloor}
{t \choose s} x^{t-2s}
\right].
$$
Note in particular that $e(t){t\choose t/2}$ is therefore the
number of paths of length $t$ which start and end at the origin.
Summing over $t\ge0$, we get a generating function
$$
p(x)=\sum_{t=0}^{\infty} p_t(x)
$$
for the expected amount of time spent at each coordinate,
i.e. the coefficient of $x^m$ is the sum of probabilities
that the signal is at one of $x=\pm m$ over all times $t\ge 0$.
Of particular interest to us is
the expected total time spent at the origin,
$$
p(0)=\sum_{t=0}^\infty{2t\choose t}p^{2t}.
$$
Combinatorial note: there is no double-counting
(no need for P.I.E.) because each term counts only
the terminus point over all paths of length $t$.
This acts like a "driving force" of a
harmonic oscillator damped by a probability of dying out.
By Stirling's approximation and the root test,
this converges, for $p<\frac12$.
Claim $\left(\mathbb{Z}^1\right)$:
For $r=2p<1$, the expected total number of
visits to the origin converges to
$$p(0)=\frac{1}{\sqrt{1-r^2}}.$$
Proof: this follows from the general binomial theorem,
as in e.g. Yuan 2001 proposition 4.3:
$$
\left(-4\right)^n {-\tfrac12 \choose n} =
\left(-4\right)^n \frac{(-1)(-3)\cdots(1-2n)}{2^n\,n!} =
\frac{(2n)!}{(n!)^2}.
$$
Now for the two-dimensional case, we can avoid using the
multinomial theorem expansion, if we like, with
$$
\eqalign{
p_t^*(x,y)
&= p^t \left(x+x^{-1}+y+y^{-1}\right)^t
= \sum_{s=0}^{t} {t \choose s} p_s^*(x) \, p_{t-s}^*(y)
\cr
&= p^t
\sum_{s=0}^{ t }
\sum_{u=0}^{ s }
\sum_{v=0}^{t-s}
{ t \choose s}
{ s \choose u}
{t-s\choose v}
x^{2u-s} y^{2v+s-t}
}
$$
or, throwing away the negative powered terms,
$$
\eqalign{
p_t(x,y)
&= \sum_{s=0}^{t} {t \choose s} p_s(x) \, p_{t-s}(y)
%%
%% This formula could be simplified analogously to p_t^*(x) above:
%%
%\cr
%&= p^t
% \sum_{s=0}^{ t }
% \sum_{u=0}^{ s }
% \sum_{v=0}^{t-s}
% { t \choose s}
% { s \choose u}
% {t-s\choose v}
% x^{2u-s} y^{2v+s-t}
}
$$
and take the analogous sum $p(x,y)$.
Note that $p_t(0,0)=0$ for odd $t$, and that (for $2t$ even)
$$
p_{2t}(0,0)
=p^{2t}\sum_{s=0}^t{2t\choose 2s}{2s\choose s}{2t-2s\choose t-s}
=p^{2t}\sum_{s=0}^t{t\choose s}^2
=p^{2t}{2t\choose 2}^2
$$
which is $p^t$ times the square of a
central binomial coefficient; cf. sequence A000984.
(The latter two formulas, with the squared binimial
coefficients, are offered here without proof.)
Hence
$$
p(0,0)=\sum_{t=0}^\infty{2t\choose t}^2p^{2t},
$$
which will converge for $p<\frac14$.
Claim $\left(\mathbb{Z}^2\right)$:
For $r=4p<1$, the expected total number of
visits to the origin converges to
$$p(0,0)={}_2F_1\left(\tfrac12;\tfrac12;1;r^2\right)=\frac1{M(1,\sqrt{1-r^2})}$$
converges to a special case
of the Gaussian hypergeometric series ${}_2F_1$
which is a complete elliptic integral of the first kind and
can be calculated with extremely rapid convergence
(Gauss 1799)
by the arithmetic-geometric mean
$$M(a,b)=\int_0^{\frac\pi2}\frac{d\phi}{\sqrt{(a\cos\phi)^2+(b\sin\phi)^2}}.$$
The first OP question is perhaps related to the expected time
$$
c_{mn} =
\mathbb{E} \left[ T_{mn} \right] =
\frac1{m!\,n!}
\left.
\left(
\partial_x^{\,m}
\partial_y^{\,n}
p(x,y)
\right)
\right|_{(0,0)}
\qquad(m,n\ge0)
$$
spent at one of the four points $(\pm m,\pm n)$
which is the sum of the $x^my^n$ terms over all
$p_t(x,y)$ and can be computed deterministically
with no further simplification in $O(m^3n^3)$ time
by adding products of binomial coefficients.
Perhaps there are ways to simplify this
using some combination of binomial identities,
parity and symmetry, or with asymptotics,
to get a reasonable approximation
in a feasible amount of time.
The familiar formula with the higher order derivative
using the Taylor series trick for the desired coefficient
is valid on the reduced generating function for $m,n\ge 0$.
As a wild speculation, is the answer
to the (first and) second OP question
then
$$
\frac{c_{mn}}{1-p(0,0)}
$$
or something in a similar vein, analogously to the
gambler's ruin problem or the reasoning of @joriki or @Didier?
Conjectures for higher dimensions:
Conjecture: The expected number of visits to the origin
in $\mathbb{Z}^d$ is
$$ \sum_{n=0}^\infty {2n \choose n}^d p^{2n}. $$
Conjecture: The number of paths in $\mathbb{Z}^d$ of length $2n$
which start and end at the origin is
$$ {2n \choose n}^d. $$
Note: $d=2$ is Richard Stanley's
bijective proof problem #230.