I am looking for a formula to compute the following:
Assume you are given a GPS coordinate $(\alpha, \beta)=(\text{latitude}, \text{longitude})$ in decimal format (not in the format involving minutes and seconds), a time $t$ and a speed vector $v=(v_x, v_y)$. Now a point moves from $(\alpha, \beta)$ with speed $v$ for the time $t$. The question is: Which GPS coordinate does it end up at (, assuming the earth is a ball with radius $R$)? I am sure there is a formula for that already but I can´t find it. The solution can be an approximation, since I need this for a real world problem. I know that this problem is not strictly well-defined since the speed vector is in Cartesian coordinates, so I am assuming that the earth is locally flat. Still, there should be some almost-solution to my problem, since I am dealing with rather short times and small speed vectors which makes the curvature of the earth almost irrelavant.
Any help will be appreciated!
Leon