0
$\begingroup$

I am looking for a formula to compute the following:

Assume you are given a GPS coordinate $(\alpha, \beta)=(\text{latitude}, \text{longitude})$ in decimal format (not in the format involving minutes and seconds), a time $t$ and a speed vector $v=(v_x, v_y)$. Now a point moves from $(\alpha, \beta)$ with speed $v$ for the time $t$. The question is: Which GPS coordinate does it end up at (, assuming the earth is a ball with radius $R$)? I am sure there is a formula for that already but I can´t find it. The solution can be an approximation, since I need this for a real world problem. I know that this problem is not strictly well-defined since the speed vector is in Cartesian coordinates, so I am assuming that the earth is locally flat. Still, there should be some almost-solution to my problem, since I am dealing with rather short times and small speed vectors which makes the curvature of the earth almost irrelavant.

Any help will be appreciated!

Leon

  • 1
    Thanks, but I don´t see how exactly I can apply this to my original question.2017-01-31
  • 0
    I agree that the proposed duplicate cannot be directly applied to answer what is asked here. The main difficulty is in relating the "Cartesian" units of $v = (v_x,v_y)$ to relative rates of change in latitude and longitude. I suspect the OP is aware of this and is looking for guidance on how to make such a connection, e.g. by assuming $v_x$ is East-West miles per minute (no pun intended) and $v_y$ is North-South miles per minute. However it is up to the OP to clarify in what format "speed" (velocity!) $v$ will be given.2017-02-01

0 Answers 0