I was thinking about this when flying on the plane which was approaching and slowing down.
Assume an object is approaching its target which is at a certain initial distance d at time t0.
It starts at a speed that will allow it to reach the target in exactly one hour (e.g. d=100km, it starts at 100 km/h).
Incrementally, it will slow down so that at every point in time, it will be exactly one hour far from reaching the target (after it had travelled 40 km being 60 km away, it will be travelling at 60 km/h).
After reaching a predefined minimum speed (10 km/h), it will keep its velocity constant (and it will need exactly one additional hour to reach the target).
How long will it take to reach the target?
I somehow assume the answer should be 2 hours, independently of initial distance, but it does not fit (because then it would not matter what the original distance is (which it should not since we are travelling faster in the beginning), but any shorter path is included in the longer path and the resulting equation cannot possibly be true (or can it be?))
I think I am missing the mathematical apparatus that is needed to solve this (is it differential equations?)
Can you please advise how to solve this?