I'd like to find a way to detect a significant drop/decrease in a signal. Below is an actual example of what I'd like to accomplish, with the arrow denoting the change that I'd like to detect (only the red curve).
The data is fairly straightforward...the x-values are integers starting from zero and increasing by 1 at each data point. The y-values are also integers. I know that the dip I'd like to detect always occurs after the minimum value (denoted by the small circle). However, I'm not sure of the best way to find this drop.
What's the best methodology or algorithm for a situation like this?