3
$\begingroup$

I'd like to find a way to detect a significant drop/decrease in a signal. Below is an actual example of what I'd like to accomplish, with the arrow denoting the change that I'd like to detect (only the red curve).

The data is fairly straightforward...the x-values are integers starting from zero and increasing by 1 at each data point. The y-values are also integers. I know that the dip I'd like to detect always occurs after the minimum value (denoted by the small circle). However, I'm not sure of the best way to find this drop.

What's the best methodology or algorithm for a situation like this?

Detect this event/drop in signal

  • 0
    You might want to search for "anomaly detection" algorithms and techniques.2012-06-06

3 Answers 3

2

I think is coherent with the context (Heat in oven) use a logarithmic regression: $T=k*\ln(t)+C$

You can do the following: As in the picture take for example five temperatures $(t_0,T_0),(t_1,T_1),(t_2,T_2),(t_3,T_3),(t_4,T_4)$, with regular periodicity. Then you must obtain the logarithmic regression of the data. In the picture we obtain aproximately temperature and time. How it isn't reference I used the grid for units.

If the following medition $(t_5,T_5)$ is not accord with the estimation of the regresion you have possibly a drop. If not you delete the initial medition $(t_0,T_0)$ and you take the new data $(t_5,T_5)$ and $(t_1,T_1),(t_2,T_2),(t_3,T_3),(t_4,T_4)$to obtain the regression again...and so on. enter image description here

As you can see the ajust is very fine (OpenOffice-Calc says to me regresion coeficient is equal 1).

2

A similar idea to H. Kabayakawa's is to take the last few points and extrapolate the next point. If the actual point is enough below the prediction, you have a drop. This is more local than the global fit, so may be better or worse for your purpose. A discussion and algorithms are in chapter 3 of Numerical Recipes or any numerical analysis text.

1

If the function is expected to be smooth everywhere except for a jump point, you could treat this as an inverse problem and use a smoothing regularizer.

Say there is some smooth background function $f$, to which a jump at point $p$ with height $j$ has been added, along with some noise $\xi$, yielding the observed function $g$. The forward model is,

$g=A(f,p,j) + \xi,$

where $A:L^2(0,1) \times \mathbb{R} \times \mathbb{R}$ is modeled by $A(f,p,j)(x):=\begin{cases} f(x), & x

We seek to find the minimizer $(f^*,p^*,j^*)$ to the following regularized inverse problem, $\inf_{f \in L^2, p\in (0,1),j \in [0,\infty)} ||A(f,p,j)-g||^2 + \alpha||\Delta f||^2 + \gamma |j-j_0|^2.$

  • The first term tries to minimize the misfit $A(f^*,p^*,j^*)-g$.

  • The second term tries to make $f$ smooth. $\Delta$ is the laplacean and $\alpha$ is a regularization parameter you choose. The larger $\alpha$, the more $f$ is forced to be smooth.

  • The third term tries to make the jump a certain size. $j_0$ is about how big you expect the jump to be, and $\gamma$ is how much you penalize it if the jump is not that size.

Choose some reasonable-ish parameters $\alpha,j_0,\gamma$, replace $L^2(0,1)$ by a space of continuous piecewise linear functions on a fine enough grid, the laplacean by a second finite difference, and solve the above minimization problem with your favorite optimization method/software.