I would like to compute the settling time of a signal y in Matlab. It should give the amount of time required before the signal reaches a steady state error $|y(t)-y_{ss}|$ which is smaller than some absolute value x and stays smaller than x for all future times.
I already tried to use the Matlab function stepinfo, but this defines the value x as "a fraction 2% of their peak value for all future times" and that is not what i want.
Is there a way that i could determine this analytically, so i could code it myself in Matlab?