I would appreciate a definition clarification.
if a numerical method is "unstable", does it mean that if we introduce a small random error in one of the steps, the error would be magnified greatly after further steps? is this true for all unstable algorithms or are there some where the random error is never made significant, say wrt the error of the method itself?