I have a signal composed of the summation of a set of sine waves of different frequencies. The amplitude of these sub-signals can change so many times a second.
I have been told that, if I want to retain the ability to distinguish each of the frequencies, the time-frequency uncertainty principal means there will be a limiting relationship between the duration of the time window between amplitude changes and the smallest interval between frequencies.
I found a website which seems to deal with the problem, but as a non-mathematician, I'm not sure how to utilise the formulas it shows. To be honest, I'm not even sure if it's relevant.
My question then: What is the relationship between the time interval and the minimum frequency interval?