As I understand it, a signal may be thought of as a vector in 3D space, with the respective axis being time, amplitude and frequency. Using a Fourier transform, one can convert a signal from a time-amplitude representation to a frequency-amplitude representation, and then back again using an inverse transform.
My question is why don't we have a method for converting a signal from time-amplitude representation to time-frequency representation?
>>9109415
Basically, time is just inverse frequency so there's no problem converting between the two because they'll always be continuous. If you convert frequency or time to amplitude, there's no guarantee the function will be continuous.
Another way to look at it is that a time-frequency dependence implies the signal spends more time in some time passage than others. We want the signal to be uniform across all time.
>>9109440
>>9109415
I dont know the exact math on this, but I have done a lot of signal processing programming, and I think I know whats up. >>9109440 knows whats up.
You cant have a continuous frequency domain, because frequency is by definition occurrence over time. You cant attribute a frequency to a point of time, because at any point there is no duration and therefore no frequency. You can only attribute frequencies to ranges of time.