Hello,
I already know how to normalize absolute power to get relative power in the case of a 1 dimensional frequency transform based on FFT (one simply takes the area under the curve of the power spectral density and then normalizes power at each frequency bin by this integral). However, the case of a 2 dimensional time-frequency representation (TFR) based on a Morlet wavelet transform seems more complicated.
I have data from one EEG channel for which I compute the TFR using log-spaced Morlet wavelet (1 - 45 Hz, 8 wavelets per octave). Note, these are not event related data, but I still want to resolve power in both time and frequency dimensions. In the raw data, the overall amplitude of the signal is different in each recording for technical reasons that would be a bit complicated to get into here–the bottom line is that I want to correct for these overall differences using relative power.
I see two possible solutions. Should I A) z-score each signal before doing the wavelet transform (but this assumes normally distributed amplitudes) or B) integrate power with respect to frequency at each time bin of the TFR and then somehow average all of these integrals together to create a normalization factor for the TFR (but then it is unclear how to do the averaging, there are probably several different approaches which could give different results). To me, A seems to the most straightforward, especially because I don’t really care if my signals get converted to arbitrary units, but the only caveat is that it assumes a normal amplitude distribution.
What do you think?
Thank you!
-Joel