I want to paint the full spectrum of a song from its complete FFT representation.
Using the BASS Audio Library, I create a decoder stream, get its length and then retrieve all the FFT samples. My first prototype worked perfectly:
Decoder := BASS_StreamCreateFile(FALSE, pchar(fn), 0, 0, BASS_STREAM_DECODE);
SongLen := BASS_ChannelGetLength(Decoder, BASS_POS_BYTE);
repeat
NRead := BASS_ChannelGetData(Decoder, @FftBuf, BASS_DATA_FFT1024);
TotRead := TotRead + NRead;
until (NRead<=0);
But after running it I have realized that TotRead accumulated from the NRead returned by BASS_ChannelGetData()
was systematically (in all the test songs I tried, and with different FFT Sizes) two times the SongLen returned by BASS_ChannelGetLength()
.
This result is something I didn't expect. I expected to have the same value returned from BASS_ChannelGetLength()
than the value accumulated from all the returned from BASS_ChannelGetData()
, since the documentation says "When requesting FFT data, the number of bytes read from the channel (to perform the FFT) is returned."
I need to understand what is happening here. The reason I ask is that, for storing the complete full spectrum, I must derive, from the song length, the number of samples I will need to allocate.
Could you explain this discrepancy? thanks.