views:

35

answers:

0

As a warning, I'm not super savvy with DSP so this may be silly. I've got a callback method in an application that takes a float buffer and writes it to a queue, and then eventually to a file. The callback method looks like this:

FMOD_RESULT F_CALLBACK
dspCallback (FMOD_DSP_STATE *dsp_state, float *inbuffer, float *outbuffer,
             unsigned int length, int inchannels, int outchannels)
{
  if(isPlaying)
  {  
    unsigned char *tmp = new unsigned char[length*2];
    int a;
    for (a=0;a<length*2;a++) 
    {
      unsigned char v=(unsigned char)((inbuffer[a]*128)+127);
      tmp[a]=(unsigned char) v;
    }
    wavData.push(tmp);
  }
}

I realize there's maybe some sub-optimal thinking in there but the big question I'm curious about is this: the code works as expected on OSX 10.6 making an 8 bit PCM file that I can read in Audacity (for instance) and use LAME to turn into an MP3. On XP, where the application needs to run, it produces noise that, even when brought back over to OSX and imported into Audacity, is high frequency oscillation. Both are 32bit systems, little-endian, my thinking was that any buffer of unsigned chars should be able to be written an binary file and turned into 8bit PCM data. Any advice or something that I really need to clarify? Much appreciated!