I have a filtergraph with a custom source filter that delivers uncompressed RGB24 bitmap data and uncmompressed PCM sound through two pins. The graph compresses the data using arbitrary compression filters and saves it to a file using an arbitray writer filter. Internally, for each frame there is generated a sound sample with an equal time stamp at the same time. Whenever a frame sample is delivered the corresponding sound sample is buffered and vice versa. The generation of a frame and its sound sample is necessary. They cannot be generated independently.
In certain graph configurations the following behaviour occurs:
The sound pin's FillBuffer method is called and the pin is forced generate a frame and a sound sample to deliver the sound, if no sound sample is buffered. This happens constanly while the frame pin is blocking such that all frames generated are being buffered, consuming a lot of memory after a while.
This has been observed in a graph that compresses the samples using DivX 6.0 (frames) and an ISO MP3 codec (audio) and writes it to an AVI file. Using no compression for audio or Microsoft's ADPCM compressor, the encoding process has a constant (low) memory usage.
How can I guarantee a data flow of equal proportions for both pins of my source filter with the least need to buffer samples?