Hi everyone,
I'm working on a project sending serial data to control animation of LED lights, which need to stay in sync with an animation engine. There seems to be a large serial write buffer (OSX (POSIX) + FTDI chipset usb serial device), so without manually throttling calls to write(), the software can get several seconds ahead of the lights.
Currently I'm manually restricting the serial write speed to the baudrate (8N1 = 10 bytes serial frame per 8 bytes data, 19200 bps serial -> 1920 bytes per second max), but I am having a problem with the animation drifting out of sync with the lights over time - it starts fine, but after 10 minutes there's a noticeable (100ms+) lag between the animation and the lights.
This is the code that's restricting the serial write speed (called once per animation frame, 'elapsed' is the duration of the current frame, 'baudrate' is the bps (19200)):
void BufferedSerial::update( float elapsed )
{
baud_timer += elapsed;
if ( bytes_written > 1024 )
{
// maintain baudrate
float time_should_have_taken = (float(bytes_written)*10)/float(baudrate);
float time_actually_took = baud_timer;
// sleep if we have > 20ms lag between serial transmit and our write calls
if ( time_should_have_taken-time_actually_took > 0.02f )
{
float sleep_time = time_should_have_taken - time_actually_took;
int sleep_time_us = sleep_time*1000.0f*1000.0f;
//printf("BufferedSerial::update sleeping %i ms\n", sleep_time_us/1000 );
delayUs( sleep_time_us );
// subtract 128 bytes
bytes_written -= 128;
// subtract the time it should have taken to write 128 bytes
baud_timer -= (float(128)*10)/float(baudrate);
}
}
}
Clearly there's something wrong, somewhere.
A much better approach would be to be able to determine the number of bytes currently in the transmit queue, and try and keep that below a fixed threshold, but I can't figure out how to do this on an OSX (POSIX) system.
Any advice appreciated.