Greetings,
I have two devices that I would like to connect over a serial interface, but they have incompatible connections. To get around this problem, I connected them both to my PC and I'm working on a C# program that will route traffic on COM port X to COM port Y and vice versa.
The program connects to two COM ports. In the data received event handler, I read in incoming data and write it to the other COM port. To do this, I have the following code:
private void HandleDataReceived(SerialPort inPort, SerialPort outPort)
{
byte[] data = new byte[1];
while (inPort.BytesToRead > 0)
{
// Read the data
data[0] = (byte)inPort.ReadByte();
// Write the data
if (outPort.IsOpen)
{
outPort.Write(data, 0, 1);
}
}
}
That code worked fine as long as the outgoing COM port operated at a higher baud rate than the incoming COM port. If the incoming COM port was faster than the outgoing COM port, I started missing data. I had to correct the code like this:
private void HandleDataReceived(SerialPort inPort, SerialPort outPort)
{
byte[] data = new byte[1];
while (inPort.BytesToRead > 0)
{
// Read the data
data[0] = (byte)inPort.ReadByte();
// Write the data
if (outPort.IsOpen)
{
outPort.Write(data, 0, 1);
while (outPort.BytesToWrite > 0); //<-- Change to fix problem
}
}
}
I don't understand why I need that fix. I'm new to C# (this is my first program), so I'm wondering if there is something I am missing. The SerialPort defaults to a 2048 byte write buffer and my commands are less than ten bytes. The write buffer should have the ability to buffer the data until it can be written to a slower COM port.
In summary, I'm receiving data on COM X and writing the data to COM Y. COM X is connected at a faster baud rate than COM Y. Why doesn't the buffering in the write buffer handle this difference? Why does it seem that I need to wait for the write buffer to drain to avoid losing data?
Thanks!
* Update *
As noted, this code can very easily run into an overflow condition with large and/or fast incoming data transfers. I should have written more about my data stream. I'm expecting < 10 byte commands (with < 10 byte responses) at 10 Hz. In addition, I'm seeing failures on the first command.
So while I know this code does not scale and is less than optimal, I'm wondering why the 2-4K read/write buffers couldn't even handle the first command. I'm wondering if there is a bug with writing a single byte of data or something with the event handler that I don't understand. Thanks.
* Update *
Here's an example of the failure:
Let's say my command is four bytes: 0x01 0x02 0x3 0x4. The Device on COM X sends the command. I can see the C# program receiving four bytes and sending them on to the device on COM Y. The device on COM Y receives two bytes: 0x01 0x03. I know the device on COM Y is reliable, so I'm wondering how the two bytes were dropped.
By the way, can someone let me know if it's better to just reply to answers with comments or if I should keep editing the original question? Which is more helpful?