tags:

views:

414

answers:

2

Greetings,

I have two devices that I would like to connect over a serial interface, but they have incompatible connections. To get around this problem, I connected them both to my PC and I'm working on a C# program that will route traffic on COM port X to COM port Y and vice versa.

The program connects to two COM ports. In the data received event handler, I read in incoming data and write it to the other COM port. To do this, I have the following code:

    private void HandleDataReceived(SerialPort inPort, SerialPort outPort)
    {
        byte[] data = new byte[1];

        while (inPort.BytesToRead > 0)
        {
            // Read the data
            data[0] = (byte)inPort.ReadByte();

            // Write the data
            if (outPort.IsOpen)
            {
                outPort.Write(data, 0, 1);
            }
        }
    }

That code worked fine as long as the outgoing COM port operated at a higher baud rate than the incoming COM port. If the incoming COM port was faster than the outgoing COM port, I started missing data. I had to correct the code like this:

    private void HandleDataReceived(SerialPort inPort, SerialPort outPort)
    {
        byte[] data = new byte[1];

        while (inPort.BytesToRead > 0)
        {
            // Read the data
            data[0] = (byte)inPort.ReadByte();

            // Write the data
            if (outPort.IsOpen)
            {
                outPort.Write(data, 0, 1);
                while (outPort.BytesToWrite > 0);  //<-- Change to fix problem
            }
        }
    }

I don't understand why I need that fix. I'm new to C# (this is my first program), so I'm wondering if there is something I am missing. The SerialPort defaults to a 2048 byte write buffer and my commands are less than ten bytes. The write buffer should have the ability to buffer the data until it can be written to a slower COM port.

In summary, I'm receiving data on COM X and writing the data to COM Y. COM X is connected at a faster baud rate than COM Y. Why doesn't the buffering in the write buffer handle this difference? Why does it seem that I need to wait for the write buffer to drain to avoid losing data?

Thanks!

* Update *

As noted, this code can very easily run into an overflow condition with large and/or fast incoming data transfers. I should have written more about my data stream. I'm expecting < 10 byte commands (with < 10 byte responses) at 10 Hz. In addition, I'm seeing failures on the first command.

So while I know this code does not scale and is less than optimal, I'm wondering why the 2-4K read/write buffers couldn't even handle the first command. I'm wondering if there is a bug with writing a single byte of data or something with the event handler that I don't understand. Thanks.

* Update *

Here's an example of the failure:

Let's say my command is four bytes: 0x01 0x02 0x3 0x4. The Device on COM X sends the command. I can see the C# program receiving four bytes and sending them on to the device on COM Y. The device on COM Y receives two bytes: 0x01 0x03. I know the device on COM Y is reliable, so I'm wondering how the two bytes were dropped.

By the way, can someone let me know if it's better to just reply to answers with comments or if I should keep editing the original question? Which is more helpful?

A: 

You should make sure that outPort.WriteBufferSize is bigger than the biggest buffer you expect to send. Also, calling ReadByte and WriteByte in a loop is generally going to be slow. If you change your handler to something like:

int NumBytes = 20; //or whatever makes sense
byte[] data = new byte[NumBytes];

while (inPort.BytesToRead > 0)
{
    // Read as much data as possible at once
    count = inPort.Read(data, 0, min(NumBytes, inPort.BytesToRead));

    // Write the data
    if (outPort.IsOpen)
    {
        outPort.Write(data, 0, count);
    }
}

this will reduce the overhead, which ought to help. The Write buffer will then (hopefully) handle the timing as you expect.

mtrw
+2  A: 

What you are trying to do is equivalent to drinking from a fire hose. You are relying on the receive buffer to store the water, it isn't going to last long when somebody doesn't turn the tap off. With your workaround, you are making sure that the receive buffer will overflow silently, you probably didn't implement the ErrorReceived event.

To make this work, you'll have to tell the input device to stop sending when the buffer is full. Do that by setting the Handshake property. Set it to Handshake.RequestToSend first. Use XOnXOff next. It depends on the device whether it will use the handshake signals properly.

Use the Read() method to make this a bit more efficient.


Okay, not fire hose. I can think of only one other possibility. A common problem with early UART chip designs, they had a on-chip receive buffer that could store only one byte. Which required the interrupt service routine to read that byte before the next one arrived. If the ISR isn't quick enough, the chip turns on the SerialError.Overrun state and the byte is irretrievably lost.

A workaround for this issue was to artificially put a delay between each transmitted byte, giving the ISR in the device more time to read the byte. Which is what your workaround code does, as a side-effect.

It is not a great explanation, modern chip designs have a FIFO buffer that's at least 8 bytes deep. If there is any truth to this at all, you should see the problem disappear when you lower the baudrate. Also, using Read() instead of ReadByte() should make the problem worse since your Write() call can now transmit more than one byte at a time, eliminating the inter-character delay. To be clear, I'm talking about the output device.

Hans Passant
Yes, you are right that this code is like drinking from a fire hose. Unfortunately I should have wrote more about the hose connected to this code. I'm sending < 10 byte commands (with < 10 byte responses) at 10 Hz. I would expect 2-4K read/write buffers to me more than sufficient. In addition, I'm seeing failures on the first command. So while I know this code does not scale to larger/faster data transfers, I'm curious why it didn't work with small/slow transfers. Sorry for not being more clear.
GrandAdmiral
Okay, ought to work. Please be explicit about "I'm seeing failures".
Hans Passant
Sure. Let's say my command is four bytes: 0x01 0x02 0x3 0x4. The Device on COM X sends the command. I can see the C# program receiving four bytes and sending them on to the device on COM Y. The device on COM Y receives two bytes: 0x01 0x03. I know the device on COM Y is reliable, so I'm wondering how the two bytes were dropped.
GrandAdmiral
@Grand: I updated my answer.
Hans Passant
I wondered about a problem with the UART chip. It would have to be on the PC side, because both connected devices have 2K buffers.The other oddity is the missed bytes are VERY consistent. The device receives the first byte and every other byte after that. If I send: 0x01 0x02 0x03 0x04 0x05, the device receives: 0x01 0x03 0x05. I would expect to see less consistent results with an overrun condition or something like: 0x02, 0x04, 0x05.
GrandAdmiral
I should mention that, when I recreated the above example, the system worked without the workaround a few times. Then it started failing and continued to fail. So I now have evidence that it works without waiting for the write buffer to drain, but then it starts failing and continues to fail. I have no idea if that helps solve the problem or not. Hmmmm....
GrandAdmiral
Beware that there are 2 overrun conditions, the buffer on the chip and the buffer in the driver. I'm talking about the first one, 2 KB buffer size doesn't matter. You are losing the "right" bytes, ISR timing is usually pretty consistent. The "started failing" angle might be associated with changing the baudrate.
Hans Passant
Ah, maybe that is it then. Strange. I'll admit I was sort of hoping it was something to do with C#. I figure once I learned what was going on, I wouldn't run into the problem again (and I would learn more about the language). :-)
GrandAdmiral