I understand from the MSDN docs that the event DataReceived will not necessarily fire once per byte.
But does anyone know what exactly is the mechanism that causes the event to fire?
Does the receipt of each byte restart a timer that has to reach, say 10 ms between bytes, before the event fires?
I ask because I'm trying to write an app that reads XML data coming in from a serial port.
Because my laptop has no serial ports, I use a virtual serial port emulator. (I know, I know--I can't do anything about it ATM).
When I pass data through the emulated port to my app, the event fires once for each XML record (about 1500 bytes). Perfect. But when a colleague at another office tries it with two computers connected by an actual cable, the DataReceived event fires repeatedly, after every 10 or so bytes of XML, which totally throws off the app.