views:

342

answers:

5

I'm trying to implement a protocol over serial port on a windows(xp) machine. The problem is that message synchronization in the protocol is done via a gap in the messages, i.e., x millisecond gap between sent bytes signifies a new message. Now, I don't know if it is even possible to accurately detect this gap.
I'm using win32/serport.h api to read in one of the many threads of our server. Data from the serial port gets buffered, so if there is enough (and there will be enough) latency in our software, I will get multiple messages from the port buffer in one sequence of reads.

Is there a way of reading from the serial port, so that I would detect gaps in when particular bytes were received?

+2  A: 

If you want more control over a Windows serial port, you will have to write your own driver.

The problem I see is that Windows may be executing other tasks or programs (such as virus checking) which will cause timing issues with your application. You application will not know when it has been swapped out for another application.

If possible, I suggest your program time stamp the end of the last message. When the next message arrives, another time stamp is taken. The difference between time stamps may help in detecting new messages.

I highly suggest changing the protocol so that timing is not a factor.

Thomas Matthews
+1 changing the protocol, timing based protocols are a THOROUGHLY BAD IDEA in almost every circumstance imaginable
Autopulated
unfortunately changing the protocol is out of the question as I have to interface with other supplier's device.About the driver - I was hoping that maybe the serial driver already is timestamping the incoming bytes and perhaps there would be a way to get that info from it, though I can't find any references to that.
Marcin K
The default API does not timestamp, it simply writes to a buffer and you can control that buffer size, when to flush, et cetera. If you write a driver that overrides Microsoft's driver then you can gain more control and add timestamps to the data. Writing a driver takes a lot of learning and experimenting though.
Dr. Watson
A: 

Are you saying that the protocol is a 1ms. gap? Nothing else delineates the messages, like STX, SOH, ???

What is the device? Who is the manufacturer?

dbasnett
A: 

I've had to do something similar in the past. Although the protocol in question did not use any delimiter bytes, it did have a crc and a few fixed value bytes at certain positions so I could speculatively decode the message to determine if it was a complete individual message.

It always amazes me when I encounter these protocols that have no context information in them.

Look for crc fields, length fields, type fields with a corresponding indication of the expected message length or any other fixed offset fields with predictable values that could help you determine when you have a single complete message.

Another approach might be to use the CreateFile, ReadFile and WriteFile API functions. There are settings you can change using the SetCommTimeouts function that allows you to halt the i/o operation when a certain time gap is encountered.

Doing that along with some speculative decoding could be your best bet.

markh44
Thanks about the comments about speculative message detection - I thought this will be the only way to go and if it worked in your situation it might work in mine.Especially, thanks for the hint about SetCommTimeouts, it seems that it might be the way to go to detect gaps in messages.
Marcin K
You don't, and won't be able to (without writing your own driver) detect timing gaps with any precision. You don't need to.
dbasnett
Whether the comm timeouts will help you depends on how the device operates. For example if the device sends a message every second, a time gap will work very well but if it sends a burst of messages a couple of milliseconds apart, you'll have trouble separating them using timeouts. In either case I would also use any information contained in the protocol to attempt to make certain the message is valid.
markh44
not in windows using the standard driver, even at seconds. eventually it will fail. it is not how it works. see http://social.msdn.microsoft.com/Forums/en-US/vbgeneral/thread/a709d698-5099-4e37-9e10-f66ff22cdd1e
dbasnett
We'll have to disagree on that. The Comm Timeouts do work fairly well - I have direct experience of this and did quite a lot of testing with it. I would always prefer to use the protocol exlusively but my point is you CAN use timeouts if you HAVE TO, but I wouldn't rely 100% on it. In fact, you can't rely on any one thing with serial - not even a CRC. You should still check all length fields even if there is a CRC for example. The article you link to is talking about using serial ports with .net which isn't comparable to using C++ Win32 API where you have far greater control.
markh44
it doesn't sound like we disagree about timing, but I didn't see what you were doing. BTW - I updated this http://social.msdn.microsoft.com/Forums/en-US/vbgeneral/thread/a709d698-5099-4e37-9e10-f66ff22cdd1e to discuss the "protocol" and your observation of the 1 bit error. I credited you.Not being a C++ programmer, I don't have an opinion about what can and can not be done in the API. I have done timings with my code and been fairly accurate. I just could not recommend it.
dbasnett
A: 

It sounds odd that there is no sort of data format delineating a "message" from the device. Every serial port device I've worked with has had some form of a header that described the data it transmitted.

Just throwing this out there, but could you use the Win32 Asynchronous ReadFileEx() and WriteFileEx() system calls? They allow you to attach a callback function, and then you might be able to manage a timer within the callback. The timer would only provide you a rough estimation, however.

If you need to write your own driver, the Windows Driver Kit has a sample that shows how to write a serial port driver. I can't imagine that you'll be able to override the Windows serial port bus driver(the driver that directly controls the serial port on your Windows machine), but you might be able to write a driver that sits on top of the bus driver.

Dr. Watson
rought estimation is the problem here - 1 ms is a very short time in terms of system timer AFAIK.
Marcin K
Are you using the default thread priority for the thread that captures the data? If you drive your thread priority up to higher levels you *might* get what you are needing. I've been able to capture interrupts from hardware devices within 40 us apart by making the data capture thread higher priority than most of the threads running on the system.
Dr. Watson
A high precision timer is available via QueryPerformanceCounter. However, I'm not sure if this idea would work in practice or not.
markh44
A: 

I thought so. You all grew up with the web, I didn't, though I was present at the birth. Let me guess, the one byte is 1(SOH) or 2(STX)? IMVEO it is enough. You just need to think outside the box.

You receive message_delimiter followed by 4 (as length) and then 4 bytes of data. A valid message is not those 6 bytes.

    message_delimiter - 1 byte
    4 - length - 1 byte
    (4 data bytes) - 4 bytes

A valid message is always bounded by the message_delimiter, so it would look like

    message_delimiter - 1 byte
    4 - length - 1 bytes
    (4 data bytes) - 4 bytes
    message_delimiter - 1 byte
dbasnett
I am not a C++ programmer, but I have done serial ports for a long time. I have a standard set of code that I use that is capable of sending / receiving serial data at close to 1Mbps, and displaying the data without a timer.
dbasnett
+1 for IMVEO. And your solution is exactly right too.
mtrw
No, this is too simplistic. The OP says the delimiter only appears at the start of the message so you have to wait for message 2 before you can process message 1. Also, just because you have a delimiter either side, doesn't mean the message is valid. Example: 01 04 11 11 01 ff - message of 4 bytes containing the delimeter as part of the data. Imagine if a switched bit error means you receive 01 02 11 11 01 ff. Now you think you have a 2 byte message followed by the start of another message of length 255. Timeouts would help you detect and reject such data.
markh44
@mark - this is why most of you have problems with the serialport and I don't. The givens in this case are delimiter, length, data. The example you gave is an error, and if properly implemented, it would not be detected. The error that would get detected is that next frame that starts 01 ff, when there is not a 01, 255 bytes later. Believe this, someone like me did not write this protocol. The simplest I would have done would be delimiter, length, checksum. How many times / how many people have to say that timing can't be determined using the windows drivers.
dbasnett
@mark - your point that the protocol is bad is valid, sorry if that didn't come through previously.
dbasnett