views:

1788

answers:

5

I've written a simple app in C# 2.0 using the .Net Framework 2.0 Serialport class to communicate with a controller card via COM1.

A problem occurred recently were the bytes returned by the Read method are incorrect. It returned the right amount of bytes, only the values were incorrect. A similar app written in Delphi still returned the correct values though.

I used Portmon to log the activity on the serial port of both apps, compared the two logs and there where some (apparently) minor different settings and I tried to the imitate the Delphi app as closely as possible, but to no avail.

So, what could affect the byte values returned by Read method ?

Most settings between the two apps are identical.

Here is a list of the lines which differed in the Portmon log :

Delphi App :

IOCTL_SERIAL_SET_CHAR Serial0 SUCCESS EOF:dc ERR:0 BRK:0 EVT:0 XON:11 XOFF:13
IOCTL_SERIAL_SET_HANDFLOW Serial0 SUCCESS Shake:0 Replace:0 XonLimit:256 XoffLimit:256 IOCTL_SERIAL_SET_TIMEOUTS Serial0 SUCCESS RI:-1 RM:100 RC:1000 WM:100 WC:1000 IOCTL_SERIAL_SET_WAIT_MASK Serial0 SUCCESS Mask: RXCHAR RXFLAG TXEMPTY CTS DSR RLSD BRK ERR RING RX80FULL

C# App :

IOCTL_SERIAL_SET_CHAR Serial0 SUCCESS EOF:1a ERR:0 BRK:0 EVT:1a XON:11 XOFF:13 IOCTL_SERIAL_SET_HANDFLOW Serial0 SUCCESS Shake:0 Replace:0 XonLimit:1024 XoffLimit:1024 IOCTL_SERIAL_SET_TIMEOUTS Serial0 SUCCESS RI:-1 RM:-1 RC:1000 WM:0 WC:1000 IOCTL_SERIAL_SET_WAIT_MASK Serial0 SUCCESS Mask: RXCHAR RXFLAG CTS DSR RLSD BRK ERR RING

UPDATE:

The correct returned bytes were : 91, 1, 1, 3, 48, 48, 50, 69, 66, 51, 70, 55, 52, 93 (14 bytes). The last value being a simple checksum.

The incorrect values returned were : 91, 241, 254, 252, 242, 146, 42, 201, 51, 70, 55, 52, 93 (13 bytes).

As you can see the first and the last five bytes returned correspond.

The ErrorReceived event indicates that a framing error occurred, which could explain the incorrect values. But the question is why would SerialPort encounter a framing error when the Delphi app apparently does not ?

A: 

Have you checked the settings for number of data bits, stop bits and parity?

The parity bit is a kind of error detection mechanism. For instance: If you send using 7 data bits and one parity bit, the eighth bit will be used for detecting bit inversion errors. If the receiver expects 8 data bits and no parity bits, the result will be garbled.

norheim.se
A: 

Unfortunately you did not mention exactly what type of differences you get. Is it an occasional character that is different or is all your incoming data garbled ? Note that characters read through the SerialPort.Read function could be changed by the system due to the setting of the SerialPort.Encoding property. This setting affects the interpretation of the incoming text as it was text in ASCII, Unicode, UTF8 or any other coding scheme Windows uses for 'raw byte(s)' to 'readable text' conversion.

Cees Meijer
I also thought it might could be the encoding since IOCTL_SERIAL_SET_CHAR settings differed slightly and I assumed that it it might have something to do with the encoding. But since the return values are bytes how can the encoding affect it ? Could you perhaps give an example ?
jakdep
Ah, sorry, should have checked myself. I see the encoding can affect the amount of bytes used to represent a single character.
jakdep
A: 

If you are reading into a byte array (ex: SerialPort.Read) you should get exactly the bytes you are seeing on PortMon.

If you are converting to characters (SerialPort.ReadLine or SerialPort.ReadChar) then the data will be encoded using the current encoding (SerialPort.Encoding property), which explains the differences you are seeing.

If you want to see characters with the same binary values as the bytes on the wire, a good encoding to use is Latin-1 as described in this post.

Example:

SerialPort.Encoding = Encoding.GetEncoding("Latin1")
Joe
I played around with the encoding to little avail. I now use ReadByte() to avoid any possible encoding problems. Thanks anyway for the interesting suggestion.
jakdep
+1  A: 

Well, it seems as if the problem has been resolved (at least for the time being).

Apparently a framing error caused the return of incorrect values. I wrote a VB6 app, using the MSComm control, which worked fine, and compared the log files generated by Portmon.

I picked up the following differences

VB6 App :

IOCTL_SERIAL_SET_HANDFLOW Serial0 SUCCESS Shake:1 Replace:0 XonLimit:256 XoffLimit:256

C# App :

IOCTL_SERIAL_SET_HANDFLOW Serial0 SUCCESS Shake:0 Replace:0 XonLimit:1024 XoffLimit:1024

Playing around with the settings I found that if I set _serialPort.DtrEnable = true the C# App generates the following log entry :

IOCTL_SERIAL_SET_HANDFLOW Serial0 SUCCESS Shake:1 Replace:0 XonLimit:1024 XoffLimit:1024

That seemed to prevent the framing error and the application seems to be working fine.

jakdep
A: 

hi, i have the same problem but the software that i use for this application is LabVIEW 7.1 . i would to know how can i change those parametres with that software. The setting where i have problem are:

IOCTL_SERIAL_SET_HANDFLOW Serial0 SUCCESS Shake:1 Replace:43 XonLimit:204 XoffLimit:204

IOCTL_SERIAL_SET_TIMEOUTS Serial0 SUCCESS RI:-1 RM:0 RC:0 WM:0 WC:250

please answer me soon...bye bye

I would suggest that since you're using LabVIEW and not the .Net Framework's SerialPort class you would probably be better off listing your query as seperate question.
jakdep