views:

1442

answers:

5

I got a RS232 signal capture device. and it working great.

I need some help making sense of the data. Basically we bought it because we are dealing a late 80's machine controller that uses serial communication. We had little luck despite knowing the port parameters.

From the data I dumped machine control is using the break signal as part of it's protocol. I am having trouble duplicating it using VB and the MSComm. I know to toggle the break signal and on and off. But I am not sure what I am supposed to be doing with it. I am supposed to leave it on for each byte of data I send. Or send a byte of data and then toggle.

Also I am confused how I supposed to receive any data from the controller. Do I toggle a flag when the break is turned on and then when it turned off read the input?

Any insight would be appreciated.

A: 

Not really an SO question, but let me dredge up stuff from my long-past (1980s in fact) days as a comms programmer. You normally send a break by holding all bits low or high (depending on your comms hardware). So to cause a break either send the value 0x00 repeatedly for about half a second, or the value 0xFF.

anon
@Neil - You used to be a comms programmer? Very cool.
sheepsimulator
You've forgotten about start and stop bits. Sending either 0x00 or 0xff won't be the same as a break.
Roddy
Actually, for most 1980s hardware (IIRC) correctly, it was. The hardware is detecting a prolonged voltage difference, not the individual bits.
anon
@sheepsimulator Not really - back then everyone was linking their computers by any means possible - understanding comms was part of being a programmer, though I did specialise in it for a couple of years.
anon
Michael Burr
@Michael I'll take your word for it, but I am pretty sure that when I wrote code to perform a break, I didn't change the stop/start/parity bit configuration. But this was 25 years ago...
anon
@ Neil - I dunno, I think that stuff is cool. To each his/her own. Your coolness factor on my front went +1. :)
sheepsimulator
@Neil: one thing is certain about the RS-232 standard - it's like an opinion, everyone had their own. I have no doubt that there were systems that work exactly like you describe. Especially way back when (just curious, did you ever do a bit-banging driver?)
Michael Burr
@Michael By bit-banging I take it you mean writing the individual bits out to the hardware? If so, no. My highpoint of abstruse comms knowlege was as a systems programmer on an IBM 7171 protocol convertor (maninframe <--> serial comms)- a rare expertise that is not part of my current CV, for some reason :-)
anon
@Neil, uarts detect "break" in hardware by the line being permanently in a "space" state for N clocks, without transitions. Sending loads of zero bytes MUST be distinguishable from break, otherwise you can't send loads of zero bytes... Even old-skool uarts like the early 80's 6850 http://www.datasheetcatalog.org/datasheet/motorola/MC6850.pdf have special register bits to enable/disable the BREAK condition.
Roddy
@Roddy I'm not disputing your logic, and my memory regarding setting the stopbits etc. could definitely be at fault - I haven't touched comms programming since about 1985.
anon
@Roddy And I think you are talking about a break being detected by another UART at the other end - I'm talking about something much lower level in the comms hardware that sees a voltage difference and resets the connection. I'm talking about big freezer size cabinet comms gear here, which mediates the connection, not direct PC to PC communications.
anon
@Neil - lucky you - I still have to handle it on almost a daily basis, as ethernet simply isn't as predictable for real-time control applications!
Roddy
@Neil - Maybe it's just a nomenclature issue: I think some protocols refer to a "break character" (which could typically be 00 or FF) to perform the same re-synchronization function that a 'true' break can be used for.
Roddy
A: 

You should be able to see the data that the port is sending. You'll need a null-modem cable, a computer with a serial port (or a serial-USB dongle) and a terminal program (as HyperTerminal on Windows -- not included in Vista). If you configure your terminal program adequately (correct speed, number of bits for data, the correct setting of start-stop, and correct port) all the data will be show on screen. Sometimes it is requiered to hit the enter key to start seeing the data. You can toggle the setting for the terminal program during the test to see if something changes ("noise" to data).

Edmundo
+3  A: 

A break signal is an invalid character. When the RS-232 line is idle, the voltage is in the 'mark' (or '1') state (which is -12 volts if I remember right). When a character is sent, the protocol toggles the line to the 'space' (or '0') state for one bit time (the start bit) then toggles the signal as appropriate for the data (the data bits) and any parity bits. It then holds the line in an idle/mark (or 1) state for a number of bits defined by the stop bits, which is typically configurable (usually 1 stop bit in my experience).

Since there is always some period of time where the line will be in a mark state between data characters, the start of a character can always be recognised. This also means that the longest period of time that the line can be in a space state is:

1 start bit + however many data bits + a parity bit (if any)

A break signal is defined as holding the line in the space state for longer than that period of time - no valid data byte can do that, so the break 'character' isn't really a character. It's a special signal.

As far as when you need to issue a break signal depends entirely on the protocol being used.

Michael Burr
Trivia on the number of stop bits - it was typically 2 for 110 bps, 1 for all other rates. Hooray for Teletypes!
Mark Ransom
+4  A: 

Michael Burr's description of the way break works is accurate. Often, "break" signals are sent for significantly longer than one character time.

These days, "Break" is infrequently used in serial comms, but the most common use is as a 'cheap' way of providing packet synchronization. "Break" may be sent before a packet starts, to alert the receiver that a new packet is on the way (and allow it to reset buffers, etc.) or at the end of a packet, to signal that no more data is expected. It's a kind of 'meta-character' in that it allows you to keep the full range of 8 or 7-bit values for packet contents, and not worry about how start or end of packet are delineated.

To send a break, typically you call SetCommBreak, wait an appropriate period (say, around 2 millseconds at 9600 baud) then call ClearCommBreak. During this time you can't be sending anything else, of course.

So, assuming that the protocol requires 'break' at the start of the packet, I'd do this (sorry for pseudocode):-

procedure SendPacket(CommPort port, Packet packet)
{
    SetCommBreak(port)
    Sleep(2);  // 2 milliseconds - assuming 9600 baud. Pro-rata for others
    ClearCommBreak(port)

    foreach(char in packet)
        SendChar(port, char)
}

Pseudocode for a receiver is more difficult, because you have to make a load of assumptions about the incoming packet format and the API calls used to receive breaks. I'll write in C this time, and assume the existence of an imaginary function. WaitCommEvent is probably the key to handling incoming Breaks.

bool ReadCharOrBreak(char *ch); // return TRUE if break, FALSE if ch contains received char

We'll also assume fixed-length 100 byte packets with "break" sent before each packet.

void ReadAndProcessPackets()
{
  char buff[100];
  int count;

  count = 0;

while (true)
{
  char ch;
  if (ReadcharOrBreak(ch))
    count = 0; // start of packet - reset count
  else 
  {
     if (count < 100)
     {
       buff[count++] = ch;
       if (count == 100)
         ProcessPacket(buff);
     }
     else 
       Error("too many bytes rx'd without break")
  } 
}

WARNING - totally untested, but should give you the idea...

For an example of a protocol using Break, check out the DMX-512 stage lighting protocol.

The start of a packet is signified by a Break followed by a "mark" (a logical one) known as the "Mark After Break" (MAB). The break signals end of one packet and the start of the next. It causes the receivers to start reception. After the break up to 513 slots are sent.

Roddy
Thanks for this. I knew the Serial 101 stuff that the other answer had but you answered partly what I am looking for. A couple of followups. 1) Do you know if any protocol on the web where break is used. 2) what happens when you are receiving a break signal how you process that. You gave a send example how does it work on receive?
RS Conley
This answers the first part of the OPs question, but not the second - how to tell if break is being turned on/off from the DCE.
sheepsimulator
A: 

'Break' was intended for when the line synchronization got totally mixed up.

I am supposed to leave it on for each byte of data I send. Or send a byte of data and then toggle.

Try sending a nice long 'break' signal (500 ms?) then wait a bit (50 ms?) then send your data.

Marsh Ray