views:

133

answers:

4

Hi,

I have a piece of hardware which sends out a byte of data representing a voltage signal at a frequency of 100Hz over the serial port.

I want to write a program that will read in the data so I can plot it. I know I need to open the serial port and open an inputstream. But this next part is confusing me and I'm having trouble understanding the process conceptually:

I create a while loop that reads in the data from the inputstream 1 byte at a time. How do I get the while loop timing so that there is always a byte available to be read whenever it reaches the readbyte line? I'm guessing that I can't just put a sleep function inside the while loop to try and match it to the hardware sample rate. Is it just a matter of continuing reading the inputstream in the while loop, and if it's too fast then it won't do anything (since there's no new data), and if it's too slow then it will accumulate in the inputstream buffer?

Like I said, i'm only trying to understand this conceptually so any guidance would be much appreciated! I'm guessing the idea is independent of which programming language I'm using, but if not, assume it is for use in Java.

Thanks!

A: 

You wait till the port has a byte (if its windows, then there is a api to know whether rs232 has a byte waiting, or do a blocking read)... ideally you put your reading code in a separate thread...wait for the bytes and pump them into some meaningful datastructure.

Keith Nicholas
A: 

100Hz is pretty slow - there would be no issue 'sleeping' for say 9 mSecs...

But as Hamish says there is likely an event that notifies that there is data in the buffer - use that and bare in mind that if your readings are in ASCII or over multiple bytes you will need to buffer the bytes you get until you have a full reading (or line of ASCII ??) before actually processing it.

Hope this helps...

Ken Hughes
you tried sleeping for 9ms on windows? :) there's issues.
Keith Nicholas
+1  A: 

If you are using the Java communications API then you will not be polling at all. Instead you will implement a SerialPortEventListener and will receive a callback when there is data available from the port.

public class SerialConnection implements SerialPortEventListener
{
      private SerialPort         sPort;

...
    // Add this object as an event listener for the serial port.
       try
      {
          sPort.addEventListener(this);
       }
      catch (TooManyListenersException e)
      {
          sPort.close();
          throw new SerialConnectionException("too many listeners added");
       }

...
Romain Hippeau
A: 

tutorial: http://devdot.wikispaces.com/Iphone+Serial+Port+Tutorial

back to concept.

bytes come as a stream via the serial port. so if you program is too slow to pick up the bytes, some bytes are lost. there are some ways to minimize the problem:

  • buffer (probably up to 16 bytes or more) in the serial chip. when there is any data coming in, the system interrupt your program and tell you there is some new bytes in the buffer. so the program can get them in a batch. also, an interrupt-driven program does not make the cpu busy by looping for bytes.

  • at protocol level. serial port is only the channel. your program and the program at other side of the serial port may have a higher level protocol to streamline traffic and minimize data lost. a typical mechanism is xon-xoff.

hope it helps

ohho
The serial port hardware (UART) will typically only have a 16byte buffer, but when it's full it will signal the OS which will read into the device driver with a much larger buffer, so you are unlikely to miss data.
Martin Beckett
during the days when I was talking to the serial port, device driver is still alien technology ;-) story here: http://stackoverflow.com/questions/774871/why-did-you-learn-c/2626462#2626462
ohho