views:

361

answers:

4

Hi guys, I've been pulling my hair out lately trying to get an ATmega162 on my STK200 to talk to my computer over RS232. I checked and made sure that the STK200 contains a MAX202CPE chip.

I've configured the chip to use its internal 8MHz clock and divided it by 8.

I've tried to copy the code out of the data sheet (and made changes where the compiler complained), but to no avail.

My code is below, could someone please help me fix the problems that I'm having?

I've confirmed that my serial port works on other devices and is not faulty.

Thanks!

#include <avr/io.h>
#include <avr/iom162.h>

#define BAUDRATE 4800

void USART_Init(unsigned int baud)
{
    UBRR0H = (unsigned char)(baud >> 8);
    UBRR0L = (unsigned char)baud;

    UCSR0B = (1 << RXEN0) | (1 << TXEN0);

    UCSR0C = (1 << URSEL0) | (1 << USBS0) | (3 << UCSZ00);
}

void USART_Transmit(unsigned char data)
{
    while(!(UCSR0A & (1 << UDRE0)));

    UDR0 = data;
}

unsigned char USART_Receive()
{
    while(!(UCSR0A & (1 << RXC0)));

    return UDR0;
}

int main()
{

    USART_Init(BAUDRATE);

    unsigned char data;

    // all are 1, all as output
    DDRB = 0xFF;

    while(1)
    {
     data = USART_Receive();

     PORTB = data;

     USART_Transmit(data);


    }
}
+5  A: 

I don't have reference material handy, but the baud rate register UBRR usually contains a divisor value, rather than the desired baud rate itself. A quick google search indicates that the correct divisor value for 4800 baud may be 239. So try:

divisor = 239;
UBRR0H = (unsigned char)(divisor >> 8);
UBRR0L = (unsigned char)divisor;

If this doesn't work, check with the reference docs for your particular chip for the correct divisor calculation formula.

Greg Hewgill
I looked up an old AVR project I worked on and the most relevant line of code was;divisor = (u16)(SYSTEMCLOCK / (baudrate * 16)) - 1; With my SYSTEMCLOCK frequency this worked out to 191. With yours, maybe 13?. But Greg's answer is probably more likely to be spot on. With this sort of situation, the exact flavour of chip and clock generator is all important.
Bill Forster
+5  A: 

I have commented on Greg's answer, but would like to add one more thing. For this sort of problem the gold standard method of debugging it is to first understand asynchronous serial communications, then to get an oscilloscope and see what's happening on the line. If characters are being exchanged and it's just a baudrate problem this will be particularly helpful as you can calculate the baudrate you are seeing and then adjust the divisor accordingly.

Here is a super quick primer, no doubt you can find something much more comprehensive on Wikipedia or elsewhere.

Let's assume 8 bits, no parity, 1 stop bit (the most common setup). Then if the character being transmitted is say 0x3f (= ascii '?'), then the line looks like this;

...--+   +---+---+---+---+---+---+       +---+--...
     | S | 1   1   1   1   1   1 | 0   0 | E
     +---+                       +---+---+

The high (1) level is +5V at the chip and -12V after conversion to RS232 levels.

The low (0) level is 0V at the chip and +12V after conversion to RS232 levels.

S is the start bit.

Then we have 8 data bits, least significant first, so here 00111111 = 0x3f = '?'.

E is the stop (e for end) bit.

Time is advancing from left to right, just like an oscilloscope display, If the baudrate is 4800, then each bit spans (1/4800) seconds = 0.21 milliseconds (approx).

The receiver works by sampling the line and looking for a falling edge (a quiescent line is simply logical '1' all the time). The receiver knows the baudrate, and the number of start bits (1), so it measures one half bit time from the falling edge to find the middle of the start bit, then samples the line 8 bit times in succession after that to collect the data bits. The receiver then waits one more bit time (until half way through the stop bit) and starts looking for another start bit (i.e. falling edge). Meanwhile the character read is made available to the rest of the system. The transmitter guarantees that the next falling edge won't begin until the stop bit is complete. The transmitter can be programmed to always wait longer (with additional stop bits) but that is a legacy issue, extra stop bits were only required with very slow hardware and/or software setups.

Bill Forster
+2  A: 

After reading the data sheet a little more thoroughly, I was incorrectly setting the baudrate. The ATmega162 data sheet had a chart of clock frequencies plotted against baud rates and the corresponding error.

For a 4800 baud rate and a 1 MHz clock frequency, the error was 0.2%, which was acceptable for me. The trick was passing 12 to the USART_Init() function, instead of 4800.

Hope this helps someone else out!

samoz
+2  A: 

For debugging UART communication, there are two useful things to do:

1) Do a loop-back at the connector and make sure you can read back what you write. If you send a character and get it back exactly, you know that the hardware is wired correctly, and that at least the basic set of UART register configuration is correct.

2) Repeatedly send the character 0x55 ("U") - the binary bit pattern 01010101 will allow you to quickly see the bit width on the oscilloscope, which will let you verify that the speed setting is correct.

Toybuilder