It was the Baud rate.
I was trying to send bytes from my home server to the control panel I’m building for it around an Atmega32, and I was obviously getting the wrong values. I don’t think there was a single project I worked on that involved serial communication where the bytes I sent were received properly on the first time. When transferrig data between different platform, there’s Big Endian vs. Little Endian, signed vs. unsigned, and of course – different word sizes. So when it was obvious that the Atmega is not interpreting the data coming from the computer, I was sure this is what I was dealing with.
But I was wrong, it was the Baud rate. It took me some time to figure that out, but here’s how it happened:
My first problem was that I didn’t want to set up a JTAG connection for debugging, because I just didn’t have the right connectors. If I did have them, I would put a break point in the Atmega’s program and look at the serial register. Then I recalled that I can do that with the LCD screen that the Atmega is driving. I programmed the Atmega to display the binary value of the received byte. On the server, I made a test program that transmits bytes in a loop, while displaying their binary value in the terminal.
First, I noticed that when transmitting different bytes, sometimes the received are the same. This would not happen in signed vs. unsigned situation, or a Big vs. Little Endian case. I also noticed that while the 5 last bytes are identical in the transmitted and the received bytes, the first 3 can be different. That started to feel like an electrical problem rather than a software problem, and the first thing to try was to lower the Baud rate from 9600 to 4800.
And it worked. The Atmega is currently running with its internal clock at 1MHz, and according to the datasheet, at a Baud rate of 9600, the error rate is about 7%. That’s a lot. For now, this will remain the solution. In the future, I’ll probably add an external crystal that will allow me to increase the Baud rate and lower the error rate.