Optiboot at 57600 with the Uno's ATmega16U2 as USB-to-serial adapter

Hi, I have some questions about Optiboot and the Uno please, if some of you could dig way back in your memories.

I have compiled Optiboot 6.2 for 57600 baud, and it works great with a device like an HC-05 Bluetooth transceiver. But it does not work with the Uno's built in ATmega16U2 running the Rev 3 firmware. I found this old thread: http://forum.arduino.cc/index.php?topic=28560.0

There are some interesting things brought up in that thread, such as this:

The original Arudino bootloader never tried to use the 2X mode, so it communicated at 58824 baud, not 57600. That's +2.124% error, which is just barely within 8 bit serial tolerance. In other words, all those Duemilanove boards were just barely working when in bootloader mode.

Dean probably tested the 8u2 firmware for Uno with a Duemilanove running the original bootloader at 58824 baud, instead of optiboot. Just as I encountered, if you use the "best" baud rate in 2X mode, which is 57143 (-0.794% error), then you can't communicate with the original Arduino bootloader.

And this:

In fact, it looks like the 8u firmware has special case code for 57600 that DOES configure its uart "opposite" the way the 328 is configured. I can't tell whether that is a bug, or something that was done specifically to address 57600bps issues with an earlier Arduino; there's a comment there about being compatible with the 57600bps bootloader on 328, but... the 328 bootloader doesn't RUN at 57600 any more. And if it is talking about the current Arduino code, it looks like the tests are backward (it picks u2x for everything except 57600 !?)

So, I compiled Optiboot for 58824 baud rate and made a boards.txt entry for my modified Uno with unob.upload.speed=57600. It works great when using the built in ATmega16U2 and with the HC-05 Bluetooth.

Is this a compromise or workaround which results in the bootloader actually running closer to 57600 baud than it would if it were compiled to run at 57600?

Or is the firmware on the ATmega16U2 off a little and never has been worked on because Uno is rarely used at 57600 anyway?

There is a third possibility. The 16U2 runs from a crystal giving it a fairly accurate clock. The m328 runs from a resonator giving it a terrible clock. 57600 is a high-error baud rate for both processors (2.1%). It is possible the m328's clock is simply too inaccurate in the "wrong" direction to communicate with the 16U2.

If you can use any other baud rate, do. Multiples of 31250 are a great choice (31250, 62500, 125k, 250k, 500k, 1M). Even 38400 is a better choice than 57600.

Reference: http://wormfood.net/avrbaudcalc.php

It is interesting that 115200 is even worse according to the table, and was chosen as the default for Uno and Mega.

Is that a polite way of saying my theory is junk?

No I like your theory. I just think it is interesting the choices they made such as 16MHz crystal and then picking bit rates which are not really optimum for the crystal frequency they committed to. Design deficiencies, I think. Coding Badly, I think the stuff you say is golden. :-)

So, when you compile something like Optiboot for a particular bit rate, do I understand correctly during compilation a bit rate is picked which is close and hopefully compatible?

When I compile for 57600, the actual bit rate it ends up using is different and determined by the frequency of the oscillator, and dependent on some calculations to get it close to the bit rate I want?

If so, this would imply I can't really get right on the money by making minor adjustments in the bit rate requested to achieve an actual bit rate.

dmjlambert: When I compile for 57600, the actual bit rate it ends up using is different and determined by the frequency of the oscillator, and dependent on some calculations to get it close to the bit rate I want?

Yeah, it tries to get it as close as it can to the specified baud rate.

This is a very useful page:

http://wormfood.net/avrbaudcalc.php

How the frequency relates to the baud rate is covered on page 173-175 of the datasheet (for the 328p).

This is interesting. I can get a real bit rate of 57142, which is within 458 of 57600, or the next notch I can tweak it up to is 58823, which is 1223 away from 57600. The ATmega16U2 USB-to-serial program using its 16MHz clock (and a different kind of clock) must have a difference in actual clock frequency, or difference in optimum bit rate calculation and setting, such that the more "incorrect" bit rate on the ATmega328P works and the closer to correct rate does not work. And for HC-05 both attempted bit rates are within its tolerance.

The m328 runs from a resonator giving it a terrible clock.

Nonsense. A resonator is not as good as a crystal, but it's far from "terrible."

is the firmware on the ATmega16U2 off a little and never has been worked on because Uno is rarely used at 57600 anyway?

The problem is historical. I'm not sure of the exact origins, but... The Arduino core and the 16u2 serial code both contain "special case" logic for 57600bps that cause it to run at 58823bps. They work fine, when used with each other, but they're pretty far off-spec. The optiboot build does NOT have that logic, so it runs at 57142, which is 0.7% out-of-spec in the opposite direction of the 16u2 serial code. Apparently the combined error is too much (which is a little puzzling, since I thought you could have up to 5% total error and still expect things to work, but ... within believability.) 115200 works fine because both chips and both software entities choose the same divisor, so they match "exactly", even though the error is similar.

I suppose that we're "lucky" that assorted 3rd party devices (BT devices, FTDI, etc) seem more accepting of the errors than the double-AVR configuration.

Thank you so much! I think that old thread combined with what has been discussed in this thread is great information. It is very clear for me now.