Would it be possible to write a code that auto calibrates the "magic numbers".

Definitely, but because the tables existed It was easier to let Excel find them.

In fact you can derive the magic numbers from the protocol. You know that the a data bit has 1/baudrate seconds between the edges. As you can see how much time the machine code takes to read and store a bit you can derive the magic numbers even quite exact for a given baud rate.

The real problem is that you need to find a protocol that works for all given baud rates, so an auto calibrating mode needs a number of known patterns to learn. Best to start with the highest baud rate as this is the most critical one. If you have the basic formula rxbit = CLOCKSPEED/(alpha * baudrate) - beta; you can try all possible combinations for alpha and beta between 1..20 so after 400 bytes you get the ranges for alpha and beta that work. You try the next baud rate and the ranges will decrease until all baudrates done. Then take the middle of the ranges and you're done.

A

~~better~~ faster approach is first find the optimal alpha, the search the optimal beta then alpha again then beta again until same values appear.

That will bring you to the optimal value within 50 or so bytes (~10x faster).

Instead of linear search through the ranges you can do a binary search,...

For me the strange thing is the value SEVEN where I expected EIGHT in the formula - 16000000L/(7 * baudrate) - 3;

but I did not really investigate 16000000L/(8 * baudrate) + BETA ... => todo list