what I meant to explain is that timing tables are tuned manually because of the fact the same code executes faster / slower on different boards.
Please note that the accuracy of micros() is 4 us which makes it harder to use for timing than delayMicros() which has a stepsize of 1 us and is reasonably accurate..
a baudrate of 57600 has a new bit every 17.36 us. With micros() the best I can do is 16 us.
Given that you need to send 10 bits for a byte this timing error add up to 13.6 us. (almost one whole bit)
with a 1us accuracy of the delayMicros() the timing error adds up to 3.6us (about 1/5 bit)
That said your idea is theoretical sound, only you need a more accurate clock than micros() for high speeds.
and yes for low baud rates, 9600 and below the idea will work.
Indeed I suspect it should be possible to write a sketch that would automatically adjust its timing if it was sent a stream of known bytes against which to test itself.
Yes that can be done. It is a well known method to optimize long communication lines to get the highest baud rate possible. This optimization can be done continuously by adding codes to check reception quality and to decrease speed if line get worse and increase again when line is stable again. Not trivial but possible.