And maybe as an option a 32khz watch crystal at timer2 for low power operation (not for rtc).
Many RTCs (the DS323x for example) have a 32kHz output, this could be run to a spare interrupt and used as the time base for millis(), that way there's no missing numbers and it's dead accurate. It also frees up a hardware timer.
But all I ever see with C is errors, nothing actually works,
That's true when you start with C (and for quite a long time afterwards as well
) and I'm sure it's put off many a potential programmer. But at the end of the day C is probably the best language for most embedded applications.
Now Arduino was I think originally aimed at artists etc so you could validly argue that in that regard C is not a good choice. But when the "artist" decides to do something complex that sends waves of light through a 1000 LEDs an interpreted BASIC won't cut it.
What's the answer? Buggered if I know. Arduino goes half way with it's HAL, maybe an even higher level of abstraction is called for, for example Ian mentioned that you shouldn't need a whole program to flash a single LED. This can be fixed with a little bit of scheduling in the background and a function like
flashLED (pin, onTime, offTime);
So I think for the beginner maybe this stuff could be built in, no finding libraries or downloading classes from git.
This (as does the current system) still allows advanced users to write "native" code that dicks with registers etc and also allows someone to gradually change over from a warm and fuzzy protected environment to working with the bare metal.