How can I simulate the avr code going to arduino?
it would need the gcc c code going to the c compiler to be saved somewhere...
then the compiler options to create debug enabeled output.
then some windows & linux software to watch the registers change and step through it in a graphical way...
::) please help a tutorial would be great. :'(
This post showed on Lady Ada’s forum as well, and I answered some there, but I guess this is a more appropriate place for the discussion.
The arduino environment does in fact build binaries with the debugging info included, and leaves those files around as well as the raw binary that is downloaded to the arduino HW (which no longer contains any debugging info.) Unfortunately, as far as I know, it isn’t possible to put the sort of debugger stub inside the arduino that a debugger like gdb requires, and the chip involved doesn’t support any of the “non-invasive” debugging techniques (eg jtag.)
That means that supported a useful debugging environment would mean SIMULATING the arduino on your local CPU. There are a couple of approaches to this:
 On windows. you might as well download and use Atmel’s AVR Studio, which includes a simulator. This simulates things at the CPU level, rather than at the board level, but fortunately there isn’t a lot more to an arduino board than just the CPU chip. I haven’t used AVR studio in a long time, so I’m not sure of the current status of simulation of the peripherals.
 Adapt one of the open-source AVR simulators. There are several, and some even support an interface to gdb. Unfortunately, most of these seem to have been only partially completed, and don’t have any current developers working on them. I think they all have the same “CPU vs Board” issues as AVR Studio.
 Write a “virtual Arduino” environment that runs natively on your host system. In other words, re-write all the arduino libraries and simulate the overall operation, but WITHOUT doing CPU or hardware-level simulation. You’d compile your sketch to x86 code and debug at C level (using gdb or whatever), and calls to “digitalWrite()” would remember the state of the bit and change the appearance of a cute little picture of an Arduino, or spit out to an “out-of-band” console “%% digitalWrite (N, HIGH)”, all without ever getting down to the level of AVR code, or registers, or memory locations (useless for debugging “out of memory” issues, though!) It would be messy or impossible to support actual AVR hardware, but as long as users kept to the high-level functions, it could work surprisingly well. The functions you write can be arbitratily complex ("%% DigitalWrite(N, HIGH) on pin NOT in output mode"), and you get to debug the logic of YOUR code without the complications of actual hardware (also without the features and accuracy of actual hardware, but… that’s an often useful trade-off; ask the digital designers spending millions on simulating their ASICs at 1/1e6 actual speed…)
re-write all the arduino libraries and simulate the overall operation, but WITHOUT doing CPU or hardware-level simulation.
Sigh. On second thought, this would be harder than I thought since most desktop compilers would have a hard time being convinced to create and use 16bit "int" and pointer variables. You COULD do it even with mis-matched data sizes, but it would be a significant step downward in usefulness for debugging, I think.
since most desktop compilers would have a hard time being convinced to create and use 16bit "int" and pointer variables.
Not really. A short int is 2 bytes on most systems; just checked a PPC mac, SPARC, and an intel linux box.
A short int is 2 bytes on most systems; just checked a PPC mac, SPARC, and an intel linux box.
But you need "int" to be 2 bytes, and I don't think "#define int short" will cut it. Harder, is that sizeof(char *) on AVR is also 2 bytes.
Yeah, a 2 byte address is going to be a lot harder to arrange.