I'm curious why all of the Arduinos have 16MHZ crystals, when they can go up to 20MHZ (or most of them).
Is it because 16MHZ crystals are readily available and cheaper, or is there something fundamentally better about running the chips a 16?
I'm looking forward (who isn't) to the Arm-based Arduino (Due?), but would suspect making the compiler work with an Arm core is going to be a serious chore for the design crew. I wonder if there will be licensing issues and costs associated with that will be passed onto us.
Arm licensing isn't cheap. We use an Arm compiler where I work, and it is quite expensive.
96Mhz is an even multiple of 16, so there must be something pertinent regarding that.
cappy2112:
I'm looking forward (who isn't) to the Arm-based Arduino (Due?), but would suspect making the compiler work with an Arm core is going to be a serious chore for the design crew. I wonder if there will be licensing issues and costs associated with that will be passed onto us.
Arm licensing isn't cheap. We use an Arm compiler where I work, and it is quite expensive.
No chore at all. gcc has had ARM support for quite some time.
And just like with AVR there would be no monetary cost to use gcc for ARM.
The task is getting Arduino to work on DUE.
The maple guys already have Arduino working on their ARM chip using gcc.
The chipkit UNO32 and MAX32 which use a PIC32 processor which is a MIPs core also uses gcc
and they have their boads up and working with gcc.
In fact, the mpide that chipkit uses is extensible to any platform and currently supports both AVR and PIC32
based boards. http://www.digilentinc.com/Products/Catalog.cfm?NavPath=2,892&Cat=18
bperrybap:
The task is getting Arduino to work on DUE.
That's just an abstraction layer, + java on top. Right?
I'm curious why your company wouldn't use gcc?
I can't give you the exact reason, because I don't write the FW we use. But the arm compile generates the FW that goes
into an Arm7 that goes into our products.
I would speculate that the Arm compiler is highly more efficient than gcc- but that's not first-hand knowledge.
I will ask one of the FW engineers next week to find out what the difference is.
bperrybap:
The task is getting Arduino to work on DUE.
That's just an abstraction layer, + java on top. Right?
Not really. While it is an abstraction layer, its not all that clean and ready to
support other Architectures than AVR.
There is core specific code under the hood that has to be updated/created.
Code that has to know how to map things for the specific hardware and
there is often some processor specific code, sometimes that has to be in
assembler to make certain things work.
What makes it really tough in this case is that there is new hardware internal to the micro-controller,
things are going from 8 bits to 32 bits,
and the CPU architecture is different.
There is lots of stuff in the current Arduino IDE that is hard coded in the
IDE itself rather than being in scripts or configuration files.
All that stuff has to change.
The chipkit guys have already gone through the Arduino IDE and cleaned it up to work with files to control
the architecture and can support multiple architectures in the same IDE,
but its not clear how the Arduino team wants approach this or if they
intend to use any of the new mpide IDE for DUE as it means having to relinquish control over the IDE.
I'm curious why your company wouldn't use gcc?
I can't give you the exact reason, because I don't write the FW we use. But the arm compile generates the FW that goes
into an Arm7 that goes into our products.
I would speculate that the Arm compiler is highly more efficient than gcc- but that's not first-hand knowledge.
I will ask one of the FW engineers next week to find out what the difference is.
I doubt it.
The gcc compiler and toolset
has evolved over more than 25 years and has added support for just about every CPU architecture out there.
The most common reason given by companies not to use gcc is that since it is "free" there is no support.
or how can you trust something that is free?
But typically the code generated by commercial compilers isn't any better and sometimes not as good
and the need for support is somewhat of a myth.
gcc particularly on architectures like the ARM are in such widespread use that any bugs that
do show are fixed very timely.
AFAIK, there is no equivalent of avr-libc for most of the ARM chips, though. (there's "newlib", I guess, which covers some of the ground, but probably isn't very microcontroller oriented.) Whether a chip vendor provides open-source libc and peripheral libraries, or even include files defining the addresses and bits of the peripherals, is a bit hit-or-miss. Writing code for a microcontroller that has 1000 pages worth of chip documentation, with nothing but a bare C compiler is not a pleasant task.
The problems that ChipKit has run into with proprietary libraries, or the problems that Maple ran into with vendor-provided libraries of questionable usefulness, are examples of some of the issues.
But it shouldn't be impossible.
The gcc compiler and toolset has evolved over more than 25 years and has added support for just about every CPU architecture out there.
The most common reason given by companies not to use gcc is that since it is "free" there is no support.
Many of the "commercial" compilers ARE gcc. gcc+libraries+support+ide+debuggerSupport+deviceDependentStuff. And frequently these days the IDE is something open-source as well (like Eclipse.) That's worth quite a bit; it's far cheaper to buy a copy of Code Red's ARM environment than to have your Engineer Who Is Supposed To Be Working On The Product spend a week putting all the little pieces together.
And gcc isn't perfectly wonderful, either. The gcc developers' priorities may not match the "customers" or the "vendors." One of the big support things WE used to pay for was "could you please make version X of the compiler backward compatible with the feature we used back in version X-3?" For instance, Arduino (and AVR in general) is currently sort of stuck on a rather old version of gcc, because all of the more recent versions are broken on AVR in ways that are more serious than the older version... (Arduino uses 4.3.2. Atmel ships a patched 4.5.x. Last WinAVR/Crosspack distributions are from about 2010. Current gcc version is 4.7, and it intentionally breaks the way that PROGMEM has been used...)
cappy2112:
I'm curious why all of the Arduinos have 16MHZ crystals, when they can go up to 20MHZ (or most of them).
Is it because 16MHZ crystals are readily available and cheaper, or is there something fundamentally better about running the chips a 16?
A few reasons. Early boards used 16MHZ so a different clock speed would break compatibility with older programs.
Also I suppose its best to run the chips alittle below their maximum speed for best stability.
I'm looking forward (who isn't) to the Arm-based Arduino (Due?)
I am not particularly interested in it.
but would suspect making the compiler work with an Arm core is going to be a serious chore for the design crew. I wonder if there will be licensing issues and costs associated with that will be passed onto us.
Shouldn't be. GCC already has support for it. I do think there are any other licensing issues but I really don't know.
Arm licensing isn't cheap. We use an Arm compiler where I work, and it is quite expensive.
GCC is free open source software and thus does not require licensing as stating above.
Too bad they didn't choose a chip with more flash.