Go Down

Topic: Why are there no 8-bit CPUs with GHz clock-rates? (Read 3629 times) previous topic - next topic

tim7

Most 8-bit AVR chips have a maximum clock of 20MHz.  For many applications that's enough, but it made me wonder: why do even modern 8-bit CPUs lag so far behind 32-bit CPUs in terms of clock rate? 

I can see why 8-bit CPUs lack sophisticated features to deal with several program instructions simultaneously, but is the difference entirely down to pipelining?  Or can 32-bit chips carry out a complete instruction cycle in less time than an 8-bit chip?  And if so, how?

Please excuse the slightly OT post.  I hope the subject is of some common interest.  I'm sure this is basic stuff for any computer science student, but digging around on the web I didn't turn up any answers.

P18F4550

Hi, the simple answer is they dont need high clock speeds,
8 bit mcu's get used in all things from tv remotes to washing machines, yes some higher end industrial machinery, basically for control purposes only and don't require the computational grunt that say a hames console would

tim7

I see your point, but I don't think that can be the whole story.

Fast 32-bit CPUs have been around for a long time now.  They're cheap and widespread.  And you'd expect modern 8-bit CPUs to be built along the same design principles: same transistors and so on, at least for the CPU core.  All other things being equal an 8-bit core is much smaller and simpler than a 32-bit core, so you'd expect it to be at least as fast.

Now it doesn't make sense to design a hugely complicated 8-bit CPU with parallel instruction execution and so on, so overall performance will be less.  But equally it doesn't make sense to artificially limit the clock rate when a minority of applications could use that extra speed (the same principle has driven PC development for decades).

So is the difference due to CPU complexity - do 32-bit CPUs execute 50 instructions in parallel to get their 50x higher clock rate?

Or are 8-bit CPUs built along fundamentally different lines for some good reason?

kg4wsv

I think that pretty much is the whole story.  The applications don't require higher clock rates - if they did, there would be a higher clock rate 8 bit MCU on the market, because as you point out the technology is there, and if there was a buyer then a seller would fill that void.

GHz microprocessor (not microcontroller!) clock rates are achieved by many sophisticated techniques - pipelining, caching, branch prediction, etc.  These techniques require lots of silicon and lots of engineering. They also require more sophisticated compilers to take advantage of the specialized hardware architecture.  Since it's a microprocessor, that means a pile of support components to make a functional computer out of it.  It's a larger, more complex and more expensive system.

Users with microcontroller applications that require more computing horsepower than an 8 bit microcontroller can provide are using 16 and 32 bit microcontrollers (ARM, anyone?) for those applications.

-j

dxw00d

Also need to consider power consumption and heat generation. The Atmegas run happily on very little power, and don't waste it heating the environment, so can run in enclosed spaces without needing heatsinks or fans.

MichaelMeissner

I would imagine it comes down to return on investment.  To design the modern 32/64-bit chips takes a large staff of hardware design engineers to design the chip.  To get the highest speeds you need to shrink the chip smaller and smaller, which means newer fab systems, that start out in the n-billion (thousand million outside of the USA) range.  To achieve the speeds, the chips do not execute an instruction sequentially, but instead pipeline the whole process (having decode, execute, write back stages that each may be several internal cycles), and you typically can many instructions in flight.

The 8-bit chips typically are bought very cheaply compared to the higher end chips, and most customers do not need the extra speed.  If you are doing something waiting for a button press, you don't need the speed.  If you do need the speed, you would typically go to a higher priced chip.

In addition, if you look at your average program, the 8-bit chips often spend a lot of instructions to do the equivalent of 16 or 32-bit instructions on higher end platforms.  So if you need speed, you get some bump going to a 32 or 64-bit platform directly.

Jack Christensen

Interesting question, and of course no one answer. It occurs to me to think about it in multiple dimensions. Such as speed, word size, application complexity, power consumption, interface requirements, memory requirements, storage requirements, various costs, ROI, etc. etc. Think about the "solution space" represented by these multiple dimensions, and the real-world problems that need to be solved. I suspect that things are not evenly distributed, that is, there are sweet spots around which the solutions tend to cluster.
MCP79411/12 RTC ... "One Million Ohms" ATtiny kit ... available at http://www.tindie.com/stores/JChristensen/

James C4S


I see your point, but I don't think that can be the whole story.

Well, then, let's start listing out applications for 8-bit mcus with very high clock rates that wouldn't be appropriate for 32-bit cpus.

I can't think of one.
Capacitor Expert By Day, Enginerd by night.  ||  Personal Blog: www.baldengineer.com  || Electronics Tutorials for Beginners:  www.addohms.com

tim7

Re. motivation:  any project with significant investment in 8-bit architecture (either by code or experience)
looking to make incremental improvements might benefit from a faster 8-bit CPU.

But I prefer to put the question the other way around: given the prevalence and cheapness of fast CPUs,  why are modern 8-bit CPUs still relatively slow?  I'm sure the designers did not turn their back on the available technology without a good reason.  So what is it about their design which stops them running at the same clock rate as other recent processor designs?

Re. power and heat:  look at mobile phone CPUs, already well into the GHz range.  Power-saving modes mean there's little to gain by using slower processors -- in other words the energy consumed per operation does not vary much.  But you're always free to use a slower clock should you desire.

Re. parallel execution:  I agree there's no sense in pipelining an 8-bit processor.  I'm just wondering how much of the speed difference that accounts for.  Surely not all of it, 50x or more?

Re. cost and fabrication:  The 32-bit Raspberry Pi on my desk cost *less* than the 8-bit Arduino Ethernet sitting next to it.  I don't know the costs of the CPU/MCU chips, but they can't be wildly different.  As for return on investment, well I bet more 8-bit chips are sold than 32-bit.  They're absolutely everywhere, and even used as peripherals to 32-bit processors.

There may be a clue in the voltages the CPUs need: the ATmegas need 5V to run at 20MHz, or 1.8V at 4MHz, whilst the ARM core in the Raspberry Pi runs at 700MHz with 1.2V and the maximum recommended voltage is something like 1.4V.  So it sounds like there is a fundamental difference at the transistor level.  Perhaps reliability has something to do with it.  Anybody know?

James C4S

You are way over analyzing the situation.

It is as simple as, what applications require such performance?  Companies do not create products just because they can.  Instead, they create products that people will buy.

Again, what are some actual applications that an 8-bit processor with much higher clock rate would serve, that 32-bit processors wouldn't?
Capacitor Expert By Day, Enginerd by night.  ||  Personal Blog: www.baldengineer.com  || Electronics Tutorials for Beginners:  www.addohms.com

tim7

James, you misunderstand me.  This is a purely technical question motivated by curiosity about the inner workings of microprocessors.  I'm not saying Atmel should upgrade their 8-bit chips; I'm wondering why a modern chip doesn't "just work" at higher clock rates, irrespective of demand or application.


  • In 1975 when the 6502 was designed, transistors were big and slow.  Result: a processor which ran at 1 or 2 MHz.

  • In 1996 when the first AVR was designed, logic transistors were capable of 200MHz.  But AVRs do not run at this speed.  Why not? 



It could be due to memory speeds, it could be for improved reliability, it could be for compatibility with the numerous on-board peripherals a micro-controller needs, it could be that slow transistors are better in some other way, or it could be something else entirely.  I thought the answer might be interesting, and that somebody here might know.

retrolefty

#11
Aug 12, 2012, 06:43 pm Last Edit: Aug 12, 2012, 06:52 pm by retrolefty Reason: 1

James, you misunderstand me.  This is a purely technical question motivated by curiosity about the inner workings of microprocessors.  I'm not saying Atmel should upgrade their 8-bit chips; I'm wondering why a modern chip doesn't "just work" at higher clock rates, irrespective of demand or application.


  • In 1975 when the 6502 was designed, transistors were big and slow.  Result: a processor which ran at 1 or 2 MHz.

  • In 1996 when the first AVR was designed, logic transistors were capable of 200MHz.  But AVRs do not run at this speed.  Why not?  



It could be due to memory speeds, it could be for improved reliability, it could be for compatibility with the numerous on-board peripherals a micro-controller needs, it could be that slow transistors are better in some other way, or it could be something else entirely.  I thought the answer might be interesting, and that somebody here might know.


The maximum speed a given chip can operate at is a function of the design of the chip and is one of many design trade-offs the manufacture decides to use in making up the design specifications for the chip. Speed is also a major factor of how hot the chip will run at and the chip must be designed to be able to dissipate that heat when ran at it's maximum rated speed. If a given manufacture thought they could make a profit designing and building a 8 bit micro that runs a Ghz speed then they would have done so by now as the technology already exists. Such a profitable market therefore must not exist at this time if ever. Such is the rule of supply and demand in a free enterprise market. They cannot be expected to undergo the large development costs of such a product in the hope that it will sell in profitable quantities, there must be valid market research done first to see if such a development risk is worthwhile or not.

I know for example that Motorola designed simple 8 bit micro-controllers and some support chips in the 80s that could operate at up to around 350 degrees F. This was aimed at oil and gas industry for use in drill bit real time monitoring instrumentation applications. It was a limited market but they felt that they could sell enough at a given price to make a profit, but they certainly cost way more then normal commerically rated chips for normal industrial temperature ranges.

Lefty

AWOL

It isn't a technical question, and it doesn't have a technical answer  - the answer is  simply
"because there's no market for them".
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

Chagrin

  • For faster clock speeds the fabrication has to be smaller (more expensive) and voltages must be lower as well. Just like the insulator on a wire, thicker insulation allows higher voltages. If you speed it up then you lose the ability to interface with 5V devices directly.

  • AVR chips have AD converters. Apparently this requires the larger/slower fabrication process.

  • Smaller fab processes leak more current. One of the nice features of the AVR chips is that they aren't power hungry. You'd give that away if you sped it up.

  • Cost ;)



I don't see it as an 8bit vs 32bit problem at all. Just look for any 5V capable microcontrollers over ~50Mhz with AD conversion.

James C4S


James, you misunderstand me.  This is a purely technical question motivated by curiosity about the inner workings of microprocessors.

Thats the issue though. From a technical perspective, there isn't a reason it couldn't be done. In fact there are probably prototypes on a shelf somewhere.

The reason you dont see them is that there isn't a market for them. (or a large enough market).
Capacitor Expert By Day, Enginerd by night.  ||  Personal Blog: www.baldengineer.com  || Electronics Tutorials for Beginners:  www.addohms.com

Go Up