I think I know some electronics, having taken a multitude of university courses, though those were rather mathematical, with differential equations, matrix algebra, signal processing, and the like.
Back then, and in theory, digital electronics were 1 & 0, represented by 5V & 0V. Then, when you build a circuit, you need some tolerance because of components/fabrication variability, and the low becomes 0-0.8 V and High 4.2-5.0 V ...
All of the above is fine, and I understand. Then came the Arduino and Raspberry Pi ... and now we have both 5V and 3.3V (For sure there was a distinction before, but most of us may not have known back then).
My question is:
Why today there are two different 'High' i.e., 3.3V vs. 5.0 V ?
Does this have to do with the properties of components used? e.g., chemistry of semi-conductors
Someone thought 3.3V has less power requirements than 5.0 V ?
...
I did google this but the answers were all over the spectrum
It is easier to make smaller, faster, more power efficient processors that run at lower voltages.
The general pattern is - lower voltage/higher frequency/more sophisticated computational capability/wimpier pin drivers/more abstraction between you and the hardware/lower power consumption (when not sleeping - when sleeping, they're all damned near zero) vs higher voltage, slower clock speed, more rudimentary computational features, beefier pin drivers, closer to bare metal. I find the latter type of processor way more fun to work with, personally... but depending on what you want to do, your computational needs can make the decision for you.
There are actually lots of different "high" voltages. Some parts are down at like 0.8V... there's a lot around 1.x V range, 2.x V is not unheardof either...
The application of voltage as a digital signal and speed of operations is somewhat dependent on how fast a signal can go from low to high or high to low. Take a look at a 5V digital signal and one may find that the transition of change is not instant and it takes longer to go from 0 volts to 5 volts then 0 volts to 3.3 volts; slew rate.
5v is what the arduino was before all the 3.3v components became available due to the cell phone age.
The cell phone, and all of its components run on a single LiPo battery that goes from 4.2v fully charged down to 3.3v or maybe 3.0 discharged.
Those components do not like 5v.
The data sheet for each processor will tell you the voltages for a HIGH and LOW signals. On the AVR 8 bit processors used for the Uno and Mega those levels are defined as less than 0.3 * Vcc for LOW and more than 0.6 * Vcc for HIGH. [Moderator edit: with Vcc = 2.4V to 5.5V]
The real answer is quite simple. The time it takes to change a semiconductor pin that is currently at 5 volts to the ground voltage of 0 volts is much longer than it takes the same pin to change from 3.3 volts to 0 volts.
The processor in my laptop runs at 1.8 volts for the same reason: speed of the processor.
When a IC gate or other piece of a device is changed from 1 to 0 or from 0 to 1, that takes time to chars it's inherent capacity. The smaller the charge, the quicker the transition.
So, all the chemistry and all the power and all the other suggestions are correct to a point, but not the key reason.
Why today there are two different 'High' i.e., 3.3V vs. 5.0 V ?
There is and alway has been different voltages that can represent logic levels. For example the early valve computers used a hundred volts to represent a logic one. Some early transistor logic used 12V for a logic one. I made a computer once that used a hole and a slot in a card or no slot and just a hole to represented zero and one. It is just the two that you know are 5V and 3V3, yours is a small world.
Does this have to do with the properties of components used? e.g., chemistry of semi-conductors
It is more like the technology used.
Someone thought 3.3V has less power requirements than 5.0 V ?
Yes they were right. Using 3V3 can use a smaller pattern in the silicon that you need with 5V.
jb63:
3. Someone thought 3.3V has less power requirements than 5.0 V ?
Yes, that's exactly it, power depends on the square of supply voltage times the clock frequency times the
capacitance of the circuitry on the IC(*). To get more complex circuits you have to keep the power dissipation
down, which means lower voltages (as circuits get more dense and faster the power density would rise
to impossible levels unless it was compensated for by lowering the voltage.
Modern full-on processors run at 0.8V or below in fact, only external circuitry is at 3.3 or 5.0V
Lowly micro-controllers can be 5V or 3.3V as they are much simpler (10000 times fewer transistors or so!)
(*) Well you sum the product of frequency and capacitance per gate - some parts of the IC work harder
than others...
OTOH 5V means better noise margin and easier to interface some external devices (LEDs, piezoelectric speakers). Also some "5V" devices work from "some much smaller voltage" (1.8V for most AVRs) to 5.5V which may be advantageous.
vinceherman:
5v is what the arduino was before all the 3.3v components became available due to the cell phone age.
The cell phone, and all of its components run on a single LiPo battery that goes from 4.2v fully charged down to 3.3v or maybe 3.0 discharged.
Those components do not like 5v.
Thank you gentlemen ... very useful answers for sure, and if anything at all, the broad scope of answers mimics what I found from googling.
All in all, I 'think' one of the more valid reasons may have to do with the new chemistry of batteries, as explained above, and likely semiconductors as well. In my first course in Electronic Circuits, many years ago, I recall that transistors (semi-conductors) needed specific voltages to operate/switch. Back then, you can't run anything on a single 1.5 V battery because the available chemistry did not allow it. Today, the mouse you're using likely runs with a single AA battery ... This was unthinkable 2-3 decades ago.
It seems safe to say then that, there are some ideas as to why that is the case ... but no one knows for sure... kinda like Evolution ... ? We got here, for multiple reasons/technologies, but the path is not really clear i.e., take it for what it is, High/Low do not always translate to 5/0 V.
jb63:
take it for what it is, High/Low do not always translate to 5/0 V.
That has been the case in digital electronics since at least the introduction of CMOS devices, where different logic levels were quoted for a range of supply voltages.
And that was some 50 years ago, so nothing new.......................
jb63:
It seems safe to say then that, there are some ideas as to why that is the case ... but no one knows for sure... kinda like Evolution ... ? We got here, for multiple reasons/technologies, but the path is not really clear i.e., take it for what it is, High/Low do not always translate to 5/0 V.
I don't think that's the case. The reason there are standards is that it makes interfacing components easier. The reason that there are multiple standards is that the component families are optimized to different ends within the constraints of the process technology available at the time.
Take for example the Raspberry Pi cited as a 3.3 V device in the original post. The I/O is 3.3 V so that it can interface with 3.3 V peripherals, but the processor core runs at a nominal 1.8 V to reduce power consumption and consequently heat dissipation issues. This approach is ubiquitous across high performance processors and FPGA devices over the last couple decades.
The early digital logic ICs, RTL, ran at 3.6 volts.
Then TTL was developed to be high speed at
5 volts and much higher current. That was the
way to get high speed, back then, while using
junction transistors.
Herb