Resistor Size.

Hi,

So I was looking to get 100 Ohm resistors. I went out to Radio Shack and bought them, I came home to find that they were massive in comparison to my 330 Ohm resistors and my 10k resistors (which came with my starter kit), like so massive that they could without a doubt eat all the other resistors. Turns out that they are 1/2 watt, so that brings me to my question, can I still use them or are they meant for larger electronics? Or are resistors, resistors?

...can I still use them?

Yes. They are just capable of handling more power; up to 1/2 watt.

They are capable of more power, but does that mean that it works more on a lesser powered electronic?

Teddy1:
They are capable of more power, but does that mean that it works more on a lesser powered electronic?

CB said they can handle more power... if you're not giving them more power (which is the product of voltage and resistance) it simply means you have a component which is over specified for the job. It's capable of dissipating more heat than you actually need it to.

No harm done, except in a real project with 1000s of resistors you would have wasted money. And maybe have a problem trying to fit them in....

Neat! Thanks guys!

Just to expand on the replies already given, the power rating of a resistor is a measure of how much power it can handle without being damaged by self heating. Dissipating power on a resistor results in heat. If you run a resistor at near it's power rating, it will be hot, sometimes painfully so if you were to touch it. If you develop a circuit that generates a 1/2 watt of power on a particular resistor value, you just need to be sure to get a resistor that can handle AT LEAST 1/2 watt. If you place a 1/2 watt resistor in the circuit, that resistor will run hot but safely. If you place a 10 watt resistor in the same circuit, the circuit will perform exactly the same, but the power on the resistor will cause a much lower heat rise on the resistor. You probably would not even sense a temperature rise if you were to touch it. The advantage of a larger one is less heat but with the penalty of occupying larger space and costing more. Make sense?

To determine the power a resistor needs to survive is simply one of three equations:

  1. power = voltage (across resistor) times current (through resistor, in amps)
    or
  2. power = current (through resistor, in amps)^squared times resistor value (in ohms)
    or
  3. power = voltage (across resistor)^squared divided by resistor value (in ohms)

Lyle

That makes a lot of sense, this might be kind of a dumb question, but is there anyway that I can figure out how much power my other electronic components can handle? For instance, I think I burnt out the red bulb in my LED bulb (that has the three bulbs), because I was playing a piezo speaker as well. So I think it had too little power, is there a way to tell how much power is being drawn from the speaker? And how much the bulb needs? I got them in a kit without information.

I guess you have fed the LEDs 5V with no (or to small resistors).
The red has the lowest voltage drop and therby gived the greater current - enough to burn it.
Use resistors in the range 100-1000 Ohm (higest value on the red) to match your needs for light intensity.

Interesting, I added a 100-Ohm, that helped! Thanks!

Teddy1:
Interesting, I added a 100-Ohm, that helped! Thanks!

You shouldn't just "add" a resistor of a random value; this won't work in the long term.

You need to know the specs of the LED (or other part) in question; basically how much current it pulls (and in the case of an LED, it's forward voltage drop and the voltage you are supplying it).

You also need to understand (and how to use) Ohm's law to figure out what/when your voltage/current/resistance needs (depending on what you know about the parts and such).

This is the complex version:

Here's a better example:

Also:

...in fact, you might want to review that entire site, as well as pick up some books or such on basic electronics.

Also - review "LED resistor calculation" for more detail on that topic...

Electronics isn't simple "rule of thumb" engineering - you can get by with that for a while (for instance, for most LEDs you can use between a 330 ohm and a 1 K ohm resistor for current limiting reasons - but without knowing and understanding Ohm's law, you'll never know -exactly- how much current the LED is pulling from your I/O pin), but ultimately you need to be able to understand and do the math to know whether the parts you have selected will work in the application you are applying them to (or if you will burn something out). Electronic design, unfortunately, has more than a bit of math involved, especially as you move into the more complex devices like transistors and mosfets (where you have to calculate for currents, voltages, and more - to properly turn off/on the device - or keep it in the "linear" region if you are doing audio/rf amplification - plus more calcs to determine power dissipation and what the proper heatsink needs to be - heh, heatsinks have spec sheets too!).

Teddy1:
Hi,

So I was looking to get 100 Ohm resistors. I went out to Radio Shack and bought them, I came home to find that they were massive in comparison to my 330 Ohm resistors and my 10k resistors (which came with my starter kit), like so massive that they could without a doubt eat all the other resistors. Turns out that they are 1/2 watt, so that brings me to my question, can I still use them or are they meant for larger electronics? Or are resistors, resistors?

A larger resistor simply can handle more power without becoming "too hot".

Here's how to figure out any resistor you need. Let's use an LED and a 9 volt battery as an example:

Imagine that the LED has a forward voltage drop of 2.5 volts. The data sheet says the LED needs 20 milliamps (0.02 amperes) of current. You have a 9 volt battery.

You know that you will be wiring the LED and resistor in series, then connecting it to the battery, So, to figure out the resistor you need:

R = V / I (resistor = voltage divided by current).

You have 9 volts and 2.5 go across the LED, leaving the rest (9 - 2.5=6.5 volts) across the resistor. So:

R = 6.5 / I

Remember that the current in a series circuit is equal at any point, so if you want 20 milliamps across the LED, you also want 20 milliamps across the resistor. So:

R = 6.5 / 0.02
R = 325 ohms.

The closest standard value to that is 330 ohms. Now lets figure out the power rating required for the resistor:

P = V * I (power = voltage times current)

We are using 330 ohms instead of 325, so let's see exactly what the current will be:

I = V / R
I = 6.5 / 330
I = 0.0197 amperes (19.7 milliamps). Close enough. Now for the power:

P = V * I
P = 6.5 * 0.0197
P = 0.128 watts.

A tiny shade more than 1/8 watt (0.125 watts). So, an 1/8 watt resistor would be fine. A 1/4 or 1/2 watt resistor would work equally well, it's just larger and takes up more room.

As "homework", try this: Imagine you are going to use a 24 volt power supply instead of a 9 volt battery to run the LED. Figure out what resistor you would need and the smallest (power) resistor you can use.

You should come up with:

Closest standard value: 1000 ohms (1K)
Current: 0.0215 amperes (21.5 milliamps)
Power: 0.462 watts (therefore a 1/2 watt minimum is needed)

Hope this helps you understand it.

-- Roger

Roger, that actually helped a ton, my physics class gets to that unit after the break. So that makes sense. I'm guessing you were just showing me how to find the current, that's not something you actually need to do right?

Also, the only info on the LED's that I have is that it says LED (5mm). Does the 5 mm mean 5 milliamps? It seems like the red bulb is more reactive than the other 2 bulbs though.

5 mm means 5 millimeter.
It is the size of the "bulb" and Google tells me it converts to 0.196850394 inches.

You need to know the maximum current your LEDs can handle, and choose a resistor that will result in a current that is always lower than that maximum.
So yes, you need to know this for the best performance and reliability.

That's what I thought mm meant at first, but that didn't make sense. Haha. So, my kit didn't say it's currency. It does say to have a 330 Ohm resistor, I'll just assume that's right.

It is the actual size of the LED , nothing to do with it's electrical properties. The common sizes for the traditional basic bullet-shaped round leds are 3 mm and 5 mm.

Teddy1:
Roger, that actually helped a ton, my physics class gets to that unit after the break. So that makes sense. I'm guessing you were just showing me how to find the current, that's not something you actually need to do right?

Also, the only info on the LED's that I have is that it says LED (5mm). Does the 5 mm mean 5 milliamps? It seems like the red bulb is more reactive than the other 2 bulbs though.

Being able to find the current (or being able to choose the proper resistor to get the current you want) is THE point of using a resistor in the first place!

After getting the resistor value from the calculation (325 ohms), I chose 330 ohms as the closest standard value, then ran it back through just to be sure that I was still getting the LED current I was shooting for.

Of course, I knew that (a) the 330 ohm resistor was fine and (b) the current isn't all that critical, but I ran it back through to show you how to verify your results and be sure you were getting the values you wanted.

As far as a "5mm LED", that is the diameter of the plastic "bulb". 5mm is more or less a "standard" size LED. There are also "jumbo" 10mm LED's and miniature 3mm parts.

Now, you may think that a "bigger" LED is "brighter", but that's not at all true. The little die (chip) inside the LED is the same size in all of them and the brightness will be more or less the same.

As far as brightness, LED's have come a long way from what they were even 10 years ago. There used to be relatively little choice in colors. You could get red, yellow and "green" which was actually yellow in a green bulb and it produced a yucky yellow-green color.

Now, with the new semiconductor materials like gallium nitride, we now have REAL green LED's, beautiful blue ones and even ultraviolet devices.

They are also MUCH brighter than they used to be, as well as more efficient.

Lastly, you may be interested to know that the color of an LED is determined by the chemistry of the little "chip" inside and regardless, an LED only makes one color. (the RGB tri-color devices have three separate chips inside).

So, how do they make WHITE LED's?

They work a lot like a fluorescent lamp. A white LED is actually an ultraviolet LED with a white light emitting phosphor on top of the UV die. The LED makes bright UV light, and the phosphor fluoresces just like the white coating inside a fluorescent lamp tube does.

Look at any "regular" LED. You will see a little black or blue square chip inside. But a white LED has a big glob of yellowish-white material inside. That's the phosphor, and the UV LED die is underneath.

Neat, huh? :slight_smile: