# Noob Resistor Question

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.

#### maLx

Hi, All.

I have a 15V Power Supply I'm trying to run an LED light to the power so when the supply is on, the light is on. My friend who's helping me with electronic stuff who built the PSU, said with the LED I have (MPR3AD Bivar | Mouser) a 2.2K resistor is fine. So, I happen to have a 1/4 watt 2k Metal Film Resistor that I soldered to the LED light. It worked but the resistor started smoking.

Question is, is there a direction to the resistor? Or does it not matter which orientation? I also have 5K 1/4 watt resistors should I use those?

Let me know what I did wrong.

Thanks,
Joey

#### rayma

The diode drops 2V when on, so the drop across the series resistor is 15-2 = 13VDC. Now decide what current to use. Typically for a pilot light, around 10mA is plenty. So 13V/10mA = 1.3k for the resistor.
Power dissipation in the resistor would be 13V x 10mA = 0.13W.

If you used the 2.2k resistor, 13V/2.2k = 5.9mA, and the power is 13V x 5.9mA = 0.076W (less than 1/4W).
So the 2k should have worked ok. Make sure it is really 2.2k and not 220R instead.

If you use the 5k, then 13V/5k = 2.6mA, and power = 13V x 2.6mA = 0.034W.
So this will also work, but it will be dimmer.

Resistors don't have any direction. If another resistor burns up, there must be some other kind of problem. Make sure you are using the 15V supply instead of a higher voltage supply.

Last edited:

#### Galu

Could you be connecting the LED/resistor wrongly to the high voltage input instead of correctly to the 15V output?

#### Mark Tillotson

Sounds like the LED is connected direct to the mains - in that case you need a back-to-back diode and a much higher value resistor with significant wattage, or a capacitive dropper.

Alternatively power the LED from the output DC as suggested above. A through-hole 2k resistor cannot smoke from a 15V supply

#### maLx

Alright, noob here.

So I was attacking it to the V++ and V-- terminals on my PSU, but when I go from V++ to Ground it does not smoke, seems to work, but the resistor gets really hot. If it's normal for the resistor to get really hot, then I think the V++ and the Ground terminal is the way to go.

#### rayma

If you mean connecting it to the +15V / 0V / -15V supply, the resistor should not get hot.
Measure it with a meter. Is it 2.2k?

#### maLx

Will measure shortly and report back.

#### rayma

If you are using the color code for the resistor value, red-red-brown is 220R,and red-red-red is 2.2k.

#### PRR

Paid Member
...I have a 15V Power Supply...My friend ..., said ... a 2.2K resistor is fine. So, I happen to have a 1/4 watt 2k .... the resistor started smoking....

No, you have a +/-15V supply. It is 30V from one end to the other.

Do math.

30V minus 2V in LED is 28V.

Power is voltage squared divided by resistance. 28*28/2000 is 0.39 Watts.

"1/4W" is 0.25W rating.

0.39W of actual heat is greater than 0.25W rating. You are not supposed to do that. You can't complain about the smoke.

If you connect from ZERO to either 15V side, then it is 0.085 Watts and a 1/4W resistor is satisfactory. (The resistor does not have a "direction" but the LED sure DOES; at this current level you can just try it both ways.)

#### maLx

Thanks thanks, I have it in the 15 and ZERO and so far so good! If I got a 1w resistor will it be better? Like, better long term?

measured the 2k resistor and its' 2k.

#### Mark Tillotson

If you mean connecting it to the +15V / 0V / -15V supply, the resistor should not get hot.
Measure it with a meter. Is it 2.2k?

If you put 30V across the resistor yes it will smoke as its dissipating 400mW which is a lot more than 1/4W.

At 15V its fine, and dissipating 100mW. So yes it will feel warm.

Twice the voltage means 4 times the heat generation.

#### wiseoldtech

If you put 30V across the resistor yes it will smoke as its dissipating 400mW which is a lot more than 1/4W.

At 15V its fine, and dissipating 100mW. So yes it will feel warm.

Twice the voltage means 4 times the heat generation.

And the more voltage you push into an LED, the more current it will draw of course.
This soon leads to premature failure and/or reduced illumination of the LED, which is damage.

#### indianajo

(V^2)/R 13*13/2000=.084 W should be fine on 1/4 watt resistor. No need to upgrade. At .23 W & above start worrying.

#### PRR

Paid Member
...If I got a 1w resistor will it be better? Like, better long term?...

For 90-day warranty you can run a resistor AT its rating.

As you just proved, 1.5X rating won't do for long.

"Lifetime" really requires 0.5X rating. Figure the dissipation, DOUBLE that, and round-up generously.

I computed 0.085 Watts for 15V into an LED. So you need 0.17 watt rated resistor. About 1/6th watt. Your 1/4 Watt will probably last a lifetime.

#### maLx

For 90-day warranty you can run a resistor AT its rating.

As you just proved, 1.5X rating won't do for long.

"Lifetime" really requires 0.5X rating. Figure the dissipation, DOUBLE that, and round-up generously.

I computed 0.085 Watts for 15V into an LED. So you need 0.17 watt rated resistor. About 1/6th watt. Your 1/4 Watt will probably last a lifetime.

Cool, thanks. Not sure why it's getting so hot but it's on there and been on there for a few days, no issues yet, I'll monitor and go from there. thanks for all of the tips everyone!

#### nezbleu

Measure the voltage drop across the resistor, calculate the current through the resistor. The voltage dropping resistor for an LED should never be hot. You should only have about 10mA through the LED hence same through the resistor. 10mA through a 2K resistor would cause about a 20 volt drop across the resistor, which is not possible in your circuit.
10mA through a 2K resistor will not dissipate even 1/8 Watt of power. You need to measure what is actually happening.

#### maLx

Measure the voltage drop across the resistor, calculate the current through the resistor. The voltage dropping resistor for an LED should never be hot. You should only have about 10mA through the LED hence same through the resistor. 10mA through a 2K resistor would cause about a 20 volt drop across the resistor, which is not possible in your circuit.
10mA through a 2K resistor will not dissipate even 1/8 Watt of power. You need to measure what is actually happening.

ok, will take care of that in the coming days, thanks!

#### schiirrn

And the more voltage you push into an LED, the more current it will draw of course.
This soon leads to premature failure and/or reduced illumination of the LED, which is damage.

LEDs operate by current. Voltage drop across the LED doesn't change much, when operated in it's linear(specified) region and you can't "push voltage into an LED". That's why a resistor in series in needed, to set the current.
The crucial part of all this: Supply voltage minus voltage across the LED is voltage across the resistor. Given that voltage you use Ohm's law to determine a resistance to draw specified current for the LED.

#### wiseoldtech

LEDs operate by current. Voltage drop across the LED doesn't change much, when operated in it's linear(specified) region and you can't "push voltage into an LED". That's why a resistor in series in needed, to set the current.
The crucial part of all this: Supply voltage minus voltage across the LED is voltage across the resistor. Given that voltage you use Ohm's law to determine a resistance to draw specified current for the LED.

Perhaps I wasn't clarifying enough.

Lets say a red LED designed to have 2 volts, 20 mA on it is powered from a 12 volt supply.
We now need an appropriate resistor to limit that excess voltage and keep the LED from being destroyed.
Because 12 volts directly on it would cause it to burn out immediately.
It would get hot, hot from too much current resulting from too much voltage.
High-powered white LEDs need heatsinking.
So the red LED would need something like a 470-500 ohms resistor of reasonable wattage. (.5W)

#### nezbleu

Perhaps I wasn't clarifying enough.

Lets say a red LED designed to have 2 volts, 20 mA on it is powered from a 12 volt supply.
We now need an appropriate resistor to limit that excess voltage and keep the LED from being destroyed.

No, look at the data sheet for said LED. It's not a question of voltage, it is a matter of current. 5V would be enough voltage to destroy the LED if the current was not otherwise limited. At the desired current level (usually 10-20 mA but depends on LED type) it has a forward-biased voltage drop of about 1.8-2.0V. If your power supply has a low enough output impedance it will put more than that amount of current through the LED, because the diode is not a resistor, as long as it has >2V across it, it will pass all the current you can push into it. The dropping resistor is there to limit the current, not the voltage. The LED defines the voltage across it.

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.