...Not sure why it's getting so hot...
Resistors "waste off" power. So they get hot.
Modern resistors are built of very tough stuff, so they can be safely run at high temperature (usually less than paper-scorch temperature).
If we figure 0.08W in a 0.25W rated part, it will only get 1/3rd this hot, but that is still enough to warm your finger-skin.
How bright is the LED? If plenty-bright, you can use a higher-Ohms resistor to get lower current and less power and heat. Modern LEDs can be "too bright" at a small fraction of their rated MAXimum current. (In the very old days, a "20mA " LED at 10mA was not too bright; LEDs have improved a lot.)
Measure the voltage drop across the resistor, calculate the current through the resistor. The voltage dropping resistor for an LED should never be hot.
There I go making the same mistake I tried to correct in someone else's post. The resistor is not a "voltage dropping resistor" it is a current limiting resistor. Bad habits die hard.
No, look at the data sheet for said LED. It's not a question of voltage, it is a matter of current. 5V would be enough voltage to destroy the LED if the current was not otherwise limited. At the desired current level (usually 10-20 mA but depends on LED type) it has a forward-biased voltage drop of about 1.8-2.0V. If your power supply has a low enough output impedance it will put more than that amount of current through the LED, because the diode is not a resistor, as long as it has >2V across it, it will pass all the current you can push into it. The dropping resistor is there to limit the current, not the voltage. The LED defines the voltage across it.
Whatever.
Apples/oranges.
I've spent 45+ years in electronics service, got my schooling, know Ohm's Laws, and got my degrees.
You want to debate, fine.
There I go making the same mistake I tried to correct in someone else's post. The resistor is not a "voltage dropping resistor" it is a current limiting resistor. Bad habits die hard.
Duh, it does BOTH.
It's not apples/oranges. Current and voltage are completely different things. Of course the resistor does drop voltage but that is just a side effect that is unnecessary. If you connect that diode without resistor you short out the power supply. The series resistor is used to limit current.
Maybe have a look at an LED datasheet and Ohm's law.
Maybe have a look at an LED datasheet and Ohm's law.
It's not apples/oranges. Current and voltage are completely different things. Of course the resistor does drop voltage but that is just a side effect that is unnecessary. If you connect that diode without resistor you short out the power supply. The series resistor is used to limit current.
Maybe have a look at an LED datasheet and Ohm's law.
I fail to understand the continuing argument over this simple, elementary thing about LED's.
I know all about LED's and their particular needs, I'm not some "noob" to this.
If you understand LEDs and how to implement them, then stop posting plain wrong statements and feed the original poster, who asked for help, that nonsense.
I give you the benefit of the doubt and given your handle and usual postings that you are just a troll.
I'm not going to argue with you.
Time to put you on IGNORE.
Bye.
- Status
- This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
- Home
- Design & Build
- Parts
- Noob Resistor Question