Basic LED Questions

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Ok, sorry to bug everyone again, but I've got a few questions on LEDs that seem pretty basic.

1. If you hava 5mm LED can you just glue it into a 5mm hole in the front panel? I have seen LED clips/mounts etc. designed to assist in through-hole mounting, but I might like the look of a raw or "nude" LED sticking through the panel. Advise on how to do this?

2. For a LED which is turned on by a switch, which is better if you have the LED in the "off" condition: (a) to ground both ends or (b) to switch it so its open circuit? (a) might be good because its a static sensitive device, and (b) might be good in case a fwd or reverse potential develops across the 2 grounds.

3. If an LED is 2.4 v (Vf) and 30 mA (If), will it only draw 30 mA if you give it 2.4 volts? Or do you need a current limiting device of some sort to protect it? I am aware of the so called LED equation here:
http://www.diyaudio.com/forums/showthread.php?postid=231227#post231227

So are you covered by just sticking to this equation?
 
Sure you can just mount the LED through a hole in the panel, Ive done that a few times in the past by just gluing it in with bathroom silicone.

Dont worry about static in connection with LED's, theyre hard to kill....i had LED's as toys when i was like 4 or 5 years old, and they survived that treatment.

No, you dont need any current limiter, the current stated is simply the current the LED draws at the stated voltage (mostly an interesting figure in case youre running the LED by a LED driver or if the circuit is battery powered).

Have fun.....since thats pretty much all LED's are :)

Magura
 
LED question

Definitely make sure that there is a current limiting resistor in the LED path, unless you're driving it with a voltage-to-current converter. If the LED is to be powered from the 13.5 volt car supply, a 1.0 kohm 1/4 watt resistor will limit the current, If, to just over 10 mA. Do not operate the LED at the full limit of 30 mA, but 10 to 15 mA for longer life. Today's LEDs are very bright and don't need as much current as the parts from yesteryear. If the power source is 24 volts dc, use a 2.2 kohm, 1/2 watt resistor. I hope this helps.
 
Magura said:

No, you dont need any current limiter, the current stated is simply the current the LED draws at the stated voltage (mostly an interesting figure in case youre running the LED by a LED driver or if the circuit is battery powered).

Have fun.....since thats pretty much all LED's are :)

Magura


You absolutely need a current limiter -- anything over 40 ma starts to diminish the LED life pretty quickly, over 100 ma for more than a few milliseconds will kill it (unless it's a Lumiled, in which case you can kill if the device isn't properly heatsinked.)
 
do this;

(supply voltage - Vf) / (required current) = current limitting resistor

same formula applies if you want to series load a bunch of LEDs, but the params are never exact. obvisouly you are limitted to number of LEDs by how high you supply voltage is (no such thing as negative resistance). If you want to put a bunch of LEDs you are better off using a DIP of resistors

ie, in your case, say you had 12vdc to work with

[12 - 2.4 ] / (.030) = 320ohm minimum

below 320 ohm you will be drawing too much current and can blow the led

above 320 and you may not get max brightness

-chris
 
Sorry....i guess i didnt explain myself that well...i meant that you dont need any limiting besides the resistor calculated according to the link in the above post :) Its a rarity to run led's without a resistor in series or a driver.

You sure can run the led on its rated limits without any substantial disadvantages life expectancy vise....life expectancy probs in an indoor application is also a problem of yesteryear (Still lots of old led's around though).

Magura
 
Hi,
I may be missing something here but didn't the original question stated have the supply equal to the voltage of the LED?

2.4 volt supply, 2.4 volt LED

I can see how the above equation would provide the proper voltage drop across the resistor for the LED to operate normally.

But what if the supply was the same voltage as the led?
the equation doesn't work.

(2.4-2.4)/.030=0

So given the stated formula is it correct to say that if the applied voltage is equal to the voltage rating of the LED that a resistor is not required?

Just askin':xeye:

joe
 
Yup, you need a resistor. I discovered once to my suprise that if you connect an LED across a 9V battery without a resistor the little sucker explodes like a firecracker. (So I do some dumb things some days . . .) Little pieces of plastic flying about were definately an eye hazard.

Anyway, I find it easiest to use a cheap single turn trim pot as the resitor, That way I can adjust the brightness. It can just be "air wired" if you don't mind the dorky look. If you want something neater, just remove the trim pot, read the resistance setting on the pot then replace it with a resistor. You can calculate the resistor value, of course, but if you are using a non-standard one and don't have a spec sheet, the trim pot is the quickest way to do it.
 
i was always told (as a child) to calculate the series resistor like this:

supply voltage(v) x 50(R)
Which is the same as saying: supply voltage(V) / 0.02(A) = R, because 1/0.02 = 50. So you are just using Ohm's law with a fixed current value of 20 mA, but you are not taking into account the voltage drop across the LED.



I may be missing something here but didn't the original question stated have the supply equal to the voltage of the LED?

2.4 volt supply, 2.4 volt LED

I can see how the above equation would provide the proper voltage drop across the resistor for the LED to operate normally.

But what if the supply was the same voltage as the led?
the equation doesn't work.

(2.4-2.4)/.030=0

So given the stated formula is it correct to say that if the applied voltage is equal to the voltage rating of the LED that a resistor is not required?
Sure. But what happens if the voltage rises to 2.45 V or higher, or drops to 2.35 V or lower. The V/I curve of an LED is very non-linear, and is almost vertical at the operating point (V=2.4 V in this case). That means that the slighest change in voltage results is a very large change in current. Or to put it another way, a large change in current results in a small change in voltage, meaning that the forward voltage of the LED is fairly constant over a wide range of operating current. So you must always drive an LED by controlling the current, not by controlling the voltage. This is done by using a voltage source with a current-limiting (actually current-setting) resistor in series.
 
All this calculating resistor stuff is ok....but the easy way in most cases is to use a driver.
A LED driver will cost you 0.2usd, and you can even run it with current feedback in order to maintain the correct intesity of the LED.

Add an LDR to the circuit, and you got a LED array that compensates for the ambient light.

Magura
 
Disabled Account
Joined 2002
Magura said:
All this calculating resistor stuff is ok....but the easy way in most cases is to use a driver.
A LED driver will cost you 0.2usd, and you can even run it with current feedback in order to maintain the correct intesity of the LED.

Add an LDR to the circuit, and you got a LED array that compensates for the ambient light.

Magura


What's the use of a LED driver ( ? ) if a resistor is enough ?

BTW I made the mistake myself to think a resistor is *always* needed. I searched for the thread about blue LEDs directly wired to a newly wound winding on a toroid ( P.Daniel ) but I didn't find it. When the LED is operated in the right spot in its curve ( supply voltage just above turn-on voltage of the LED ) a resistor can be omitted but it is really wiser to use one in all cases for safety reasons. I would measure first if you have 2.4 V exactly. When it is higher you certainly need the resistor.

When you want to feed the LED from a low voltage AC winding as the suggested extra winding I would use a series resistor and a series diode like a BATxx type ( Uf of 0.2 V ). LED's don't like reverse voltages too much.

edit: found the thread:

http://www.diyaudio.com/forums/showthread.php?s=&threadid=6368&perpage=15&highlight=&pagenumber=4
 
Disabled Account
Joined 2003
jean-paul said:
I would use a series resistor and a series diode like a BATxx type ( Uf of 0.2 V ). LED's don't like reverse voltages too much.

I would use a diode (any diode) in parrallel with the LED but in opposite directions. LEDs usually have a reverse voltage of 4-5v. so your method would also work as the overwhelming majority of the reverse voltage will be on the diode, not the LED. but I just feel safer knowing my LEDs aren't hugely reversely biased.
 
jean-paul said:



What's the use of a LED driver ( ? ) if a resistor is enough ?



Since a LED dosnt maintain the same intensity over the years, a driver with current feedback will compensate for this. Just take a look at older LED signs, the intensity matching simply sucks.

Thats part of the point of using a driver, but especially if you run more than 1 LED, id say a driver is a must. You could buy intensity matched LED's, but at a fairly high price, and with a limited time horisont of the matching anyway.

Magura
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.