Home-made assembled LDR or which commercial RoHS LDR?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
yuhengdu said:
Can LDR be unstable? A LDR preamp converts an electrical signal into light, which is then transmitted and picked up by yet another device whose properties are dependent on a fickle light source. How to make sure that the amount of light being picked up is accurate?
The light does not carry the signal. It merely sets the attenuation of the signal.
 
I would happily use an LDR as an attenuator any time of the day, now if you want precision or repeatability, look elsewhere.

Althoughn the PIC based correction looks good, is no guarantee against degradation along time.

Unless it has a self calibrate mode or something, which updates expected parameters regularly, say once a month or so.

Back to the OP problem: what will that LDR be used for?
Type of signal/voltage/current/minimum acceptable resistance/linearity/etc.

That is needed for a better answer, otherwise it's like painting with a very broad brush.
 
The photocouplers would be used as an attenuator for volume control, as used on a few projects on here :)

I've ordered a set of 25 NSL-32SR2s which arrived today. Just awaiting for some LEDs and LDRs to arrive hopefully tomorrow and do some home-made photocouplers.

I don't mind having to recalibrate every few months if necessary. It will be interesting to see how much they deviate over time and how much deviation is required before it really becomes noticable (on the volume level / balance front).
 
I don't mind having to recalibrate every few months if necessary. It will be interesting to see how much they deviate over time and how much deviation is required before it really becomes noticable (on the volume level / balance front).

If you keep max current below 10ma I believe you won't have to recalibrate.

If you look back to earlier posts 617 & 620 in the precision LDR thread, you will see that the beta tester has been using his precalibrated board for eight or nine months without re calibrating. In fact, he never calibrated his system, using the calibration that was done when the board was manufactured (and then sent from the USA to Australia for him to test).

We're finding that so far, for listening purposes, no one has detected an audible change after the first calibration. However, the board is designed so that LED current is limited to 10ma max at minimum attenuation (which is 48db on a 10K pot with R2 at 40 ohms), and with selected R2 LDRs, it is possible to keep max current below 6ma at -48db. Maybe this is why they don't change noticeably over time.

When you test those NSLs, measure current requirement for 40 ohms and try to use the lowest current devices on the R2 side. The R1 side can be 80~90 ohms with virtually no loss of signal at max volume (< .1db attenuation at 80 ohms).
 
Last edited:
I would happily use an LDR as an attenuator any time of the day, now if you want precision or repeatability, look elsewhere.

Not true any more.

Althoughn the PIC based correction looks good, is no guarantee against degradation along time.

Unless it has a self calibrate mode or something, which updates expected parameters regularly, say once a month or so.

Which it does have, as often as you like; takes 10~15 minutes to run the complete calibration cycle.
 
CdS cells were used for decades in camera light meters, and AFAIK, drift or loss of calibration was never an issue. I think they could go off a bit over temperature extremes, but that's not usually a problem in the living room. Run your LEDs with a voltage variable current source and they should be pretty stable too, especially if you derate them a bit.

LEDs are current-dependent devices, it's much better to drive them with current-controlled sources. I believe that voltage control may be the main reason LDRs are reputed to be so variable. You can't really control an LED by keeping voltage constant, you have to keep current constant, and at a fixed voltage, current will vary depending on temperature at the LED (and high currents through the LED will make the temperature vary even more) . . .
 
LED aging is a matter of luck. As traffic lights and other LED panels become more common, you see some individual LEDs going dim early

Of course luck plays a part -- as it does for every single component, not just LEDs. If an LED turns out to be bad just replace it (I mount mine on 4-pin DIP sockets for easy replacement). I've tested about 80 devices so far; I've found two or three that didn't meet my minimum 'pass' criteria, and non that I've used extensively have failed prematurely. All of my 'failures' were failure to pass initial screening.

One advantage of using a 20ma LED and driving it to a max of 10ma is you don't stress it. And 10ma happens ONLY at maximum attenuation. I play my system between about 25~75 on a 1~99 linear scale. At 25 the current through the shunt LED is about 1 milliamp and goes down from there to drop below 0.1ma at around 55 on the scale (the display shows the current passing through each shunt device). Amazingly, at a maximum 10K resistance, LDR current is about 4~6 micro amps.

Truly, I believe if you get the design right the LDRs will be stable for a long time between calibrations and they will live for a very long time unless you get one of the very rare 'dud' devices.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.