Switching power supplies vs transformer


2010-02-12 4:09 pm
I am powering up a small string of LED lights. And have a few small power adapters. Some with transformers, and some with switching.

So I have run into this issue.
The switching ones - make the rated voltage @ no load. I guess its understandable.

The transformer ones - say 12v 1 amp ... I measure it and it reads 14.5v @ no load.
I run 1 string of led's - great lights up - I am drawing 25-30ma. I have calculated the resistor etc etc for the 14.5.

Then the second ... and lights up ... third ... works great, 4th ... 5th etc works.

6th - and nothing lights. I measure and its 12.5. I guess drawing 150 ma or so and it gets closer to its rated 12. Now I'd calculate it for 12v and do it right that way, but I cant power it up till the thing is all done. Is that my solution ?



2010-02-12 4:09 pm
I am using a 12vdc adapter with 1 amp capacity. But when I measure it under no load it reads 14-15v. Since I barely use 100-150ma I cant decide what voltage to resistor it for.

Its 12v if I pull close to 1 amp. I guess.
Its 15v if I pull 0 amp. But is there a way to find what the voltage will be @ say 100 ma, 200 ma etc etc.

A universal approach - determine the need, solve for the need, apply the solution.
From what you've written, my guess is the series LED string has a total voltage drop near 12V. Too many strings and the voltage from the transformer drops below what is required to maintain LED conduction.
A transformer has a load regulation spec, similar to what you encounter with a full-fledged power supply. Your transformer is spec'd for 12V @ 1A... with a specified primary voltage. As you've seen, with a light load the output may be higher; with a heavy load the voltage may be lower.
But is there a way to find what the voltage will be @ say 100 ma, 200 ma etc etc.
Yes, probably more than one way.
The easiest and most accurate would be to connect a dummy load equal to the application and measure the voltage across the load.
Sofa has assumed you meant 12Vdc 1A.
But what voltage is your transformer? and what is the output from the cable, AC or DC?

But to your design.
You must design to operate at your lowest worst case supply voltage.

Then you should check that the design can operate without damage at your highest worst case supply voltage.

Two quite different stages in the design process.

Finally you may want to check long term operating temperatures for the operating voltage that is likely to occur most of the time.
The transformer ones - say 12v 1 amp ... I measure it and it reads 14.5v @ no load.
That is right trafo's secondary provides higher voltage output without load that is normal if have rectifier+capacitor. Place a linear voltage regulator afterwards and you will get similar behavior as per switching power supply that implements one already.

Regarding LEDs strips once you connect two many of them due to increased load trafo's output voltage drops below value that requires to make them work (Vf is about 1.5V-2.5V per light emitting diode).