While not directly audio related, I was hoping to get some help, as my problem refers more to general audio design rather than anything specific.
I need to wind some low power transformers (the load will be in the 10s of milliwatts) for a project involving nixie tubes and dekatrons. These will convert 110Vac to the voltage needed to be fed into the tubes (~170v for nixie tubes, 350-400v for the dekatron).
My question is this: I have read that transformers are designed for some minimum frequency, given a transformer I have wound, how can I determine whether or not my transformer has a low enough minimum frequency for safe use with 5/60Hz mains.
I've read this article: Transformers Part 1 - Beginners' Guide to Electronics
which does a good job of explaining the problem - the issue is that at too low a freqency, the flux density will be more than the core (in my case a small-ish ferrite toroid) can handle. Also it mentions that more windings will decrease the density.
If I wind the transformer in such a way that the flux density is too great for the core to handle (too few windings?), will I be able to see it somehow with an oscilloscope? will there be some obvious change to the waveform? Is this even something I need to worry about with a step up transformer?
Please note: I don't have details about the cores I'm using, they are salvaged, so probably only an experimental method will be useful to me (as I don't know the magnetic characteristics of the cores).
Thanks for the help!
I need to wind some low power transformers (the load will be in the 10s of milliwatts) for a project involving nixie tubes and dekatrons. These will convert 110Vac to the voltage needed to be fed into the tubes (~170v for nixie tubes, 350-400v for the dekatron).
My question is this: I have read that transformers are designed for some minimum frequency, given a transformer I have wound, how can I determine whether or not my transformer has a low enough minimum frequency for safe use with 5/60Hz mains.
I've read this article: Transformers Part 1 - Beginners' Guide to Electronics
which does a good job of explaining the problem - the issue is that at too low a freqency, the flux density will be more than the core (in my case a small-ish ferrite toroid) can handle. Also it mentions that more windings will decrease the density.
If I wind the transformer in such a way that the flux density is too great for the core to handle (too few windings?), will I be able to see it somehow with an oscilloscope? will there be some obvious change to the waveform? Is this even something I need to worry about with a step up transformer?
Please note: I don't have details about the cores I'm using, they are salvaged, so probably only an experimental method will be useful to me (as I don't know the magnetic characteristics of the cores).
Thanks for the help!