I need help with a transformers issue

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
i have two transformers just like each other every one of them steps up voltage from 2.3v to about 107v when i connect one of them to an ac source with 2.3v and then take the output which is supposed to be 107v and then connect it to the other one's primary coil I'm supposed to have as a total output about 5kv i think (correct it for me if I'm wrong) but this doesn't happen i get a total voltage lower than 107v , so why does this happen ?! please help
 
Each transformer winding has a certain self inductance that limits the current thru the winding. When you try to put 100 volts across a nominal 2 volt winding, the self inductance current is 50x what the designers expected, so the transformer just saturates, and doesn't do much of anything.

If you have a frequency source, you could raise the input frequency. If the transformers don't run out of bandwidth, or insulation breakdown, a higher input frequency might give you better results.

But...be careful...if it does work, that's a potentially deadly amount of volts.
 
Neon sign transformers that are rated to produce 3500 VAC are pretty special. The insulation on the wiring is way thicker than that used in a transformer rated to take 600 V surges. If you need a neon sign transformer (3500 VAC), buy one.
I realize the tesla coil experiment in the Boy Scout handbook didn't mention using special wire, but then that experimental result didn't last very long, either.
If you are making a hi-pot tester, the resistors used in those are about 3" long. A hipot tester limits the short circuit current to < 20 ma outside the grounded box to not kill operators. The techs that repair hipot testers are firm believers in the one hand at a time rule, and not touching the hot side with the power on.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.