A Low-Lethality Dielectric Strength Tester

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
This tester comes as a natural companion for the thermal ohmmeter:
http://www.diyaudio.com/forums/equipment-and-tools/322079-thermal-ohmmeter.html

To fully characterize insulating interface materials, one needs to know both thermal and electrical performances.
Of course, this tester can also be used for many other purposes, including totally unrelated ones, for example as a lab bias source for experimental electrostatic loudspeakers.

It is fully automatic, and reaches voltages of up to 10kV, but its most salient feature is its low-lethality: unlike many testers, it inflicts minimal or even zero damage to the sample tested, allowing many rounds of testing, reworking, etc on the same sample.

The low damage feature is achieved thanks to a minimal energy level: the peak current, power and duration of the discharge are very small, leading to a commensurately low energy delivered to the sample.

The first means of energy-limitation is a very large series resistance, totaling 220MΩ.
At the maximum output of 10kV, this result in a current never exceeding 45µA, and the worst-case instantaneous power dissipated within the sample is limited to ~115mW.
The duration of a discharge event is also limited: a detector inhibits the HV converter as soon as an event is detected, and the energy stored in supply capacitors is minimal: the supply is a directly rectified flyback, without intermediate multiplier stages and their capacitors.
The filter capacitor itself is minimal: 560pF.
At 10kV, it can store a maximum of 28mJ, of which 14 at most can be delivered to the sample, because of the series resistor.
As the detector's response time is 5 ~ 10ms, the HV converter will be turned off well before the capacitor has discharged significantly, thus the theoretical worst-case energy delivered to the sample is always going to be <15mJ, and in practice will be much lower, 1/10 or 1/100 or even less, because the voltage will generally not be maximal, and because of the realities of breakdown mechanisms.

How effective are all these measures?

They work stunningly (no pun intended!) well:
for example, with the tester in "hold" mode at 8kV, if you grab the ground wire with one hand, and near the "hot" terminal with the index of the other hand, at some point you see the detection has tripped, but you can only know by looking at the instrument: you feel absolutely nothing, not even a tingle, and there is no visible spark flying.
More to the point, if a fragile, carbon-rich sample like ordinary paper is tested many times in a row, the detected breakdown voltage is always in the same region, sometimes higher, sometimes lower, but there is no downwards tendency, meaning there is no significant carbonization or similar damage.
 

Attachments

  • BDT2.png
    BDT2.png
    60 KB · Views: 420
  • BDT1.jpg
    BDT1.jpg
    290.8 KB · Views: 410
  • Power supply
The supply is slightly unusual because I didn't have the ideally suitable transformer in my stash, and I resorted to a 16V/20VA, the closest match.
With it, I couldn't use a normal, capacitor-only filter: the DC current required is a bit over 1A, and this would have resulted in a 2A rms current in the secondary, clearly excessive.
In addition, the pre-regulator DC voltage would have been ~22V, meaning lots of heat to dissipate.
Fortunately, I also had a 10mH choke, and with it, the rms current became just acceptable, and the full load voltage was 16.5V, ideal for the 15V LDO regulator.
I used a home-brew regulator rather than a monolithic one, because the supply voltage also serves as a reference for the DAC and needs to be stable and accurate.
Other than that, it has nothing special, but with the BD435, a dropout of 150mV @1A is sufficient to let it regulate, which is nice.

  • EHV generator
This generator is rudimentary and dreadfully inefficient: the HV winding is made on a simple cylindrical core, not a closed magnetic circuit, and the regulation is performed by varying the DC supply of the flyback stage, not by PWM.
The reasons for these choices are availability, simplicity, but also the quality and agility of regulation: a linear supply is superior to PWM in this respect, and it is more important than efficiency, because of the low level of power delivered.
The flyback switch, M1 receives a constant squarewave on its gate, and chops whatever voltage is presented to L2.
C10 is the classical FB capacitor, like in CRT supplies. It needs to be polypropylene.
The secondary voltage is rectified using a "pencil", selenium stack rectifier.
More modern options like Si diodes are of course usable, and would have a better efficiency, but I had a few available.
The resulting DC is filtered by a home-made 560pF/20kV.

A divider made of a string of high-voltage 33MΩ samples the voltage, bringing it to the wiper of R26, and the FB port of the regulator U4.
Its reference port receives the control voltage from the DAC.

The cold side of the EHV supply is not directly tied to the ground, but to the breakdown detector U11, thus when a current is drawn from the test output, R28 pulls a negative current from the (-) input and as a result, the output goes positive (BTW, I see that I messed up this part of drawing, there will be a version 1.2)

In the mean time, here is already the second part of the circuit, comprising the small control logic, the BCD-DAC, and the display
 

Attachments

  • BDTL.png
    BDTL.png
    87.8 KB · Views: 366
  • BDT3.jpg
    BDT3.jpg
    544.1 KB · Views: 368
Here is the corrected schematic, since the "Edit the first post" feature does not seem to work properly.
If a moderator sees this, he can replace the schematic of the first post by this one (and delete this post, as it will become redundant)
 

Attachments

  • BDT2.png
    BDT2.png
    68.7 KB · Views: 362
Good on ya Elvee for yet another interesting test instrument.

I don't recall 'gimped' as a winding technique ??

Why do you refer to C10 as a 'feedback' capacitor - I would expect it to have more a dV/dt and peak V snubber type action.

I'm sure you will describe how you made your step-up transformer and 20kV cap - can't wait for that.

I guess somebody else would have been preparing this thread if you hadn't made the current limiting acceptable for human testing!
 
I don't recall 'gimped' as a winding technique ??
Gimped is the translation of a very technical french term, "guipé", which means that the main, central wire is wrapped inside other wires, in this case synthetic silk.
It is useful for reducing the capacitance, because the winding is more aerated, and also to build odd-shaped windings, because of the friction.

Here, both features are useful.

There is perhaps a more commonly used term than "gimped" in english?

Why do you refer to C10 as a 'feedback' capacitor - I would expect it to have more a dV/dt and peak V snubber type action.
FB means flyback: it makes the stage operate in class E or quasi-class E.
When the MOS opens, a positive half-sine arch is generated, and the voltage returns to the ground, where it stays clamped by the body diode.

attachment.php


Here, with the low repetition rate, I am not sure it works in pure class E, but even if the cap voltage returns to Vsupply before the next pulse, it is relatively unimportant since the voltage is limited to few volts.

I'm sure you will describe how you made your step-up transformer and 20kV cap - can't wait for that.
I certainly will

I guess somebody else would have been preparing this thread if you hadn't made the current limiting acceptable for human testing!
Even without the limitation, the power output is sufficient to deliver a good jolt, not kill somebody :)

Excellent Elvee.
Thanks *a lot* for your very useful contribution. :)
Thanks
 

Attachments

  • BDTx.png
    BDTx.png
    75.4 KB · Views: 456
Most of the Power/HV section is obvious enough for not requiring detailed explanations, but some details probably look puzzling:

-A 2kV spark-gap is connected between the earth and the ground (0V).

What is the point of such a bizarre arrangement?

First of all, the instrument is designed as a class 2 device, with proper insulation and an insulating case, therefore not requiring earthing, and although the power cord is equipped with an earth wire, it is not permanently connected to a node inside the instrument.

When the tester is connected to a DUT, it is possible for the DUT in question to be fully floating, but it can also have a path to the earth. Such a path could be explicit, but it can also be more informal, like a leakage of several gigaohm, caused by the surface the DUT is placed on for example.

If the cold terminal of the tester is connected to the earth-related side of the DUT, no particular problem arises, but if the connection is reversed, the 0V of the tester will in fact assume a potential of -Vtest wrt. the earth.

Since the average potential of the mains is the earth, the transformer will see this potential between its secondary and primary windings, which could initiate a discharge and damage the insulation.
The spark-gap limits the voltage to 2kV which in practice is safe enough for a split-bobbin transformer.

Wouldn't it be simpler to permanently connect the earth and the 0V?

It is in fact the normal, preferred configuration, and the earth and 0V terminals are side by side on the front panel, and are normally strapped together, but I wanted to keep the option of a floating instrument open: it could be useful when testing some parts of a large stationary equipment where earth is present, but not explicitly connected to any side of the section tested.
For example, imagine you want to test a terminal block inside a machine: none of the terminals is connected to the earth, but the block itself is attached to an earthed chassis.
If you test the terminal-to-terminal breakdown voltage with the tester earthed, you are in fact going to test the terminal to earth insulation, which might be smaller than the actual terminal to terminal insulation.


-The ground connection has a series resistor of 6.8K and a ferrite bead, and both the ground and earth connections pass through a common ferrite ring.

Compared to the 220MΩ limiting resistor, the added impedances looks ridiculously small, so what's their point?

The 220MΩ resistance does statically limit the current to <45µA, but if the DUT is capacitive, that is not necessarily the case: obviously, if you push a capacitor to its breakdown voltage, the main discharge current will be caused by the capacitor itself, and could reach many amperes, but this current will be completely internal, and won't return through the tester.
There is however one case when the current will return through the tester: when the DUT has a significant capacitance wrt. the ambient space: if a large isolated conductive object is charged through the 220MΩ, the discharge current will return to the earth through the earth/ground connection, the transformer's capacitance, and the spark-gap, if the earth connection is not used.
The very steep pulses of this discharge current did funny things to the digital section, which is why I had to add damping/limiting components.

-Finally, and I see that I will have to prepare a version 1.3 of the schematic, what is the purpose of R29/C15?

In fact, R29 connects to the output of the opamp.
When the DAC out returns to zero, after having reached the maximum, U4 injects a transient into R26 through the loop compensating components, and the kick-back of this transient generates a false detection event in U11 U3.
This is a minor nuisance, because it cannot be confused with a real event, but it forces you to clear an inexistent breakdown detection.
The "good" solution would have been to buffer the wiper of R26, but I had no spare opamp left, and I had no room to add one, so I added this compensation network, which is not perfect but sufficient.
The output of U4 is not ideal, because it regulates the HV and contains a little ripple compensation, but that was the only low-impedance node available for that.
The consequence is a detection floor limited to 100 ~ 200nA, because of the added noise, but that's at the resolution limit of the potentiometer anyway, thus little harm is done
 
  • Digital section
The core of the digital part is the 4-digit counter.
The outputs are used for the DAC, a variation on the R2R theme, and for the display.
The 7-segments are handled directly by 4543's: I don't bother to use limiting resistors, it is much easier to rely on the self-limiting characteristics of the outputs.
Here with a 15V supply, the current and hence the dissipation would have been uncomfortably high which is why I used a 8.2V zener D10 in series with the display common.
The method might look brutal, but it works very well provided all the 4543 have the same origin.

The main clock is generated by oscillator U3; two basic rates are possible: 1kV/minute is for "serious" tests, requiring repeatablity and accuracy, 10kV/min is for exploratory purposes, because 1kV/min is slow, especially if the breakdown voltage is high.
It is also possible to manually fine tune the voltage, at ~1/3rd the nominal rate.
The advancement can also be halted completely; the implementation is not very orthodox: the input of U3 is simply left floating, but it does not seem to pose a problem, and even if there was an increment every several minutes, it wouldn't be problematic (but nothing of the kind has been observed).

The top voltage can also be limited thanks to a crude address decoding.

Two flip-flops, U4, U5 register the run/stop and breakdown conditions.
When a breakdown condition is detected by the analog section, it freezes the count, blanks the DAC output thanks to an analog switch, and makes the display blink rapidly thanks to U6.

Here is a more detailed view of the EHV section and resistors strings:

attachment.php



attachment.php



Here is the 2kV spark-gap:

attachment.php


This is the converter controller:

attachment.php
 

Attachments

  • BDT7.jpg
    BDT7.jpg
    526.7 KB · Views: 411
  • BDT6.jpg
    BDT6.jpg
    522.3 KB · Views: 425
  • BDT5.jpg
    BDT5.jpg
    542 KB · Views: 444
  • BDT4.jpg
    BDT4.jpg
    533.9 KB · Views: 410
Here is a convenient accessory for the tester: it is a sample-holder.

It allows testing under controlled and ~repeatable conditions: same pressure, same geometry of the test electrode, etc

attachment.php


attachment.php


And here is a short video showing the tester in action:

YouTube
 

Attachments

  • BDT8.jpg
    BDT8.jpg
    221 KB · Views: 388
  • BDT9.jpg
    BDT9.jpg
    185.9 KB · Views: 383
Construction notes
  • High-voltage transformer
Here are the accessories I used to wind the transformer:

attachment.php


The primary and secondary are wound separately. The primary is not particularly problematic: the 46t are hand-wound directly on the insulated ferrite core in three layers, and the coil is immobilized using cyanoacrylate glue.

I wound the secondary on a stationary drill, using the home-made flanges shown in the pic.
Their inner side is covered in polypropylene tape, to prevent adherence.
The core of the winding is a 10mm diameter silicone ring cut 1mm longer than the 5mm thickness of the finished coil: that way, when the screw is tightened, the core expands slightly, facilitating its removal when the coil is finished.

The coil is "scramble" wound: I made no attempt to achieve neat and orderly layers, I just guided the wire by hand, to try to achieve a reasonable regularity.
After the core was covered by the first layers, I immobilized them using cyanocrylate to avoid their disintegration at the removal stage.
During the course of the winding, I also poured glue drops from time to time, to reinforce the structure.
The turn number is approximate: my counter was broken and I just filled the flanges to the top (37mm dia.).

I stopped the winding using one more drop of glue, and I lightly impregnated it with modelling wax: I heated the coil by Joule effect and applied the wax against the outer layer.
I then let the wax penetrate for half an hour, and removed the heating power.
When everything had cooled down, I dismantled the winding props, and attached the teflon output cable using sewing thread and glue.
I then made an anticorona outer isolation using a ring of heatshrink tube punched in the middle.

attachment.php


I secured the cable and obturated the hole using acrylic resin, fitted the sleeve of the inner termination, glued it and sprayed the whole winding in a polyurethane varnish.

When everything was dry, I assembled the primary and secondary: note that the winding directions have to be opposed: the outer side of the primary is the cold side, as is the inner side of the secondary.

This is necessary to avoid injecting noise in the breakdown detection circuitry.
 

Attachments

  • BDT10.jpg
    BDT10.jpg
    197.1 KB · Views: 336
  • BDT11.jpg
    BDT11.jpg
    190.4 KB · Views: 304
  • The HV capacitor
I normally don't make my components myself, but I made this one because I had nothing suitable in my stock, and I didn't want to place an order just for that.

The base material is a polyethylene tape, 50mm wide and 0.14mm thick.

To build one electrode, I start with 200mm of tape, and I stick a strip of aluminum foil, 40mm wide, just in the middle, for the whole length.
I prepare the termination, a teflon cable stripped for 20mm having its individual threads arranged as a fan.
I then fold the tape assembly for the whole length, with the termination inside the fold, in contact with two sides of the aluminum.

I make the other electrode in exactly the same way, and carefully laminate them flat.

The two electrodes are then wound on an insulating tube or rod, approx. 10mm dia.:
A first electrode is taped to the cylinder, then the other, about 20mm further down.
Laterally, the two 20mm metal strips must be perfectly overlaid.
The winding is then completed, as tight as possible and immobilized with a bit of tape.
The whole thing is covered with a heat-shrink tube and retracted.

It is then dipped extremely slowly (1/2 hour) into a bath of liquid wax.
It is dipped verticaly, following the axis of the cylinder.
The purpose of this operation is to drive as much air as possible out of the capacitor.
The bath is then let to cool, and just before it solidifies, the capacitor is pulled out.

The result is a capacitor having two sheets of high quality polyethylene as a dielectric.
 

Attachments

  • BDT12.jpg
    BDT12.jpg
    227.8 KB · Views: 148
Chicken or the egg - HV tester first, or diy HV capacitor first !
I had prior experience with this particular plastic film...

  • High-voltage precautions
The 10 kilovolt level reached in this tester is not huge, but it is higher than 99% of DIY projects, and deserves due respect.

Initially, I had cleared away all the rows of copper pads surrounding the HV sections, but to my surprise, there were flashovers at voltages as low as 6~7kV.

Yet, the clearances and creepage distances should have been ample enough, but a lousy perfboard having its copper milled off behaves extremely poorly in this respect.
I then maniacally chased any copper I could still eliminate, but there were still breakdowns at 7~8kV.
In the end, I resorted to a kerf of the PCB in the most critical area, supplemented by an insulating shield and a polyurethane coating.
With a good quality, well cleaned epoxy PCB, clearances alone should be enough

All of that eventually cured the problem, and allowed safe operation at at least ~14kV: with a 10kV output, the voltage at the output of U4 is 7V, and even when the regulation is lost, the zener D2 limits the voltage to ~14kV.
Without the zener, the voltage could reach 20kV, which is clearly unsafe.

14kV might still look excessive, but one has to take into account temperature variations, and more importantly, the additional loading when a HV probe is connected to measure the voltage: the burden of the 100MΩ probe then becomes dominant, compared to the 132MΩ of the feedback divider, and the minuscule 220MΩ of the output resistor.

A jumper allows the connection of such a probe without tripping the detection circuit.
Conversely, another jumper inhibits the HV generation, to be able to debug the circuit comfortably, without the fear of HV being present.

Note that the power generated and the energy stored in the 560pF capacitor present no danger, but any contact with anything else than the normal, protected output could be somewhat unpleasant, to say the least.


I have built the circuit into an insulating case, and even the front panel is insulating because it makes things so much simpler, but it would be also possible to use metal-work: drastic insulation measures would be required though
 
  • Component selection

Most of the components used in this project are commodities. There are two exceptions though: the HV ones, and the resistors part of the BCD DAC.
The HV cap has been covered, the best solution is to find an off-the-shelf model, and the resistors also need to be specific HV types: if you use the Philips taxonomy as a guide, suitable types are VRxx.
For example, the four 33MΩ of the feedback divider are VR37, withstanding 3.5kV each.

For the 10x 22 MΩ, I used Yageo types, similar to the VR25, withstanding 1.5kV.

It would be possible to use ordinary CF or MF types rated at 250V, but the number required would be non-negligible....

DAC resistors
The values indicated on the schematic are theoretical ideals, supposing the CD4518's have a zero output resistance, which is not the case obviously.

In reality, they have around 250Ω output resistance, and this has to be factored in the selection.
Here is the way I did it, starting from ordinary 1% (good quality) types:

I picked 18 240KΩ in the drawer and I ranked them for accuracy, including the additional 250Ω.
I then saw that the four most accurate were not within 25ppm of the target value, and I tested some more samples to attain the target..
I then picked 120kΩ resistors until I found values exactly half of the 240kΩ for four of them (including the correction).

Finally, I populated the DAC according to the measured accuracy of the resistors: the most accurate for the heaviest weights.

The few other odd-valued resistors were made up from series combinations.

That way, I achieved a 1 in 104 resolution, but in reality that's a luxury: the 4-digit are required to avoid range switching, but an error of 2 or 3V on a 8kV breakdown voltage is in fact perfectly acceptable, and at lower voltages, the error will become unnoticeable.
I made the thing completely accurate simply because I could do it, but it it is certainly not necessary to do so
 
I calibrated the output at 5kV and I then checked the real output against the displayed value for a number of voltages, higher and lower than 5kV.
The values were in good agreement, except that there is a small residual offset error, because the ~1mV offset of the regulator opamp has not been trimmed or compensated.
Of course, this couldn't catch isolated non-monotonicity errors caused by an incorrect resistor matching, but if the matching job has been carried out properly, such errors should not be present.
To make sure I made no silly mistake though, I also made a sanity check: I clocked the DAC at a rate high enough to allow a comfortable oscilloscope observation, and I examined the resulting ramp at a high magnification ratio, particularly concentrating on the large bit weights switchings, to make sure there was no obvious anomaly.

The tempco of the VR37 resistor is quoted as +/-200ppm/°C, and this would be a limiting factor, but the actual tempco is clearly much lower, comparable to good MF resistors.
Note that not all high voltage resistors are that good: for example, the Yageo ones are less stable.

Of course, both the measurement and the regulation are made upstream of the 220 meg output resistor, and in actual use if there is the slightest leakage current before the breakdown occurs, it will cause a deviation from the displayed value, but that's the principle of this instrument
 
You have a 5kV reference?

Could you please go through the DUT breakdown detection process again wrt to U11, and what level of current through the DUT is allowed for parasitic capacitance charging (which would depend on generator voltage ramp up rate) and what rate of DUT breakdown voltage (or current ramp up rate) is needed to trigger a breakdown detection, given C12 filtering across R19.

I have been doing and setting up for some valve and ss diodes PIV testing. I'd anticipate your tester may be fine for such an application, whether for valve or ss diodes, as the diode loading capacitance would typically be very small, and the DUT conduction current suitably low, up until vacuum related or avalanche related breakdown processes started - and the DUT current would be limited to well below damage level (eg. for ss diodes). I have done a few quick checks of ss diodes with an insulation resistance meter at 1kVDC - even modern 1N4004's didn't show conduction more than 1uA - and was setting up a cheap battery powered dc/dc to provide additional circa 500VDC to go in series with the insulation meter's 1kV.
 
Last edited:
You have a 5kV reference?
No, I don't: I use a high resolution multimeter (Datron 1071) and a high voltage probe that I can calibrate at safe voltage (100V), whilst retaining a sufficient resolution.
I set the voltage to ~5kV in pause mode, and I measure the actual output (upstream of the 220 meg, of course, and with the detection inhibition jumper in place).

Could you please go through the DUT breakdown detection process again wrt to U11, and what level of current through the DUT is allowed for parasitic capacitance charging (which would depend on generator voltage ramp up rate) and what rate of DUT breakdown voltage (or current ramp up rate) is needed to trigger a breakdown detection, given C12 filtering across R19.
U11 is wired essentially as a transimpedance amplifier measuring the DC return current of the HV supply.
However, it is somewhat degenerated because I felt the need to include R28 to isolate further the opamp from HV events, event though D8, D9, C12 already provide a certain level of protection: when you have some experience with HV, you tend to become paranoiac about these details.
The degeneration doesn't fundamentally change the way the TIA operates, it just makes the calculations of gain a bit more complicated.
Because the TLC274 has an output capable of reaching 0V, the voltage on C13 under quiescent conditions is effectively zero (even a LM324 almost works... but not quite).
The time-constant imparted by C12 is ~1.3ms, smaller than the 2.7ms of C14, reasonably lower than the maximum possible clock rate (6ms).
Going much lower would not really make sense, because of the 220meg resistor and the 560pF supply filter cap.

At the lowest practical detection threshold of 100nA (which could easily be lowered if necessary) and the maximum ramp rate of 10kV/min, the maximum capacitance in parallel with the DUT is (100e-9*60)/10e4=600pF.
At the 1kV/min rate, this becomes 6nF, and in manual mode, it can increase to ~18nF


I have been doing and setting up for some valve and ss diodes PIV testing. I'd anticipate your tester may be fine for such an application, whether for valve or ss diodes, as the diode loading capacitance would typically be very small, and the DUT conduction current suitably low, up until vacuum related or avalanche related breakdown processes started - and the DUT current would be limited to well below damage level (eg. for ss diodes). I have done a few quick checks of ss diodes with an insulation resistance meter at 1kVDC - even modern 1N4004's didn't show conduction more than 1uA - and was setting up a cheap battery powered dc/dc to provide additional circa 500VDC to go in series with the insulation meter's 1kV.
The tester will work well when the DUT breaks down in a brutal way, ie it suddenly becomes a negative resistance: this is often the case for HV devices, but for components exhibiting a progressive increase of the leakage current or a zener-like characteristic like controlled-avalanche rectifiers, the 220 meg will affect accuracy.
If you intend to use the tester for such applications, it would probably be possible to correct the displayed voltage according to the threshold current setting, but this would add some complications.
The other option would be to measure the actual output voltage, downstream of the 220 meg, but I don't think it would be simpler
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.