I use a battery-operated millivoltmeter having an input resistance of 10Meg as picoammeter, and I connect its COM terminal to the lowest impedance side of the loop I break.
The oscilloscope remains connected to the opamp output, and does not show anything abnormal, except an increased level of 50Hz hum. When the hum level is too large, I place GND-connected screens around the setup.
U1 remains in the range except with the 900M resistance connected: its output then goes to the +12V rail because it loses control.
On the previous ranges, the voltage is within the linear range, but quite low: for the 1nA range, the voltage is comparable to a BJT CCS.
I agree that something strange is going on, but I cannot pinpoint it. I have tried various method, like a cold spray, but the thermal shock sends the voltages all over the place and no conclusion can be drawn.
I have also tried to explore the PCB with the tip of a high-impedance millivoltmeter, to detect stray conduction paths, but triboelectric effects swamp the measurement
The oscilloscope remains connected to the opamp output, and does not show anything abnormal, except an increased level of 50Hz hum. When the hum level is too large, I place GND-connected screens around the setup.
U1 remains in the range except with the 900M resistance connected: its output then goes to the +12V rail because it loses control.
On the previous ranges, the voltage is within the linear range, but quite low: for the 1nA range, the voltage is comparable to a BJT CCS.
I agree that something strange is going on, but I cannot pinpoint it. I have tried various method, like a cold spray, but the thermal shock sends the voltages all over the place and no conclusion can be drawn.
I have also tried to explore the PCB with the tip of a high-impedance millivoltmeter, to detect stray conduction paths, but triboelectric effects swamp the measurement
I use a battery-operated millivoltmeter having an input resistance of 10Meg as picoammeter, and I connect its COM terminal to the lowest impedance side of the loop I break.
Makes sense.
The oscilloscope remains connected to the opamp output, and does not show anything abnormal, except an increased level of 50Hz hum. When the hum level is too large, I place GND-connected screens around the setup.
There are two op-amp outputs, do you monitor both? What is your criterion for too large? Any difference in DC level with and without screening?
The last time I tried to measure really small currents (CMOS inverter input DC currents) at home using some circuit I built on a perfboard, I originally got completely erratic results because I had no shielding and I was really measuring rectified mains hum. With a grounded shield all around, the results made perfect sense.
U1 remains in the range except with the 900M resistance connected: its output then goes to the +12V rail because it loses control.
I presume you mean the +15 V rail, minus the voltage drop across the output stage.
On the previous ranges, the voltage is within the linear range, but quite low: for the 1nA range, the voltage is comparable to a BJT CCS.
Your MOSFET is working far in weak inversion (a.k.a. subthreshold region), so its source-gate voltage is well below the threshold voltage. That's to be expected with a 200 mA MOSFET conducting 1 nA.
I agree that something strange is going on, but I cannot pinpoint it. I have tried various method, like a cold spray, but the thermal shock sends the voltages all over the place and no conclusion can be drawn.
I have also tried to explore the PCB with the tip of a high-impedance millivoltmeter, to detect stray conduction paths, but triboelectric effects swamp the measurement
I only monitor the CCS. The schematic is heavily simplified, but the output of U2 is fitted with an AC detector lighting a red LED when the ptp voltage exceeds 1V (10% of the fs). For the other I also use the 10% criterion, relative to the difference between the out and the +12V.There are two op-amp outputs, do you monitor both? What is your criterion for too large? Any difference in DC level with and without screening?
The screening causes a momentary shift of the DC level, but it slowly returns to its initial value
Yes, of courseI presume you mean the +15 V rail, minus the voltage drop across the output stage.
Your MOSFET is working far in weak inversion (a.k.a. subthreshold region), so its source-gate voltage is well below the threshold voltage. That's to be expected with a 200 mA MOSFET conducting 1 nA.
I wonder why in the AOE they traced the subtreshold currents /VGS graphs starting with 1nA.Besides the power supply and common mode noise rejection being a problem even with a battery, the guys there were working with astrophysics grade callibrated equipment so if they couldn't do it who or what would?
if you would plot the vgs vs id in ultra weak inversion, the line is not straight logarithmic anymore, but tapers off; it could be gate oxide contamination, as that far end of the curve is seen as a measure of clean processing/ lack of contamination
those circuits are often built using teflon soldering posts, old keithley PA meters were full of them. it also helps to find long resistors with proper glazing.
If I can find a PjFET having sufficiently low leakages, I'll try to replace the PMOS, just to see if it changes something
I'm probably stating the obvious now, but you will need a much larger difference between the high and the low supply then.
I have managed to test a jFET by slightly shifting the gate voltage, and the result was exactly the same.
More than that, I tested a BJT, a cheap Chinese 2SA1015, and it also worked in exactly the same way: modern Chinese BJT are exceptionnally good regarding ultra-low current operation compared to legacy US and European types. Negligible leakage, no gain droop, etc.
Thus, the problem is elsewhere
More than that, I tested a BJT, a cheap Chinese 2SA1015, and it also worked in exactly the same way: modern Chinese BJT are exceptionnally good regarding ultra-low current operation compared to legacy US and European types. Negligible leakage, no gain droop, etc.
Thus, the problem is elsewhere
The Keithley Low Measurements Handbook was mentioned earlier in this thread. In that Handbook, it states that when performing the FIMV approach, the current source has to have an output impedance much larger than the device under test. Have you characterized the output impedance of your current source?
No, it can only be inferred because it is many teraohms. Anyway, this is not the current problem: the output current is too low, whether the output is shorted to GND or connected to any voltage between 0 and 10V.
The problem lies in the upper part, and there the current is also too low.
Solving that problem is my first priority. Once it is done, I can look at other potential issues, like parasitic shunt resistances
The problem lies in the upper part, and there the current is also too low.
Solving that problem is my first priority. Once it is done, I can look at other potential issues, like parasitic shunt resistances
the output of U2 is fitted with an AC detector lighting a red LED when the ptp voltage exceeds 1V (10% of the fs). For the other I also use the 10% criterion, relative to the difference between the out and the +12V.
I'm not sure if this is relevant, but assume you have a 100 mV peak, 50 Hz sine wave across 5 pF of capacitance. The peak current is then 50 pi pA > 100 pA.
The screening causes a momentary shift of the DC level, but it slowly returns to its initial value
That seems to refute the hum hypothesis.
The current or capacitance is not very important: what matters is to keep the measurement node in its linear region. The AC average value will be zero, and if the voltmeter has a sufficient AC rejection (it has), the displayed value will be correct.I'm not sure if this is relevant, but assume you have a 100 mV peak, 50 Hz sine wave across 5 pF of capacitance. The peak current is then 50 pi pA > 100 pA.
That seems to refute the hum hypothesis.
If the capacitance is large, it will improve the averaging, but it will slow the measurement
Perhaps it's worth having a look at the operating principle and circuit diagram of the Radiometer IM6 Megohmmeter. It measures up to 10^9 Mohm.
http://www.peel.dk/Radiometer/IM6.html
http://www.peel.dk/Radiometer/IM6.html
I was wondering what would happen if such an AC current would get superimposed on the source current of M1, but I can't think of any way how that could reduce rather than increase the average currents.The current or capacitance is not very important: what matters is to keep the measurement node in its linear region. The AC average value will be zero, and if the voltmeter has a sufficient AC rejection (it has), the displayed value will be correct.
If the capacitance is large, it will improve the averaging, but it will slow the measurement
I made some progress: I notched the PCB deeper and further, and manage to scrape (litteraly!) a few pA, but the game-changer was the modification of the guarding scheme: initially, it was according to the outline of the first post.
I reasoned that the sensing node, seeing impedances of 100G was the priority. Therefore, I surrounded the drain of the MOS with the guard connected to the output.
I was a bit worried that the rest of the MOS was unguarded, but with only 1G on its source and a very small area, I didn't take any precaution.
Of course, a deeper analysis immediately shows that the source node is as sensitive as the drain, but with the SMD package and a perfboard, managing two separate guards seemed over the top.
But I realized it could be a problem, since everything else failed.
At first, I simply disconnected the guard under the MOS, and it did improve matters.
Then, I removed the conductor completely and milled a notch under the MOS.
It greatly improved matters: now, with the 900M resistor, the current is 50pA. Not yet the 100pA goal, but a great improvement compared to zero.
The next step will be the squeezing of the two guards under the MOS. Progress is slow and laborious, but it does happen.
Not unexpected for a gigaohmmeter on the cheap
I reasoned that the sensing node, seeing impedances of 100G was the priority. Therefore, I surrounded the drain of the MOS with the guard connected to the output.
I was a bit worried that the rest of the MOS was unguarded, but with only 1G on its source and a very small area, I didn't take any precaution.
Of course, a deeper analysis immediately shows that the source node is as sensitive as the drain, but with the SMD package and a perfboard, managing two separate guards seemed over the top.
But I realized it could be a problem, since everything else failed.
At first, I simply disconnected the guard under the MOS, and it did improve matters.
Then, I removed the conductor completely and milled a notch under the MOS.
It greatly improved matters: now, with the 900M resistor, the current is 50pA. Not yet the 100pA goal, but a great improvement compared to zero.
The next step will be the squeezing of the two guards under the MOS. Progress is slow and laborious, but it does happen.
Not unexpected for a gigaohmmeter on the cheap
Do you see any chance of putting a guard around the source and connecting it to the positive input of U1? Or to replace M1 with something in a TO-92 package?
- Home
- Design & Build
- Equipment & Tools
- nA CCS mystery