I am currently building a cheap and dirty gigaohmeter.
The principle is pretty simple and basic: a precision CCS feeding the R.U.T. followed by a unity gain buffer.
No rocket science involved, but for 100G full scale with a 10V output, a current of 0.1nA is required as a stimulus.
Everything works as expected for lower ranges, but for 100G, the CCS fails.
Obviously, there are numbers of potential issues regarding leakage, etc., and I did my homework accordingly: I carefully selected an tested all the components involved: the BSP92 was chosen for its sub pA leakages between all electrodes, the TLC272 sample was selected for the same reason, etc.
I used guarding for the output, but it doesn't seem to be an issue: the main problem is the CCS. It should deliver 0.1nA, but it delivers nothing.
I measured the current through Rx and R2, they were identical and zero. When R2 is reduced to 250Meg, a current of 0.2nA flows through it and the Rx, and the indication is consistent.
With the total 1G of reference, no current flows, and the output of the TLC272 goes high, trying to reduce it even further.
At 1nA, its output is ~0.7V below the +12V rail. The 3V difference between the 15 and 12V rails allows the opamp to work comfortably.
There must be a leakage unaccounted for somewhere, but I am unable to locate it.
I have measured the actual gate current of the BSP92, an it's below 1pA, as is the -input current of the TLC272.
The reed relay has been screened too, and anyway in the 100G condition, all its electrodes are +referenced, meaning it could only increase the current, not diminish it.
The problem is clearly the CCS, but I don't see where it stems from.
Could be the board: it is an epoxy perfboard, but it doesn't seem to have issues: nothing measurable, and I milled it to isolate the critical nodes, and I isolated the -input of the opamp to connect it directly to the source of the BSP.