distortion analyzer recomendations?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Administrator
Joined 2004
Paid Member
Hi Jack,
The internal oscillator is good for about 0.002% in the 339.
Yes, that is the limit for this oscillator. The average 339A will measure 0.0018% when testing it's own oscillator. I am looking into ways to improve the oscillator a little. Today's semiconductors are a little better than what they had to work with back then.

Hi TubeMack,
The THD section is coupled to the oscillator frequency setting. This eliminates having to tune both the oscillator and THD notch filter independently. Believe me, this is a wonderful change from what I have been doing for many, many years. Making THD measurements with this instrument is also easier simply because the level, within a 10 dB range, is also automatically set. The 339A represents a huge time saving by eliminating two tedious adjustments for each measurement. The fact that it has a lower residual distortion than most other analyzers on the market is pure icing on the cake. I'm not embarrased to say that I have and affection for this distortion test set. It has made my life a great deal easier and I get better numbers in the deal as well. By better numbers, I don't mean lower either. In fact, the distortion indication on a 339A is normally higher than many newer instruments manufactured by Kenwood and Leader (or whoever makes them) simply because it responds to a much higher frequency and the fact that it does use an AD536 RMS converter. This allows it to properly respond to non-sinusoid waveforms as opposed to a peak or average responding meter might. A truer number if you will.

I just performed the test you were asking about. At maximum sensitivity for input and maximum sensitivity on the THD section, I am reading 0.00105% on the meter with the input shorted. It might actually be lower since I do not have a shielded dual banana plug as they specify in the manual.

I've been thinking of making a few out of single banana plugs, Plexiglas and sheet copper. Either way you look at it, a pain to create. May as well find some copper banana plugs so that I can make a few true shorts in order to set my meter zeros. The normal plated things sold will create thermocouple type errors when you start getting into the mV region with high resolution meters like the Kiethley or HP/Agilent DVMs. Anyone working with thermocouples will know this issue well.

-Chris

Edit: At those low voltages with extremely high gain, expect the switching noise from the contacts and even some microphonics to create some bouncing as you adjust the controls. So - yes, this would be normal and expected under the circumstances.
 
Last edited:
Did the next test looping the OSC to the DA.

I need to disclose that my test lead shipment has not arrived so I have no sheilded jumpers with banana plugs. I'm using a 1 foot jumper of lamp cord with Banana's.

The unit first read .006 BUT Just moving my hand or standing near the jumper would swing the meter. Not an ideal situation.

Next I ran the highpass filter to help filter what the jumper may be picking up.


Reading dropped to .002

Next I coiled the jumper to reduce length, Meter dropped to .0016

I'm thinking this 339A is pretty close?

I'm going to use it for tube amp testing, and to kill the fundamental going into the FFT of my new Rigol DSO. Hopefully it is close enough to dead on.

I have learned that at these levels DA's seem extremely sensitive. Hard to believe an accurate reading could be had running into a 100W amp then pulling a signal off a dummy load. It felt like I could throw things off just breathing too hard.
 
Last edited:
Administrator
Joined 2004
Paid Member
Hi TubeMack,
The unit first read .006 BUT Just moving my hand or standing near the jumper would swing the meter. Not an ideal situation.
Normal, and expected. Do you have any idea how much gain you have there?

I'm using a 1 foot jumper of lamp cord with Banana's.
Now, that is not ideal!
I use BNC cables for just about everything, and good adapters (Pomona or equiv.) to go from BNC to RCA or "F". I would recommend that you do something similar, rather than go for more specialized leads.

I'm thinking this 339A is pretty close?
Yes.

Hopefully it is close enough to dead on.
Yes, you will enjoy this.

Hard to believe an accurate reading could be had running into a 100W amp then pulling a signal off a dummy load.
This is no problem, I've had much larger amps running and I only have 250 watt Dale dummy loads. A Carver PM-1.5 or M 4.0t can toast those - no problem. Keep measurement times short!

It felt like I could throw things off just breathing too hard.
Not really, but consider how far down you are measuring. Try -80 dB F.S., and another 20 dB down near the bottom of the scale. Right about now you should be impressed. BTW, this is also why real speakers will not give you valid readings. That's aside from the fact that external noises will cause the needle to bounce around. Yes, the driven speaker is also a mic.

It may take some time to get your head wrapped around what levels you are dealing with here. Once you figure that out, you will respect the conditions that really low distortion circuits need to have for measurements to occur.

-Chris
 
Hp 3478a

Hi frags,
Cool, but you don't have a front end, signal amplifier (not low levels), filters or protection. The 339A responds to very high frequencies (the harmonics) and your solution does not. The best place for your solution is hanging of the analyzer outputs of something like the 339A.

There is more going on with a distortion analyzer than first appearances. I have tried, and do not use my sound card for this. Plus, the computer ground is a noisy thing.

Hi TubeMack,
Well, try the electronics jobbers or on-line stores. This is a General Cement product (GC Chemicals), a liquid in a bottle. Some on-line stores probably have it as well. Check with Newark or Digikey. In Canada, you should find it at Sayal possibly, Electrosonic or Active Components.

Caig Laboratories may have a similar product as well. Just think how "green" this type of product is. Less used and no propellant. I only use a canned version if I can't reach the area. It's hard to deliver a short shot, you always get far too much out.

Hi Damon,
Great price!!! Let's just say that my 3478A meters cost me about 4 X more than yours. You might want to replace the Lithium battery before you lose your calibration constants. It takes a 2/3 AA, non-rechargeable with single lead on each end. I used the tab versions. You need one extra and a 10 K resistor to keep the memory alive as you transfer the batteries. That's my trick and it's saved me lot's on $$$. :)

-Chris

Hey Anatech,
I am about to switch out the lithium battery on my HP 3478A. Above, you said that you use an extra battery and a 10K resistor to keep the memory alive. Why do you need a 10K resistor? What are you using that for? I was just going to temporarily tack 3 AA batteries (in series) across the + and ground terminals while I make the switch.
Thanks,
Steve
 
Administrator
Joined 2004
Paid Member
Hi Steve,
You don't want any current flowing. The 10 K resistor limits any current flow to very low levels. Don't forget that you're working directly on a power supply that isn't expected to suffer from any surges, that goes into sensitive areas in the meter. Also, the voltage is 3 VDC, 3.3 VDC for a fresh battery. 4.5 VDC (or more) is clearly well out of limit. The connections I am recommending are solid, soldered connections. Your solution may have involved loose batteries in a holder. Bong! Batteries popped out.

Also, you don't want the old battery to discharge the "keep alive" cell either. It would be nice to install it into something else later on. As for the 1.5 volt cells, even a momentary drop to 0 volts may be enough to lose your calibration constants. That's ugly. Mind you, a fresh, proper calibration isn't all bad either - with data (the more expensive option). I would use either Agilent or Fluke for the cal.

The value of the 10K resistor is not critical at all. It was the magnitude I was looking for, and it works.

-Chris
 
Hey Chris,
Initially, I was thinking about using 2 AA batteries for 3 volts instead of the 3 AA's for 4.5volts. But, from the manual, when the unit is on, 5 volts are supplied to the memory. So, I figured that somewhere in between would be ok.
I do see what you are saying about the resistor though. I think I will use one as well.

Thanks for the tip,
Steve
 
Administrator
Joined 2004
Paid Member
Hi Steve,
Well, the actual goal is to supply the memory retention voltage and avoid any sudden shifts in that potential. So the resistor will form a LPF with any capacitance that may exist from that line to ground. Nice and smooth.

I would highly recommend you use some sort of backup supply that has no contacts or leads involved at all. My lithium cell was soldered in before removing the old battery. The next steps are to clean up the mounting area, especially if the original battery leaked a little, followed by the installation of the new battery. You did measure the new battery first before starting this - right?

While you are moving things around, it is very easy to bump a battery holder and lose your calibration constants. A lithium battery is small, light and soldered right on. Tough to accidentally bump off. Play it safe. While you are considering the cost of another battery, flash a thought to how much it costs to have the meter calibrated.

One of my 3478As had a dead battery. I installed a replacement battery and calibrated the 3478A by transferring the accuracy of my 3457A and 34401A meters. At some point, it will either be sold or calibrated properly.

-Chris
 
Just a question Chris, as you know HP meters quite well: does the 34401A suffer the same problem? I mean discharging back up battery and bye-bye to calibration constants.
If positive, how long does the battery last? Mine is 10 yrs old now.
Thank you!
 
Last edited:
Administrator
Joined 2004
Paid Member
Hi massimo,
Your question really concerns the technology of "closed case calibration" for all brands of equipment. There is / was a time before eeprom was implemented for memory backup of calibration constants. The technology was then static ram, the best there really was at the time. Therefore, I'd have to answer in the positive to your question with a qualifier that I don't know what the exact technology used in your instrument.

For all,
To the best of my knowledge, the 34401A probably does use the same information storage method as the earlier 3478A. The lithium batteries used in these have a very long life time it seems. Note that these are not rechargeable like a Ni-cad technology, just as well because a Ni-cad would be dead by now. Rechargeable Lithium batteries are available, but the are not the right type. The size used in the 3478A is 2/3AA as I recall, one terminal at each end. My 34401A is still running fine, and still in tolerance after over 15 years of service. I think I bought one just near the introduction of that model.

I absolutely love my 34401A. I would have bought these instead of the 3478A if it weren't for the extremely high resale value of them. If I had the available funds and some reasonably priced 34401A's became available, I'd buy one (or two). For the prices that used 34401As go for, I'd buy a newer 34410A or 34411A instead. They are both out of my price range.

Now, I'm guessing you bought your meter new. If that is the case, you should have the service manual that came with it. Have a look in the schematic, if you can't find it there try the parts list as a last resort. I always seem to have more trouble locating things like this in the parts list. If you find the part, could you please reply and post the HP # here. I don't know if they will give you a useful description in the manual. Any manufacturer's real part number would be gold.

Now, about the other ways of storing the calibration constants. They could currently store them in the micro-controller itself if they use one with eeprom large enough. Other things that use eeproms are televisions and DVD players. I hear of a lot of eeprom failures in televisions, so there is something not reliable there. Agilent would certainly not use an untrustworthy device for this. Also, many "e-pots" have their own non-volatile memory these days. I don't know if Agilent uses these, but other equipment manufacturer's might. Older Eproms are also beginning to fail, the 3456A is an example of a product that has this issue. I bought one "working perfectly" off Eeekbay. Nope. Doesn't even boot. So much for vendor honesty. I mention this only because a number of HP meters are becoming available, some at affordable prices. If anyone does buy one of these, open it up, find the replacement battery and order one (or two) so you can retain the calibration. A normal calibration will run over $100 often, so it's worth your time to safeguard the data.

I see new lithium batteries intended for battery backup service are warrantied for 25 years now (in one ad I saw today). To be honest with you, this performance would seem to be more than long enough, even though Agilent / HP equipment lasts longer than this. I have been thinking of going into my 34401A to replace this battery, and also into a 3457A that was recently calibrated before it dies. It does seem to be about time for both 3457A and 3478A meters to lose their batteries.
 
Hi Steve,
You don't want any current flowing. The 10 K resistor limits any current flow to very low levels. Don't forget that you're working directly on a power supply that isn't expected to suffer from any surges, that goes into sensitive areas in the meter. Also, the voltage is 3 VDC, 3.3 VDC for a fresh battery. 4.5 VDC (or more) is clearly well out of limit. The connections I am recommending are solid, soldered connections. Your solution may have involved loose batteries in a holder. Bong! Batteries popped out.

Also, you don't want the old battery to discharge the "keep alive" cell either. It would be nice to install it into something else later on. As for the 1.5 volt cells, even a momentary drop to 0 volts may be enough to lose your calibration constants. That's ugly. Mind you, a fresh, proper calibration isn't all bad either - with data (the more expensive option). I would use either Agilent or Fluke for the cal.

The value of the 10K resistor is not critical at all. It was the magnitude I was looking for, and it works.

-Chris

Hey Chris,
I assume that you install the resistor on the positive side of the temporary battery (not the ground side) -- correct?

Thanks,
Steve
 
Administrator
Joined 2004
Paid Member
Hi Steve,
... how did you transfer the accuracy of one meter to the 3478A?
That's really just a term used in the calibration industry. What it means mostly is that you can use an uncalibrated signal source to calibrate another device (DVM in this case) and monitor that source with another device (yet another meter here) that ideally has an accuracy that is 4 X the accuracy of the device you are calibrating or better (optimizing or adjusting in cal-speak). Strictly speaking, applying a test stimulus to an instrument and observing the indication and writing it down or recording each check point value somewhere constitutes a "calibration". Of course, when a value needs to be corrected and it is adjusted to agree with the acceptable range of indications can also be part of a calibration. What I am saying is that an adjustment does not need to be performed in order to call the procedure of verification that a device is performing within it's stated accuracy from the manufacturer. It's also possible to calibrate the same device to tighter or looser specifications than the manufacturer stipulates. These are called a "Special Calibration", and the label that states the device is within that changed tolerance looks physically different from a calibration that is to the same tolerances that the manufacturer specifies. In that way, the fact that the acceptance deviates from the device model standards is easily noted. We used a white label for standard certifications and a yellow one to indicate a "Special Calibration" situation.

Now, why do you want your "calibration standard" to be 4 times or more accurate than the device you are certifying to be in calibration? The "calibration standard" also has an accepted tolerance, so the indicated value may actually off from the true value by so many percent. This is called an "uncertainty". So, the uncertainty of your device to be calibrated also has an error band off acceptable indication, or "uncertainty". Therefore the TUR is merely a comparison of the two uncertainties. The minimum of 4 X the accuracy guarantees that the calibrated device will in fact be within it's own uncertainty (or tolerance, or acceptable deviation from the real value). The 4:1 ratio represents a 4:1 TUR, or Total Uncertainty Ratio. This ratio will very likely change from range and function, depending on the settings on the device. For example, an AC reading on a 1.9999 V range to a DC reading on the 20.000 V range. Here, both the type of measurement (AC and DC - or even resistance for example) and the range of the measured value are different (a 2 V range to a 20 V range). The point is that the worst ratio (closest ratio) should be no less than 4 to 1. If the ratio does fall below that 4 to 1 ratio (in this case, it's arbitrary for the industry to decide), then the calibration certificate becomes a "Special Calibration". A "Special Calibration" for this reason is something a calibration lab does not want to get involved in. To prove the accuracy of the lab standards, there may have been a large amount of statistical analysis involved. Factors like time (from previous calibration), temperature, vibration may all contribute to measurement errors for any device.

Sorry for talking your ear off there, but I wanted to give you a clear picture of the calibration process. Still, what I've posted is simplified.

-Chris
 
calibration

Hey Chris,

Interesting -- Thanks for the info. I thought maybe there was a way to insert calibration constants (obtained from a known calibrated meter) into an uncalibrated 3478A. But what your talking about is creating a table of actual readings from one or multiple calibrated meter(s) and comparing them to the actual reading from an uncalibrated meter. Then when you use the uncalibrated meter, you would use the table to extract the actual value from the one read on the uncalibrated meter -correct?

You said that you would record a reading for each check point. Typically, how many check points would you use for each function (ie. how many check points would you use for voltage on the 3478A?)? I mean, could you take as few as 2 points (high and low reading) and then extrapolate a linear relationship between the 2 points for the rest of the readings? Or, are DMM's not linear in this respect?

Thanks,
Steve
 
Administrator
Joined 2004
Paid Member
Hi Steve,
Well, no. What is happening is that you are measuring your unknown (but stable) signal source with a known, good meter that is within it's calibration period. You take the value of your signal source as indicated by the calibrated meter and correct the reading on the uncalibrated meter to be as close as possible to the readings on the calibrated meter. You are transferring the accuracy of the known meter, through the signal source to the unknown meter. Example, you have an HP 3457A in calibration and some HP 3478A meters that are out of calibration, or have lost their calibration constants due to a dead Lithium battery. You also have a Data Precision 8200 voltage / current adjustable reference (out of calibration) and an old RFL calibrator (well out of tolerance) to supply AC voltages. No one in their right mind would consider using these devices for daily calibration duties, however a hobbyist or experimenter using this stuff at home could swing it. For one, the RFL doesn't have the accuracy to calibrate an HP 3478A, the TUR is a fraction in this case. The RFL was suitable for calibrating the inexpensive 3 1/2 digit multimeters, but not the high quality Fluke types, or good bench meters!

So what you are now looking for is the ability to set a voltage or current to a stable value. If you can do that, you're all set for the next phase. Connect the HP 3457A to the leads at the unknown (or out of cal) meters input jacks. Now you can sense what voltage actually exists at the input to your 3478A. Adjust the signal source to a value close to the one requested in the calibration instructions for the 3478A as indicated on the display of the 3457A, then perform the reading on the 3478A or the adjustment as required. Once you have gone through all the steps, you will have transferred the accuracy of the HP 3457A to the HP 3478A. If tested on a proper calibration standard (such as a Fluke 5500A calibrator), it should pass. To follow the chain to the NIST standards, the accuracy of the NIST standards was transferred to whatever standards the calibration facility had for the purpose of calibrating the 5500A (for example). The transferred accuracy of the 5500A was transferred to the HP 3457A at some lab (for example), which you used to transfer that value to the 3478A through an uncalibrated source. It's important to understand that with each transfer, the accuracy decreases (or error budget increases). The important thing here is that you can trace the accuracy from the 3457A to the 5500A calibrator and through to some higher standard and eventually to the standards held at NIST (or NRC in Canada). You have an unbroken audit trail back to the primary standards. What that really means is that you have a certain confidence that a reading taken from the calibrated HP 3478A is within so many percent of the true value that exists at the test point. That is the entire idea behind calibration standards and why they are so very important.

There is another Agilent / HP DVM that often serves as a transfered standard. That is the awesome 3458A with voltage reference upgrade (by Fluke at one time, Agilent may have caught up by now). That is pretty much the last word in digital voltmeter accuracy standards. Kiethly has amazing products in the nV and uA ranges, and then you start getting into physics experiments for higher accuracy still. Note that time is also a factor, not as in "what time is it?", but rather the exact frequency of the clock used in a meter to quantify a reading.

The field of Metrology can get amazingly hairy, depending on how far you are trying to push down the levels of uncertainty. I wonder if they have figured out a theoretical limit for this. After all, there is a base ideal background noise level in every physical component, and statistics will only get you so far (pretty far really). Fluke Electronics published an excellent book on Metrology and calibration (metrology is the science of calibration and procedures). If you are interested, try to borrow a copy. It's an expensive book to buy. On-line forums specializing in this are another excellent way to learn, try the Agilent forum for example. I'm sure Fluke and others have their own as well.

Another interesting fact is that low uncertainty calibration puts you firmly in the world of physics and basic physical properties of the world around us. I've often said that the study of electronics is simply a subset of the physics world. This should drag that point home. It should also illustrate why I don't believe in "magic" parts or circuits.

Now to address your comment ...
I thought maybe there was a way to insert calibration constants (obtained from a known calibrated meter) into an uncalibrated 3478A.
Yes, that process is called "calibration", or "optimizing the calibration". You apply a known quantity to the input of the 3478A and tell the meter what that value is exactly (within its acceptable range for that value), then tell it to set itself to read a value close to this. The process normally runs like this: you set the zero on the range, then the span on that range. Repeat for each range of each basic function, then lock the calibration in. Some meters use a software lock (like the 34401A) and some use a physical switch (the 3478A), and it may even be on the front panel for all to view! Thank goodness there are extra steps to follow, so gross errors are difficult to cause by mistake.

Now, a simple certification (you only receive out of tolerance before and after readings) normally looks at three values per range, per function. Near the zero point, mid point and near full range (or span). A more comprehensive certification would test about 6 points per range, per function. You get the full before and after values. These values will be the same unless an "optimization" was performed. That normally happens if a value reads more than 80% of the allowable error spec. If the reading is really out of tolerance (a bad thing!), the facility that is using that instrument may have to do a recall on the work performed with the out of tolerance instrument. It's a QC issue, and a serious one. When instruments are depended on, it's very common to perform a "cross check" with other instruments every week, or even every single day. A cross check uses equipment that may be of similar accuracy and is a confidence check only. An instrument that fails a self-test or cross check is then caught early before the costs get out of hand. Self tests may often be performed (if the instrument has this function) at the beginning of each shift. As you can see, calibration and certainty is actually a serious business. That is how we keep industrial processes under control and verify proper operation of various pieces of equipment. Potential losses due to out of tolerance instrumentation can be extremely large and even deadly.

-Chris :)
 
Administrator
Joined 2004
Paid Member
Hi Steve,
I didn't give the name of the book, I can't remember it off-hand. They may have it available at Fluke if they have a book store. The title would be listed. It isn't a trivial read either. Lots of great information in there. I have a copy from my stint as a calibration technician that I still use to refer to. I also have the CRC Chemistry book that I look things up in when I need to. It's a big, black book.

The Metrology text is pretty dry reading for most people. That book is more difficult to get through, but the concepts are explained well.

Sticker shock on the 3458A by chance? Look it up at Agilent as a new product, it's a current model too. The performance is excellent by anyone's standards. I guess you've maybe looked it over. I did use one, they are a pleasure.

The 34401A meter is the best choice for bench service work, the 3458A is a true asset better suited to a laboratory than a service bench.

-Chris
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.