testing speakers and voltmeters..

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Having found my voltmeter wasn't sensitive enough to measure small changes in resistance (grrr..) I'm about to get a new one.

The new meter has a 2v scale that my old one didn't which is good, but now I'm unsure as how to calibrate it to do impedence runs.

I'm following the test set up by Weems which recomends calibrating with a 10 ohm 1% resistor to 10% of a 1v scale. Does this mean that as I'll be using a 2v scale I calibrate it to 0.2v and then read each 0.2v change as a 1ohm change? Or am I completely wrong?
 
You will get far better accuracy using your DMM as a voltmeter and measuring the current by voltsdrop across an accurate reference resistor. Then measure the voltsdrop acros your DUT resistor. With care you can approach accuracy of 0.2% and sensitivity better than 0.1%.
This particularly applies as the resistance falls below 1K0 and is the only way us poorly equiped amateurs can measure below 10R0.
Try measuring 0r22 to better than 1% with a 200mVdc DMM and then try with 200ohm DMM setting.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.