Oscilloscope results differ with volt/div settings

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi

Don't think this has been covered in another post.

I am new to using scopes and have just got hold of one (Philips PM3217)

I have been testing a filter circuit, trying to get results for the corner frequency.

I found that I got different frequency results with differing volts/div settings.

With a higher v/div setting, I got c.4.4KHz which was what my SPICE result was. With a lower v/div setting I got 3.8KHz.

I know there's not enough info here for a full answer, but what I'm wondering is whether there is an generally accepted issue with results from different v/div settings.

My gut instinct is that the lower the v/div setting (and therefore the larger the signal image on the screen) the more accurate the reading.

Am I on the right lines with this? (If so, I need to alter some cap values!)

Also, I should say I am using a x10 probe and I found that the lower I set the v/div setting, the less clear the reading on the screen - it got a bit fuzzy... So I'm also wondering if this has some bearing on it too...

Any help would be great.

Thanks.
 
As your scope seems rather old I guess the initial calibration might have gone bad over the years. Not every setting or channel is affected equally.
On the input connector there should be adjustable capacitors that affect signal response on the corresponding setting on the v/div range switch. You may have to readjust them with reference to your scope's manual. There may also be inaccuracies regarding your timebase which affects all voltage ranges.
 
I agree that the waveform should almost fill the screen to be able to estimate the amplitude visually. If the x 10 amplifier is being switched in and out I can well believe that the gain changes by a couple of %. The vertical accuracy of the PM3217 is specified as +/- 3 % or +/- 5 % with the x10 amplifier (a long time ago).
I guess that you haven't got an AC voltmeter with reasonable accuracy to use instead.
 
AX tech editor
Joined 2002
Paid Member
Hi

Don't think this has been covered in another post.

I am new to using scopes and have just got hold of one (Philips PM3217)

I have been testing a filter circuit, trying to get results for the corner frequency.

I found that I got different frequency results with differing volts/div settings.

With a higher v/div setting, I got c.4.4KHz which was what my SPICE result was. With a lower v/div setting I got 3.8KHz.

I know there's not enough info here for a full answer, but what I'm wondering is whether there is an generally accepted issue with results from different v/div settings.

My gut instinct is that the lower the v/div setting (and therefore the larger the signal image on the screen) the more accurate the reading.

Am I on the right lines with this? (If so, I need to alter some cap values!)

Also, I should say I am using a x10 probe and I found that the lower I set the v/div setting, the less clear the reading on the screen - it got a bit fuzzy... So I'm also wondering if this has some bearing on it too...

Any help would be great.

Thanks.

Can I ask a question, how do you decide from the display what the filter cut-off frequency is?

jan
 
frequency flatness or response from a scope

I do this rather differently from most users.

I set the attenuation of a stepped attenuator to exactly equal the passband gain of the amplifier. Let's suppose the amp is +30.1dB, then the attenuator is also -30.1dB. The input to the attenuator will be exactly the same as the output from the amplifier. I can compare these voltages with a not very accurate DMM.

I set the input signal to the attenuator feeding amplifier under test that is appropriate for the output level I require. Say 2.83Vac. This will show as 8Vpp on the scope.
I can sweep the test signal frequency and see from the 8Vpp trace when the output signal starts to drop. I can go up a further two or three octaves and watch the scope trace fall to below 4Vpp (>-6dB)
I now decrease the attenuation of the stepped attenuator from -30.1dB to -27.1dB. I adjust the test frequency until the trace reads 8Vpp.

What I have now is the same output voltage as the input voltage. I can use a DMM set to 20Vac to measure the input and output signals. There will be a slight error. I fine tune the frequency until the input and the output read identically.
The amplifier is now amplifying by 27.1dB, i.e. -3dB relative to the passband.

I can now measure the frequency using a frequency counter.
I have the F-3dB frequency.

This method uses "compare" to get fairly accurate results even though the signal generator and the DMM are not accurate.
If I want a bit better resolution I reduce the input signal to 1.95Vac and use the 2Vac DMM scale. I might get a 0.3% difference in input to output voltage due to the smallest step in my attenuator being 0.05dB. I would aim for the same input to output difference when fine tuning the test frequency. The 2000count on the 1.95Vac signal allows very good repeatability, if that is needed.
 
A simple home brew audio band noise generator and a PC spectrum analyzer makes for quick filter function checking even with sound cards and systems that can't handle duplex operation. Just have to be sure that your noise generator rolls off shortly after the audio band because input filtering on cheap cards can be questionable and you might see some alias products in the display. You can get a baseline, of course, by feeding the noise directly into the card input before testing filters. Of course it can be done with a frequency sweep from an oscillator but it's a lot more work.
 
You won't find a good DMM that can guarantee measuring band flatness from 1Hz to 500kHz, unless you go for a very good rms reading DMM.
Even then there are tolerances involved that make for errors you can't check without other instrumentation.

But if you rely on "compare" between virtually identical sinewaves, then just about any DMM is good enough to surpass the accuracy of an rms reading meter.
 
Here is my simple method:
Turn the time/div dial to the left so that you see the ~envelope of the signal. Adjust the V/div and variable (uncal) so that you get full screen 8 div (+/- 4 div) in the passband. Now adjust the input freq so that you get +/- 2.8 div, that gives the -3dB freq.
BTW the -3dB corner frequency shouldn't be mystified in any design, it is just an informative data. At least I am not interested in it more than +/- 20% accuracy.
 
Thanks to everyone for their contributions. I will definitely check out the ARTA software and will look into decent voltmeters.

I was also happy to see oshifis' reply re. accuracy. Although I assume accuracy is dependent to some extent upon the tightness of component tolerances...

Furthermore, I set the bandwidths of the filters according to somewhat arbitrary figures. I need to get it into real life to find out just how useful the design is!

Thanks again guys.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.