Measuring phono stage RIAA accuracy with a computer

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.

PRR

Member
Joined 2003
Paid Member
Most of this thread is Off-Topic.

helma is "..modifying a cheap commercial 3-transistor phono stage as a little hobby..", not building a super reference preamp.

_THE_ phono playback standard is a dynamic pickup into a specified load, then to an amp with a specified (RIAA/IEC) response curve. Cartridge response is the *cartridge maker's* problem, slightly aided by recommendation of load C (and R for non-47K interface). Dynamic correction of mechanical nonlinearity is totally outside the amplifier's job; that is too big for 3 transistors or even a small pile of generic chips.

I'm baffled by using hiss to measure frequency response. That's like using pebbles to measure a distance, and assuming average pebble size is one inch. Frequency response is measured with sine tone. Yes, with heaps of averaging the two methods approach the same result. Hiss is massively valuable in grossly nonlinear systems such as speakers, rooms, and some RF mixers. Sine is far cleaner for a linear system.

Trying to measure over a 40dB (100:1) range with flat tools is dubious precision. You at least need precision attenuators (gain set), something no longer found in most workshops. I have done it with a Boonton meter, but you *really* need to compensate on the input, not after the amplifier under test. Yes, the high-end signal generators have somewhat trustworthy calibrated attenuators, but are no longer common hobby tools.

THE way to do it is with a Reverse RIAA network. We can easily buy 1% parts now, giving better than 0.1dB precision. Unless all your several parts are 1% off the wrong way, the interactions of a 2R2C network averages the error to typically <0.05dB. While a pile of 1% parts costs more than a buck, it is not a Heavy Investment, and can be used on all future phono projects.

With r-RIAA network into preamp, the problem is to measure a near-unity gain over the audio band. This is FAR simpler and more precise than gaining-up/down 40dB. Even the $10 needle-multi--meter of 1962 will do fine 50Hz-10KHz. Most VTVMs can be trusted down near needle-flutter and up past 50KHz. We have gone backward since the VTVM: some DMMs fall-off above 400hz and this must be checked.

You do not need a dense sweep to check RIAA response. Spot-checks suffice. Absent some major amplifier failing, you mostly check 150Hz, 1KHz, and 6KHz. The big resistor dominates below 50Hz. The big cap has most effect at 150Hz. The ratio of the two caps sets the 150Hz:6KHz balance. The small resistor trims the response at 1KHz. Errors above 10KHz can be many things: poor gain-bandwidth, series resistors "needed" for stability, etc. Errors around 50Hz are tough because IMHO a good phono preamp should fall by 20Hz, and some plotting may be needed.

You don't put a cartridge in front for basic amplifier tests. This is the *cartridge maker's* problem, in conjunction with loading suggestions. You _do_ want to check that your input Z is 47K||C down to tonearm resonance (a few Hz) up past 20KHz. For 47K loading the simple thing is to do a test with r-RIAA network plus 47K (minus r-RIAA Rout) and see that the response falls to half, and a bit more at 20KHz with high C.
 
Most of this thread is Off-Topic.

helma is "..modifying a cheap commercial 3-transistor phono stage as a little hobby..", not building a super reference preamp.

_THE_ phono playback standard is a dynamic pickup into a specified load, then to an amp with a specified (RIAA/IEC) response curve. Cartridge response is the *cartridge maker's* problem, slightly aided by recommendation of load C (and R for non-47K interface). Dynamic correction of mechanical nonlinearity is totally outside the amplifier's job; that is too big for 3 transistors or even a small pile of generic chips.

I'm baffled by using hiss to measure frequency response. That's like using pebbles to measure a distance, and assuming average pebble size is one inch. Frequency response is measured with sine tone. Yes, with heaps of averaging the two methods approach the same result. Hiss is massively valuable in grossly nonlinear systems such as speakers, rooms, and some RF mixers. Sine is far cleaner for a linear system.

Trying to measure over a 40dB (100:1) range with flat tools is dubious precision. You at least need precision attenuators (gain set), something no longer found in most workshops. I have done it with a Boonton meter, but you *really* need to compensate on the input, not after the amplifier under test. Yes, the high-end signal generators have somewhat trustworthy calibrated attenuators, but are no longer common hobby tools.

THE way to do it is with a Reverse RIAA network. We can easily buy 1% parts now, giving better than 0.1dB precision. Unless all your several parts are 1% off the wrong way, the interactions of a 2R2C network averages the error to typically <0.05dB. While a pile of 1% parts costs more than a buck, it is not a Heavy Investment, and can be used on all future phono projects.

With r-RIAA network into preamp, the problem is to measure a near-unity gain over the audio band. This is FAR simpler and more precise than gaining-up/down 40dB. Even the $10 needle-multi--meter of 1962 will do fine 50Hz-10KHz. Most VTVMs can be trusted down near needle-flutter and up past 50KHz. We have gone backward since the VTVM: some DMMs fall-off above 400hz and this must be checked.

You do not need a dense sweep to check RIAA response. Spot-checks suffice. Absent some major amplifier failing, you mostly check 150Hz, 1KHz, and 6KHz. The big resistor dominates below 50Hz. The big cap has most effect at 150Hz. The ratio of the two caps sets the 150Hz:6KHz balance. The small resistor trims the response at 1KHz. Errors above 10KHz can be many things: poor gain-bandwidth, series resistors "needed" for stability, etc. Errors around 50Hz are tough because IMHO a good phono preamp should fall by 20Hz, and some plotting may be needed.

You don't put a cartridge in front for basic amplifier tests. This is the *cartridge maker's* problem, in conjunction with loading suggestions. You _do_ want to check that your input Z is 47K||C down to tonearm resonance (a few Hz) up past 20KHz. For 47K loading the simple thing is to do a test with r-RIAA network plus 47K (minus r-RIAA Rout) and see that the response falls to half, and a bit more at 20KHz with high C.
Thanks for clearly stating the role of the cartridge in this kind of testing!

Agreed with all of the above, and that's exactly how I would (and did) do it about 20 years ago.

Today we have computers, and they all have sound cards, some even pretty good ones. We have free software capable of better measurements than any instrument in the above, given calibration has been done, which in some cases at least, is an automatic routine. I guess if I were going into production on a phono pre I'd build up an r-RIAA block, but for the one-off, or even 20-off, the software solution is fast, cheap, and highly accurate. The RIAA (not r-RIAA) is entered as data (a "target curve"), the result of single frequency or log swept measurements is normalized against that, and in literally seconds you have a full picture of the exact deviation from the idea with fraction dB precision. If your sound card will do 16/44 you have plenty of dynamic range for an RIAA eq test. If the sound card will do 24/192 you'll exceed the capabilities of most vintage analog test gear by quite a respectable bit.

And, like I said, if you already own the PC/Mac computer, REW free, others are free demos or very low cost. I happen to like and use REW, but also own several other apps that could do this too. At the most you need a passive attenuator to get sound card line level down to phono cart level. If there's any cut-and-try design involved, you want do get your answers accurately and quickly. Free is just the bonus.

One comment, using calibrated noise to measure FR is surprisingly accurate, and has several advantages particularly in systems that change dynamically with frequency because noise stimulates with the entire spectrum, were pure tones do not. It's not a big deal for RIAA curve verification, but it also works fine. Smoothing is just because raw FFT is impossible to look at without it, but the real key is temporal integration, which you do anyway with any analysis. Setting an RTA for infinite averaging takes care of this, and precision is as good as any other method.
 
Member
Joined 2014
Paid Member
Most of this thread is Off-Topic.

helma is "..modifying a cheap commercial 3-transistor phono stage as a little hobby..", not building a super reference preamp.
Agreed, but threads on ways to measure this are always useful even if they just tease out misunderstandings off Scott's articles!
_THE_ phono playback standard is a dynamic pickup into a specified load, then to an amp with a specified (RIAA/IEC) response curve. Cartridge response is the *cartridge maker's* problem, slightly aided by recommendation of load C (and R for non-47K interface). Dynamic correction of mechanical nonlinearity is totally outside the amplifier's job; that is too big for 3 transistors or even a small pile of generic chips.

.
This I agree with less. Many cartridges have a +3/-1dB FR specification into the specified load. Pointless as it may be, I might want better, but to get better you need to understand what is going on. Outside the scope of this thread yes, but important in terms of driving home the point that 0.1dB RIAA accuracy is only of use if you cartridge can deliver to the same order of magnitude. A point I think we are all in violent agreement on.
 
Member
Joined 2014
Paid Member
Sorry. Unsubstantiated myth is my hot button.
There is not enough information about that graph to conclude anything. Assumption about vertical scale...well, it's probably ok but who knows? Test conditions are undefined, and it seems to indicate a significant change in the cart for each of the traces. Not enough info.

I'm not on board with the "harmless" part if mythology is perpetuated. If it's fact, I'll be 100% behind it.

What Myth? To be a myth it has to have been around for years and mentioned all over the place. I suspect you first read it here a couple of days ago...

Who knows? Well I do as I have other ortofon graphs from doing a little research so I can get the scale from those where it is marked. I never said it was enough information, it's just a starting point to show that magnetic non-linearities do exist and cartridge manufacturers know about them. And Ortofon still make solid pole cartridges.

I think those graphs show something that is interesting and worth some investigation as a hobbyist. You don't. However you have not provided a counter claim of any sort. So you are also in the 'unsubstantiate myth' group by your own reasoning.

So as to not pollute this thread any more as it has useful info and a still good SNR I will open a new thread to discuss cartridge non-linearities where data can be presented and ideas discussed.
 
What Myth? To be a myth it has to have been around for years and mentioned all over the place.
Well, that's not true. There's no minimum time of existence for a myth to be qualified as such.
I suspect you first read it here a couple of days ago...
Yup. So what? I hear new and disturbing myths all the time, and for some odd reason, they seem particularly prevalent in audio. Always wondered why. There was a time when I'd first heard the "green CD marker" myth, and that one was only a few weeks old when I heard it. And it more than qualified as a myth as it did when I first heard it.
Who knows? Well I do as I have other ortofon graphs from doing a little research so I can get the scale from those where it is marked. I never said it was enough information, it's just a starting point to show that magnetic non-linearities do exist and cartridge manufacturers know about them. And Ortofon still make solid pole cartridges.

I think those graphs show something that is interesting and worth some investigation as a hobbyist. You don't.
Sorry if you got that impression. I do find graphs interesting, as long as they are actual graphs with legends, defined graduations and all. Unscaled graphs are a tease at best. Graphs without test conditions and scales are works of art, but that's about all they're good for. If it's worth graphing, it's worth scaling the data...or don't bother. Any science student that submitted an unscaled graph of undefined conditions would receive a failing grade. Not sure why this should be a passing grade.
However you have not provided a counter claim of any sort. So you are also in the 'unsubstantiate myth' group by your own reasoning.
Lack of negative proof is always a problem. I cited a personal test, but of course, that's meaningless to anyone but me and others that experienced it. However, it's my reason for doubting the myth. If a myth can't be proven true, do we also have to prove it's false?

My reasons for doubt are:

1. Claims made without support of
a. test data
b. technical paper or research
c. scaling the magnitude claim
2. My career in audio has never intersected this myth despite
a. testing many carts/preamps using test gear and test records
b. My personally conducted tests mastering/pressing vinyl and comparing to the master while having the entire chain under control
3. My questions and objections still go without scientific response


So as to not pollute this thread any more as it has useful info and a still good SNR I will open a new thread to discuss cartridge non-linearities where data can be presented and ideas discussed.
I look forward to actual data.
 

PRR

Member
Joined 2003
Paid Member
...Many cartridges have a +3/-1dB FR specification into the specified load. ...

So? Is that the preamp's problem? Do we have a different preamp for every cartridge? Why stick the chore on the preamp? Perhaps an EQ in the line amp. Or pick speakers with complementary flaws?

Yes, all of this has been done for all-in-one players. Screechy needles with dull speakers and vice versa.

But the hi-fi tradition is modular, each part has its job, and may be interchanged with other brands/models (supporting the trade-in racket at the hi-fi dealer).

.....0.1dB RIAA accuracy is only of use if you cartridge can deliver to the same order of magnitude. ....

The cartridge maker does have(*) his own problems, for sure. However the wise designer, falling short of perfection, picks his flaws cunningly for "best sound". Since no two people agree on "best", this also allows many models to sell OK. Since the same person's "best" (or budget) may change over time, this allows "upgrades". Tired of 2dB suckage @7KHz? Try 1dB hot @13KHz!! Only $400 more!!

(*)Apologies to any female cartridge designers.
 
So as to not pollute this thread any more as it has useful info and a still good SNR I will open a new thread to discuss cartridge non-linearities where data can be presented and ideas discussed.
I think that's a good idea, Bill.

At any given playback level, a cartridge's overall 'f-response' comprises a composite of several separate component parts. Electromagnetic generator (typically hf loss/roll-off), a mechanical 'top' resonant system (centred 8kHz-30kHz) that sometimes props up generator hf roll-off, an electrical resonant LCR impedance system (10-20kHz) that also sometimes props up generator roll-off system in MM/MI carts, and a lf spring-mass elastomer system. There's also a contribution from geometric thd and mistracing distortion that can influence f-response somewhat. And other contributing elements besides.

When combined, the well-known f response curve results: lf peak, mid-dip, resonant peak and hf roll-off. Nominally +3dB/-1dB at normal test levels. However, each of the component parts has losses and non-linearities which are level sensitive, and many of them slew-rate sensitive as to losses. It's slew-rate, rather than 'frequency' per se, which is mostly at issue.

I first encountered significant level-sensitive parameters about 6 years ago in collaboration with David Laloum trying to 'perfectly' terminate MM carts, where cartridge inductive behaviour, and hence the generator (not least LCR resonant f and Q), apparently changed significantly dependant upon test level for some cartridges. Especially for very small signals. I still have the charts from that work, and stand by that it is very probably real and a part of vinyl sound.

Exploration of theoretical loss mechanisms in each of the contributing elements to f-response reveals many and various level-sensitive non-linearities. Some are slew-rate specific, or displacement specific. Some increase with level, some decrease.

One might surmise the overall result is probably a level-dependent series of contours of f-response, typically within a bracket of +/- 3dB at the mid-dip, over the decades of normal playback level range where the most audible effects happen. AFAIK there is no single test record to confirm.

IMO the audible result is probably then a big part of vinyl sound - certainly something one doesn't get from digital sources, and only to an extent from tape.

LD
 
Last edited:
I first encountered significant level-sensitive parameters about 6 years ago in collaboration with David Laloum trying to 'perfectly' terminate MM carts, where cartridge inductive behaviour, and hence the generator (not least LCR resonant f and Q), apparently changed significantly dependant upon test level for some cartridges. Especially for very small signals. I still have the charts from that work, and stand by that it is very probably real and a part of vinyl sound.

LCR equalizers can do the same thing. One wonders how many engineers put that into their bag of tricks?
 
Many cartridges have a +3/-1dB FR specification into the specified load. Pointless as it may be, I might want better, but to get better you need to understand what is going on. Outside the scope of this thread yes, but important in terms of driving home the point that 0.1dB RIAA accuracy is only of use if you cartridge can deliver to the same order of magnitude. A point I think we are all in violent agreement on.

Well, I don't agree with that. As it is relatively easy to make an amplifier with good RIAA accuracy and apparently much more difficult to make cartridges with a flat response, I would want the amplifier's inaccuracy to be negligibly small compared to the cartridge's response errors, so the accuracy of the combination is set by the cartridge. 0.1 dB seems like a perfectly reasonable target to me when the cartridge does +3 dB/-1 dB.

Similarly, I like the distortion of my power amplifier to be at least a decade below the distortion of the loudspeakers, so not of the same order of magnitude.
 
Last edited:

PRR

Member
Joined 2003
Paid Member
I think that's a good idea, Bill. <snip!>

And yet this thread *continues* to be flooded with hobby-horses which avoid the root question "Measuring phono stage RIAA accuracy".

(I omit "using a computer" because I submit you use whatever tool works best, which is NOT always a computer, and any calibrated-stick numbers can always be entered into a (computer?) spread-sheet for graphing.)
 
My method is in post #38. It requires the measuring device to be able to accurately handle +/-20 dB level differences. If that's a problem, you could change the fixed 40 dB attenuator in one with a few settings. The attenuator is just a home-made resistive voltage divider; 0.1 % accurate resistors are readily available nowadays.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.