Question re: Optimal Gain for RIAA preamp?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
This came up in discussion elsewhere and I'm wondering what the general consensus is here (if there is a consensus, that is)...

It seems to me that when you're playing your LPs, you'll want the output from your phono preamp to match the output from your digital sources (2V rms at 0dBFS).

I've read comments that anywhere around 44dB gain is 'standard' for a phono preamp meant for use with MM cartridges.

I've also read comments that a gain of about 50dB is more optimal because it will yield an output level from the phono preamp that will more closely match the output from a typical CD player or DAC.

Let's say a hypothetical MM cart puts out (the standard?) 5mV rms into a 47k ohm load at deflection of 5cm/second.

Question: What level from this hypothetical MM cartridge would correspond to 0dBFS from our standard digital source? Is it possible to come up with a number that would help us decide on an optimal gain from our phono preamp?

I've read that the typical maximum peak levels from an LP are about +10dB (3.16x) above that 'average' level. I think that means our hypothetical 5mV rms @ 5cm/sec cartridge will put out 16mV rms on those +10dB peaks.

I've found for myself that when I'm using my Denon DL110 HOMC (rated at 1.6mV output by the mfg, but with no other qualifier for that spec), and using that with my Schiit Mani preamp set to 48dB gain, the perceived output from my records matches the output from CDs pretty closely.

If I use a Shure M35X cartridge, which is rated at 6mV rms output at 5cm/second, then setting the gain on the Mani preamp to 42dB yields just slightly louder perceived levels from my system than when playing CDs at the same setting of my volume control.

Now, with those observations in mind...

The Schiit Mani specs say that with its gain set to 48dB (for HOMC carts):
Sensitivity = 1.3mV for 300mV output

With its gain set to 40dB (for standard MM carts):
Sensitivity = 2.3mV for 300mV output

If recorded peaks are going to be +10dB (3.16x) above that, then the peak output from this 'standard MM cart' would be about 16mV.

16mV amplified by 42dB (126x) = 2.016V

If that's all correct, then a cartridge rated at 5mV output at 5cm/second (1kHz) run with a preamp providing +42dB gain will yield our hoped-for 2V peak to match the output at 0dBFS from CD/DAC.

(I see that the Grado Prestige cartridges all have an output voltage spec of 5mV/5cm/second)

The oft-cited 'standard' 44dB of gain from a typical MM preamp would yield a peak output of 2.56V, which is only slightly 'louder' than 2V peak. That should be fine.

If I try that logic working from the 1.6mV spec for the Denon DL110 HOMC:

1.6mV * 48dB (251x) gain = 0.4V average level
0.4V avg level * 10dB (3.16x) peaks = 1.26V peaks

That's -4dB down from our CD/DAC output, but I think close enough.

50dB of gain from the phono preamp would make the max output 1.59V, or -2dB down from the target 2V (not readily noticeable).

It would take 52dB gain (x400) to make that 1.6mV/5cm/sec rated HOMC yield 2V output on +10dB peaks.

Am I on the right track here?

If so, being that I really like the Denon DL110, I think I'd design my phono preamp to yield 48 to 52dB gain.
--
 
Last edited:
That's exactly what I've done

http://www.diyaudio.com/forums/analog-line-level/306765-tgmc-modular-control-pre-amplifier.html

I'm setting the gain of the phono amp so that it provides a higher level of output than was traditional many years ago. I don't think I've tweaked the gain to final values but my simulations are pointing to around 50dB too.

What you have to take care of is adequate headroom, you don't want a new cartridge or nasty clicks and pops saturating your amp.

You know this already, but I had to remember that the gain is not uniform with frequency in an RIAA curve so I pick 1kHz as the benchmark point because I believe most texts about phono amps talk about their gain at this frequency.
 
Right, I should have mentioned that I was talking about gain at 1kHz, not at other frequencies.

Agreed that 50dB gain at 1kHz seems to be a kind of sweet spot. The cheap Mani confirmed this, being that I'm getting good results from the DL110 with the Mani set to 48dB gain.

I've been trying out ideas using vacuum tubes, which will give me more voltage headroom than FETs or opamps. One of the trade-offs with high gain triodes is high input capacitance. Fortunately, if I stick to HOMC cartridges like the DL110, their much lower coil inductance won't make any nasty peaks running into a 12AX7 or similar (higher than 300pF input capacitance).

I need to do more soldering and less planning!
--
 
That's what cascodes are for.

Yup, and there are a couple of circuits that would be fun to try.

One promising cascode idea is in Merlin Blencouwe's High Fidelity Preamps book, using an LSK170 nJFET as the bottom device with a frumpy, unloved 12AU7 (ECC82) as the top device.

A nice looking finished design is kevinkr's Muscovite Mini III, using a 6DJ8 cascode for the first stage and a 6S3P triode mu-follower as the output stage.

The crazy thing is that in simulation, I get the best looking results from an updated RCA-style 12AX7-12AX7-12AU7 cathode follower circuit. Since the whole circuit's plate supply needs only 300V 10mA, and the total heater supply requires only 12VDC 450mA, it would be very cheap and easy to regulate all supplies.

--
 
Last edited:
Before the CD player changed the output voltage of sources, most preamps would expect roughly 750mV output for a line level source (input sensitivity specification) to produce rated output from a preamp. So, you can use that as the basis for calculations to increase gain closer to the CD standard.

The exception would be some UK and/or European gear, where 450~500mV or so was good for full output (eg QUAD 33 preamp). But that's rarely encountered.

A fair number of disk players and DACs are actually above the 2V level if measured, so you may still have a level change switching inputs, or maybe you could add just a smidgen of gain beyond what would be needed for 2V.
 
Last edited:
Before the CD player changed the output voltage of sources, most preamps would expect roughly 750mV output for a line level source (input sensitivity specification) to produce rated output from a preamp. So, you can use that as the basis for calculations to increase gain closer to the CD standard.

Hey, thank you for this! That's a good point. But I have a question...

Does your statement above translate to 'most preamps would expect roughly 750mV [RMS?] from a line level source to produce full output from the power amp'?

Would that be with the stylus being displaced 5cm/second? If so, that's the 'nominal output' from the cartridge.

If that is 750mV RMS, and I want to change that to 2V RMS to full output (clipping) from the amplifier, then I need 2.67 times the gain of a more standard phono preamp.

If I take a standard phono preamp's gain to be 44dB input-to-output, and since 2.67X the (voltage) gain is roughly +8.5dB voltage gain, that means the gain I'm shooting for is a whopping 52.5dB. That seems really high. I suppose that means a phono preamp with +49.5dB should be close enough. Maybe +48dB would be enough.

Since the output voltage from various cartridges varies so widely, even among MM and HOMC types, there's no use in trying to get things to match really tightly. Just 'in the ballpark' is plenty close enough.
--
 
Thanks, I forgot about that one. KAB has lots of good stuff on his site. According to that calculator, if the output is 1.6mV rms @ 5cm/second, gain of 46dB will yield 325mV rms out from the preamp. Why is 325mV an 'optimal' output? Well...

"The optimum gain is based on achieving 325mV rms output at 5 cm/s. For the current crop of CD recorders, 300mV is required for 0dB recording level with the recorder's level control set at max. Aiming for 325mV gives a little margin."

What if I don't care about a CD recorder? What if what I really care about is subjective level matching between the output from a CD player and the output from my phono preamp?

It looks like there is a relationship between the compliance of the stylus/cantilever/suspension and the maximum stylus deflection that can be achieved. It looks like I'm asking a question that's too complex for me to figure out for myself, unfortunately.

So far, it looks like gain of 46dB is recommended by the KAB calculator, while gain of about 49dB looks like it will get the output in the ballpark of that from a CD player. That agrees with my experience with the Denon DL110 having output of 1.6mV rms @ 5cm/s and Schiit Mani preamp having gain of 48dB. The output from that setup is close enough to that of the same recording on a CD.
--
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.