I have been working on the amplifier that has been mentioned in some other posts (http://www.astro.uu.se/~marcus/private/m250.html) and I am running into a few questions. Basically I just need to know what the standard signal levels for the RIAA are. Like what would come out of the line out of my home stereo.
Out from the RIAA amp, you mean? 300-1000 mV. Line level is a floating value, have I noticed. It is also dependent of the pickup.
My monster RIAA amp has gain of 1848 = 65 dB at 50 Hz. This level is OK for my pickup (Ortofon FF15) and pre amp.
My monster RIAA amp has gain of 1848 = 65 dB at 50 Hz. This level is OK for my pickup (Ortofon FF15) and pre amp.
There aren't any RIAA standards for levels, it's all down to how much level the cutting engineer is prepared to put on disc. Higher levels reduce recording time and risk mistracking on subsequent playback. Conversely, lower levels reduce signal to noise ratio, but allow you to cram Beethoven's 9th on one LP. Typical (equalised) peaks reach 12dB over 5cm/s.
Clarification
Ok so either I’m not getting the picture or I didn’t make my question clear enough... I have the amp built that I mentioned in my previous post. It takes a 4.4Vp-p signal to make the output of the amplifier start clipping. I have learned that my CD player doesn’t even come close to that. It’s more in the range of 500mVp-p. Is my CD player just odd or am I going to need a pre amp to get the full potential out of this amp?
Ok so either I’m not getting the picture or I didn’t make my question clear enough... I have the amp built that I mentioned in my previous post. It takes a 4.4Vp-p signal to make the output of the amplifier start clipping. I have learned that my CD player doesn’t even come close to that. It’s more in the range of 500mVp-p. Is my CD player just odd or am I going to need a pre amp to get the full potential out of this amp?
Standards? What Standards?
Hi,
From what I read I gather your a bit at a loss as to why your phono stage delivers more signal than your CDP?
Cheers,😉
EDIT:
Question is, what's the input sensitivity of your amp?
Hi,
From what I read I gather your a bit at a loss as to why your phono stage delivers more signal than your CDP?
Cheers,😉
EDIT:
Is my CD player just odd or am I going to need a pre amp to get the full potential out of this amp?
Question is, what's the input sensitivity of your amp?
Hmmm, so I guess from what has been posted that every device from a home theater receiver to a CD player, to a computer has a different line-out peak voltage. Am I so far off that even GPS can’t find me or am I finally making sense. So if there is no standard does that mean that I have to add an AGC (automatic gain control) to my amp so that I can get the full potential from it?
😕 😕 😕 😕 😕 😕 😕

The thing you need is called a "volume control". Maybe also
a preamp.
CD audio is supposedly standardized so "full scale" output
is fixed (2V rms, I believe, or thereabouts; maybe it's 2V pk).
It's part of the nature of representing your signal in digital
form that there is a maximum full-scale "level" represented
by the bits.
Analog systems, be they magnetic tape, phono, broadcast,
etc., have no hard full scale limit -- it's somewhat "soft"
and dependent on many factors. (ulitmately, of course,
the circuitry can't put out a bigger signal than the power
supplies allow).
To use a somewhat defined standard, nominal "0 VU" on
professional tape machines are set to 180 nW/m or 250nW/m,
or even other standards depending on tape type, etc. But
even given that your machine is set to a particular one,
you can easily put peak signals onto the tape that are 10 dB
or more above this nominal operating level.
In analog, there are various reference levels such as the
above-mentioned tape levels or a phonograph disk level
of 1 cm/sec or broadcast standard of 100% modulation
(AM) or x kHz deviaion (FM), but these are just nominal
(and somewhat arbitrary) levels chosen for that particular
medium to use as a reference for technical performance
measurements. These do not represent the maximum
signal levels that can be put onto the medium.
There are some fancy preamps, both DIY and commercial that
give you a gain/attenuation adjustment for each input so
that you can have roughly equal volumes between sources
when switching from one to another.
a preamp.
CD audio is supposedly standardized so "full scale" output
is fixed (2V rms, I believe, or thereabouts; maybe it's 2V pk).
It's part of the nature of representing your signal in digital
form that there is a maximum full-scale "level" represented
by the bits.
Analog systems, be they magnetic tape, phono, broadcast,
etc., have no hard full scale limit -- it's somewhat "soft"
and dependent on many factors. (ulitmately, of course,
the circuitry can't put out a bigger signal than the power
supplies allow).
To use a somewhat defined standard, nominal "0 VU" on
professional tape machines are set to 180 nW/m or 250nW/m,
or even other standards depending on tape type, etc. But
even given that your machine is set to a particular one,
you can easily put peak signals onto the tape that are 10 dB
or more above this nominal operating level.
In analog, there are various reference levels such as the
above-mentioned tape levels or a phonograph disk level
of 1 cm/sec or broadcast standard of 100% modulation
(AM) or x kHz deviaion (FM), but these are just nominal
(and somewhat arbitrary) levels chosen for that particular
medium to use as a reference for technical performance
measurements. These do not represent the maximum
signal levels that can be put onto the medium.
There are some fancy preamps, both DIY and commercial that
give you a gain/attenuation adjustment for each input so
that you can have roughly equal volumes between sources
when switching from one to another.
Thanks
Ok now I get it. I think all the EMI from the labs I have been working in has finally messed with my head. I have determined through testing that nothing other than a function generator seemes to put out the voltage i'm looking for. which means a pre amp is in order.

Ok now I get it. I think all the EMI from the labs I have been working in has finally messed with my head. I have determined through testing that nothing other than a function generator seemes to put out the voltage i'm looking for. which means a pre amp is in order.


I don't know if the situation is different in the USA, but down here in the Netherlands there are rather strict limits on the frequency deviation of FM transmitters. In the good old days +/-75kHz was the limit. Nowadays there is a spectral mask defining how strong you can modulate your transmitter without loosing your license, but for practical music signals, this spectral mask still corresponds to roughly 75kHz peak deviation.
On the other hand, for commercial reasons, many stations want to sound as loud as possible. That is why they use terrible multiband compressors and clippers to increase the perceived loudness while still complying with the spectral requirements. The pop music stations sound particularly dreadful.
Regarding AM: some transmitters cheat by increasing their carrier power somewhat in loud parts of the signal, but otherwise you cannot exceed 100% modulation without turning the signal into a kind of DSB instead of AM, which can no longer be properly demodulated with a precision rectifier.
However, there is no real standard for the demodulator constant of AM and FM tuners, so you can still get about any level out of your tuner.
On the other hand, for commercial reasons, many stations want to sound as loud as possible. That is why they use terrible multiband compressors and clippers to increase the perceived loudness while still complying with the spectral requirements. The pop music stations sound particularly dreadful.
Regarding AM: some transmitters cheat by increasing their carrier power somewhat in loud parts of the signal, but otherwise you cannot exceed 100% modulation without turning the signal into a kind of DSB instead of AM, which can no longer be properly demodulated with a precision rectifier.
However, there is no real standard for the demodulator constant of AM and FM tuners, so you can still get about any level out of your tuner.
MarcelvdG said:I don't know if the situation is different in the USA, but down here in the Netherlands there are rather strict limits on the frequency deviation of FM transmitters.
Ok....... um I never said that I was transmiting signals using FM or AM. I simply have an audio amplifer for my home speakers... people seem to be over analyzing my question. Not that I am not gratefull for the help people try to provide, but that's way far off topic.
Maybe this is what you wanted.....
I usually shoot for 0.5 V rms, using the standard 5 cm/sec 1 Khz test tone.
Jocko
I usually shoot for 0.5 V rms, using the standard 5 cm/sec 1 Khz test tone.
Jocko
I usually shoot for 0.5 V rms, using the standard 5 cm/sec 1 Khz test tone.
Sounds reasonable...
or 0 dBU = 776 mV rms (into 600 Ohms, for the purists).
If we're going to be pure about it...
0dBu = 775mV RMS and is the voltage that would have dissipated 1mW in a 600R load, but it does not specify an impedance.
0dBm = 1mW into 600R. But this is virtually impossible to measure (most meters calibrated dBm actually measured dBu).
0dBu = 775mV RMS and is the voltage that would have dissipated 1mW in a 600R load, but it does not specify an impedance.
0dBm = 1mW into 600R. But this is virtually impossible to measure (most meters calibrated dBm actually measured dBu).
Just to mess everything up..
And there is also the the 0dbV = 1V, as used by a lot of studio equipment of today...this one also being impedance independant.
The 600 ohms nominal load seems to mostly for measurements and calibration purposes.
Most units today will be rated around 1V sensitivity, or contrary, output level. This is of course not the same as clipping level, which is design dependant.
Older units, like cassette decks of european origin, often had 100-200 mV as nominal output level, whereas the japanese were among the first to somewhat standardise 1V, or thereabouts..
Cirka thereabout is probably the closest we can get....
CDs vary quite a lot, in terms of max output, or full scale level, and this is both the actaul discs themselves as much as the players we use. Personally, I cannot recall ever having seen a calibration level for CDs...??? Anyone else??
And there is also the the 0dbV = 1V, as used by a lot of studio equipment of today...this one also being impedance independant.
The 600 ohms nominal load seems to mostly for measurements and calibration purposes.
Most units today will be rated around 1V sensitivity, or contrary, output level. This is of course not the same as clipping level, which is design dependant.
Older units, like cassette decks of european origin, often had 100-200 mV as nominal output level, whereas the japanese were among the first to somewhat standardise 1V, or thereabouts..
Cirka thereabout is probably the closest we can get....
CDs vary quite a lot, in terms of max output, or full scale level, and this is both the actaul discs themselves as much as the players we use. Personally, I cannot recall ever having seen a calibration level for CDs...??? Anyone else??
600R is historical, and comes from the days when wires strung on telegraph poles had a measured characteristic impedance of 600R. Before anyone starts wittering about transmission lines and audio, I must emphasise that for transmission lin effects to become noticeable, the cable length has to be a reasonable proportion of one wavelength. In audio terms, that means miles, not metres.
The specification for CD players is that a maximum amplitude sine wave (using all possible bits) should produce 2V RMS. Early CDs didn't use all the bits because of metering problems during mastering.
Modern CDs are recorded at greater resolution, then tweaked so that they just avoid 16 bit digital clipping. Also, the CTF control is turned right up. (Compressed To F^*&)
The specification for CD players is that a maximum amplitude sine wave (using all possible bits) should produce 2V RMS. Early CDs didn't use all the bits because of metering problems during mastering.
Modern CDs are recorded at greater resolution, then tweaked so that they just avoid 16 bit digital clipping. Also, the CTF control is turned right up. (Compressed To F^*&)
And for complete confusion...
Thank goodness we hardly use dBs for video. There, we have 1V pk-pk into 75R.
Thank goodness we hardly use dBs for video. There, we have 1V pk-pk into 75R.
0 dBm is 1 mW into a 50 ohm load........0.224 V rms.
This is not a correct definition of dBm, 0 dBm means a power level in dB relative to a reference power of 1 mW U]regardless[/U] of impedance.
So for 50 ohm 0dBm is 1mW = 0.224 V RMS measured over the load.
and for 600 ohm 0 dBm is still 1mW but 0.775V RMS measured over the load.
Regards Hans
- Status
- Not open for further replies.
- Home
- Source & Line
- Analogue Source
- RIAA Audio Standards