why most amps still use 400mV and less sensitivity input?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Disabled Account
Joined 2002
It is for many years (since CD was introduced) that 2Veff is the standard output voltage of sources. Amplifiers still use 400 mV input sensitivity to cater for older tuners etc. but since DAB/DAB+ those adhere to the 2Veff standard as well. Some recent amplifiers have 2V input sensitivity as sources tend to be digital in many homes.

Why this 2V/0.4V difference? It is for most consumers incomprehensible to have to turn the volume control over 12 o clock for a normal listening level :)

Standards serve a good purpose: interconnectivity.
 
Last edited:
Why this 2V/0.4V difference? It is for most consumers incomprehensible to have to turn the volume control over 12 o clock for a normal listening level :)

Standards serve a good purpose: interconnectivity.


The point is the first CD players were targetted for both domestic and professional use, I think. Thereon in I suspect noone wanted to compromise their performance specs with a lower output level.


I think a lot of amps have different sensitivity inputs for CD and aux/tuner too for this reason - for most uses line levels are very nominal, its only for broadcast and recording purposes that precise level matters.
 
I think a lot of amps have different sensitivity inputs for CD and aux/tuner too for this reason - for most uses line levels are very nominal, its only for broadcast and recording purposes that precise level matters.

In many of the older integrated and pre-amps I've serviced there is quite often a voltage divider at the CD inputs, less often at the other inputs.
 
Adjust the volume control for a comfortable listening level. Problem with an overly sensitive power amp input is that you'll hear more preamp noise.

Exact, same to me!

Everything which makes you turn down the volume is just waste gain which brings extra noise, extra hum, extra THD%, lower margin of stability, etc..etc..

I understand we need a trade-off between usability and noise/hum/distortion/stability.

To me 400mV/500mV sensitivity is currently more than enough to satisfy the actual standards.

For sure Sudgen Signature A21 170mV is overkilling...Just to mention one...
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.