Why nearly all integrated amps still use 400mV (and less) input rating when all digital sources run at 2V output max?
Is there any ideal input sensitivity for a DIY amplifier if you know that you will use only digital sources?
Is there any ideal input sensitivity for a DIY amplifier if you know that you will use only digital sources?
Its very common for people to use mobile device earphone outputs into amps, they are mainly less than 400mV even.
Choose professional line level perhaps:
Line level - Wikipedia
Choose professional line level perhaps:
Line level - Wikipedia
Why nearly all integrated amps still use 400mV (and less) input rating when all digital sources run at 2V output max?
Ever heard of ... ummmm .... "headroom"?????
That you *can*put out up to 2V RMS does not mean you need to do it all the time.
If your car is capable of 180 km-h ... do you always run it at that speed?
Adjust the volume control for a comfortable listening level. Problem with an overly sensitive power amp input is that you'll hear more preamp noise.
It is for many years (since CD was introduced) that 2Veff is the standard output voltage of sources. Amplifiers still use 400 mV input sensitivity to cater for older tuners etc. but since DAB/DAB+ those adhere to the 2Veff standard as well. Some recent amplifiers have 2V input sensitivity as sources tend to be digital in many homes.
Why this 2V/0.4V difference? It is for most consumers incomprehensible to have to turn the volume control over 12 o clock for a normal listening level 🙂
Standards serve a good purpose: interconnectivity.
Why this 2V/0.4V difference? It is for most consumers incomprehensible to have to turn the volume control over 12 o clock for a normal listening level 🙂
Standards serve a good purpose: interconnectivity.
Last edited:
Is there any ideal input sensitivity for a DIY amplifier if you know that you will use only digital sources?
There is a standard that defines both Consumer Line Level and Professional Line Level. You can study it HERE
By making an amplifier slightly more sensitive (300mv instead of 400mv) they can guarantee that it will reach full output.
Is there any ideal input sensitivity for a DIY amplifier if you know that you will use only digital sources?
This article will help you work out what it is. What is Gain Structure? - diyAudio
Tony.
Why this 2V/0.4V difference? It is for most consumers incomprehensible to have to turn the volume control over 12 o clock for a normal listening level 🙂
Standards serve a good purpose: interconnectivity.
The point is the first CD players were targetted for both domestic and professional use, I think. Thereon in I suspect noone wanted to compromise their performance specs with a lower output level.
I think a lot of amps have different sensitivity inputs for CD and aux/tuner too for this reason - for most uses line levels are very nominal, its only for broadcast and recording purposes that precise level matters.
I think a lot of amps have different sensitivity inputs for CD and aux/tuner too for this reason - for most uses line levels are very nominal, its only for broadcast and recording purposes that precise level matters.
In many of the older integrated and pre-amps I've serviced there is quite often a voltage divider at the CD inputs, less often at the other inputs.
Adjust the volume control for a comfortable listening level. Problem with an overly sensitive power amp input is that you'll hear more preamp noise.
Exact, same to me!
Everything which makes you turn down the volume is just waste gain which brings extra noise, extra hum, extra THD%, lower margin of stability, etc..etc..
I understand we need a trade-off between usability and noise/hum/distortion/stability.
To me 400mV/500mV sensitivity is currently more than enough to satisfy the actual standards.
For sure Sudgen Signature A21 170mV is overkilling...Just to mention one...
Last edited:
- Home
- Amplifiers
- Solid State
- why most amps still use 400mV and less sensitivity input?