Is bigger better?

Here’s a question for the educated masses. Assuming a digital source, for the same power output, do people feel there is any difference in sound between 2 otherwise identical amps of different power ratings?

I ask this question because many people seem to want to build larger versions of amplifiers, yet probably wouldn’t get anywhere near clipping on smaller versions.

The reason I stipulated a digital source is that the output is set at a maximum level and it is impossible to achieve greater output regardless of material playing.

Just to get the discussion rolling, my feeling is that provided the power supply is capable of providing the required current, any additional “headroom” voltage above the maximum required is immaterial.

Other thoughts?


I think you mean if I am not mistaken, for a given input signal of say 1Volt from a CD player, if there is a difference in sound quality of 2 amplifiers sharing the same design but at different voltage supply ratings and thus have different output power ratings ?
Correct me if I missunderstood the question.
It actually depends on if the amplifier is 100% scalable. Sometimes by changing the power supply voltage you have different biases or whatever else and the character of the amp changes. In most class B or A/B amplifier designs I know you can scale them up or down because they have circuitry that adapts to the voltage supply with the same bias. (for instance the current sources provide the same bias currents at other voltages and so on). For class A though, I think you have a change in sound. Like Nelson Pass states often in the projects, an amp sounds different at 2A bias than at 3A bias. With bigger current, better sound. So it gets complicated and depends on you do with the circuit hen you change it.
For class B amps when you give more headroom my opinion is that you get better quality in sound when it comes to dynamics. Music has a great dynamic range, which means in practice that in the same song you can have low passes and very loud passes, those signals at a difference of many dBs. maybe 40-50 dB or more. This means at the amp output you can have a few millivolts to 50-100 Volts. When you have a lot of headroom the large transients don´t get clipped and the sound is clearer and more real.
This is a nice subject and I hope to hear what other people have to say too.


2001-09-07 6:08 am
Two amplifiers with different power output otherwise identical connected to the same digital source will definitely show difference in sound in terms of loudness as preceived by human ears.

The Intensity of Sound varies inversely with square of the distance between the source and listener.

For example when a listener standing at 1 meter away from a source preceiving 100W of power moves 2 meters aways from the source can preceive only 25W of power, therefore large amps.

- XL
Xavier, this is not what I was referring to. While you are correct in what you say, Promitheus was right it understanding I was referring to the same amplifier scaled by varying the power supply voltages, but otherwise identical, and with the same output power level when comparing.

Promitheus, I understand your very valid point regarding bias currents and agree my question is perhaps a bit theoretical, however you kindly raised the point that I suggest (with all due respects) is a fundamental misunderstanding in this field. You said “Music has a great dynamic range, which means in practice that in the same song you can have low passes and very loud passes, those signals at a difference of many dBs. maybe 40-50 dB or more. This means at the amp output you can have a few millivolts to 50-100 Volts. When you have a lot of headroom the large transients don´t get clipped and the sound is clearer and more real.”

This is a comment I hear often, but one I find difficulty in agreeing with, PROVIDED the source is digital. Unlike analogue (that’s analog in Americanish :p) sources, the maximum output of a CD player can be precisely determined, and it is impossible to exceed this level. Sadly many poorly mastered disks do try, and the result is particularly offensive to the ears by way of digital clipping.

Given that the CD player’s output at the maximum of 0dB or 2.3V, and the gain of the two amplifiers is the same, the output voltage must, ipso facto, also be identical between the 2 amplifiers. It will not, nay cannot, be exceeded under any condition, dynamic or otherwise. Indeed, when deciding to build an amplifier it should be possible, with not too much maths, to precisely determine the size amplifier to build given the desired SPL.

For the more powerful amplifier to produce higher voltage transients compared to the lower powered version it would need to act as a dynamic range expander; amplifying in a non-linear fashion. I suggest the amplifier does not, or at least should not, do this, however I throw the debate open in this area.

I suggest we are being duped into believing “more is better”. While this may be true in commercial amplifiers, where commercial compromise (manufacturers’ jargon for crap design) may result in power supplies incapable of providing the required current at high output levels. The result in “low” powered amplifiers is collapsing rails and dynamic range compression. Not facing the constraints of the (sometimes) mighty dollar or Yen, we are able to produce power supplies rock solid at all output levels, thereby negating this constraint.


I've heard larger amps that sounded better than their little brothers (e.g. Conrad Johnson Premier One vs. MV-75), and little ones that sounded better than the bigger ones (some of the Levinson pieces, for instance). The problem is--and I'm agreeing with promitheus, here--how many changes were made in the circuit? In the case of the Premier One and the MV-75, the topology was almost identical, it's just that the P1 had more of it (along with a much larger power supply). But that word 'almost' is a potent word. If we can hear the difference in caps or whatever, then just how much of a change in topology is audible?
Nelson has said elsewhere that he prefers the sound of the X-600 over the other models in his current lineup; one is larger, and several smaller. As far as I know, he's using pretty much the same thing up and down the line with as few variations as he can get away with. He also mentioned that he preferred the sound of one of the smallest Thresholds, though.
So let me toss out this idea: For every topology, there will be an optimum power (I'm tempted to write "bias" but will let the word "power" stand) range where it sounds best. For some it will be as a low-powered amp. For others, medium, and for some their destiny is to sound best at high power.

G’day Grey,

My question is purely an exercise in theory, and I agree in practice may not align with what we commonly find in the real world. The point I wanted to propose is that the concept of greater dynamic range from a larger amp is a misnomer, albeit a common misconception. If my proposal is in fact correct, it makes me wonder why many would want to build 100 or even 200W amplifiers. Either the speakers they are using are absurdly inefficient, their living area is somewhat larger than mine, or they like to listen to rock music at live concert levels.

Having said that, my present (commercial) amplifier is rated at 100W and it was indeed the best sounding of the range. However in normal use I doubt it’s driven beyond a few watts, and has almost certainly never put out more than 10-15W.




2001-02-04 4:23 am
A nice can of worms opened here.The McIntosh 502, 2120, and 2200 use the same driver board.Exactly.And the same B+.Exactly.There are one pair, two pair, three pair of outputs for 50W, 120W, 200W output.The 2120 sounds the best.Load was a pair of 104dB/W Klipschorns and the power level never came close to clipping.The diff pair and second voltage amps are run from a regulated supply on all three amps.Go fish.