Bridged vs Conventional Amps

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Thanks for all the comments. Some non-obvious ways a bridged amp might sound better have been presented. And there's no question, as megajocke and others pointed out, bridging has some serious benefits for high powered pro amps. But back to high-end home audio with more modest power levels...

FAVORING BRIDGED: When it comes to sound quality, we have in a bridged amp's favor...

Better PSRR - but PSRR is not typically a problem in conventional amps especially with the well filtered stiff power supplies typically found in high-end amps.

Better Slew Rate - but, as above, slew rate is rarely a limiting factor in all but extremely high powered amps.

Lower 2nd harmonic distortion - but, as above, the 2nd harmonic is rarely a problem and also the least objectionable of all harmonic distortion.

Better common mode noise performance - Most of this benefit can be built into the gain stages of a conventional amp and I can't see how it helps an output stage much?

Possibly less distortion from ground currents using a floating power supply - This may have some real benefit but mostly in very high powered amps and only applies to floating designs like Crown's grounded bridge.

Less voltage on output devices for higher SOA - This is certainly true, but it's not clear how much it helps the actual sound quality or measured performance--especially given the current through the output devices is doubled. If the amp can really use fewer output devices for the same performance there might be gains from lower Cbe or gate capacitance on each bank of output devices.

Higher headroom - Due to a bridged amp drawing power on both phases of the signal from each rail, and having twice the current requirement, they're likely to create more power supply sag giving them higher measured headroom. This is controversial as some think "soft" power supplies hurt sound quality.

AGAINST BRIDGING: These issues likely work against a bridged amp's sound quality and measured performance...

Higher distortion from driving half the effective impedance due to beta droop, etc.

Higher distortion from mismatches between the sides of the bridge.

Higher distortion from additive distortions in the mirrored bridge--i.e. phase and thermal distortions.

Higher distortion from whatever circuitry is added to invert the signal.

Twice the current requirements which creates more power supply sag, more I2R losses, may require more output transistors, and possibly increase EMI related distortion.

More difficulty driving low impedance loads due to the load impedance being cut in half. A conventional amp of 200 watts or less, with the same number of output devices as a bridged amp, likely has a significant advantage into low impedance loads.

Given the above I'm still not convinced an ultimate high-end amp should be bridged--at least if it's around 200 watts or less. Clearly bridging has some strong advantages but most of them mainly benefit very high power amps and/or lower cost manufacturing.

I think there is a perception among many that "fully balanced" (and sometimes "fully symmetric") from input to output, all other things being equal, is better. Many companies certainly market their products that way. But speakers are 2 wire devices that don't care if they're floating or grounded. So there's no mismatch between a conventional grounded amplifier and a speaker. The issue seems to come down to mostly the points above.

So perhaps at 600 watts for the Bryston bridging makes sense? But I think Outlaw would be hard pressed to justify how bridging really makes their 200w/ch amp sound better--especially with difficult to drive speakers.
 
Hi Nelson. Correct as usual.

RocketScientist;

It seems you have your mind made up. Have you ever listened to both types of high end amps rated at the same power?

You seem to want this to be about the manufacturers increasing their profit. Some manufacturers think one way to increase profits is to sell what the customer wants. It was the demand for very high powered amps that created the market for them.

All of your points about low impedance loads are addressed in the design stage. Even amplifier designers understand low impedance loads and beta droop. There is also no need for the added inverter for one channel as you suggested.

I think there is a perception among many that "fully balanced" (and sometimes "fully symmetric") from input to output, all other things being equal, is better. Many companies certainly market their products that way. But speakers are 2 wire devices that don't care if they're floating or grounded. So there's no mismatch between a conventional grounded amplifier and a speaker. The issue seems to come down to mostly the points above.

The difference is common-mode rejection as Nelson pointed out.
 
Diamond Differential Balanced Bridge

Below is the "diamond differential" balanced bridge amplifier published by Takahashi and Tanaka for Sansui in 1984. It's fairly elegant and has some things in common with the Hadley and Pass designs mentioned in this thread.

The authors claim it achieves extremely high CMRR, PSRR, and slew rate while being inherently immune to T.I.M. (which was a major worry back in 1984) without needing complex constant current sources for the "diamond" stage. The lack of current sources also allows higher current drive.
 

Attachments

  • sansui-balanced-bridge.gif
    sansui-balanced-bridge.gif
    16.4 KB · Views: 399
Steve Dunlap said:

RocketScientist;

It seems you have your mind made up. Have you ever listened to both types of high end amps rated at the same power?

You seem to want this to be about the manufacturers increasing their profit. Some manufacturers think one way to increase profits is to sell what the customer wants. It was the demand for very high powered amps that created the market for them.

All of your points about low impedance loads are addressed in the design stage. Even amplifier designers understand low impedance loads and beta droop. There is also no need for the added inverter for one channel as you suggested.

The difference is common-mode rejection as Nelson pointed out.

My mind is anything but made up except that bridging has some obvious advantages for very high powered amps. But for audiophile amps, it's less clear to me.

Like tiefbassuebertr posted in this thread, my listening experiences in comparing bridged designs has consisted of using stereo amps in their bridged mode as mono amps. And, for me, they sounded better when not bridged. But I also realize those designs were likely not optimized for bridged use. It's difficult to do an apples-to-apples listening test of bridged vs non-bridged. And even if I could, I might prefer one amp and you might prefer the other. I'd much rather use more objective data to compare them.

I have no objection against manufactures making money. But I do have a problem when they pitch a given technology as superior, when the real reason for using it is to save money. If Outlaw marketed bridging in their 200w amp as a way to offer more watts per dollar, I'd be fine with that. But to spin it as superior topology, there should be some objective facts to back that up. Sure it might have a faster slew rate, but that doesn't help anything.

I agree about the extra inverter, but there is usually some extra phase splitting circuitry associated with a fully bridged design somewhere in the circuit. The bottom line is the signal goes through more components--with their associated nonlinearities-- before getting to the speaker in a bridged amp. That might not always be a bad thing, but it's true.

And I still don't get the common mode benefit for an output stage. I completely understand it for the gain stages but you can get those benefits with a conventional amp. But I might be missing something in how it benefits a bridged output stage beyond PSRR, and the ground issues, that I acknowledged? What other problematic common mode signals are rejected by a balanced output stage?
 
It's difficult to do an apples-to-apples listening test of bridged vs non-bridged. And even if I could, I might prefer one amp and you might prefer the other.

That is true.

I'd much rather use more objective data to compare them.

Do you think what Nelson said was only his opinion, or just spin?

If Outlaw marketed bridging in their 200w amp as a way to offer more watts per dollar, I'd be fine with that. But to spin it as superior topology, there should be some objective facts to back that up.

I know nothing about Outlaw, but this is what they say about their bridged amp.

Okay, so you want big power but you need greater flexibility. Well, the Model 2200's are just that, powerful AND flexible! The Model 2200 ($350) is a single channel 200-watt amplifier. You simply buy one for each speaker. Like our other 200 watt amplifiers, this amp will drive virtually any speaker or (passive) sub you throw at it. So what about space you ask? Each Model 2200 sits just under two inches tall so you won't need to buy a new equipment rack! The Model 2200's advanced circuitry and large pancake style torroidal transformer enable us to deliver a cool yet potent performance. In addition to its sleek design we have provided a special signal sensing circuit that triggers the amplifier to power on whenever an audio signal is present. The Model 2200: small, sleek and powerful-need we say more?

Where is the spin here?

I agree about the extra inverter, but there is usually some extra phase splitting circuitry associated with a fully bridged design somewhere in the circuit. The bottom line is the signal goes through more components--with their associated nonlinearities-- before getting to the speaker in a bridged amp. That might not always be a bad thing, but it's true.

It very much depends on how the amp is designed. There is no reason that the signal must pass through any more components.

And I still don't get the common mode benefit for an output stage. I completely understand it for the gain stages but you can get those benefits with a conventional amp. But I might be missing something in how it benefits a bridged output stage beyond PSRR, and the ground issues, that I acknowledged? What other problematic common mode signals are rejected by a balanced output stage?

Here I will quote Nelson again.

Balanced amplifiers improve performance by differentially rejecting distortion and noise. To the extent that distortion and noise are identical, they vanish at the output, typically by a factor of 10 or so for matched single-ended Class A circuits.

I'm sure he measured that. It may be on his web site. I will not attempt to dig out my own work along these lines, but I measured improvements in noise and distortion also.
 
I'd love to see some measurement data to back up the statements. I have a lot of respect for Neslon Pass and especially all he's given to the DIY community.

Balanced amplifiers improve performance by differentially rejecting distortion and noise. To the extent that distortion and noise are identical, they vanish at the output, typically by a factor of 10 or so for matched single-ended Class A circuits.

I'm not sure if "singled ended Class A" means "non-bridged" or singled-ended as in a single transistor i.e. not push-pull? Unfortunately people (me included) use "single ended" both ways. If it's the non-push-pull variety, that's a whole different world. And, regardless, the only documented distortion that's canceled is the 2nd harmonic. And nobody has explained what sort of common mode problematic noise gets canceled by a bridged output stage.

The outlaw "spin" I'm talking about is this:

The 7500/7700 differential design virtually eliminates cross talk, through a technology known as “common mode rejection”. Compared to single-ended designs, the 7500/7700 requires half the rail voltage for a given power allowing for increased transient performance. In addition, with a slew rate nearly double that of a comparable single-ended amp, the utmost control over transients is maintained ensuring distortion free performance. The result: an amplifier that remains unequivocally true to the source material.

Their (or any other well designed) non-balanced amps can have the same real world "control over transients". And they're also likely *more* "distortion free" and *more* "true to the source" than their balanced amps. So, IMHO, it's a bit misleading.

Steve Dunlap said:

It very much depends on how the amp is designed. There is no reason that the signal must pass through any more components.

I don't agree if you want the same or (if possible) better performance than a conventional amp. AFAIK, if you follow the signal path through an amp with and without a bridged output stage, it's a longer total path on the bridged amp--there are more transistor junctions involved. Or put another way, for any bridged design, the signal path should be simplified by "un-bridging" it.
 
The outlaw "spin" I'm talking about is this:

As I said, I'm not familauer with Outlaw. This looks like standard marketing BS. It is certainly no more spin than put on advertising by car companies, soft drink companies, soap, etc.

I don't agree if you want the same or (if possible) better performance than a conventional amp. AFAIK, if you follow the signal path through an amp with and without a bridged output stage, it's a longer total path on the bridged amp--there are more transistor junctions involved. Or put another way, for any bridged design, the signal path should be simplified by "un-bridging" it.

Let's take your unbridged amp and drive both channels with a balanced input (not uncommon in pro sound). One channel gets the non inverted input, the other gets the inverted input. Connect the load between the hot outputs and you have a bridged amp. Is the signal path now longer?

Now lets take the same amp and drive the channels with the same signal as above. Now you need a balanced to unbalanced conversion stage added to the signal path. If you use only one leg of the balanced input to avoid this stage, you lose your common-mode noise rejection.

I can see benefits to both balanced and unbalanced amps and I used to build both. For amps in the 400 to 1600W range (at 8 ohms), balanced was the way I found worked best. These amps all were used with 4 ohm or less loads with no problems.
 
Steve Dunlap said:

Let's take your unbridged amp and drive both channels with a balanced input (not uncommon in pro sound). One channel gets the non inverted input, the other gets the inverted input. Connect the load between the hot outputs and you have a bridged amp. Is the signal path now longer?

OK. I'll give you that one as far as the input circuitry goes. :) But my other point still stands. In the conventional amp there are fewer transistor junctions in the signal path because one side of the speaker is connected to the power supply ground instead of to another set of transistors (that likely have at least one set of drivers upstream). But you still make a good point. And even some high-end home gear has balanced outputs (although fully balanced signal paths--especially wherever analog gain is adjusted--are extremely rare AFAIK).

Steve Dunlap said:
I can see benefits to both balanced and unbalanced amps and I used to build both. For amps in the 400 to 1600W range (at 8 ohms), balanced was the way I found worked best. These amps all were used with 4 ohm or less loads with no problems.

I agree 400 watts seems like a reasonable crossover point. And at 1600 watts it would be no contest in favor of bridged.
 
RocketScientist said:
Like tiefbassuebertr posted in this thread, my listening experiences in comparing bridged designs has consisted of using stereo amps in their bridged mode as mono amps.
This might well explain why your experiences are not that good.

Many stereo amps are clearly not optimized for bridge operation... sometimes the inverting channel's input signal is simply tapped off, and padded down in level, from the non-inverting channel's output... a true recipe for disaster. Even when there is a proper phase splitter you still may have the problems of high power supply currents modulating the GND (which itself carries no direct signal current but supply cross current is still present)

Mabye you should check out a true bridge design from the ground up like, say, the PassLabs XA.5 series...

- Klaus
 
Zero D said:
In theory you can get 4 x the power into the same load, that's if the power supply is beefy enough of course.
no, theoretically one gets twice the power into twice the load impedance.
It theoretically impossible to get four times the power into the same load impedance with conventional amplifiers no matter how good the PSU is.
 
Good point Klaus. This is why I prefer to use a bridge VAS. There is no difference between the + in and – in of it or the input stage, so assuming you have a quality phase splitting circuit, the outputs will be exactly equal and opposite.

RocketScientist said:
I don't agree if you want the same or (if possible) better performance than a conventional amp. AFAIK, if you follow the signal path through an amp with and without a bridged output stage, it's a longer total path on the bridged amp--there are more transistor junctions involved. Or put another way, for any bridged design, the signal path should be simplified by "un-bridging" it.

IMHO, it matters not how many transistor junctions the signal goes through per say, but more about the fb design choice and the operating load line each transistor has to follow.(cascode is good:yes: ) For a global fb circuit, too many amplifying transistor stages will increase phase shift from output to input and require more compensation to stabilize, reducing BW. However, with a local fb design, each stage has its own loop. But in this case, you almost have to use a common mode control loop (DC servo(s) ) to control the output offset of each output node. There is a significant advantage in eliminating common mode errors with balanced designs. This is not easily done using two single end amplifiers in bridged configuration.:)

:2c:
 
longthrow said:

Thanks longthrow. There is some good information in that thread.


KSTR said:
This might well explain why your experiences are not that good.

Many stereo amps are clearly not optimized for bridge operation...

Yes, I agree. One reason I started this thread is to decide if I should perhaps attempt a DIY bridge optimized amp. It would seem to me that might only make sense for 400 watts or more.

And Steve Dunlap raised a good point that it might also make sense if you are feeding the amp with a true balanced source.

Otherwise, with an unbalanced source and power under say 200 watts, I still think a conventional amp can overall outperform a bridged amp playing real music into typical real loads.
 
Zero D said:
AndrewT

Hi, sure twice the power into twice the load, But i stated 4 x the power into the same load.

For eg. Mono amplifier = 100 Volts ~ 1kHz into 8 Ohms = 100 x 100 ÷ 8 = 1250 Watts

2 x Mono amplifiers bridged = 200 Volts ~ 1kHz into 8 Ohms = 200 x 200 ÷ 8 = 5000 Watts

Reference

http://en.wikipedia.org/wiki/Bridged_and_paralleled_amplifiers


That is why AndrewT said theoretically. In the real world it is imposable because of losses.
 
Steve Dunlap said:

That is why AndrewT said theoretically. In the real world it is imposable because of losses.

Yes, although it's worth mentioning sometimes manufactures specs do show 4 times the power into a bridged load. For example, I think Carver has published such numbers for some of his creations.

But even when the spec says 4 times the power, the actual measurements won't. Such amps will usually outperform their 8 ohm unbridged spec and barely meet (or not meet) the bridged spec. The same is true for non-bridged amps that show a perfect doubling of power into lower impedances.

The only way to get a perfecting doubling, 4X, etc. is to artificially limit the output voltage of the amp to well below clipping into the easier loads. But it would be foolish to do so for anything but marketing reasons.
 
RocketScientist said:
The only way to get a perfecting doubling, 4X, etc. is to artificially limit the output voltage of the amp to well below clipping into the easier loads. But it would be foolish to do so for anything but marketing reasons.
I suspect there are many ways to achieve perfect doubling of output power into halved load impedance.
Positive feedback would be one method.
Variable gain with voltage monitoring on the output would be another method.
Neither would make terribly good Audio amplifiers (I have no proof for this statement - it just sounds good).
What other candidates are there for bad audio amplifiers that can achieve perfect voltage source drive into a range of loads?

Could negative output impedance be used to get close to that goal?
 
Let's suppose a manufacturer specifies an amplifier as
50W into 8ohms
100W into 4ohms
200W into 8ohms
and further on states that this applies for all frequencies from 20Hz to 20kHz
and all at <=0.1% distortion
and that this still applies for all main supply voltages within the specification range (in the UK: 216Vac to 254Vac).

When measured in real life on a fixed supply voltage of 240Vac and @ 1kHz, one may find that the amp achieves
80W into 8ohms
150W into 4ohms
230W into 2ohms.
Are we going to complain?
 
I just wanted to point out that specs are not always to be believed and can't be used as proof of the doubling, quadrupling, etc.

And the only way to achieve such perfect theoretical performance is by means that would likely degrade the real world performance of the amplifier by restricting its output into easier loads. Or it might degrade it in other ways as would be the case with a negative output impedance (which would generally cause frequency response variations with real world loads).

AndrewT is correct there are various ways to force the desired behavior. I should have been more general in my statement. ;) But this is a perfect example of where designing an amplifier to achieve theoretical perfection in one area comes at a significant price in other areas.

For good reason, we want amps to ideally have an output impedance as close to zero as possible. So creating an amp with a negative or dynamic output impedance/gain will introduce other problems. The cleanest way I know of to achieve what's being discussed is to simply work backwards from the max power at the most difficult load specified and clip the signal in the input stages to that voltage level.

With such a scheme the amp will exhibit a perfect doubling of power each time the load impedance is cut in half--or a 4X increase into a bridged load. It might even amaze some guy at Audioholics bench testing it. But the "price" is you're artificially limiting the output of the amp into easier loads. Which, from a practical point of view, makes no sense.
 
I've seen it done - the Citronic PPX1200 amplifier clamps its drive node to +-90V. Typical idle rails are +-100V. I guess it's done to lower voltage stress on output devices if the mains voltage is high, the output transistors are only 200V devices.

There is one advantage I can see with voltage limiting. Let's see what happens if output power increases much into lighter loads and an amplifier is run heavily into clipping or clip limiting. (a DJ at the controls...)

Let's say there are 4 loudspeakers in parallell and that the power rating of them is marginal. If one speaker blows, the amplifier output voltage will increase - increasing power into the remaining 3. The extra stress will make the next one blow even more easily and so on.

Now the DJ has blown up all 4 loudspeakers :bawling:
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.