Headphones on Speaker outputs

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I recently acquired a GAS Son of Ampzilla, and the manual says I can use the speaker terminals to power headphones. And I've asked about this on other forums, and confirmed that this was, and still is a semi-common practice. However, the manual states that it is suggested to use a 50-Ohm to 100-Ohm, 5 Watt resistor in each channel.

Is there some math or rule of thumb for determining the proper value resistor, based on the impedance of the headphones? Or is this perhaps a "try it and see" scenario?

My headphones are MDR-V700's with at least 1 Watt power handling, and 24-Ohm impedance.

Preliminary notes:

1) No, I will not buy different headphones so please, do not suggest such as I enjoy them very much on my current listening-rig.
2) Yes, I understand that this will likely require converting my headphones to balanced.
 
Member
Joined 2014
Paid Member
There are a number of models in that range varying between 102dB/mW and 107db/mW. So at least one watt is going to hurt your hearing. The series resistor forms a potential divider, and the ratio you go for depends on a lot, not least if you can remember to turn the volume down before plugging them in! I would err on the larger side for the resistor. But it does depend on the amplifier. Stick 100Ohms in and see what you get.
 
Far better to use a ztep-down transformer than a series resistor (otherwise known as a resistive pad).One reason being headphones are designed to be driven by a voltage source, meaning one with significantly lower output impedance than that of the phones being driven.
 
I recently acquired a GAS Son of Ampzilla, and the manual says I can use the speaker terminals to power headphones. And I've asked about this on other forums, and confirmed that this was, and still is a semi-common practice. However, the manual states that it is suggested to use a 50-Ohm to 100-Ohm, 5 Watt resistor in each channel.

Is there some math or rule of thumb for determining the proper value resistor, based on the impedance of the headphones? Or is this perhaps a "try it and see" scenario?

My headphones are MDR-V700's with at least 1 Watt power handling, and 24-Ohm impedance.

Preliminary notes:

1) No, I will not buy different headphones so please, do not suggest such as I enjoy them very much on my current listening-rig.
2) Yes, I understand that this will likely require converting my headphones to balanced.

If the headphone sound is important to you, you need a dedicated headphone amplifier with low output impedance and high output current capacity.

If not, any padding resistor value from 50-100 ohms will work for you. The resistor's function is to limit the excessive current that could damage a headphone from the power amp. You do not need to worry about the damping factor for proper transient response of the phone.
 
A simple way to get the best of both worlds?

Assume that the headphones have 24 Ohm impedance, and sensitivity of 107 dB SPL for 1 milliwatt of input. That corresponds to 155 mV leading to 107 dB.

Assume you want to, on peaks, generate 120 dB SPL peaks. That would need 0.692 volts.

You amp is rated at 80 Watts into 8 Ohms, corresponding to an output of 25.3 volts RMS. The attenuation would then be a factor of 25.3/0.692= 36.6 to 1.

If we make a voltage divider from a 100 Ohm and a 3.3 Ohm resistor, then we'd have an attenuation factor into a 24 Ohm load of 35.5, pretty close to the objective.

The additional benefit is that the headphones would see a source impedance of 3.2 Ohms, which is a whole lot better than you'd see with just a series resistor. The 100 Ohm resistor would have to dissipate 6.76 Watts in the worst case, but that is a really unrealistic assumption...that the amp would continually be driven to full output. 5 Watts is plenty!
 

Attachments

  • HeadphoneAttenuator.png
    HeadphoneAttenuator.png
    23.7 KB · Views: 230
Last edited:
Nironiro,

You're right...many receivers use something like 330 Ohms in series with the headphones. It's a pretty good answer, and can use a lower power dissipation resistor.

Recently, some people have found that a lower driving point impedance helps the sound of the headphones. Here's one reference I found:

NwAvGuy: Headphone & Amp Impedance

By using the admittedly lower 100 Ohms and the 3.3 Ohms to ground, we make the source impedance about 3.2 Ohms, more than 100 times lower than the 330 Ohms. 3.2 Ohms about satisfies the criterion that nwavguy cites for a low output impedance drive for headphones.

Dan
 
With an impedance response for the headphones in question at hand, it is also possible to determine a sensible upper bound for output impedance using the inverse calculation (input something like 1 dB for FR deviation). Since closed cans don't tend to have any major impedance peaks, I'd assume the 3.2 ohms above would be plenty though - even the picky HD598 would be perfectly happy with about 5 ohms.

AFAICT from the schematic, the Son of Ampzilla has a regular unbalanced (not BTL) output.

Using a full-grown speaker power amp to drive headphones is like driving to the grocery store in a Hummer though. It's doable, but certainly not the last word in efficiency. There are only a handful of (very insensitive) headphones where it makes any kind of sense - AKG K1000 (only about 83 dB / 1 Vrms), Hifiman HE-6 (~90 dB / 1 Vrms) or maybe an old K340 or K240DF come to mind.
 
Makes me wonder if I can just stuff an 10ohm resistor in parallel into the plug to make BA earphones less susceptible to source impedance.
You could, but not every source would be that happy driving a sub-10 ohm load, so sort of an adapter might be the better bet. Actually, 10 ohms may still be too much for really picky ones, which tend to prefer decidedly <1 ohms in some cases.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.