Subwoofer circuit: Going from speaker level to line level

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Greetings,
Most modern sub-woofers have both speaker level (high) and line level (low) inputs.

My sub does not have the speaker level inputs. This was not a problem while I was using a pre-amp in my system. But now that I've gone direct from the source, I need a way to include my sub in the system.

I was thinking about creating a circuit that would allow my sub to get it's signal from the main amp.

But before I go any further I am wondering, what's the opinion here on how complicated this might be and degree of difficultly to implement.

All I need is a circuit that takes me from the output level of my amp back to a workable line level that my sub amp can a handle.
Seems simple enough.
Any ideas?

Thanks!!
 
I'm by no means an expert but wouldn't it be more logical to take the signal from the amp input instead of the output?

Perhaps.
I am using a shunt attenuator connected at the back of the amp.
So in order for it to work, the signal would have to branch off between the shunt circuit and the amp.

I'm not certain how this would effect the various impedances.

As it is, my source output impedance is 5ohm. The amp input impedance is 100Kohm. And the shunt circuit has a 2kohm for it's series resister. (technically, 1k per balanced leg.)
I haven't looked yet to see what the input impedance of the sub amp would be.

I'm just assuming that the input impedance of the sub amp would be a lot more benign parallel to the speaker load. It also keeps as much out of the primary signal circuit as possible. (though I may be naive in this thinking.)
 
Last edited:
If your amp is not bridged (so that black=ground) you just need two resistors in series over the speaker output, with the output to the sub taken from the middle. This will divide the voltage from speaker level down to line level when the two resistors are choosen right. What is right, is not critical when your sub has a volume control, so a ball park figure to start off with could be 10KOhm and 1KOhm, with the large value connected to speaker + and the small one to speaker ground. This will give you 20dB attenuation. If you can't get your sub to play loud enough, move from 1KOhm to 2KOhm.

Take 1 Watt resistors, that would keep you safe unless your amp delivers more than 1750 Watts ,

vac
 
This may help.... comes from Linkwitz.
 

Attachments

  • attenuation speaker to RCA.gif
    attenuation speaker to RCA.gif
    30.9 KB · Views: 2,493
But how does this effect the speaker that is still connected to the amp?
Are you intending to drive you main speakers and this circuit at the same time?

This does not affect the speaker still connected to the amp, because the speaker has a much lower resistance than the voltage dividing network. Therefore, the resistance of the resistors in this network should not be choosen too low. Linkwitz I feel is on the low side with his 1000/50 Ohms.

vac
 

PRR

Member
Joined 2003
Paid Member
If your line-level source is really as low as 5 ohms, there's a 24 cent solution:

sP26h.gif


The downside is that if source is 500 ohms, cross-talk between the main channels becomes -26db which is pretty lame. (At 5 ohm source you have -66db.)

This also assumes your subwoof amp input is not small compared to ~~5K.
 
This does not affect the speaker still connected to the amp, because the speaker has a much lower resistance than the voltage dividing network. Therefore, the resistance of the resistors in this network should not be choosen too low. Linkwitz I feel is on the low side with his 1000/50 Ohms.

vac

So adding this circuit to the amplifier as well as the existing speaker will not effect the sound?
 
No, but now when I look at PRR's post, there might be a complication I overlooked: do you also need to combine the two stereo channels into 1?

p.s. I am affraid that PRR's solution would not work to attenuate the signal by much; it would be dependant on the input impedance of the subwoofer input, which is likely to be >10KOhm. You would need to tie the output of the two 4,7K resistors to ground with something like a 470 Ohms resistor on order to create a voltage divider that would work in this situation.

vac
 
The ones that are for sale have a mini transformer(s), resistor, pot, (not that many components/parts).

I used one in an AV booth for a vintage Super 8 projector: all it had for sound was a ¼” phono jack for a speaker (in the cover) and needed to send it to the mixer/main speakers in the theater - line level.

Also used them in cars/trucks connected to the rear speakers, then to the amp/subwoofer; e.g. using factory radio.

I'm on board for building things; just to see how they work.
 
I'm not sure I'd want to connect anything to my amp output other than my speakers. In any case, Wouldn't it be better to find a point to take pre out rather than amplify and then attenuate?

It should be too hard to create additional pre outs after the volume control. I did this with a yamaha av amp recently for the front outs into my main system. The principle is the same.
 
Thanks for all the help, everyone.

@vacuphile... Yes, they actually do combine into one signal inside the subwoofer amp. Is this a problem?

I like the idea of taking the signal from the output of the amp because it simplifies the overall volume control of the system. The only other place where I can use a single volume control would be between the shunt attenuation circuit that I am currently using for overall volume control and the main amp. But this might be a significant problem for two reasons. 1) adding the sub amp into the equation effects the impedances in a way I am unable to ascertain given that I do not know the input impedance of the sub amp. 2) More importantly, the sub amp has a significantly low input sensitivity. Much lower than that of my main amp. I ran into this problem when I was still using a pre-amp.

So taking the signal from the output of the main amp leaves me plenty of room to work given that it will be the highest point of both voltage and current.

I figured that a simple L-pad circuit would be the solution. I was a little foggy on the best way to go as far as watt rating for the resistors and relative values.
Using the 10k series and 1k parallel values gives me -20dB. Since my main amp is +26dB this should give me the necessary room to accommodate the lower sensitivity of the sub amp.

So, before I actually employ this idea, I'm guessing the the circuit in post #14 from Original Burnedfingers is the most simple and workable solution.

Any other thoughts, concerns, or stop signs before I give it a go??
 
I don't follow this logic.

Signal - preamp with volume control -fixed gain power amp - main speakers and voltage divider - power amplifier - subwoofer speaker.

Why would you run through a power amp then attempt to convert back to line level in parallel with you main speakers to run through another amp on the sub.

Why not look at splitting a buffered output from the preamp stage after the volume control???
 
Thanks for all the help, everyone.

@vacuphile... Yes, they actually do combine into one signal inside the subwoofer amp. Is this a problem?

I like the idea of taking the signal from the output of the amp because it simplifies the overall volume control of the system. The only other place where I can use a single volume control would be between the shunt attenuation circuit that I am currently using for overall volume control and the main amp. But this might be a significant problem for two reasons. 1) adding the sub amp into the equation effects the impedances in a way I am unable to ascertain given that I do not know the input impedance of the sub amp. 2) More importantly, the sub amp has a significantly low input sensitivity. Much lower than that of my main amp. I ran into this problem when I was still using a pre-amp.

So taking the signal from the output of the main amp leaves me plenty of room to work given that it will be the highest point of both voltage and current.

I figured that a simple L-pad circuit would be the solution. I was a little foggy on the best way to go as far as watt rating for the resistors and relative values.
Using the 10k series and 1k parallel values gives me -20dB. Since my main amp is +26dB this should give me the necessary room to accommodate the lower sensitivity of the sub amp.

So, before I actually employ this idea, I'm guessing the the circuit in post #14 from Original Burnedfingers is the most simple and workable solution.

Any other thoughts, concerns, or stop signs before I give it a go??

Yes, it is the same I suggested in my earlier post, but neatly drawn out. Just don't take 1/4 W resistors; use 1 W just to be on the safe side.

vac
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.