Converting pro audio signal (1.23v) to consumer line level (0.31v)?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I need to convert the 1.23volt output of my DBX Drive Rack PA (Digital EQ and crossover) to 0.31 volts line level that my amps require.

Can I do this with just a simple voltage divider circuit?
As per this schematic http://en.wikipedia.org/wiki/Image:Resistive_divider.png

Vout = R2 / (R1/R2) * Vin

I want a Vout/Vin ratio of 0.31/1.23 = 0.25

So making R2=100ohms give a calculated value of R1=300ohm

Since it's the ratio that matters, can I select 1K ohm and 3K ohm resisters instead and get the same result?
 
No comments?

I understand that the above circuit will lower the line voltage by the correct amount. What I don't understand is how adding the resistors affects the input impedance of the power amps.

My understanding it that it'll be just fine, but I don't want to risk damaging my xover or amps. Sorry if this a dumb question, but it's been a long, long time since I studied electronics. :)
 
No, it's a bit more complicated than that.

1st: it needs to be balanced. So a pi or T structure is needed. I like the double pi as it uses 1 less resistor (rolleyes: )

2nd: it needs to present the correct impedance each way. I guess you're 110ohms to 75ohms.

3rd: it needs attenuation

so there's a bunch of thevenin equivalents and parallel combinations to be computed.
 
I am real curious as to why you feel you need to pad down the signal to your amps. Don't you have the ability to lower the output from the other devices ahead of the amplifier?

Typically most Pro amplifiers usually require .775 volts to achieve full power. There are of course exceptions to this as some will need more or less. Most home stuff will fall in the 1-1.5volt range.

Most amplifiers of any quality will have an input gain adjustment for each channel. Proper setup would be to adjust the gain so that every piece in the chain would clip at the same time.



Quote from your DBX manual.

Once you have found the clip point of your amplifiers, you can mark this position and turn the
amplifiers back up to the point where they are clipping. You can now use the output limiters
in the DriveRack PA to protect the amplifier from clipping no matter what you do at the console.

Please list your Power amplifiers so that I can check their specifications. There is absolutely no need for a pad of any type because the DBX has ample adjustment to compensate.
 
Thanks for the response guys!

Ian, I'm not sure what a pi or T structure is... I'll do some reading but what is the name of the circuit that performs the voltage reduction?

Burnedfinger:

Amps are as follows:

Low's (80-500hz) Crown XLS 402D driving Altec 515G's in a DIY horn
Mid (500-7khz) Trends audio TA-10.1 Class D amp driving Altec 288 CD in Manta ray horns
High's (7-18Khz) Crown XLS402D Driving B&C DE-10/ME10 (IIRC) horn

Now I know what you're thinking... what a weird combination of amps... and why would anyone match up a 300watt/ch amp with high sensitivity, low power handling speakers.... well let just say that I'm working with what I have (amp wise) and it believe it or not works pretty well. :)

There is absolutely no need for a pad of any type because the DBX has ample adjustment to compensate.

The DBX has enough gain on the input side, but I'm having to internally (read digitally?) reduce the gain of the hi/med/low outputs in order to have an acceptable range of volume adjustment. Which means I'm not using much of the 24bit word length on the output D/A's, right? And yes the gain controls on the crown amps are set at their minimum.

BTW to be clear, this is a home theater system and not a PA application.

An externally hosted image should be here but it was not working when we last tested it.
 
Lower the signal to the DBX. You can then raise the gain on the amps for a match. How are you time aligning the drivers?

I believe the input sens is 1.25V in for max power on the Crown according the their information. So this tells me the solution is to either lower the output of the DBX or lower the signal level on the input.
 
burnedfingers said:
Lower the signal to the DBX. You can then raise the gain on the amps for a match. How are you time aligning the drivers?

I believe the input sens is 1.25V in for max power on the Crown according the their information. So this tells me the solution is to either lower the output of the DBX or lower the signal level on the input.

I can lower the signal to the DBX via the volume control on the preamp, but I want both the input signal and subsequent A/D conversion to maximize the 24 bit word length. Same on the output on the DBX. At the moment the output LED's barley light up the -30dbu light. Doesn't it make more sense to attenuate the line level signal after the D/A conversion?

For time alignment I'm using the DBX for the B&C HF horn which is now sitting in the mouth of the Altec midrange horn. The Tapped Horn sub isn't going through DBX but it's time aligned (it has a 15ft internal path length) via my AV preamp.
 
How hard are you hitting the input on the DBX?
Quote:
I can lower the signal to the DBX via the volume control on the preamp, but I want both the input signal and subsequent A/D conversion to maximize the 24 bit word length. Same on the output on the DBX. At the moment the output LED's barley light up the -30dbu light. Doesn't it make more sense to attenuate the line level signal after the D/A conversion?

Lower the input to the DBX. Try to hit it at 0 db or maybe a little less. Raise the output level of the DBX enough so that you can level match your speakers.

No, it doesn't make any sense to attenuate the signal out of the DBX with a pad.

I have done a number of these in commercial applications in the past. I fully understand the sensitivity of the Altec drivers because I have worked with them in 100's of applications and I too have used overpowered amplifiers in a pinch.
 
burnedfingers said:
How hard are you hitting the input on the DBX?

Everything is OK on the input side of the DBX as it has a +4/-10dbu switch on the back. I'm hitting +10 to +15dbu on peaks.

burnedfingers said:
Lower the input to the DBX. Try to hit it at 0 db or maybe a little less. Raise the output level of the DBX enough so that you can level match your speakers.

Maybe I'm wrong, but the issue as far as I understand it is with the analogue to digital conversion and subsequent D/A conversion. The DBX needs to see enough voltage at the input otherwise I'm just reducing it to a 8bit processor (It's a 24bit 48Khz unit). The same goes for the output. The only gain control the DBX has is when configuring the individual crossovers levels. So if the DBX is digitally reducing the gain, you'll loose a bunch of resolution during the final digital to analogue process, correct?

To be clear I'm not having any problems setting the gain overall. The speakers are working correctly and are level matched. But the gain structure IMHO is wrong. I'm attenuating the signal at the output stage of the crossover, which is a no-no with digital because you effectively turn your 24bit processor into a 8 bit processor. If the DBX was an analogue unit, this obviously this wouldn't be a problem.

burnedfingers said:
Lower the input to the DBX. Try to hit it at 0 db or maybe a little less. Raise the output level of the DBX enough so that you can level match your speakers.

No, it doesn't make any sense to attenuate the signal out of the DBX with a pad.

If I lower the input to the DBX then I'm once again compromising the analog to digital to process - I've tried it and it sound grainy. Sure if the DBX was an analogue X-over it wouldn't matter, but I don't understand this approach when using a digital unit. Am I missing something?

Once again, thanks a bunch for helping me out here. To summarize, it seems that your saying that I don't need another gain stage because the DBX has one built in. Sure that'll work, but what I'm saying is, "won't compromise the quality of the D/A process?"
 
burnedfingers said:
Quote:

I'm hitting +10 to +15dbu on peak

I cannot help you if you are unwilling to listen.

BF, I did listen to your suggestion of turning down the gain on my pre-amp (so I'm hitting 0dbu) and increasing the gain on the DBX's outputs (which is done in the digital domain!) but it's a step backward. And yes I have tried it and it sounded grainy (to be expected when your only using 8bits of your 24bit A/D converter!) not to mention it did nothing to improve the output levels the D/A converter since the overall gain is the same. They were still only hitting -30dbu.
 
coloradosound said:

Thanks for the suggestion Coloradosound, but they are a little out of my budget since I'd need three.

Is there any reason something like this wouldn't work? Obviously I'd need a balanced version with XLR connectors and different level of attenuation... so that's why I need to DIY.

http://www.parts-express.com/pe/showdetl.cfm?Partnumber=266-244
 
Follow imix500 advice. Build "U" style pads (as per the page that he linked).

1K, 1K, 1K will give -9.5dB

2.2K, 1K, 2.2K will give -14.6dB

2.2K, 470, 2.2K will give -20dB

2.2K, 220, 2.2K will give -26.3dB

Etc...

I use to do that with Behringer DCX2496 in order to get a better usage of output dynamic range. The upper 20dB are seldom used with most amplifiers otherwise (and the limiters can't be set below -24dB). The resistors will fit easily inside XLR connectors if you have decent soldering skills (the ones from Neutrik already include a plastic sleeve to avoid anything from touching the metal case).
 
Thanks a bunch guys (and gal), that's exactly the information I was looking for :)

burnedfingers said:
Please post the 8 bit conversion information when not slamming the input of the device.

:confused: The DBX doesn't clip until it hits +20dbu acording to the input leds. Unless I'm missing something +10 to +15dbu on peaks is ideal, no? Everything I've read on the subject states that you want to set up your gain so you maximize the resolution of your A/D & D/A converters (without clipping). Most feel that you're even better off using a 6ch volume control after the digital X-overs D/A conversion.

BTW the reference to 8bit was tongue in cheek. I'm sure it's more like 18bits, but you get the idea. :)

Anyone once again thanks for all the help guys. Can't wait to see my output LEDs get past the -30dbu mark.
 
if your going out of a -10 dbm unbalanced out of your pre into a +4dbm ballanced ins to the dbx do you have enough input gain on DBX to make up the 14 dbm? This maybe where the grainyness is comming from. What kind of meters are on the dbx? Full scale (the max meter reading is zero, this is where you run out of bits and headroom) Then you need to know the operating level; what meter reading do you get when you input a +4dbm (1.23v??) signal. It should be around -20db Full scale. This is also your head room. when you feed the dbx unit from your pre, your signal on the meters should be bouncing around this level.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.