Good Line Preamplifier, 600 Ohm load?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
When seeing many line peamplifiers, they can drive '600 Ohm'. But is this really necessary? I would say it is enough to be able to drive like 2k Ohm. Why should a preamplifier be able to do 600 Ohm? What is your thoughts on this? What would you say about preamplifer load? What should a good preamplifier be able to do?

(I have a project going to make a good line preamplifier.)
 
I don't think driving 600ohm is necessary. Its more a statement that the output stage is more than man enough for the job. Overkill tends to be desirable in audio.

Nowadays there are a few classD amps with lowish input impedances (2k-3k or so), I'd want to be sure I could drive those. If the drive voltage is just 1VRMS (a fairly typical poweramp input sensitivity) then even 600ohms needs no more than 2.5mA peak, not too hard to provide.
 
But is this really necessary?


The lower current deviation in the any output stage - the lower distortion. Or you need to provide excessive OLG around OPS.

I have a project going to make a good line preamplifier.


IMG_6511.JPG
Or just force high-GBW opamp output with simple EF having appropriate current source at the load.

Having 2k input resistanse it’s easy to provide at least ~80 dB regulation with 10k linear potz in front of it:
IMG_5322.JPG
 
The '600 Ohm' standard exists for historical reasons from telecomms and dBm definition etc. Not really relevant now unless you have a particular requirement. I would generally want to be able to go into the front end of a mic preamp. That might have an input Z around 2K or so - I'd take a design value of 1K5 or so.
 

PRR

Member
Joined 2003
Paid Member
> want to be able to go into the front end of a mic preamp.

But never straight from a line-level?!

A 20dB pad can be devised to show over 5K to the source.

600 Ohms is still an option for some test gear.

The '5532 chip and friends will drive <600r, so a preamp-maker may as well use that in the specs.

Many modern power amps and other interfaces have input impedance as low as 10K. A monster-system might drive a dozen power amps parallel (unlikely, but it could happen). The resulting 800r load is still OK for a 600r output.

Transformers suck energy beyond their nominal impedances especially in bass. While 10K:10K would be fine for most hi-fi interfaces, 600:600 lumps are more common. Cheap "600:600" iron may be well below 600r before 50Hz (but is not a best choice for fidelity).

> a good line preamplifier

Stated JUST like that, you intend to drive "lines", miles of wire. Which does imply some grunt. However in hi-Fi we rarely face >30' 10m of "line". While this may strain a small plate, it is easy for most ample outputs. Line Inputs are mostly =>10K, and you might drive a few, or just one. Your math seems fine to me.
 
It is a difference who the preamplifer is made for. If the preamplifier is made for the market, it is good with 600 Ohm capability. If I build an amplifier for myself, I know the need and can design for this only.

It is also a question of output voltage.
Harder to drive 4 Volt across 600 Ohm than 2 Volt RMS.
The resulting is a current. Some opamps have limited current output.
 
Last edited:
Founder of XSA-Labs
Joined 2012
Paid Member
600ohms is needed to match impedance of balanced lines that are needed in professional audio where mics are on stage and mixer engineer is hundreds of feet back where audience is. Those cables and gear are 600ohms to keep noise low over long runs. Nowadays they use remote wireless mics. However, there is still a need to connect long XLR patch cables between gear, and all those are designed to 600ohms.

Dedicated 600ohm balanced line driver chips are ideal for applications where you reallly need to push long cables.
 
Last edited:
600 ohms is a historical artifact of the early telecom days. Back then, there was no such thing as an amplifier. The energy to drive the speaker at the receiving end came entirely from the microphone. This meant the maximum power transfer theorem came into play, and it became important to match the impedances at each end. But what impedance to use? Also back then, there was no telecom network, so early telecom traffic utilized the comparatively extensive telegraph network. In order to control transmission line reflections caused by long connections, it was necessary to match impedances to the characteristic impedance of a telegraph line. Such lines were typically 6 gauge (I think) wire strung vertically 12 inches apart on a wooden pole, and the characteristic impedance of such is right about 600 ohms. That’s where it all began, and the audio and telecom industries have been saddled with it ever since.

These days, signal transfer has become more important than power transfer, and a low impedance transmitter working into a high impedance receiver has become standard for just about everything except for runs long enough to create reflections.

That said, 600 ohms is still frequently used as an indication of drive capability, and has become a de facto measuring stick.
 
I’m working from memory here, but a 20KHz signal has a wavelength of about 20,000 feet. The rule of thumb I was taught is that trasmission line effects can be ignored until the run length exceeds 10% of the wavelength, or about 2000 ft. Even most pro audio runs are less than that, so I’m not sure even the pro audio world cares much about 600 ohms, except as an arcane figure of merit.
 
Yep! I was just checking up on VF for common TLs ...
Velocity factor - Wikipedia
So 20K ft is probably not far off for telecoms twisted pair assuming its like poor quality network cable.

In amateur radio, we use the 10% rule all the time. Anthing longer than that and we start using matching networks (ATUs) and/or open ladder line, mainly to avoid power losses and blowing up transmitters!
 

PRR

Member
Joined 2003
Paid Member
"600" is indeed an artifact. It solidified around the VU Meter spec, ~~1938.

Before that, meter signals were generally transformed to 500 Ohms and "0" was 6mW, not 1mW.

While the first transcontinental line ran near 600 Ohms, with the rise of amplifiers and rise of demand the telco switched to twisted-pair cable. As you know from ethernet, these always run near 100 Ohms over long runs. Indeed much work was done at nominal 150 Ohms, easily transformed to 600 with standard coils.

Yes, reflections are a problem out at many miles. Especially for duplex (talk both ways on one pair). However the line impedance is distinctly non-resistive. When interfacing such lines you have a complex termination, adjusted on site (and sometimes auto-adjusted with temperature).

Modern practice over "Professional Audio" lengths is almost to drive with 60 Ohms and load high-Z. Single-ended termination kills the first reflection. Many hi-Z loads may be Y-ed on one source if needed. Level is about the same if a true-600 load gets onto the line. Look inside most semi-Pro boxes, the source is a couple opamps and a couple 47r resistors (60, 94, what's the difference?). Sadly the hi-Z loading is often violated with a 1-opamp diff-amp full of 10K resistors, presenting different impedance in different modes.

Also to my mind, "600 Ohms" implies "+18dBm". Yes, in-console mike amps were often lower, and clearly specced, program amps higher. But generally peak levels were +8dBm for live speech and +4dBm for recording work. Taking 10dB lead for ephemeral speech and 14dB lead for re-recording, +18dBm is a minimum (>+24dBm is common). An old Dynaco could "drive 600r", but at -24dBm with resistor pad, -3dBm with transformer; these are not "pro" levels.
 
600ohms is needed to match impedance of balanced lines that are needed in professional audio where mics are on stage and mixer engineer is hundreds of feet back where audience is. Those cables and gear are 600ohms to keep noise low over long runs.

Impedance Matching is not an issue. There is nothing '600 Ohms' about the multicore audio cabling typically used for live analogue audio.
 
> want to be able to go into the front end of a mic preamp.

But never straight from a line-level?!

A 20dB pad can be devised to show over 5K to the source.

600 Ohms is still an option for some test gear.

The '5532 chip and friends will drive <600r, so a preamp-maker may as well use that in the specs.

Many modern power amps and other interfaces have input impedance as low as 10K. A monster-system might drive a dozen power amps parallel (unlikely, but it could happen). The resulting 800r load is still OK for a 600r output.

Well I guess it depends on what the OP means by 'Line Level'. I wasn't thinking that it would necessarily be the typically standard +4dBu nominal +21dBu peak or so. But yes - unlikely but I'd want a design to be able to go into a mic pre without falling flat on its face.
As you say, a 5534/2 (or similar) can handle 600R so can be spec'd at that. Although might be 'pushing it' if same op amp stage also driving lowish impedance feedback network resistors.
My only real concern with '600 Ohm' specs is that it sometimes leads to confusion wrt need for impedance matching.
600R test gear - yes indeed - although I'd class that as a 'particular requirement'.
Cheers.
 
Founder of XSA-Labs
Joined 2012
Paid Member
Impedance Matching is not an issue. There is nothing '600 Ohms' about the multicore audio cabling typically used for live analogue audio.

Cables for professional applications where hundreds of feet run are indeed 600ohms. It makes a difference for long runs if you want to have best propagation and minimal reflections and losses.

The value of 600ohms is not magical, just needs to be consistent. RF cables are often 50ohms and 51ohm resistor is used as terminator to prevent reflections.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.