Simple. It's "current." What, are you stupid or something? 😀
se
Must be. 😱
Perhaps. But there is something interesting about it: it works. If you don't want to pay for a really expensive cable that offers high performance, going low impedance balanced with do all of what the expensive cables can do without the cost... -as long as the preamp supports the standard.No, it's not.
The 600 ohm "standard" is an irrelevant, antiquated throwback to telegraphy which found its way into telephony when early telephone lines used old telegraph lines, and from there into early broadcast and recording systems.
And the professional world began casting it off decades ago and it's not taken seriously by anyone serious anymore.
se
Hello. I was not being deliberately stunted, I simply didn't think it needed a longer answer because your method seems to makes sense. It looks as if you have far too much gain in the system, and your method seems a reasonable way to deal with that.
One other possible way would be to balance the output of the DAC with the level setting of the pre, so that the pre is not used at the low end of the volume control (...because potentiometers are notoriously poorly matched at the low end. Not normally an issue with stepped attenuators).
Also, if you end up using the preamp volume control (of an active pre with conventional volume-control-first design) at the low end of the range you still get the full background noise of the circuit. This is because the incoming signal is attenuated and the subsequent amp noise is not. However in practice you may never notice it.
Perhaps it seems as though we are off-topic, but we're not.
The original post concerns pre amps which eventually connect to power amps - in traditional terms.
Even though in many cases, we have pre amp stages connected to pre amp stages as in CD out to pre amp box, mic pre out to console fader in, etc.
It is important to discuss power amps and interface details.
To obtain excellent sonic performance between pre and power amps, even when all interface details are optimized still requires proper gain structure/staging, right on through to the speaker.
Regarding the pre I will use:
Gain is 0, +3, +6, +9.5 dB. At 0 gain and no attenuation, it's +/- 0 dB through, at +3 gain and no attenuation, it's +3 through and so on.
It is a precision stepped resistor attenuator. Attenuator is after the volume design, or the opposite of what you stated above.
So I don't have too much gain, I only have too much if I want to.
Because at 0 dB through, I still have the attenuation control and at any gain setting. This is a good arrangement.
So at what level and at what music signal arrangement should a DAC be set to? Whatever it is, there must be an optimum if the DAC has properly designed variable outs.
But now, that optimum level should be fairly wide ranged in order to be adjusted to accommodate the optimum input levels of the pre amp, which in turn, should have a wide range of optimum out levels for the amp, etc to the monitors.
Without proper metering on the face of the devices, one would have to have electrical performance data of all equipment and test from there.
In pro audio, it's normal to have knowledge about the above mentioned gain, although with some devices, the implementation is not very good.
Perhaps. But there is something interesting about it: it works.
Depends how you define "works."
Photographs in your freezer "works."
So most literally anything "works" that anyone cares to lay claim to.
And did it escape your notice that going low impedance ultimately puts you even closer to the situation that exists between an amplifier and loudspeaker?
se
Can anyone build a variable quality control knob(s)?
Several years ago, I had the opportunity to discuss the DSP implementation of such an idea.
Some smart folks they were, having built the Sony Oxford mixing console, digital reverberators and the like.
In very brief terms, a musicological control interface.
I just want to submit this idea here as it related to the possibility to do something new that musically improves standard audio devices and interface, analog or digital.
I think it's best that I move this to a new topic.
Several years ago, I had the opportunity to discuss the DSP implementation of such an idea.
Some smart folks they were, having built the Sony Oxford mixing console, digital reverberators and the like.
In very brief terms, a musicological control interface.
I just want to submit this idea here as it related to the possibility to do something new that musically improves standard audio devices and interface, analog or digital.
I think it's best that I move this to a new topic.
Last edited:
So which is better SS or tubes .........?![]()
This is a useless question. A good design engineer, who knows what he is doing, can build a very good ss (pre)amp as well as a very good tube based (pre)amp. The marketplace abounds.
Someone who doesn't know his stuff can build rotten ss (pre)amps and rotten tube based (pre)amps. The marketplace abounds.
😉
jd
This is a useless question.
No it isn't.
The answer is simply... "yes." 😀
se
'Works' in this case: take two balanced line interconnects, 25 feet long. The 'control' is the cheapest cable you can find, Horizon or similar, common studio cable. The other can be a choice of high end balanced cables, Audioquest, Purist, Tara Labs Zero or similar.Depends how you define "works."
And did it escape your notice that going low impedance ultimately puts you even closer to the situation that exists between an amplifier and loudspeaker?
se
1st test: balanced line passive control, 50K resistance. At most volume settings, audible differences between an expensive cable and the control can be heard and measured- the expensive cables will easily win out.
2nd test: balanced line stage that supports the 600 ohm standard, with 600 ohm termination at the amp end. No audible differences between cables will be heard and no measurable differences.
'Measurable' = frequency response test with calibrated microphone from the listening chair.
and-
yes, I did. It takes power to drive 600 ohms, so the line stage I use, like any other tube device that supports the 600 ohm standard (Ampex 351 electronics for example) is a scaled down power amp with line-stage-style voltage gain.
'Works' in this case: take two balanced line interconnects, 25 feet long. The 'control' is the cheapest cable you can find, Horizon or similar, common studio cable. The other can be a choice of high end balanced cables, Audioquest, Purist, Tara Labs Zero or similar.
1st test: balanced line passive control, 50K resistance. At most volume settings, audible differences between an expensive cable and the control can be heard and measured- the expensive cables will easily win out.
2nd test: balanced line stage that supports the 600 ohm standard, with 600 ohm termination at the amp end. No audible differences between cables will be heard and no measurable differences.
'Measurable' = frequency response test with calibrated microphone from the listening chair.
and-
yes, I did. It takes power to drive 600 ohms, so the line stage I use, like any other tube device that supports the 600 ohm standard (Ampex 351 electronics for example) is a scaled down power amp with line-stage-style voltage gain.
First, please describe how you kept the listening tests double blind.
Second, why on earth are you comparing to a 50k ohm potentiometer (i.e. "passive" preamp) driving 25 feet of cable? This was about a preamp driving a cable into a 600 ohm load versus a much higher load such as 10k or more.
Third, why on earth are you measuring frequency response using a microphone measuring from the speaker output? Why don't you measure frequency response at the load?
Fourth, wouldn't a more appropriate test be to keep everything else the same EXCEPT for the load impedance you're driving through the cable?
se
First, please describe how you kept the listening tests double blind.
There is no need with objective tests, but I have yet to be convince the double blind tests are worth anything, due to the interconnectivity issues required, and this **is** about interconnectivity. Having proved that the 600 ohm termination works though, it should be possible to design a rig that can overcome the switching and impedance issues introduced by blind tests, but that's for another day.
Second, why on earth are you comparing to a 50k ohm potentiometer (i.e. "passive" preamp) driving 25 feet of cable? This was about a preamp driving a cable into a 600 ohm load versus a much higher load such as 10k or more.
This test suits that nicely: its about a preamp driving a cable into a 600 ohm load versus a much higher load such as 10k or more. We are using the same source, a phono preamp, so if the passive guys were right then the passive system should have beat out the line stage easily. It didn't. Of course we can argue that the phono can't drive cables that long, blah blah but that is rather the point of it, a line stage should control the cable, IOW make the artifact of the cable inaudible. Its just like the long speaker cables test I mentioned earlier- do it in a way that you can get easily repeatable results.
Third, why on earth are you measuring frequency response using a microphone measuring from the speaker output? Why don't you measure frequency response at the load?
Since we can hear it, can we measure it? Well, yes, apparently we can😀
Fourth, wouldn't a more appropriate test be to keep everything else the same EXCEPT for the load impedance you're driving through the cable?
se
No. The point is to demonstrate whether or not the 600 ohm standard is worth a hoot with a line stage that supports the standard, as opposed to keeping things as simple as possible (passive) and no standard. This is what audiophiles do all the time and you can see it on this thread. Turns out, the passive deal does not work, does not stand a chance against a line stage that gets rid of cable coloration, and for no reason other than all cables will exhibit coloration. As I have mentioned previously, **this is why the balanced line system was created**. This is why Mercury could park their recording truck behind Northrup Auditorium in Minneapolis, run their mic cables 250 feet (the recorders were permenently mounted in the truck) and get recordings that are highly respected to this day. This is why all LPs made in the 50s and 60s can rival the best of what is made today- the cables in their systems had no artifact, despite them being in many cases well over 100 feet.
I have produced a fair number of recordings myself and several of them have required that I use very long interconnects. In one case they were about 250 feet. In another case (in the same hall) I was faced with either run long speaker cables or run long interconnects to run a PA. I tried both, and the difference in vocal intelligibility was profound when we went to the balanced lines instead of long speaker cables. But I have also done that test at home, many times. Its easy to prove.
If the driving equipment is not up to the task, you cannot conduct the test; you need the line stage with that ability. So the test is only interested in audible differences between cables that are easily heard and measured and it does that, and at the same time shows that an active line stage, if built right, can beat the pants off a passive, while at the same time showing that the differences between the cheapest and most expensive cables completely disappears. For real. This is not something that cable manufacturers like to hear. They think it puts them out of a job.
This is something that anyone who does this test can say with conviction. You should try it.
There is no need with objective tests, but I have yet to be convince the double blind tests are worth anything, [snip].
Well, yes, but your whole argument hinges on the fact that there is an audible difference. Which still hasn't been established (except your statement that you can clearly hear it).
And 'measuring' cable differences with a mic in front of the speaker, are you kidding?
jd
This test suits that nicely: its about a preamp driving a cable into a 600 ohm load versus a much higher load such as 10k or more.
No, it's about driving a cable from a low impedance source into a high impedance load versus driving a cable from a 600 ohm source into a 600 ohm load.
The output impedance of a 50k potentiometer will vary from approximately the output impedance of the source driving it to upwards of 12.5k ohms.
We are using the same source, a phono preamp, so if the passive guys were right then the passive system should have beat out the line stage easily.
None of the "passive guys" would EVER recommend a passive for driving 25 feet of cable. It's absolutely ludicrous to use a 50k passive to make your point about 600 ohm loads.
No. The point is to demonstrate whether or not the 600 ohm standard is worth a hoot with a line stage that supports the standard, as opposed to keeping things as simple as possible (passive) and no standard.
Again, it's an absolutely ludicrous comparison.
And with respect to OUR discussion, it had to do with using a passive attenuator as the LOAD, i.e. mounted inside the amplifier.
This is what audiophiles do all the time and you can see it on this thread. Turns out, the passive deal does not work, does not stand a chance against a line stage that gets rid of cable coloration, and for no reason other than all cables will exhibit coloration.
No, audiophiles are NOT driving 25 feet of cable with passive preamps.
As I have mentioned previously, **this is why the balanced line system was created**.
No. The balanced line system was created for NOISE REJECTION.
This is why Mercury could park their recording truck behind Northrup Auditorium in Minneapolis, run their mic cables 250 feet (the recorders were permenently mounted in the truck) and get recordings that are highly respected to this day. This is why all LPs made in the 50s and 60s can rival the best of what is made today- the cables in their systems had no artifact, despite them being in many cases well over 100 feet.
The reason why the professional world went from 600 ohm matched impedances to low source/high load impedance was to MINIMIZE the effects of cables.
And just so you know, the "600 ohm standard" WASN'T just the load impedance. It was also the SOURCE impedance. The 600 ohm system was an IMPEDANCE MATCHING system.
But in audio, the goal isn't to transfer maximum POWER as it was for telegraphy and telephony. The goal is to transfer maximum SIGNAL. Which in this case is VOLTAGE. So the professional audio world eventually ditched the IMPEDANCE MATCHING system in favor of a VOLTAGE MATCHING system, which is characterized by low source impedance and high load impedance.
By dropping the load impedance down to 600 ohms, you make the cables even MORE of an issue than the would be otherwise.
se
But in audio, the goal isn't to transfer maximum POWER as it was for telegraphy and telephony. The goal is to transfer maximum SIGNAL. Which in this case is VOLTAGE. So the professional audio world eventually ditched the IMPEDANCE MATCHING system in favor of a VOLTAGE MATCHING system, which is characterized by low source impedance and high load impedance.
No, no, and no.
Professional audio world had to abandon power matching because output and input resistances of semiconductor devices are more non-linear than of vacuum tubes. That means, power losses were introduced as trade-off to minimize non-linear distortions caused by usage of new active devices. But still, professional audio world still uses balanced ins / outs, and even transformers in high end studio gear.
Also, when professional audio equipment went in masses in order to cut costs transformers were eliminated. But top end of audio gear still uses them.
No, no, and no.
Professional audio world had to abandon power matching because output and input resistances of semiconductor devices are more non-linear than of vacuum tubes. That means, power losses were introduced as trade-off to minimize non-linear distortions caused by usage of new active devices. But still, professional audio world still uses balanced ins / outs, and even transformers in high end studio gear.
Also, when professional audio equipment went in masses in order to cut costs transformers were eliminated. But top end of audio gear still uses them.
But balanced connections are done to cancel interference & hum and anything else that is equal on both lines. It can be done with high or low source impedance and high or low receiver impedance or any combination. The only requirement for cancellation (which is why you go to balanced in the first place) is that both lines 'see' equal impedances.
jd
I am sorry but it seems that you are missing my points with intention. My experiments were not conducted to satisfy your marketing issues, they were designed to answer questions that I had about the 600 ohm standard.No, it's about driving a cable from a low impedance source into a high impedance load versus driving a cable from a 600 ohm source into a 600 ohm load.
The output impedance of a 50k potentiometer will vary from approximately the output impedance of the source driving it to upwards of 12.5k ohms.
None of the "passive guys" would EVER recommend a passive for driving 25 feet of cable. It's absolutely ludicrous to use a 50k passive to make your point about 600 ohm loads.
Again, it's an absolutely ludicrous comparison.
And with respect to OUR discussion, it had to do with using a passive attenuator as the LOAD, i.e. mounted inside the amplifier.
No, audiophiles are NOT driving 25 feet of cable with passive preamps.
No. The balanced line system was created for NOISE REJECTION.
The reason why the professional world went from 600 ohm matched impedances to low source/high load impedance was to MINIMIZE the effects of cables.
And just so you know, the "600 ohm standard" WASN'T just the load impedance. It was also the SOURCE impedance. The 600 ohm system was an IMPEDANCE MATCHING system.
But in audio, the goal isn't to transfer maximum POWER as it was for telegraphy and telephony. The goal is to transfer maximum SIGNAL. Which in this case is VOLTAGE. So the professional audio world eventually ditched the IMPEDANCE MATCHING system in favor of a VOLTAGE MATCHING system, which is characterized by low source impedance and high load impedance.
By dropping the load impedance down to 600 ohms, you make the cables even MORE of an issue than the would be otherwise.
se
You are a cable guy and this is your website: Q
I am sure that it galls to find out that there is a way around using exotic cables that really works. FWIW I still recommend exotic cables to people for use with RCA connections, tone arm wiring and of course speaker cables, as I am all too aware of the fact that cables do otherwise make a difference in high impedance applications. As I have pointed out earlier, the longer cables are used to make the work easy, but we could do it with a 1 meter cable and the results are identical: you can't tell the difference between the cheapest cable made and the most expensive and no-one can change that as it is not a matter of system resolution. The system works.
But balanced connections are done to cancel interference & hum and anything else that is equal on both lines. It can be done with high or low source impedance and high or low receiver impedance or any combination. The only requirement for cancellation (which is why you go to balanced in the first place) is that both lines 'see' equal impedances.
Right. The question is, what values of that equal impedances? To be equal they have to be measurable.
Cheap mass production semi-professional gear has "electronically balanced" ins/outs. Expensive gear still has transformer balanced ins/outs. If transformers are not properly loaded they ring.
Why 600 Ohm?
Calculate characteristic impedance of long lines of that era: 1 foot between wires, 6-8 gauge wires.
Last edited:
Professional audio world had to abandon power matching because output and input resistances of semiconductor devices are more non-linear than of vacuum tubes. That means, power losses were introduced as trade-off to minimize non-linear distortions caused by usage of new active devices.
Nonsense.
Power losses were not any sort of trade-off because your goal isn't the maximum transfer of power, it's the maximum transfer of signal. Impedance matching was only critical for telegraphy and telephony where they were transmitting over many MILES.
But still, professional audio world still uses balanced ins / outs, and even transformers in high end studio gear.
Balanced ins and outs is not the same as impedance matching.
Balanced refers to equal impedances of each line with respect to ground. Not a balance of impedances between source and load.
Also, when professional audio equipment went in masses in order to cut costs transformers were eliminated. But top end of audio gear still uses them.
Yes. And for line inputs, they most commonly use 10k, not 600 ohms.
Again, the "600 ohm standard" is an irrelevant, antiquated throwback to telegraphy and telephony where information was being transmitted over wires that were miles long and which behaved like transmission lines.
That's not the case here.
se
Right. The question is, what values of that equal impedances? To be equal they have to be measurable.
Again, you're confusing impedance matching with impedance balancing.
In impedance balancing, it doesn't matter what the equal impedances are so much as that they're equal and hence balanced. That balance is required in order for a differential input to most effectively reject common-mode noise.
And typical input transformers present about a 10k load to the source driving them. See for example Jensen's JT-11P-1 or CineMag's CMLI-15/15B. Both examples of line input transformers.
se
- Status
- Not open for further replies.
- Home
- Amplifiers
- Solid State
- preamps, ss versus tubes