Mumbo-Jumbo and power supply caps

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Sorry that I have not read EVERY page of this thread...so maybe this is already been ruled out, but...

Pretty much gut feel, but, it seems to me that you might be shunting a difference in the output transformer primaries (between the channels).
I think I recall that this power supply cap directly feeds both output primaries right?

It could be that there is a small difference in parasitics in the primary windings, between the two channels. The difference is unaffected (or not totally normalized) by the large electro with it's ESR but is being shunted by the small cap with low ESR. Since it is a difference that is being normalized between the channels, it might be noticed by a change in imaging.

You might be able to measure this by comparing phase distortion on each channel, with and without the bypass cap. It might be pretty subtle though.

Something to think about anyway.

That would explain why the bypass cap had little impact on my amp as I have only one primary winding in my power trafo feeding both channels.
 
Hello,
I believe DF96 has a good idea. Assuming that the distortion can be measured the contributing factors can be identified with a carefully selected set of methods and tools.
The first chunk i would chip off of the amplifier is the power supply and insert the best available shunt regulator and remeasure. Does the measured distortion improve? If it does we have made a large step towards a solution of the puzzle and the enjoyment of the amplifier.
DT
All just for fun!
 
Don't we expect phase to change with frequency in a bandlimited amplifier? How does that become distortion?

Yes. Of course. I expect all amplifiers to have some distortion. But it is still distortion. It doesn't go away just because we expect it. But, yes, it does fall out of being band limited, just like all amplifiers. No ideal transfer function found on my planet :) But good to point out none the less.

But if it is different for the channels (which it will be to a certain extent) and if that difference can be traced to differing parasitics in the OPT primaries and if the bypass cap has an affect on that, then it could be why he hears improvement with imaging. Enough caveat's there?
 
Hello,
... Does the measured distortion improve? If it does we have made a large step towards a solution of the puzzle and the enjoyment of the amplifier.
All just for fun!

Yes, indeed, just for fun and interest, since he is already enjoying his amplifier, regardless of what caused the improvement!

Actually if the root cause were uncovered, we would have to rename this thread.
 
Don't we expect phase to change with frequency in a bandlimited amplifier? How does that become distortion?

Yes. Of course. I expect all amplifiers to have some distortion. But it is still distortion. It doesn't go away just because we expect it. But, yes, it does fall out of being band limited, just like all amplifiers. No ideal transfer function found on my planet :) But good to point out none the less.

But if it is different for the channels (which it will be to a certain extent) and if that difference can be traced to differing parasitics in the OPT primaries and if the bypass cap has an affect on that, then it could be why he hears improvement with imaging. Enough caveat's there?

I think you are missing my point. How did simple phase shift get defined as distortion?
 
I think you are missing my point. How did simple phase shift get defined as distortion?

oh sorry Michael. Phase distortion (at least in some circles) is defined as phase shift that is non-linear over the target frequency band. This is more typically applied to filters, but then an amp is a always a filter. We deal with this in instrument (industrial, T&M) amplifiers.
 
Hello Michael,
I am guilty of going south with the phase thing. A little phase shift is just a tic or 2 on the atomic clock, in terms of time, hardly important to perception of the ear and brain. The trace on the O-scope appears different but sounds exactly the same.
DT
All just for fun!
 
oh sorry Michael. Phase distortion (at least in some circles) is defined as phase shift that is non-linear over the target frequency band. This is more typically applied to filters, but then an amp is a always a filter. We deal with this in instrument (industrial, T&M) amplifiers.

I can certainly believe that unequal phase shift between the 2 channels can cause delay differences and a perception of stereo image instability. Not sure how a common B+ causes this though. I recall that both timing and loudness are stereo image cues. Not sure about distortion. So there's the clue to my line of reasoning...

But phase shift is by definition not linear with frequency in any bandlimited amplifier (and all amplifiers are bandlimited to some extent), but this is the first time I've heard phase shift called distortion. I.e. if you don't add any spectral information, how can it be "distortion"?

If you're trying to imply that phase shift changes the information content of a signal, I'll need some references or something so I can get educated :confused: Please point me to the circles that understand phase shift as a form of distortion, or at least address the question in some rigorous way.

Thanks,

Michael

PS here's a question: If I modify the frequency response of an amplifier using a tone control that introduces no new spectral components, would you define the effect as "distortion"?
 
Last edited:
If a signal path has phase shift as a function of frequency, would that not distort the ability to precisely reproduce a signal? Example a square wave is made up of infinite odd harmonics, the higher the frequency of the harmonic, the more it is shifted by phase delay, the less accurate the reproduction becomes. If the reproduction is not accurate, would that not be distorted?
 
Disabled Account
Joined 2010
We hear sound as pressure if that pressure is different can we perceive it?
A speaker cone can only react to the magnetic field of a magnet so it cannot go forward and back at the same time!
There is more to image than volume "loudness".
Doppler effect comes to mind.


Regards
M. Gregg
 
We hear sound as pressure if that pressure is different can we perceive it?
A speaker cone can only react to the magnetic field of a magnet so it cannot go forward and back at the same time!
There is more to image than volume "loudness".
Doppler effect comes to mind.


Regards
M. Gregg

:snowman2: Hello M Gregg,
Are you speaking of “Doppler distortion”, where the speaker cone is moving at a “low frequency” and there is a higher frequency super imposed? Like the train wissle of a passing train.
Pick a point, any point and track the velocity of the speaker cone. Even with a positive motion (or negative) that positive motion will continue with varying positive velocity due to the superimposed higher frequency .The higher frequency is still there and yes you can hear it. The cone need not change direction.
DT
All just for fun!
 
:snowman2: Hello,
If a tree falls in the woods and no one is there to hear it does it make a sound?
Timbre, this is well studied. Phase does not affect timbre. You cannot hear phase.
DT
All just for fun!

If left and right channel have different fase from each other, your stereo image goes through the drain! Timbre is far less affected, but imagine this:
a piano note hits. It has a impulse (attack) and sustain.
What if the fase shift is such that you hear the harmonics before the attack?
 
:xmas:Just a quick question.

What happens to the image if you reverse the left or right speaker connections?


Regards
M. Gregg

My first response is try it and see. It takes about 30 seconds.

But a direct answer is that the center of the stereo image moves to an apparent position behind the listener in most rooms. If you know what to listen for, it's obvious immediately. But I've been to some audio shows where it sure sounded like they had the polarity reversed between speakers. I always wonder whether to point it out...
 
Last edited:
If left and right channel have different fase from each other, your stereo image goes through the drain! Timbre is far less affected, but imagine this:
a piano note hits. It has a impulse (attack) and sustain.
What if the fase shift is such that you hear the harmonics before the attack?

:snowman2: Hello,
Your cognitive processor (mine too) is easily distracted by what ifs like what is the sound of one hand clapping, hopefully we do not lose any sleep. How small of time difference can we differentiate? We walk into a room with the grandfather clock ticking. Surely we can tell the tick from the tock. Before long it is tick-tock, not tock-tick. A trained Cardiologist can differentiate the timing of a hearts’ rhythm. The split in timing is equivalent to the hooves of a horse striking the ground. Different abnormal heat beats are characterized as different types of gallops. The violin A string is 440 Hz or about 0.002 seconds per cycle, half a cycle is about 0.001 seconds. The second harmonic is half that. If the second harmonic of the reproduction of the violin A string is a few degrees off and comes before the fundamental that is in the range of 0.0001 seconds and is not detectable, even between stereo channels, in the Tic-Tock of the brain clock. Try not to lose any sleep.
DT
All just for fun!
 
I can certainly believe that unequal phase shift between the 2 channels can cause delay differences and a perception of stereo image instability.

Yes definitely.

Not sure how a common B+ causes this though.

Not by itself, but (and here is the heart of the speculation) if the bypass cap changed the effect of parasitics in the OPTs and those were different enough between the channels, then a change in imaging could occur. But the same thing could happen with split supplies so I don't attribute any significance to the shared supply.

I recall that both timing and loudness are stereo image cues. Not sure about distortion. So there's the clue to my line of reasoning...

Ditto. I don't know either. Just speculation. Seems like differences in phase behavior would though.

But phase shift is by definition not linear with frequency in any bandlimited amplifier (and all amplifiers are bandlimited to some extent), but this is the first time I've heard phase shift called distortion. I.e. if you don't add any spectral information, how can it be "distortion"?

We've probably hijacked this thread enough already. But what I am saying is that the parasitics in the OPT's form a filter. Phase distortion is part of filter analysis. I know we look at it in video signals. In fact I think someone asked about video measurements.

If you're trying to imply that phase shift changes the information content of a signal, I'll need some references or something so I can get educated :confused: Please point me to the circles that understand phase shift as a form of distortion, or at least address the question in some rigorous way.

In some fields, like test & measurement systems, yes definitely!
OK, here is reference, see page 37.
http://www.tek.com/Measurement/App_Notes/25_7075/eng/25W_7075_3.pdf
and an audio oriented one:
Phase Distortion article
and a general note
Phase distortion - Wikipedia, the free encyclopedia
 
Last edited:
Well, if you add the notion of a "linear distortion" then my filter example above causes distortion :p

The Tek appnote television signal example illustrates that the information content isn't altered by the "linear distortions", only the decodability of the signal.

The Wiki definition sounds sort of like the Tek "linear distortion" concept, with no references at all.

The "Music 108" school paper asks some interesting questions but there is nothing there or in a brief review of the cited sources that can't be explained by the effect of phase shift on peak signal amplitude and how that interacts with the nonlinear processes in some amplifiers, loudspeakers, and human hearing. Many of the sources describe said effect. This paper makes some questionable assertions (e.g. states that phase linear source material and a phase linear system is necessary to prove or demonstrate phase distortion) and does the usual "ABX is flawed"disclaimer... so I wouldn't cite this as the only academic source.

The interesting questions raised are about individual sensitivity because to me that makes the nonlinearity of human hearing an interesting explanation for phase sensitivity. But it's still a leap from there to a general notion of phase distortion or a particular phase arrangement of partials as being "right" or "wrong". As someone pointed out earlier, move your head an inch and all the phase changes again.

Cheers,

Michael
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.