Mumbo-Jumbo and power supply caps

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
The interesting questions raised are about individual sensitivity because to me that makes the nonlinearity of human hearing an interesting explanation for phase sensitivity. But it's still a leap from there to a general notion of phase distortion or a particular phase arrangement of partials as being "right" or "wrong". As someone pointed out earlier, move your head an inch and all the phase changes again.
Michael

Yes, but could it change perception of imaging in a stereo context?

The sources were just from a quick web search. There are I am sure, better out there. I deal with some phase modulated systems so the distinction between content and decodibility is blurred. But not germane to the discussion.
 
The interesting questions raised are about individual sensitivity because to me that makes the nonlinearity of human hearing an interesting explanation for phase sensitivity. But it's still a leap from there to a general notion of phase distortion or a particular phase arrangement of partials as being "right" or "wrong". As someone pointed out earlier, move your head an inch and all the phase changes again.

Cheers,

Michael
Hi!

Well yes, but after you stop moving your head, the fase shift stays the same.
But if the amplitude of the signal modulates the faseshift, so it changes constantly...then we have a fuzzy stereo image.

My theory is this: (i dont think it can be proved)

Our brain has some kind of calibrating mechanism. In nature, If we hear a sound, we determine its location. Then our brain sets volume, volumedifference between left and right ear, tone, timbre and fase as default parameters for that sound.
If the sound source moves, our brain compares it to the default parameters, to determine if its the same soundsource. Back to our hifi set.
If faseshift between channels isnt constant we cant calibrate.
If the constant faseshift between both channels is too large, ie when you dont sit in the sweetspot, but far to the right or left, we cant calibrate because its unnatural for a soundsource to have so much faseshift. Ie we see it almost as two different soundsources.
Lets realise us that by moving a meter left or right, the faseshift for higher frequencys is far more then 180 degrees. Thus the effect from coming from behind is only from an exact 180 degrees faseshift on all frequencys.
Maybe this also explaines wy the sweetspot can have a certain dimension,
If the faseshift becomes more then 360 degrees or so for the groundtone of an instruments tone it playes, it becomes hard to localize...

Theory, theory... And maybe rubbish.
But in practice, all amplifiers wich took extra care to preserve fase info, with as simple schematics possible all shined in stereo reproduction....
 
If phase changes with amplitude, then by definition it's phase intermodulation distortion. It produces spectral components that were not there in the original signal.

If phase changes with frequency, it's some kind of "linear" phase distortion. No spectral components are added to the signal. To me this is simply phase shift but I don't mind calling it distortion as long as it's clear no added components i.e. "information".

This is I believe an important distinction but it seems to have degenerated into semantics. I find the difference to be important in my work.

Back to the original question, I can see how the mechanism of phase intermodulation distortion, phase change with amplitude, could potentially cause an unstable stereo image. I can also see how the common B+ connection could cause gain change with amplitude resulting in similar effects.

It would have been interesting to see some physical measurement to confirm the theory...

Cheers,

Michael
 
If a signal path has phase shift as a function of frequency, would that not distort the ability to precisely reproduce a signal? Example a square wave is made up of infinite odd harmonics, the higher the frequency of the harmonic, the more it is shifted by phase delay, the less accurate the reproduction becomes. If the reproduction is not accurate, would that not be distorted?

Waking the dead here, as it seems like the issues raised in this thread were not resolved.

With a square wave, if the phase shift rises with the harmonic to where it is triple at three times the frequency, there's actually no phase distortion. Phase that increases linearly with frequency is what we get from a simple uniform time delay. In that case a square wave still looks square, but if there were a non-delayed signal also seen on one scope, they'd be offset horizontally. Not related to the amps with poor slewing, the term phase linear comes to mind. With analog video, the phase shift that actually is a problem is called group delay distortion. It's usually visible as various types of horizontal smearing on a scan line. Non-uniform delay adds phase shift that can severely distort a square wave, making that an excellent test signal for simple tests. Gear like the HP 3575A Gain and Phase meter can be used to compare two channels or inputs versus outputs. Gear like the HP 4800A Vector impedance meter is particularly useful for checking output transformers. Both are helpful in getting the desired gain/phase response needed for stable feedback design.

In single channel audio, outdoors, I think phase distortion is most audible at low frequencies in the sound and feel of the whack of a snare drum. Although there is strong low frequency energy the abruptness of the transient cares some energy higher in the spectrum (as compared to disco boom, boom, boom). Roll off in the low end that isn't severe enough to affect the sound of disco much, can still have enough phase shift to affect the sound of the snare drum. It's also clear visible as tile on a low-frequency square wave.

In the days of vinyl, there was always the issue of rumble. Not so much the vibration from an inferior turntable, we got rid of those, but the thumps and bumps from a warp in a record, and the slower waddle from a hole slightly of center (that also caused wow). Figuring out how to preserve the low end phase while dealing with the warp bump was the issue. I designed a preamp that did by doing something unusual. But around that time the AES decided to add a 50 Hz LF rolloff breakpoint to the original RIAA curve. Well that made things uniform between preamps, but it degraded phase response of ALL made since that complied because there was no inverse characteristic ever put in recordings (that would have caused other problems like groove jump). So all modern phono preamps following the updated RIAA/EAS response have low-end phase distortion. Beyond a very few non-retail preamps for a small group, I moved on figuring the world had written off vinyl and that it had become irrelevant, but maybe I still should have told people what I did. The arrival of the digital era did result in many ideas sitting on the shelf.

At high frequencies, with I think pertains more to the things heard that led to this thread, time delay is important to our perception. In mono, we still can perceive depth. It involves the direct sound, and the delays of reflections, and the spectrum of the reflections and the delay across the spectrum. There's also the matter of decay of energy, and absorption with frequency. Ringing due to loop stability issues, or how unbalanced or leakage inductance interacts with a power supply and transformer/circuit capacitance, might have some audible effect. In a frequency range where undesired LC issues are present with a supply influence, perhaps there's some coupling between channels. The behavior may be a bigger issue driving the crossover seen in normal use, but not significant with a resistive load under testing. If not a channel-to-channel phase issue, perhaps to of the power supply grunge, which resembles filtered full-wave rectified audio in a push pull amp, is giving us false and dirty depth cues from the even harmonics in grunge.

This isn't like trying to find distortion amidst a complex signal. Anything strange on the supply line should be easily viewed if the signal and load conditions are the same as when listening. If it really isn't visible at the caps, then it may be showing up through a ground loop. If it is cross channel, it should be audible from the second channel when that one has the input terminated. Maybe, with clamp diodes for protection from the charging spike, a small cap should be coupled from the power supply to another amp to just listen to it. Ears make great test gear.

Actual incidental phase modulation at high frequencies is most likely to be seen from junction capacitance modulation in current sources and the like. Some would be seen when there's large amplitude non-linearity where the signal swing modulates gain which in turn modulates the Miller effect. That's analogous to the voltage-variable emulated inductor circuit, the reactance tube, found in vintage color tv. Non linear gain-bandwidth modulation often shows up as differing overshoot behavior on the top of a large signal square wave than seen on the bottom.

I don't claim ever hearing it, but I also theorized low frequency phase intermodulation when a large signal swing modulates cathode current, changing the impedance seen looking back into the cathode, modulating the break frequency of the bypass capacitor.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.