Tire NAD S200 72-76 V depends on the export market. And this is a very good level. It is not 50 V.
In addition, 4 Panasonic 25A transistors are used in each arm, which is also quite a lot. Although some manufacturers may even make it a lot cooler. For example, 12 Sanken transistors with a current of up to 15 A each.
NAD S200 - Manual - Stereo Power Amplifier - HiFi Engine
https://img.stereo.ru/news/2020/1/9d6848d7a850106459d5e91e171ce2ba.jpg
75 volt rails produce 50 VACRMS.
No more no less? Cool party gathered 😀😀😀75 volt rails produce 50 VACRMS.
I use real, technical arguments. What is unserious at this? And "good" amplifier shoul protect itself against any abnormal condition, output short circuit inclusive. We do not need to make "welding machines" for competently designed speakers. And electrostatics speakers are also connected via inductive cables, and driven by audio signal. If you find such arguments no serious, I give up such debate.
What good amplifiers are only suitable for welding? You can also say that only for a night club. Or that it is impossible to listen to music loudly (as they already said), because the neighbors will call the police, and cats and other animals will get scared, and so on. 😀😀😀I use real, technical arguments. What is unserious at this? And "good" amplifier shoul protect itself against any abnormal condition, output short circuit inclusive. We do not need to make "welding machines" for competently designed speakers. And electrostatics speakers are also connected via inductive cables, and driven by audio signal. If you find such arguments no serious, I give up such debate.
We multiply by a current of 40A, which creates a voltage drop of 0.224V. Recall that this current passed through a non-linear load and, accordingly, it is very non-linear, as this voltage is 0.224 V in accordance with Ohm's law. Considering the fact that at the terminals of the power amplifier, for example: NAD S200 Total harmonic distortion: 0.03%, the increase in distortion only due to the cable resistance at the speaker terminals increased by about 7.46 times.
This analysis is quite flawed. The distortion measured at the output of the amplifier remains 0.03% and the distortion in the acoustical output is trivially changed (if at all).
Last edited:
They really already talked about it. Apparently a good sound has long been banned.🙂Lack of arguments, followed by cheap "humor"?.Amuse yourself..in your "night club".
The only person who saw the mistake. 😉Although it is fundamentally insignificant just different numbers. The order of numbers is still the same, it all depends on the particular amplifier and the length of the cable. Distortion at the speaker terminals increases significantly.This analysis is quite flawed. The distortion measured at the output of the amplifier remains 0.03% and the distortion in the acoustical output is trivially changed (if at all).
You can also independently analyze harmonic distortion: 0.005%, as in Onkyo M-510 Grand Integra.
Last edited:
This analysis is quite flawed. The distortion measured at the output of the amplifier remains 0.03% and the distortion in the acoustical output is trivially changed (if at all).
If you are driving a transducer (Speaker or shaker table) at that level (40A?) distortion in single percentages at the output would be exceptional. Even if the frequency is low enough for the cable to temperature cycle at the signal rate you would be really challenged to see it in the output.
Years ago John Atkinson posted measurements of speaker acoustical response with different cables. It was measurable but not substantial (sub 1 dB I believe). If it is an issue add remote sensing (not for 300' cables). In the wayback times some amps had a control that could make a slightly negative output impedance or a higher positive output impedance. You could have a lot of fun with that. Accuracy be damned.
Thanks. Good because the other one I was linked to wasn't that impressive in any way. Reaching for Tidal...
//
Here is a different effect. That the conductors have linear resistance, and the load is nonlinear. And even on the linear resistance of the cable (presumably) a non-linear voltage drop will occur due to the non-linear current. In general, everything is perfectly audible with my ears. But the right question would be as follows. Which cable out of three sounds better ... I would say that out of three, two sound good.If you are driving a transducer (Speaker or shaker table) at that level (40A?) distortion in single percentages at the output would be exceptional. Even if the frequency is low enough for the cable to temperature cycle at the signal rate you would be really challenged to see it in the output.
Years ago John Atkinson posted measurements of speaker acoustical response with different cables. It was measurable but not substantial (sub 1 dB I believe). If it is an issue add remote sensing (not for 300' cables). In the wayback times some amps had a control that could make a slightly negative output impedance or a higher positive output impedance. You could have a lot of fun with that. Accuracy be damned.
Van Den Hul, AudioQuest, ACROLINK.
Last edited:
Ah, Google failed me I could only find one crowdsourced album.
Here is a different effect. That the conductors have linear resistance, and the load is nonlinear. And even on the linear resistance of the cable (presumably) a non-linear voltage drop will occur due to the non-linear current. In general, everything is perfectly audible with my ears. But the right question would be as follows. Which cable out of three sounds better ... I would say that out of three, two sound good.
Van Den Hul, AudioQuest, ACROLINK.
I believe you are saying that the nonlinear current through the driver will appear as a distortion in the voltage drop across the wire to the transducer. The wire is not creating additional distortion just a vehicle for increasing the distortion generated by the load, no different than adding a large series resistor. If the amp is a pure voltage source all the distortion is in the current, if a current source then all the distortion is in the voltage. You can see this same effect on AC power distribution where you get 3% HD on the voltage and 80% THD on the current waveform.
I think the relevant question is if the distortion radiated from a driver at a given level across the terminals increase if the source goes from 0 Ohms to a significant impedance. An earlier discussion of this suggested that it may be reduced (with other penalties) but I don't know if there is experimental evidence supporting that thought.
My copy strangely had the bar code and copyright info covered on the outside of the shrink wrap with a black sticker to obscure it. ? It's otherwise labeled Fantasy 31760-02.Ah, Google failed me I could only find one crowdsourced album.
All good fortune,
Chris
Here is a different effect. That the conductors have linear resistance, and the load is nonlinear. And even on the linear resistance of the cable (presumably) a non-linear voltage drop will occur due to the non-linear current. In general, everything is perfectly audible with my ears. But the right question would be as follows. Which cable out of three sounds better ... I would say that out of three, two sound good.
Van Den Hul, AudioQuest, ACROLINK.
Two things I like to add to the discussion:
1) the cable resisitance and the Amp’s output resistance for that matter are in series with the Speakers impedance. Because of that, the cable impedance plays a very subordinate role.
If searching for an explanation why LS cables sound different, this is a dead ending street.
2) whether 40A, 100A or whatever peak current may flow, IMO nobody can still hear any sound difference between cables at these eardrum breaking levels.
Hans
This is complete nonsense. You do not own reference data.Two things I like to add to the discussion:
1) the cable resisitance and the Amp’s output resistance for that matter are in series with the Speakers impedance. Because of that, the cable impedance plays a very subordinate role.
If searching for an explanation why LS cables sound different, this is a dead ending street.
2) whether 40A, 100A or whatever peak current may flow, IMO nobody can still hear any sound difference between cables at these eardrum breaking levels.
Hans
You are right, but only partly. In case the speaker was also a resistor. I did not say that only the wire creates distortion. I always said that distortions arise mainly due to its resistance, as well as capacitance, and inductance, which can also be nonlinear due to the peculiarities of the materials and the design of the entire audio cable. But since the moving system of the speaker stores part of the energy that did not go into space in the form of sound vibrations and which did not dissipate in the form of heat on the coil. He transfers this stored energy through a cable back to the amplifier, unfortunately completely out of sync with sound vibrations and not in time with the music. The speaker returns the accumulated energy to the beat of its electromechanical vibrations associated with resonances, nonlinearity, and other laws. This is the return current and, accordingly, the voltage drop in the wire caused by this is not only non-linear but also completely out of sync with the voltage loss in the wire that occurs synchronously with the input music signal. When superimposing these both antiphase and in-phase but not synchronous oscillations with different amplitudes that the speaker produces, the voltage arises from losses in the audio cable and these distortions occur at the speaker terminals.I believe you are saying that the nonlinear current through the driver will appear as a distortion in the voltage drop across the wire to the transducer. The wire is not creating additional distortion just a vehicle for increasing the distortion generated by the load, no different than adding a large series resistor. If the amp is a pure voltage source all the distortion is in the current, if a current source then all the distortion is in the voltage. You can see this same effect on AC power distribution where you get 3% HD on the voltage and 80% THD on the current waveform.
I think the relevant question is if the distortion radiated from a driver at a given level across the terminals increase if the source goes from 0 Ohms to a significant impedance. An earlier discussion of this suggested that it may be reduced (with other penalties) but I don't know if there is experimental evidence supporting that thought.
Due to the fact that these signals do not coincide either in shape, in amplitude, or in phase, these are intermodulation interactions, or distortions, which are very noticeable to listeners.
Last edited:
Something JR said he was planning to do, I've not heard anything.I think the relevant question is if the distortion radiated from a driver at a given level across the terminals increase if the source goes from 0 Ohms to a significant impedance. An earlier discussion of this suggested that it may be reduced (with other penalties) but I don't know if there is experimental evidence supporting that thought.
I think the relevant question is if the distortion radiated from a driver at a given level across the terminals increase if the source goes from 0 Ohms to a significant impedance. An earlier discussion of this suggested that it may be reduced (with other penalties) but I don't know if there is experimental evidence supporting that thought.
Some on this site have actually made measurements. I did some by making an amp with variable Zout, feeding into drivers of different make, and acoustically measuring the output. The results vary from driver to driver. Some are better driven from high Zout, other better from low. It also changes with frequency. So one driver may be better on the low end current driven and on the high end voltage driven. I also made an amp to match this behavior, for example low Zout for low frequencies and high Zout for higher frequencies. I think I posted the schematics at the time, but not to encourage others to build the same. The reason is that the advantages of finding an optimum Zout for a particular driver are not very large in terms of distortion. In my opinion not worth the bother.
Pavel (where are you?) also did similar experiments, but he looked at IM distortion. His conclusions were more optimistic about the gains to be had.
One important conclusion in all of this is that amps with a high damping factor will not significantly impact driver performance distortion wise. They may even create a bit more distortion than an amp with somewhat higher Zout would.
The only reason to want an amp with a high damping factor is to not impact bass alignment. All speakers have an impedance peak on the low end which will cause an amp with a low damping factor to create a voltage peak at the same location. This is very audible, but has nothing to do with harmonic distortion.
- Home
- Member Areas
- The Lounge
- The Black Hole......