Soundcraftsman MA5002 Bias

Manual says to adjust at 1/4W output to 0.1% THD. I do have an analyser, but seems a flaky way to set bias.

Anyone here got some experiance with these? Perhaps a suggestion on an actual bias current...?

(tempted to set it for abour 25-30mA and assume it's in the ballpark)
 
EchoWars said:
Manual says to adjust at 1/4W output to 0.1% THD. I do have an analyser, but seems a flaky way to set bias.
That’s about the same procedure as Phase Linear used to explain us:
Connect a load, feed a very small signal, look at the scope and bias up till the crossover distortion disappears. I did many of them that way with good results.
One explanation could be that in fact you need nothing more or less than eliminating that distortion like Enzo said.
I agree its sometimes easier to measure the voltage over the emitter resistors but then again these old resistors tend to vary sometimes more then 10% which also resulted in inaccurate adjustments.

/Hugo :)
 
As an authorized QSC repair center,(QSC is a maker of nicer pro audio gear) I am supposed to run the amp through an exhaustive procedure to set all the distortion levels, protection circuits and what all. It asks me to max the amp out just shy of limiting and use my distortion analyzer to adj for minimum dist.

parenthetically it ends with a brief note informing me that if I just can't make a distortion analyzer available or this might be an emergancy field repair or something that an approximateadjustment can be made measuring the voltage across the driver emmiter resistors.

If a strange amp comes through here blown up and there are no specs, we figure there is a window of "good" bias between the point where crosover disappears and the point where the draw drom the mains starts to climb. In the middle of that somewhere is close enough.

But really, I can't think of any reason to bias any hotter than the end of crossover other than to insure that one adjustment spec for all the amps would be in that window I described. SO all the amps will be 5-50% hotter than they need to be, but none will remain in crossover. That is how I look on current spec adjustment. While it is easier to adjust an amp to exactly so many milliamps, I think it is more accurate to adjust each amp to the crossover point thus allowing for the variations between units.

That is of course my opinion, not gospel.
 
The SWR Bias procedure says to connect the amp to a 2 ohm load, inject a signal and turn it up to clip, then adjust bias to remove the crossover notch! Never seen it done that way before!

>75ma seems low to me??? i thought my PCR-800's ran closer to 400ma bias....I will have to dig out my schematics aagin, maybe im confused. But im pretty sure the calculation was 100ma of bias for each of the 4 mosfet output devices or something like that....


ZC
 
The SWR Bias procedure says to connect the amp to a 2 ohm load, inject a signal and turn it up to clip, then adjust bias to remove the crossover notch! Never seen it done that way before!


A Speculation: That procedure may have been choosen to insure the distortion was easy to see. Since most amps have their lowest distortion at about 1kHz in to an 8-ohm load and at a middle power level -- testing it there might get results that are near the lower limits of many analysers and thus hard to read accurately.

The procedure quoted above would show a lot more distortion even at optimum, especially if you are using the technique of feeding the residual to an oscillioscope. Presumably optimal bias is virtually the same at all frequencies and at all outputs below clipping (after allowing the amp to reach a new thermal equilibrium). I HOPE that is a reasonable presumption - if not, optimum bias becomes a squishy concept
 
Why not just specify a currentn or voltage?

Another speculation. There is enough random individual parameter variation between in-spec active devices that the optimal value would be a little different for each amplifier. Hence, while such a procedure as you suggest may give an easily implemented "Good enough" setting, it would not be the the very best except for a few random units.
 
Hi Guys,

I repaired an MA5002 a while ago on a different thread here but when I asked about biasing I didn't get any answers, I'm hoping some of you might help me out here.

I put my meter across R31 (0.27ohm 5W emitter resistor, I tried the other 0.27ohm resistors as well but same reading) and read 0.6mV on one channel and 0.2mV on the other channel no matter the position of the variable resistor. the pot itself is good (read clean from 500 to 5ohms) the bias transistors (Q6) test fine. When turning the pot I can tell the bias increases (VBE on Q6 increases from 2.99v to about 3.4V with the pot half way and the buzzing of the transformer dims down and voltage across R28 10ohm, increases).
Am I doing something wrong here ? Could you tell me where to measure and at how much to set it ?

Thank you
 
Ok so I got up to 620mV across R28 and then measured across R31 and hello 15mV :D.
When I first got the amp the bias pot was a tad before half way so I was afraid to move it much further than 3/4. Now I guess changing the outputs might have influenced the vbe multiplier a bit.
Now the question is what voltage to aim for ?