ADCOM GFA 535 - Bias Adjustments

Dear All,

I currently sourced an ADCOM GFA 535 MK II and am looking at adjusting and checking the Bias to see if it is perfect.
I also read that the input jacks should be shorted when adjusting the BIAS,as per the service manual.

I am quite a noob in this topic and am quite confused with the terminology used to short the input jacks.

By shorting does it mean that the RCA plug of each channel has to be shorted, the inner ring and the outer jack of each channel the left and right. Or is it just a cable that needs to be run from Left RCA inputs to the Right RCA inputs.

I am quite a noob , in this topic and am looking forward to get inputs before i even start with it.
I will be really grateful for inputs provided.

Thanks in advance.
 
Welcome, if you post the bias procedure we can help you out more with the finer details. Shorting input means the outer ring and pin are connected. What I did was take an old RCA plug, cut it and twist the outer ground shielding an inner wire together.

Everyone has a starting point, but you need to appreciate and respect what you are dealing with. Amps can have a deadly potential when you go poking around. Plus it is very easy to screw things up when muckin around inside (ask me how I know).

Also you will need a multimeter if you didn't know that already.
 
Welcome, if you post the bias procedure we can help you out more with the finer details. Shorting input means the outer ring and pin are connected. What I did was take an old RCA plug, cut it and twist the outer ground shielding an inner wire together.

Everyone has a starting point, but you need to appreciate and respect what you are dealing with. Amps can have a deadly potential when you go poking around. Plus it is very easy to screw things up when muckin around inside (ask me how I know).

Also you will need a multimeter if you didn't know that already.

I do have a multimeter with me, i have managed to find the pin out digrams on the PCB, and have started to measure the DC Offset Voltage.
Please find the instructions for bias adjustment in the attached image.

I have have also attached my multi meter reading for you to verify.
Thanks again for helping me out !
Have a good day!
 

Attachments

  • Bias Adjustment.jpg
    Bias Adjustment.jpg
    121.4 KB · Views: 295
  • Multimeter Reading.jpg
    Multimeter Reading.jpg
    338.1 KB · Views: 277
Looks like you have it all squared away.

DC offset should technically be zero but it might be tough to get exactly that. Anything less than 25mV is more than fine but you probably won't be anywhere near that.

I have read that the DC Offset should be measure in the speaker out terminals, please correct me if am wrong.
Another strange thing which i noticed when adjusting the BIAS TP1,TP2, TP3 and TP4 is that the max output of TP1 and TP2 is higer than the max output of TP3 and TP4 pins.

Lets say when we turn the potentiometers fully to send the full signal,then the max output measured across TP1 and TP2 is 30mv and the max output across TP3 and TP4 is 15mv, is there something which i should check in the circuit.? Or is this fine .

I have adjusted them as per the instructions in the service manual to 7mv.
I doubt if this could be attributed to dirty potentiometers as they are open and prone to accumulate dust, i am going to clean them and check again when i get home.

Just wondering if there are other components in the circuit that would need checking too.!
Thanks for your expertise in helping me out.
 
You are correct that DC offset is measured across the speaker terminals for each channel. Doesn't matter which is positive or negative as you are shooting for zero.

I wouldn't worry too much about the "max" of the pot adjustments. Could just be the differences in the components as they were probably not matched too extensively. Hence the reason for the adjustment. As long as they both adjust to the correct value and hold there, I would call it a day.

You may find the bias needs to be adjusted more when you adjust the offset. Also the amp should be warmed up first.
 
The official bias adjustment instructions for the Adcom 535 MkII have you warm up the amp by playing 20W for 20 minutes, with the cover on. Then you should remove cover and adjust the bias for 7 mv across the relevant test points as soon as the bias stabilizes. But what is stabilizing???

At the outset, my amp reads about 8mV quiescent after a couple minutes warmup and basically stays there with the cover off. But after I run the 20 minute warmup, and take the cover off (I had it turned upside down and then block the sides with large aluminum dummy loads) the bias now measures 3.66 mV and rising. In 2 minutes it has risen to 4.94, then in 5 minutes to 6.03. Some time around 20 minutes, it returns pretty much to the quiescent cover off level of 8 mV.

So is "stabilizing" what you measure pretty much as quick as you can get the test leads on it, or is stabilizing waiting 20 minutes for everything to cool back down again to quiescent level. If the latter, the warm-up procedure seems to make little sense.

BTW, I was very concerned about the bias level because it has increasing distortion below 1 watt output. At 1 volt output into 8 ohms (0.125 watts) it measures 0.23% distortion (THD+N), ten times higher than at 10V output (12.5w), and this isn't changed with the 80kHz filter (however, strangely it reduces in half with the 400 Hz filter...possibly just because 1kHz is close to the filter cutoff). It appears to be some sort of hashy looking distortion at the negative wave peaks. The amp has low noise level of 22uV A weighted. In comparison IIRC my Dynaco 410 has low distortion well below 1 volt, and 44uV A weighted noise.

Both channels were the same in distortion, though one channel actually measured around 7V quiescent. I assumed that was wrong and cranked it up to the 8mV quiescent in the other channel. Didn't change the distortion significantly. I cranked it up to 20mV quiescent bias and it still didn't help.

But if I heated up the amp to the specified level where the bias now measures 3.66mV, and cranked it up to 7mV at that heat level, that might be an even higher level of bias and closer to what was intended.

Or maybe the intention was to make a higher powered amp by reserving some of the heat capacity for high power operation, sacrificing low level linearity. If my intention were instead to have something like a 30W amp with lowest possible distortion, would it be OK simply to crank the bias up so the heat sinks are up to whatever level seems OK with the cover on etc?

I've essentially done that with other amps, like my Aragon 8008 BB. It also shows rather high distortion (as I call anything above 0.1%) set to the Klipsch recommended settings (Mondial never published bias spec). I instead cranked the amp up so the heat sinks measure 130F after a few hours, and distortion went down to 0.007 or better (medium low power, I didn't think to measure low) at about 3 times the Klipsch spec voltages. Of course a lot depends on how high the feet under the amp are...it's very sensitive to the slightest variation in airflow...at the factory setting. At my setting it's fine with the tall feet I use.
 
Last edited:
In general you want the amplifier biased so that the sinks are nicely warm to the touch (the 10 second rule).

This was how it was designed.
Thank you. I decided to try cranking it all the way up, which reads 18mV at first, but falling, now 16mV. The heatsinks are not too warm to hold over 10 seconds yet. (They get that way quickly with 20 watts output.)

But it appears my low level THD+N is power supply related. The glitching appears at what appear to be 180 Hz intervals regardless of input frequency. So I'll guess it needs new power supply caps. (I had trouble reading my new Rigol scope at first or I would have seen that right off.)

The little frequency related harmonics in the analyzer may be quite small, just as at 10V output.
 
After about 90 minutes, with my simulated "cover on" (cover upside down with sides blocked), bias cranked all the way up has fallen to 14mv at the testpoint, heatsinks can be touched for no more than 10 seconds, my flaky IR measures 130F. My analyzer is overheated for today, anyway this is pointless until I fix the 120-180 Hz modulation. Showing the distortion output (can't remember the signal frequency, probably 1khz, but scale is 2ms/div, I think that's what Rigol means by "2 ms"). The photo was taken earlier when my analyzer was still sane, I think it was at roughly "8.5 mv" bias level, and 1v amplifier output. Can't retest now.
IMG_0721.jpg
 
Last edited: