Aleph 3 bias question

I have built a number of Aleph 3 channels. Using the values for the componants as set forth in the schematics. Upon completion, I would end up with .75 amps of Bias current going through each irf 244 output device. The manual calls for 1 amp of current through each device. The Source resistors r120 r121 r122 and r123 all show voltage drop of around .36 volts. The schematic calls for .5 volts of voltage drop. Rails are roughly +25 volts and - 25 volts

To get the 1 amp of bias current through the mosfets, I adjusted r113 from 47k to 43.1k (manual says that r113 trims the dc current value).

Question is, should I be changing the values of r120 through r123 to get to the .5 voltage drop, or is adjusting r113 sufficent, or am I approaching this incorrectly?
There are a number of things that effect the final bias values. One is the temperature of the output devices. When I built my Aleph 2s, I began by testing one channel. The bias was only about half of what it should be. I scowled and fumed and wrote Nelson, and diddled the resistor values, and got the bias about where I wanted it.
Then I went and hooked up the other channel.
The bias overshot, and I was in a mad scramble to pull it back down. Why did it overshoot? Because I'm using the oddball water-cooled heatsink and the things were barely warming up compared to Nelson's egg-frying (yes, I'm exaggerating) production models. Hooking up the second channel significantly warmed the first channel, changing the bias.
The choice of output devices matters also, but it doesn't look like that's the case with your circuit(s).
Certainly, adjusting the bias via R113 is the easiest way to go about it as long as you have sufficient range to get where you want to go,'re saying that you increased the bias by <i>lowering</i> the value of R113?