Power supply load testing

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I've recently acquired a vintage valve based bench power supply that has a variable 0 to 300V DC output, a fixed -150V DC output and a fixed 6.3V AC output. I'm hoping to use it for developing and testing valve circuits which I'm guessing is what it was intended for when it was made in the 60's.

The unit has four selenium rectifiers which I want to replace and I also want to make sure that the outputs produce the voltages that they're supposed to. So I'm thinking about using some kind of dummy loads on the DC outputs while I measure the voltages.

The -150V output should be fairly simple as it's a fixed voltage rated at 7mA. So I'm guessing that I can just use Ohms Law etc to calculate that a 22K resistor that can safely dissipate 1W will suffice? Is that correct?

The 0 to 300V output is a bit more tricky as it's just rated at 60mA. I'm guessing it won't be able to supply 60mA at every voltage between 1 and 300V. So I'm not sure if I should assume that it's 60mA at 300V and use a 5.1K (very high wattage) or something else.

What would be the best approach?

Thanks.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.