Oh dear: shouldn't have touched that 'calibration' screw...

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Like an idiot I changed a calibration screw on my Circuit Specialist 1A solder station without making a note of its original position. In fact I am an idiot. I'm now convinced that it is underheating. Even at 800F it seems to be slow to melt.

However maybe it's just me and it's actually quite close to the right temp; I just don't know. I don't have enough experience to tell. In the meantime I'm a bit OCD about these things: having decently calibrated equipment (I have lots of measurements tools and a decent magnifier) really removes a lot of thrashing around in the dark. I would like to follow soldering advice that suggests a certain temp to avoid component damage. But right now I can't.

Emailing Circuit Specialists they say that 'it can't be calibrated'. Oh great!

Any suggestions from an ingenious mind as to how I might go about this?

So far I've tried an IR thermometer but their circle of temperature detection is way too large and they are in any case very inaccurate.

Is there in fact a standard method for calibrating these things? Would I be better off just getting another station; in view of the cost of calibration?
 
All you need is patience and a piece of plumbers 40/60 solder. Wrap a piece around the tip and set the iron for 230 degrees. If you are calibrated too low it won't melt!

if you don't have 40/60 then try a 95/x/x that should melt around 224 C!

Thanks for the help. Is this an established technique? Obviously I query it simply because 230C isn't near the melting temp of 40/60.
 
The thermocouple that came with my multimeter is good up to 250 degrees C and with high accuracy too. You could easily use something like this to calibrate the unit at a lower temperature.

I tried it with mine and while at one point I got a steady temp that was in the region that I had set the station (about 40C less) on attempting to repeat I got wildly different results: much much less.

So I'm wondering at a better or standard method for doing this.
 
The thermocouple that came with my meter isn't an IR device and is accurate. The only problem with it is that it's prone to inaccuracy if whatever it is measuring is charged to any kind of electrical potential, or if the target has high speed switching going on near it.

If you've got a standard metal thermocouple then what I do is get the target up to temperature, attach the thermo couple, then turn whatever it is off. If the target is interfering with the measurement in some way then this usually does a good job of removing it from the picture.

What simon is suggesting is also a valid way of doing this. Eutectic solder melts at only one temperature. If you were to wrap a ring of some standard alloy solder with a known melting point around the tip of the iron, then you could set the iron to say 180 degrees. 63/37 standard lead solder melts at exactly 183 degrees C. If you set the calibration screw so that the iron is running cold, then if you set it to 180 and then slowly ramp up the screw till the solder melts, then you know your 180 setting is @ roughly 180 degrees.
 
My temp controlled iron seems to use a thermocouple to measure temperature - I can use a spare element as a thermometer with the temperature option on my meter instead of the standard unit.

If the unit is anything like mine then the display reads the current element temp when switched on so it may be possible to calibrate by checking the displayed temp against the current temp with the heater element disconnected. If its a ceramic enclosed element then it could be put in a pot of hot water along with a thermometer. :scratch2:
 
My temp controlled iron seems to use a thermocouple to measure temperature - I can use a spare element as a thermometer with the temperature option on my meter instead of the standard unit.

If the unit is anything like mine then the display reads the current element temp when switched on so it may be possible to calibrate by checking the displayed temp against the current temp with the heater element disconnected. If its a ceramic enclosed element then it could be put in a pot of hot water along with a thermometer. :scratch2:

Not a bad idea. I might do that if I get one of the digi ones one day. Currently I've just a dial.
 
I think the solder melt idea is great. Have you tried it yet?
That sounds like it has the potential for good accuracy, but may be a little finicky to implement in practice.

My wife has a couple of cooking thermometers for candy making and deep-fat frying with graduations up to 200C/400F or a bit more. If you trust the calibration of such a thing . . . they could be used to measure an oil bath at several different temperatures, and compare to the readout on your soldering station with the iron's tip immersed in the oil. (Two or three calibration points will give you a feeling for the interaction of gain and offset errors, even though only one of these parameters has an associated adjustment.) I recall being told once that synthetic automotive engine oil boils above 450F, so it might be a better candidate for the oil bath than common cooking oils.

Dale
 
That sounds like it has the potential for good accuracy, but may be a little finicky to implement in practice.

My wife has a couple of cooking thermometers for candy making and deep-fat frying with graduations up to 200C/400F or a bit more. If you trust the calibration of such a thing . . . they could be used to measure an oil bath at several different temperatures, and compare to the readout on your soldering station with the iron's tip immersed in the oil. (Two or three calibration points will give you a feeling for the interaction of gain and offset errors, even though only one of these parameters has an associated adjustment.) I recall being told once that synthetic automotive engine oil boils above 450F, so it might be a better candidate for the oil bath than common cooking oils.

Dale

I would approach this idea with caution as thermometers will explode and violently sometimes if their maximum temperature is exceeded. The other issue is that soldering irons aren't amazingly powerful and if you ask it to heat up something inside a container you will end up with quite a different result then you might expect. I once tried this with heating a small amount of water and was quite surprised at how long it took, measuring the waters temperature wouldn't have helped me a jot! But don't let this stop you from trying.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.