BA-3 Amplifier illustrated build guide

Temperature Measurements

So before I received a laser thermometer for Christmas I was using the "touch" method for determining heat. Somewhere I read that 55C is about 10 seconds of touch before the pain becomes intolerable. Well I have been able to hold my hands on the amp indefinitely so I thought I was good. My new handy dandy thermometer says my heatsinks on the outside are about 68C at the hottest parts on the outside. I believe this is directly on the other side of MOSFETS in between the fins. My question is, is this the correct spot to be taking thermal readings?
 
If you're holding your hands on the outside of the fins, and measuring with the laser in between the fins it is not surprising that there is a temperature difference. Following ZMs advice previously (another amp), i measured hottest part of heatsink vs legs of MOSFETs. As long as the difference is not too great, they are well-coupled (good heat transfer).



I think what ZM is saying is "don't worry", but i could be wrong.
 
Thermal image can be deceiving/ use contact thermistor

I get goofy readings from my IR thermometer especially looking down heatsink fins. Try pulling it farther back away from the amp and see if it lines up with what you feel with your hand. My hands are pretty heat tolerant but 68C would make you jump.

The temperature varies greatly from the base Metal of the heat sink where the thermistor is attached compared to the edge of the heat sink fin and a thermal noncontact infrared thermometer averages out the temperature on all items within its imaging sensor example at 6 inches sensor reading area might be the size of a dime pull it back another foot In the temperature sensor reading area maybe the size of a quarter you get where I’m going with this. The real cheap eBay nine dollar or $20 thermal imager‘s are very unpredictable not very reliable. For my work this is the cheapest that gives us the most reliable accurate temperature readings that we’ve use in our industry.

FLIR TG167 Spot Thermal Camera for Electrical


For a very accurate pinpoint temperature do you want to use a surface contact type probe like this.

Testo 0602 0693 High Temperature Surface Probe

You can literally touch the metal at the backside of the heat sink right adjacent to the MOSFET or the Jfet or anything you want to take a temperature of directly.

You can put a very small dab of silver bearing thermal grease at the end Of the sensor and touch directly onto the transistors Metal part of the heat sink case just before it touches the heat sink and take a very exact temperature of the transistor.
As the heat transfers through conductivity from the transistor to the heat sink it rapidly dissipates to heat in every millimeter traveled in all direction the temperature falls exponentially. Exactly the same as sound pressure level 1 m away from a speaker when you multiply by 2 m and take the sound decimal level it drops exponentially.

I hope I explain that in a way that was easily understandable. With a good analogy.
 
Last edited:
I would like to point out some anomalies you will likely experience using an IR thermometer. Due to Emissivity, surface texture, and surrounding environment, IR thermometers will give errors. The worst I have seen is with shinny, reflective metal surfaces. While the IR sensor is focused on the area of interest it may not only see the emissive radiation of the D.U.T. In addition, certain materials emiss more or less than the 100% number you are after. You can easily pick up reflections from very smooth metal surfaces returning highly inaccurate readings. To combat this, I used "white out" type correction fluid. I coat the device with a couple thin layers until there is no visual metal or reflective surface showing. I am usually not concerned with a good paint job just covering the shinny metal. :Pawprint:
 
I would like to point out some anomalies you will likely experience using an IR thermometer. Due to Emissivity, surface texture, and surrounding environment, IR thermometers will give errors. The worst I have seen is with shinny, reflective metal surfaces. While the IR sensor is focused on the area of interest it may not only see the emissive radiation of the D.U.T. In addition, certain materials emiss more or less than the 100% number you are after. You can easily pick up reflections from very smooth metal surfaces returning highly inaccurate readings. To combat this, I used "white out" type correction fluid. I coat the device with a couple thin layers until there is no visual metal or reflective surface showing. I am usually not concerned with a good paint job just covering the shinny metal. :Pawprint:

Actually flat black works best like the flat black barbecue paint. That white is closer to reflective shiny aluminum.
If you have purchased an IR thermometer that is one step above the absolute cheapest ones on the market usually located somewhere in the battery compartment or lift in the back of the Handle it will be a tiny switch One Direction is for normal flat dark colors switching in the other direction is for high Emissivity Colors like flat white or shiny aluminum or copper.

Fig mentioned the reflectivity here is example.
Off of a shiny piece of metal is just like a mirror. If you have a hot body item in the room such as a radiator or a electric portable heater even a coffee percolator or a stove with a big pot of boiling water or any large body that radiates heat. Well you are using the IR thermometer if you are at the right angle the heat will strike your shiny metal and reflect back into the I R thermometers sensor and it will give you a false reading.
Another scenario of a false reading if you take your IR Thermometer and pointed into the cold air stream of your cars Dash air conditioned air. You will start to develop a distorted and cold lensing affect the cold air will rapidly start to distort the glass or plastic lens in the IR camera thermometer and progressively start to give you colder and colder readings much colder than the actual air temperature.
And this is exactly why true and accurate readings only contact type thermometers are used in any scientific research. IRS thermometer are only good to give you a close Approximation or if services are of a flat color and dark in nature they can be Fairly accurate.
 
Last edited:
Hi, a few questions about biasing and dissipation:

6L6 said:
The simple rules of thumb are -

TRANSFORMER = The amplifier’s total bias (in Watts) should be no more than 1/2 the VA of your transformer.

My transformer will be 300VA 2x18Vac (I already have one). It means 150W max.
6L6 said:
How do you determine watts? Simple - measure your bias current, the voltage drop across the 3W source resistors. Let’s say you measure .3V . Divide that by the value of the source resistor. If using 1.0 ohm, your answer is then .3A (.3 / 1 = .3) — but if using 0.47ohm resistors, a voltage drop of .3 is now .64A (.3 / .47 = .637) Back to the example using the 1.0 ohm resistors, multiply by your rail voltage, 32V, so each device will have a bias of 9.6W (.3 * 32 = 9.6) then multiply by the number of devices - 12 in this amp, (12 * 9.6 = 115W)

Power supply should be +/-24Vdc. Source resistor 1ohm. So I could have a bias of 0.5A on each device (0.5*24*12 = 144W)

6L6 said:
HEAT = No more than 55C heatsink and 65C Mosfet. (The best place to measure Mosfet temp is pin 2)

The chassis will be a 4U 300mm. So 144W means 72W per channel. If we consider 0.4°C/W it mean 28.8°C with 25°C in the room it means a little less than 55°C on each dissipator.

Am I right ?

Thanks

Damien
 
Member
Joined 2019
Paid Member
General question. What makes the difference between using a PSU vs a regulated PSU. As an example, this build guide uses the diyaudiostore PSU and the BA-3 as a preamp uses the Peter Daniels regulated power supply.I understand the difference in the two (in general) but I don't know the why. Thanks.