At the https://debugginglab.wordpress.com/2014/10/30/soldering-station/ from Matthias Wagner, I saw a DIY project of a soldering station with nice user interface and clever control of the RT series soldering tips of Weller. It was so interesting that I decided to make some changes on the hardware and of course re-writing of the firmware.
It is based on a ATMega644 controller, it has a 2.4″ TFT and a rotary encoder for the user interface and is designed to support two kinds of soldering irons: the classic RT tips or other devices with in-built heat sensor and the simple devices with only the heating element.
The small 2.4″ TFT is on an Arduino MEGA1280 for developing of the initial code and making fine adjustments of the -on screen- details.
The splash screen can be any 320×240 pixels .bmp file and can be processed in any image conversion program like Ms Paint.
My original thought was to use a simple, low power, cheap soldering iron for soldering up to 1206 dimension SMT parts, such as chip resistors and capacitors and small package IC’s. The problem with this is that since the soldering iron has no temperature sensing (RTD sensor, thermocouple sensor), it is very difficult to calculate the tip temperature of the device. I know though from Physics the relationship between the resistance and temperature of a resistor (i.e. the heating element itself), so with some effort I can take some feedback for an automated temperature controller.
Since the electrical resistance of a conductor such as a copper wire is dependent upon collisional processes within the wire, the resistance could be expected to increase with temperature since there will be more collisions. An intuitive approach to temperature dependence leads one to expect a fractional change in resistance which is proportional to the temperature change:
ΔR is the difference between initial and final (R resistance, T temperature) and α is the temperature co-efficient of a resistor or the material of the resistor (for my case, the heating element material is a Nickel-Chrome alloy, very common for soldering iron resistors).
So, the above formula can be more simple, solving for the unknown temperature. It is supposed that the every single moment heating element’s resistor value can be found from the moment current that leaks the element:
In the very useful page http://hyperphysics.phy-astr.gsu.edu/hbase/electric/restmp.html can someone calculate the resistor’s temperature by inserting all other values in different fields. Very useful, indeed!
The main problem now is to make something that can measure very slight changes of iron’s resistor, such as from about 26 to 30 Ohms. These 4 Ohms have to be divided in about 375 parts, since the temperature of the tip is expected to reach the 400 degrees of Celsius centigrade. This means that if the nominal operating voltage of the iron is 12 volts, the current we have to measure varies from 12/26=0,4015 to 12/30=0,4000 Amperes. The accuracy range of the ADC must be then not less of a 0,00015075 Α per one degree for a temperature range of ambient 25 to maximum 400 degrees.
To be honest, I would like to express my reservations on this approach but the magic of the unknown would not make no sense…
UPDATE (22 FEB 2016):
After some research, I realized that the above approach to this kind of design had very little chances to succeed. So I revised my design in order to fit it in market’s soldering iron types. The updated design has 3 options of tip temperature measurement, depending on the thermal sensor type of the soldering iron. PTC, NTC and K-TC sensors have been implemented and can be easily chosen through a quadruple DIP switch. The signal amplifiers are separate, each for every type, so defining the soldering tip’s sensor in the code, the program executes accordingly.
The core of the design depends now on a very interesting ΑΤMEGA2560 core I found:
This one has plenty of flash memory, so I can do many things with graphics on the TFT.
This part of the design shows the individual signal amplifying stages before the 10bit ADC measurement.