Temperature is not something you can take chances with, particularly when you’re shipping or storing food or pharmaceuticals. It’s vital to make sure you understand all the factors that can come into play that influence the accuracy of your sensor designs. (Read on to find out more, or download the full article here.)
Semiconductor temperature sensors commonly use a bandgap element which measures variations in the forward voltage of a diode to determine temperature. To achieve reasonable accuracy, these are calibrated at a single temperature point, typically 25 °C. Therefore, highest accuracy is achieved at the calibration point and accuracy then deteriorates for higher or lower temperatures. For higher accuracy across a wide temperature range, additional calibration points or advanced signal processing techniques can be employed.
Manufacturers of semiconductor temperature sensors will specify typical and maximum temperature accuracy within certain temperature ranges. While typical values can give some idea of the accuracy for a few devices under ideal conditions, customers should rely on the maximum values for a true indication of accuracy across multiple devices and under a variety of conditions.
Power supply voltage can also affect temperature accuracy in a semiconductor sensor. Sensor devices with a lower level of internal voltage regulation will exhibit greater reductions in accuracy when the power supply deviates from nominal voltages. Most manufacturers will include this in their datasheet specifications, with maximum values in the range of ±0.2°C/V to ±0.3°C/V.
In higher accuracy devices with <±0.5°C error, secondary effects will begin to emerge that can also play a role in overall accuracy. These are often specified separately from the overall accuracy specification in manufacturer’s datasheets and should therefore be added. Some of these include:
To find out more, download the full article here.