A resistor's Temperature Coefficient of Resistance (TCR) tells how much its value changes as its temperature changes. It is usually expressed in ppm/°C (parts per million per degree Centigrade) units. What does that really mean?
Let's use an example: 50 ohm 100 Series precision resistor has a (standard) TCR of 20ppm/°C. That means its resistance will not change more than 0.000020 ohms (20.1,000,000) per ohm per degree Centigrade temperature change (within the rated temperature range of -55 to +145°C, measured from 25°C room temperature.)
Assume our resistor is in a product that heats up from room temperature to 50°C. To find our 50W resistor's (maximum) change caused by that 25°C rise, multiply 0.000020 times 50 (the resistor value) times 25 (the temperature change.) The resistor's value would change no more than 0.025 ohms. (0.000020 X 50 X 25 = 0.025W.)
The actual change may be much smaller, depending on the specific characteristics of that resistor. If you must guarantee a smaller resistance change in your application, can provide a nonstandard TCR as low as 1 ppm/°C.
Understanding Resistors and Temperature
Subscribe to:
Posts (Atom)