Temperature is how heat energy is measured, with different objects described as being either hot or cold. Hot objects radiate more heat energy to their surroundings, while those that are cold absorb heat energy. Being able to measure this temperature on a scale is extremely important, as it can affect how chemical and biological processes happen. For example, when most chemical reactions happen faster at higher temperatures. However, some chemicals, including enzymes, are destroyed by high temperatures.
Over the course of history, people have come up with different ways to measure temperature, allowing us to have a better understanding of temperature and its effects on the world around us. From the classic mercury in glass thermometer to high tech infrared sensors that are used today, these tools are highly useful in a range of different industries. In this article, we’ll look at the main methods for measuring temperature and their advantages and disadvantages.
Liquid in Glass Thermometer
This is the classic and one of the original methods used to measure temperature. The first ever sealed glass thermometer was created in 1646, though this early version used a mixture of alcohol and water. This mixture was sealed within the glass tube, and would expand when exposed to heat. Over time, many variations to the design were created, allowing for more accuracy and better results.
One of the most effective forms of glass thermometer was the design which used mercury. Despite being a highly toxic substance, mercury proved to be the best material to use inside thermometers. This is because the metal is liquid at room temperature and will expand quickly based on small changes in temperature. As a result, the mercury glass thermometer was the standard design for many hundreds of years after it was invented in 1714.
The downside of using a glass thermometer is that they’re fairly fragile. In addition, the mercury or alcohol inside can only withstand a certain range of temperatures. At higher temperatures, the device may break, potentially causing a hazard. Therefore, these thermometers are better for small temperature ranges such as body temperature or outdoor temperature.
Electrical Resistance Thermometer
One of the key developments in measuring temperature came in 1829 when Humphrey Davey proved that the electrical resistance of metal changes depending on the temperature. Eventually, this idea was used to create fully electrical thermometers, which have become the gold standard for temperature measurements today.
Electrical resistance thermometers use a thermistor, which is a resistor sensitive to changes in temperature. The metal used for the thermistor can be changed depending on the type of temperature range being recorded. When the thermistor absorbs heat energy, the resistance of the current is measured and used to calculate the exact temperature.
These devices are excellent for providing fast and accurate results. The speed at which they can display the temperature is important in industrial and engineering settings where even a slight change in temperature can affect results. They can also be used to calibrate traditional thermometers due to the high level of accuracy.
Infrared Radiation Thermometer
Every object above the temperature of absolute zero emits infrared radiation. While this radiation is invisible, it can be detected as heat energy. Infrared radiation thermometers focus the radiation released from the surface of an object and measure it, giving a precise temperature reading which is updated in real time.
The advantage of this device is that it can take accurate temperature readings without having to come into contact with the object. This is particularly useful in hazardous conditions or when temperatures are too high to safely take readings.