The ability to measure temperature accurately was a major scientific advancement, putting absolute numbers on an observable phenomenon.
How hot was it last summer? Will it be cold enough for skiing next week? Each of these questions forces quantification to a routine experience. Whether we’re talking about the weather, cooking food, or conducting a scientific experiment, we need to know how hot or how cold something is. To know this, one must be able to put some sort of accurate number on the concept. Although there is an entire field of study devoted to measuring temperature (thermometry), the focus of this section is on the fundamental measurements of temperature.
Monthly mean temperature: Temperature enables us to accurately measure and compare climates in different parts of the world.
The History of Measuring Temperature
To people in the 21st century, measuring temperature is a quick and easy thing to do. Thousands of years ago, however, things were different. Temperature-related phenomena were always being observed. Snow fell and collected in cold weather, and melted into liquid water when the air warmed in spring. Liquid water fell as rain when the air was warm. Ice melted when placed near a source of heat, and water completely boiled out of a pot on a hot stove. However, these are all qualitative observations. They do not generate a number: they do not tell us that water freezes at 0 °C, or that it boils at 100 °C. All we learn from observation is that heat and cold do something to water, or that water behaves differently when it is heated or cooled.
In the 16th and 17th centuries, scientists refined the observations and experiments of the Byzantines and Greeks to produce rudimentary devices relating the amount of “hotness” or “coldness” in the air. The devices they built were called thermoscopes. These basic measuring tools utilized the expansion and contraction of air and water when heated and cooled.
The concept was remarkable, but thermoscopes did not have a numeric scale. The thermoscope could not answer the question, “How hot is it today?” with a number, but it could give a relative measurement. The thermoscope was often a simple tube of gas over liquid. Thermoscopes also served as barometers (which measure pressure ). That made it difficult to use them as thermometers, but they reacted to both pressure and temperature. Even when early thermometers did have a numeric scale, the scales were not standardized.
The dawn of the 18th century saw great change in thermometers, thanks to the work of Isaac Newton, Anders Celsius, and Daniel Fahrenheit.
- Isaac Newton proposed a thermometer with a scale of 12 degrees between the freezing and boiling points of water.
- Fahrenheit was working with tubes filled with mercury, which has a very high coefficient of thermal expansion. This, combined with the quality and accuracy of Fahrenheit’s work, led to much greater sensitivity, and his thermometer was standardized against a brine solution and universally adopted, with the Fahrenheit scale being named in his honor.
- Anders Celsius proposed a 100 degree scale for the difference between freezing and boiling of water, and after a few minor adjustments, the Celsius, or centigrade, system was also widely adopted.
Thermometer calibrated with the Celsius scal: Celsius is a scale and unit of measurement for temperature where 0 °C is the freezing point of water. Our ability to accurately measure temperature enables us to measure the weather, cook food accurately, or conduct a scientific experiment.
Further advances led to faster-acting thermometers, which were useful in medicine and chemistry. Early thermometers did not record or hold the temperature they were measuring: if you removed the thermometer from the substance being measured, its reading would change. Scientists invented new thermometers that would maintain their reading, at least for a limited period of time, to reduce measurement errors and make it easier to record the temperature. Dial thermometers using bimetallic strips were also developed. The bimetallic strips are made from two dissimilar metals bonded together, with each metal having a different coefficient of thermal expansion. Upon heating or cooling, the two metals expand or contract at different rates, causing a bending or curvature to appear in the strip. This bending is useful as a transducer for the temperature reading; it can control a thermostatted circuit or drive a simple dial thermometer.
Through the development of temperature measurement, however, one question remained unanswered: “How cold can it really get? How cold is absolute 0?”
The trivial answer is “0 degrees,” but what exactly does that mean? Temperature itself is the measurement of the average kinetic energy of a substance. The kinetic energy arises from the motion of atoms and molecules, and it is postulated that at absolute 0, there is no motion and therefore, no kinetic energy. Therefore, the temperature must be “absolute 0.”
The question remains: how much colder is absolute 0 than 0 °C?
In 1848, Lord Kelvin (William Thomson) wrote a paper entitled “On An Absolute Thermometric Scale” about the need to seek out a thermodynamic zero temperature. Using the Celsius system for its measurement of degrees, Lord Kelvin calculated the ultimate cold temperature to be -273 °C. Today that is referred to as 0 K on the Kelvin thermodynamic temperature scale. Modern methods have refined the measurement to -273.16 °C.
Types of Temperature Scales
Temperature can be measured and represented in many different ways. The fundamental requirements of the practice involve accuracy, a standard, linearity, and reproducibility. The SI unit, chosen for its simplicity and relationship to thermodynamics, is the kelvin, named in honor of Lord Kelvin. While incrementally equal to the Celsius scale, the temperature in kelvins is a true representation of the kinetic energy in a thermodynamic sense. Chemistry and physics require many calculations involving temperature. Those calculations are always made in kelvins.
Comparison of temperature scales: Temperatures of some common events and substances in different units.
A comparison of temperature scales table illustrates a variety of temperature scales, some of which are no longer used. It is interesting to see the temperatures of commonly occurring events over these scales, and to imagine the great hurdles that were overcome in developing modern thermometry.
Conversion to and from kelvin: Use the equations in this table to calculate temperatures using the kelvin measurement system.
Although in most cases scientists are equipped with some sort of electronic calculator, there might be times when a conversion from one scale to another is required. Conversion tables can be used to convert a measurement to any scale from any other temperature scale, such as kelvin or Celsius.