Categories
Hypothermia

History

Humans have known for millennia that exposure to the cold could result in death and that fatigue or exhaustion make it worse. To actually define and recognize hypothermia, a thermometer small enough to be used regularly on humans was required. It was invented in 1866 and not widely available for medical use until decades later. It took a long time after thermometers became available to get an idea of how warm the body should be.

A lot of people had to have their temperatures taken and recorded to find out what normal was. And, all of the temperatures had to be taken the same way—standardization that didn’t exist for many years. The first study of human temperatures was published in 1868 and it included a discussion of temperatures for more than 25,000 subjects with various diseases. Most of the temperatures were taken under the arm (midaxillary), a notoriously inaccurate method.

Even in the early years of using temperature as a diagnostic tool, doctors knew that patients couldn’t handle low temperatures, but the condition didn’t have a proper name. The term “hypothermia” didn’t appear in print until about 1880 and was used to mean different things, from having cold hands to not being “tolerant” of the cold. It wasn’t clearly defined as doctors know it today until the 20th century.

It was well-known that hypothermia (despite not having an actual name) could be caused by exposure to the cold, and the role of alcohol intoxication in hypothermia was identified right away. The idea that hypothermia could occur during surgery is a relatively modern realization.

Leave a Reply

Your email address will not be published. Required fields are marked *