Temperature measurement using modern scientific thermometers and temperature scales goes back at least as far as the early 18th century, when Gabriel Fahrenheit  adapted a thermometer (switching to mercury) and a scale both developed by Ole Christensen Røemer . Fahrenheit's scale is still in use, alongside the Celsius scale and the Kelvin scale.
Many methods have been developed for measuring temperature. Most of these rely on measuring some physical property of a working material that varies with temperature. One of the most common devices for measuring temperature is the glass thermometer. This consists of a glass tube filled with mercury or some other liquid, which acts as the working fluid. Temperature increases cause the fluid to expand, so the temperature can be determined by measuring the volume of the fluid. Such thermometers are usually calibrated, so that one can read the temperature, simply by observing the level of the fluid in the thermometer. Another type of thermometer that is not really used much in practice, but is important from a theoretical standpoint is the gas thermometer.
Other important devices for measuring temperature include:
- Resistance Temperature Detector (RTD)
- Langmuir probes (for electron temperature of a plasma)
- Other thermometers
One must be careful when measuring temperature to ensure that the measuring instrument (thermometer, thermocouple, etc) is really the same temperature as the material that is being measured. Under some conditions heat from the measuring instrument can cause a temperature gradient, so the measured temperature is different from the actual temperature of the system. In such a case the measured temperature will vary not only with the temperature of the system, but also with the heat transfer properties of the system. An extreme case of this effect gives rise to the wind chill factor, where the weather feels colder under windy conditions than calm conditions even though the temperature is the same. What is happening is that the wind increases the rate of heat transfer from the body, resulting in a larger reduction in body temperature for the same ambient temperature.
The theoretical basis for thermometers is the zeroth law of thermodynamics which postulates that if you have three bodes, A, B and C, if A and B are at the same temperature, and B and C are at the same temperature then A and C are at the same temperature. B, of course, is the thermometer.
The practical basis of thermometry is the existence of triple point cells. Triple points are conditions of pressure, volume and temperature such that three phases(matter) are simultaneously present, for example solid, vapor and liquid. For a single component there are no degrees of freedom at a triple point and any change in the three variables results in one or more of the phases vanishing from the cell. Therefore, triple point cells can be used as universal references for temperature and pressure. (See Gibb's phase rule)
Under some conditions it becomes possible to measure temperature by a direct use of the Planck's law of black body radiation. For example, the cosmic microwave background temperature has been measured from the spectrum of photons observed by satellite observations such as the WMAP . In the study of the quark-gluon plasma  through heavy-ion collisions, single particle spectra sometimes serve as a thermometer.
Comparison of temperature scales
|Fahrenheit's ice/salt mixture||255.37||−17.78||0||459.67||176.67||−5.87||−14.22||−1.83|
|Water freezes (at standard pressure)||273.15||0||32||491.67||150||0||0||7.5|
|Average human body temperature ¹||310.0||36.6||98.2||557.9||94.5||12.21||29.6||26.925|
|Water boils (at standard pressure)||373.15||100||212||671.67||0||33||80||60|
|The surface of the Sun||5800||5526||9980||10440||−8140||1823||4421||2909|
¹ Normal human body temperature is 36.6 °C ±0.7 °C, or 98.2 °F ±1.3 °F. The commonly given value 98.6 °F is simply the exact conversion of 37 °C, and therefore has excess (invalid) precision.
Some numbers in this table have been rounded off.
- See main article: Negative temperature .
For some systems and specific definitions of temperature, it is possible to obtain a negative temperature. A system with a negative temperature is not colder than absolute zero, but rather it is, in a sense, hotter than infinite temperature (sic).
In 1701, Ole Rømer  (1644-1710) made one of the first practical thermometers. As a temperature indicator it used red wine. The temperature scale used for his thermometer had 0 representing the temperature of a salt and ice mixture (at about 259 K).
In 1731, René-Antoine Ferchault de Réaumur (1683-1757) made a simpler temperature scale. On this scale 0 represented the freezing point of water (273.15 K) and 80 represented the boiling point (373.15 K).
In 1742, Anders Celsius (1701-1744) invented the centigrade or Celsius temperature scale in which 100° represented the boiling point of water (373.15 K) and 0° represented the freezing point (273.15 K).
The kelvin (symbol: K) is the SI unit of temperature. It is defined by two facts: zero kelvins is absolute zero (when molecular motion stops), and one kelvin is the fraction 1/273.16 of the thermodynamic temperature of the triple point of water. The kelvin is named after the British physicist and engineer William Thomson, 1st Baron Kelvin.
- ^ The temperature of the air near the surface of the Earth, is usually determined by a thermometer in an Stevenson screen. The thermometers should be between 1.25 m (4 ft 1 in) and 2 m (6 ft 7 in) above the ground as defined by the World Meteorological Organization (WMO). The true daily mean, obtained from a thermograph, is approximated by the mean of 24 hourly readings and may differ by 1.0 degrees C from the average based on minimum and maximum readings. 
- Timeline of temperature and pressure measurement technology
- color temperature
- Planck temperature
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|