|
Relative humidity describes how far the air is from saturation. It is a useful term for expressing the amount of water vapor when discussing the amount and rate of evaporation. Relative humidity is commonly stated during weather reports because it is an important indicator of the rate of moisture and heat loss by plants and animals. One way to approach saturation, a relative humidity of 100%, is to cool the air. It is therefore useful to know how much the air needs to be cooled to reach saturation.
The dew point is the temperature to which air must be cooled to become saturated without changing the pressure. Changing the pressure affects the vapor pressure and therefore the temperature at which saturation occurs. Thus, the dew point temperature is determined by keeping the pressure fixed. Changes in pressure slightly modify the dew point temperature. The dew point is useful in forecasting minimum temperatures, forecasting the formation of dew and frost, and predicting fog.
When the dew point equals the air temperature, the air is saturated and the relative humidity is %100. The dew point temperature tells us nothing about how many water molecules are in the atmosphere or how close the air is to a relative humidity of 100%. To know how close the air is to saturation, we need to know the dew point and the air temperature. The closer the dew point is to the air temperature, the closer the air is to saturation. The temperature, the dew point temperature, and the relative humidity are related to one another.
While memorizing the definitions of temperature, dew point temperature, and relative humidity is important, it is critical to learn how these methods of describing the amount of water in the atmosphere are related to one another and how changing one affects the other two. To accomplish this understanding we invite you to explore the relationships.
|
|