Why Is Relative Humidity Important in Laboratory Corrosion Testing?
Blog
Last Updated 2020
Corrosion is caused when a metal is in contact with water and an electrolyte, such as a salt. In this corrosive environment, metals react to form metal oxides. Except for noble metals such as gold, silver, and platinum, all metals exist as oxides in the environment. Corrosion is effectively nature’s way of returning refined metals back into their natural state.
Although this concept is simple, the practice of simulating outdoor corrosion in the laboratory is very difficult. Multiple oxides can form as a result of complex multi-step reactions that are dependent on specific environmental conditions. Environmental cycling of temperature and moisture is the main reason that outdoor corrosion mechanisms are so complex. In weathering, we often talk about dew (condensation) and rain as they relate to moisture. In corrosion, there is another term related to moisture, called deliquescence. This is the phenomenon where any salt will form a liquid solution when the environment exceeds a relative humidity threshold. This threshold is known as the deliquescence relative humidity (DRH) and varies for different salts as shown in the table below.
Deliquescence of salts can affect strongly the time of wetness of materials, which plays a major role in the corrosion experienced by specimens. To address this, temperature and humidity transitions specified in modern corrosion test cycles are usually controlled to ensure that the time above the DRH during a transition is consistent, regardless of which tester is used to run the cycle. Without controlled transitions, repeatability and reproducibility drop considerably.
Reproducing and controlling relative humidity is a major factor in achieving accurate simulation of outdoor corrosion in the laboratory.