Frequently Asked Questions

Q-FOG Cyclic Corrosion Testers

Q: What is Cyclic Corrosion Testing?

Cyclic corrosion testing is intended to be a more realistic way to perform salt spray tests than traditional, steady state exposures. Because actual atmospheric exposures usually include both wet and dry conditions, it makes sense to pattern accelerated laboratory tests after these natural cyclic conditions. Research indicates that, with cyclic corrosion tests, the relative corrosion rates, structure and morphology are more similar to those seen outdoors. Consequently, cyclic tests usually give better correlation to outdoors than conventional salt spray tests. They are effective for evaluating a variety of corrosion mechanisms, including general, galvanic, and crevice corrosion.

Cyclic corrosion testing is intended to produce failures representative of the type found in outdoor corrosive environments. CCT tests expose specimens to a series of different environments in a repetitive cycle. Simple exposures like Prohesion may consist of cycling between salt fog and dry conditions. More sophisticated automotive methods call for multi-step cycles that may incorporate immersion, humidity, condensation, along with salt fog and dry-off. Originally, these automotive test procedures were designed to be performed by hand. Laboratory personnel manually moved samples from salt spray chambers to humidity chambers to drying racks, etc. More recently, microprocessor controlled chambers have been used to automate these exposures and reduce variability.

See Technical Bulletin LF-8144 for more information.

Q: What Water Purity is Required for Q-Lab Testers?

Based on internal research testing and consultation with water purification experts, Q-Lab has  harmonized the requirements for all QUV, Q-SUN, and Q-FOG tester models, with only two exceptions (Q-SUN spray, QUV non-spray, and Q-FOG CRH models), as shown below. These guidelines are intended to prevent water spotting on test specimens, buildup of mineral scale deposits, and corrosion of test chambers and plumbing systems.

Tester Model

Resistivity
(Ω-cm)

Conductivity
(µS/cm)

Total Dissolved
Solids (ppm)

Colloidal
Silica (ppm)

pH

QUV/basic, se and cw

ordinary tap water is OK

Q-SUN spray and Q-FOG CRH

> 5 M

< 0.2

< 0.1

< 0.1

6-8

All other tester models

> 200 k

< 5

< 2.5

not specified

6-8

 

Additionally, Q-Lab strongly recommends the use of water purification systems that use reverse osmosis in addition to deionization (RO/DI systems). These systems are not expensive or difficult to operate, and are effective at removing colloidal silica that can cause water spotting on some test specimens. An RO/DI system can be more effective and economical over time than a two-stage deionization system with a Type 1 anion resin, which Q-Lab previously recommended.

In addition to the water purity specifications presented above, Q-Lab also recommends that customers use a water recirculation system wherever possible, to avoid standing water.

For more information, please see Q-Lab’s Technical Bulletin LW-6049 “The Importance of Water Purification for Weathering and Corrosion Testers.” This document is available on q-lab.com and includes detailed information on our research on water purity.

Q: What is Dew Point?

The dew point is the temperature at which dew (condensation) forms and is a measure of atmospheric moisture. It is the temperature to which air must be cooled at constant pressure and water content to reach saturation. Dew points are expressed as a temperature. Higher dew points correlate to higher moisture content, also known as absolute humidity.

The dew point represents the lowest temperature to which air having specific values of temperature and relative humidity (RH) can be cooled. At the dew point, air has a relative humidity of 100%, and additional cooling produces condensation rather than lowering the air temperature.

The figure below shows a constant dewpoint line of 12 °C, illustrating that a dew point can be represented by many combinations of RH and temperature. This dew point corresponds to a well-controlled lab of 23 °C and 50% RH (yellow star), an environment of 12 °C and 100% RH (dew point equals temperature by definition when RH is 100%), and all other points on the line. A tester can only meet conditions with higher dew point (hotter, more humid) than ambient lab conditions. Lower dew point conditions (cooler, drier) can only be met with conditioned air.

Dew Point Graph

Q: Why is an air preconditioner used with the Q-FOG CRH Cyclic Corrosion Tester?

Q-Lab includes an air preconditioner as a standard accessory with every Q-FOG CRH tester. Not all manufacturers do this, so we often are asked why it is necessary. In short, the Q-FOG CRH air preconditioner ensures reliable, stable, repeatable chamber conditions and precise control of  transitions. These are necessary elements for achieving reliable corrosion test results.

There are three major benefits of the air preconditioner:

  1. Cooling and dehumidifying laboratory air ensures consistent compliance with “ambient” dry-off conditions in standards such as VW PV1210 and GMW 14872.
  2. The air-preconditioner reduces the dew point of air entering the chamber by drying the air. This allows the tester to comply with the Renault ECC1 cycle or others with low dew points.
  3. Control of the air entering the chamber enables very precise linear transitions between conditions, which promotes test repeatability.

The graph below is an example of how the drying and cooling action shifts the range of available tester conditions. This example is for a well-controlled lab environment but a similar improvement is realized in hot, humid laboratories. The preconditioner shifts the incoming air dew point from the dotted black line to the solid blue line, making available the region shaded in green. This region includes several key “ambient” setpoints from major corrosion standards.

The air preconditioner removes repeatability issues associated with corrosion testing. Since the air coming from the air preconditioner to the tester has consistent temperature and relative humidity conditions, the Q-FOG CRH enables precise control of test conditions and linear ramping that is consistent from test to test.

Preconditioner Graph

/>

 

Q: Why Is Relative Humidity Important in Laboratory Corrosion Testing?

Corrosion is caused when a metal is in contact with water and an electrolyte, such as a salt. In this corrosive environment, metals react to form metal oxides. Except for noble metals such as gold, silver, and platinum, all metals exist as oxides in the environment. Corrosion is effectively nature’s way of returning refined metals back into their natural state.

Although this concept is simple, the practice of simulating outdoor corrosion in the laboratory is very difficult. Multiple oxides can form as a result of complex multi-step reactions that are dependent on specific environmental conditions. Environmental cycling of temperature and moisture is the main reason that outdoor corrosion mechanisms are so complex. In weathering, we often talk about dew (condensation) and rain as they relate to moisture. In corrosion, there is another term related to moisture, called deliquescence. This is the phenomenon where any salt will form a liquid solution when the environment exceeds a relative humidity threshold. This threshold is known as the deliquescence relative humidity (DRH) and varies for different salts as shown in the table below. 

Table
Deliquescence of salts can affect strongly the time of wetness of materials, which plays a major role in the corrosion experienced by specimens. To address this, temperature and humidity transitions specified in modern corrosion test cycles are usually controlled to ensure that the time above the DRH during a transition is consistent, regardless of which tester is used to run the cycle. Without controlled transitions, repeatability and reproducibility drop considerably.

Reproducing and controlling relative humidity is a major factor in achieving accurate simulation of outdoor corrosion in the laboratory.

Q: What is “Relative Humidity” and How Is It Measured in Q-FOG Testers?

Humidity is a general term that describes the amount of water vapor in the air. Humidity is a critical element of the outdoor environment and contributes to the material degradation in both weathering and corrosion. Humidity can be expressed as either an Absolute Humidity or a Relative Humidity (RH). Absolute humidity is the mass of water vapor in a given volume of air, expressed as g/m³. Relative Humidity (RH) represents the amount of water vapor in the air vs. how much it would contain if fully saturated, expressed as a percentage. Relative humidity is used much more commonly both in determining human comfort level and when describing natural and accelerated weathering.

Relative humidity can be measured a number of ways. Q-Lab uses two methods in our testers - electronic measurement with a digital hygrometer is used in Q-SUN xenon arc testers and mechanical measurement with a wet bulb/dry bulb hygrometer is used in Q-FOG corrosion test chambers.

Digital hygrometers are relatively common in everyday life. A digital hygrometer doesn’t require significant air flow, which makes it ideal for use in the Q-SUN tester and for ambient lab measurements. Digital hygrometers are readily available and simple to package.

A wet/dry bulb hygrometer uses thermometers, which makesit relatively easy to calibrate compared to a digital hygrometer. The wet/dry bulb requires a lot of air flow, which isn’t a problem in the Q-FOG tester’s blower module, and is also easy to maintain free from corrosion. Salt fog would degrade and eventually destroy a digital hygrometer if it were used in a Q-FOG chamber.

Q: Is the “Wet Bottom” Condition Required for the Q-FOG CRH to meet Corrosion Test Methods?

The term “wet bottom” is used in two common standards for corrosion testing - ASTM G85 and SAE J2334. ASTM G85 calls for a few cm of water retained in the bottom of the chamber; SAE J2334 calls for either a reservoir of immersively-heated water or standing water at the bottom of the chamber with compressed air bubbled through it.

The purpose of the wet bottom in both standards is twofold, both stemming from the lack of relative humidity (RH) control in most corrosion testers:

  1. prolong the time it takes for specimens to dry off when specimen drying is desired, and
  2. to increase the humidity to promote continuous specimen wetness during certain phases of a test cycle.

The Q-FOG CRH tester’s precise control of relative humidity makes the wet bottom condition completely unnecessary. Controlling RH is a more precise way to accomplish the goals outlined above.

  • In the case of SAE J2334, the conditions specified are a chamber temperature of 50 °C and 100% RH. The Q-FOG CRH can meet this easily without the use of a wet bottom condition.
  • ASTM G85 Annex 2 and Annex 3 include the wet bottom condition as a means to produce a gradual increase of humidity from 65 to 95 % (Annex 2) and to avoid dry conditions inside the test chamber (Annex 3). In both cases, a wet bottom isn’t necessary in the Q-FOG CRH. The tester can precisely control the relative humidity during transitions from one RH level to another, and can operate high RH conditions in steps that require specimens to remain wet. In fact, even the Q-FOG CCT can meet this requirement using the HUMID function.

The problem using the wet bottom for corrosion test methods is its lack of repeatability or reproducibility. If you are concerned about running a repeatable and reproducible test that requires a wet bottom condition, we recommend the Q-FOG CRH with its standard RH control system, in place of the inferior wet bottom technique, for precise, repeatable wet/dry transitions.

Q: Is Bubble Tower Temperature Calibration in Corrosion Testing Really Necessary?

The short answer to this question is no, for two main reasons:

1) There are no international corrosion standards that contain mandatory requirements for the bubble tower water temperature.
2) Bubble tower temperature is not critical to testing, and as a result, international corrosion test standards do not require bubble tower thermometer calibrations. 

The salt fog test standard ASTM B117, published in 1939, called for “saturation towers,” now known as “bubble towers,” to promote repeatability. However, by 1954 scientists and engineers understood that bubble towers were not a critical part of the test, and the standard language in ASTM B117 and its analog ISO 9227 has changed over time to clarify that the bubble tower temperatures and compressed air pressures are non-mandatory. The test setup only requires that salt spray collections meet all specifications in terms of rate, pH, and concentration. In fact, the latest version of ISO 9227 removed the requirement to have a bubble tower to run the test!
Bubble tower temperature is best understood as a tool to help meet the collection rates required in corrosion test standards. As an additional benefit, a bubble tower cleans incoming compressed air. However, bubble tower temperature does not require calibration or precise temperature control, since the software in the Q-FOG accelerated corrosion tester independently achieves excellent temperature control inside the chamber and accurate control of salt spray collection rates.
Although bubble tower temperatures are not mandated in standards, they can be both calibrated (compared to a reference value) and adjusted. However, Q-Lab strongly advocates keeping tests as simple as possible. Since Q-Lab and corrosion experts agree that other technical features and parameters in accelerated corrosion testing are far more important, Q-Lab’s view is that calibrating bubble tower temperature or pressure of atomizing air is an unnecessary use of resources.
 
 

FAQ's

Product FAQ's


Contact Us

  1.  

Q-Lab Phone Numbers

USA: +1-440-835-8700
Florida & Arizona Test Services: +1-305-245-5600
UK/Europe: +44-1204-861616
Germany: +49-681-857470
China: +86-21-5879-7970

 
View Full Site