RF fields can change radical concentrations and cancer cell growth rates (by Frank Barnes and Ben Greenebaum, IEEE Power Electronics Magazine)
Concerns have been raised about the possible biological effects of nonionizing radiation since at least the late 1950s with respect to radar, other radio, and microwave sources. More recent concerns have arisen about the potential effects of low-intensity fields, including low-frequency fields from the electric power generating, transmission, and distribution system and the devices it energizes, as well as intermediate, radio-frequency (RF), and higher-frequency radiation from devices such as cellphones, broadcast antennas, Wi-Fi, security monitors, and so forth. These are concerns about the direct effects of radiation on humans or other organisms. They are distinct from the electromagnetic compatibility issues that concern interference by the fields from one device with the function of another, though human health can be indirectly affected by electromagnetic interference with the function of medical devices, including hospital equipment or pacemakers.
Because of the difficulties in establishing the direct biological effects of long-term low-level exposures, the lack of an understood mechanism, and difficulties in obtaining reproducible results, the guidelines for exposure limits have been set based on relatively shortterm exposures (minutes) that show clear-cut damage with the addition of a substantial safety factor. The current guidelines from the U.S. Federal Communications Commission (FCC) for limiting exposures in free space to the general public for the frequency range 100 kHz–100 GHz are given in Table 1. These guidelines are based on American National Standards Institute (ANSI) and IEEE recommendations. For cell phones, the specific absorption rate (SAR) is limited to 1.6 W/kg averaged over 1 g of tissue. These limits have been set based on providing a significant safety factor over exposure levels known to cause damage, where the primary damaging mechanism is heating and an increase in temperature. At low frequencies, the limits are based on induced current densities that would excite nerve firing, and the permissible exposures recommended by IEEE C95.6 are shown in Table 2. The International Commission on Nonionizing Radiation Protection (ICNIRP) sets electric field exposure limits at 50 Hz to 5 kV/m and magnetic flux density limits at 100 nT. It also sets guidelines for general public exposures in the frequency range 3 kHz–10 MHz at E V = 83 /m, B = 27 nT and a whole-body SAR = 0.08 W/kg, and 1.6 W/kg over 1 g.
In general, environmental exposures at any frequency do not exceed these guidelines, especially for the general public. Instances of occupational exposures approaching or exceeding the guidelines are less uncommon [1]. However, the time constants for cell growth cycles and many other growth phenomena are often hours or days. The most favored proposed mechanism for effects from low-level, longterm exposures involves radicals, such as super oxide O , 2-* NOx, and H2O2, which is readily converted into the radical OH-, molecules with unpaired electron spins that are highly reactive. These molecules are both signaling molecules and molecules that can cause damage to important biological molecules, such as lipids and DNA. Damages, such as aging, cancer, and Alzheimer’s, are associated with radical concentrations that are elevated for extended periods of time [2]. In this article, we present the possible theoretical mechanisms and experimental data that show long-term exposures to relatively weak static, low-frequency, and RF magnetic fields can change radical concentrations. As a consequence, a long-term exposure to fields below the guideline levels may affect biological systems and modify cell growth rates, while an organism’s built-in mechanisms may compensate for these changes.