This study examines the nonuniform exposure to the cornea from incident millimeter waves at 94-100 GHz. Two previous studies measured temperature increases in the rhesus cornea exposed to brief (1-6 s) pulses of high-fluence millimeter waves (94 GHz), one of which also estimated thresholds for corneal damage (reported as ED50, the dose resulting in a visible lesion 50% of the time). Both studies noted large variations in the temperature increase across the surface of the cornea due to wave interference effects. This study examines this variability using high-resolution simulations of mm-wave absorption and temperature increase in the human cornea from exposures to plane wave energy at 100 GHz. Calculations are based on an earlier study. The simulations show that the peak temperature increases in the cornea from short exposures (up to 10 s) to high-intensity mm-wave pulses are 1.7-2.8 times the median increase depending on the polarization of the incident energy. A simple one-dimensional "baseline" model provides a good estimate of the median temperature increase in the cornea. Two different estimates are presented for the thresholds for producing thermal lesions, expressed in terms of the minimum fluence of incident 100 GHz pulses. The first estimate is based on thresholds for thermal damage from pulsed infrared energy, and the second is based on a thermal damage model. The mm-wave pulses presently considered far exceed current IEEE or ICNIRP exposure limits but may be produced by some nonlethal weapons systems. Interference effects due to wave reflections from structures in and near the eye result in highly localized variations in energy absorbed in the cornea and surrounding facial tissues and are important to consider in a hazard analysis for exposures to intense pulsed millimeter waves.
- radiation damage
- safety standards
- Radiofrequency radiation (RFR)