From Office of Naval Research
Sensing tanks, as well as cancer
(Double-Blind Test Result): Unsupervised classification images of the left breast. Here the color scheme indicates the probability of normal heat source versus the abnormal heat source class.
What does remote sensing for camouflaged enemy ground vehicles have to do with breast cancer diagnosis? By next year, perhaps plenty. Both find threats hidden in innocent clutter.
The Office of Naval Research's newly developed 200 channel hyperspectral remote sensing capability -- modeled on the human visual/brain "unsupervised" learning system that can distinguish between what's important from what's not -- is now being tested both on the LANDSAT satellite and the F-18 fighter jet as a passive electro-optical, infra-red surveillance system. A range of sensors surveys the scene in question, and incoming data is compared and contrasted using algorithms based on learning neural network processing.
This algorithm, invented by ONR scientists Dr. Harold Szu and James Buss to increase the effectiveness of military surveillance systems, can dig out the nuggets of information needed even if they're buried in 'noise' and clutter. Hyperspectral sensors sweep up enormous quantities of data, but their usefulness has been limited by our ability to pull the important information out of that clutter. The algorithm that processes the data is the important factor.
Systems that do this sort of analysis are called ATR systems -- automatic target recognition. Last year the Under Secretary of Defense for Science and Technology asked ONR to look at the potential usage of ATR sensing to improve breast cancer diagnosis. The results of an initial test have been astounding. By demanding greater nutrition through increased blood supply, abnormally reproducing cells generate higher concentrations of heat -- minute though it might be. Thermal breast scanning has been popular for a number of years, but its use has been limited to a single band, using a single camera. ONR's algorithm is based on the assumption that a smart sensing algorithm requires a brain-like pair of eyes to reveal the composition of information contained in a single pixel image. And, it doesn't presume to know what is being looked for. Applying their algorithm, Szu and Buss were able to classify infrared heat distribution given off by abnormally reproducing cells. A pair of cameras -- at different infrared wavelengths* -- transcribed the thermal diffusion process into two images, which were then filtered for shared signals while disagreement noise was minimized. Last February Szu and Buss and their team detected early stage ductal carcinoma in situ (DCIS) in a test patient using a double-blind procedure (images: http://www.onr.navy.mil/onr/media/download.htm).
"This multispectral, sub-pixel super-resolution is potentially more accurate by an order of magnitude," says Dr. Szu. "It is a passive, non-intrusive means of screening pre-cancer patients without radiation hazard, and may potentially be used to detect other dermal carcinomas." A provisional patent application has been filed. Follow-on research and clinical studies are being planned through the use of Cooperative Research and Development Agreements (CRADA).
*Mid IR: InSb Cincinati Elec.3-5 Micrometer wavelength, and Long IR: Platinum Silicide ICC 8-12 Micrometer wavelength.
To interview Dr. Szu and his team, please contact Gail Cleere.