Near-infrared light enables handheld both diagnostics and screening instruments that could add considerable power to the arsenal of primary care and specialty physicians.
The handheld instruments that fill a physician's toolbox range from simple items, such as tongue depressors and reflex hammers, to more sophisticated devices, like ophthalmoscopes or stethoscopes. But none of them reflect technology advances in such areas as photonics-enabled medical imaging. Researchers hope to change that, however, with handheld devices based on near-infrared (NIR) optics.
NIR light hits the biological sweet spot because tissues absorb little of it, so it can penetrate deeply. To show the breadth of work in this area, this article explores three exciting developments in instrumentation: a device from the University of Illinois at Urbana-Champaign that uses optical coherence tomography (OCT) to screen primary care patients; a unit from MELA Sciences (Irvington, NY) that provides early detection of skin cancers by analyzing reflections from a collection of wavelengths; and a device being developed at Florida International University in Miami that detects breast cancer.
Screening over diagnostics
"Over half of the office visits by patients are internal medicine, pediatrics, and family medicine, where handheld instruments are ubiquitous," says Stephen A. Boppart, Bliss Professor of Engineering at the University of Illinois at Urbana-Champaign. For one thing, Boppart wants to advance this technology with modern optics and analysis. In addition, he says, "I want to transform these tools from diagnostic technologies—looking at sick people to determine why they are sick—to screening technologies, or surveying the general population that is usually healthy, to look for something abnormal."
To pursue these goals, Boppart selected optical coherence tomography (OCT). By using NIR light, this technology detects interference patterns that reveal subsurface structures in three dimensions. With this basic imaging technology, Boppart hopes to provide a screening tool that can be used by primary care physicians to examine the eyes, ears, nose, mouth, and skin in general.
The ears make a great place to start. As Boppart explains, over 90 percent of infants get an ear infection by the time they are three years old, and half of infants suffer from chronic ear infections that require lots of antibiotics or even the placement of ear tubes. "There's a clear link between ear infections and bacterial biofilms that grow in the middle ear," Boppart says. "With OCT, we can see these biofilms noninvasively, and we think that this will change the approach to treating this disease, because you should treat the patient until the biofilm is gone."
To turn OCT into an instrument, Boppart and his colleagues developed a handheld device that scans a beam of 840 nm light over tissue. The device includes interchangeable tips for imaging ears, eyes, or skin. The device collects the OCT data and also a video image. "The 3D OCT data are the optical analogue of ultrasound," Boppart says. "Instead of using sound waves and collecting the reflections, we use light waves and collect their reflections." From those data, Boppart's device can generate two- or three-dimensional images. For skin, this device can image about 1–2 mm deep (see Fig. 1).
|FIGURE 1. This handheld device uses optical coherence tomography (OCT) to reveal bacterial biofilms—behind the tympanic membrane (TM)—that cause ear infections. (Image courtesy of Stephen A. Boppart)|
In the next few years, funded by a U.S. National Institutes of Health Bioengineering Research Partnership, Boppart hopes to have a commercial prototype that is ready for clinical trials.
At Florida International University, Anuradha Godavarty, associate professor of biomedical engineering, and her colleagues develop handheld devices that measure the absorption of NIR light in tissue. "This images functionality instead of structures," Godavarty explains. For example, it can show differences in the levels of oxygenation and deoxygenation in blood. The high levels of oxygen going to the cell growth in breast cancer can be detected with this technology.
To further advance this technology, two of Godavarty's students—Jean Gonzalez and Manuela Roman—recently developed a version of this technology that images in two ways. It can record absorption from reflectance, like ultrasound, or transillumination, like an x-ray. Their research showed that the reflectance mode could detect a 0.45 cm3 target as much as 2.5 cm from the surface, and the transillumination could detect it as deep as 5 cm. The imaging modes might be combined to further analyze tissues.
Godavarty adds that these devices use a local GPS to keep track of measurements at specific spots on the breast. "That makes this device operator-independent," she says. "Also, since it's handheld there's no compression." So far, this technology remains in development (see Fig. 2).
|FIGURE 2. At Florida International University, Anuradha Godavarty and her colleagues work as a team to develop a handheld device to detect breast cancer. (Image courtesy of Anuradha Godavarty)|
MELA Sciences already has a handheld device on the market. The MelaFind uses LED-based illumination to create 10 wavelengths, from 440 to 940 nm, and a CMOS sensor to capture the image, which can be used for the early detection of skin cancer. "It uses light to take images of the skin, so it's noninvasive," says Steve Wicksman, associate director of mechanical engineering at MELA Sciences.
The developers of this device employed 10 wavelengths because, as Wicksman explains, "each wavelength reacts with different layers of your skin." Consequently, the CMOS camera takes images—as .jpg files—of 10 layers of the skin. That camera uses a proprietary lens that must focus 10 wavelengths of light on the same plane. Researchers at MELA Sciences designed the lens, which is now manufactured by Carl Zeiss (Oberkochen, Germany). "The software can assemble sort of a 3D picture of a skin cancer lesion," Wicksman says.
The algorithms behind this device were designed for analyzing satellite images for the defense industry. It turns out that an algorithm that distinguishes enemy tanks from friendly ones can also detect cancerous tissue in healthy surroundings.
"The software looks at a database of thousands of lesions and scores it high or low for disorganization," Wicksman explains. "When skin is disorganized, that means it's cancerous." The output of the software indicates whether a dermatologist should take a biopsy of the lesion for further confirmation. "It's a tool to help dermatologists decide on lesions that are in that gray area where they're not sure," Wicksman says.
One day, handheld devices like these could change primary care. Consequently, tomorrow's routine medical examinations could turn far more quantitative and objective than today's.