By MIKE MAY
When trying to focus on a biological image at high magnification, just a few minutes of twiddling with adjustments on a microscope makes you beg for more depth of field. If only the scope could show more of the z-plane, making it easier to see structural relationships in space. Fortunately, both hardware and software solutions now give biologists more to see in z.
Often, a biologist wants to see more depth to understand the relationship between objects. "A lot of research is now based on understanding dynamic interactions between multiple molecules," says Stephen Ross, Ph.D., a senior scientist at Nikon (Melville, NY). "You want to know if two molecules are interacting. How close are they? Maybe you want to know the kinetics of the interaction." Optics do that better laterally than in depth. "A standard microscope can get down to about 200 nm laterally," says Ross, "but axially, we can only get down to 800 or 850 nm. So, the accuracy is about four times worse in z."
As explained by Kevin Ryan, senior project manager in the engineering department at Media Cybernetics (Bethesda, MD): "With extended depth of field, you can get detailed information in depth and keep the resolution in x and y." The question is: What technology gives a researcher more of the third dimension?
|FIGURE 1. The Olympus Fluoview FV1000-MPE multiphoton system with a 60x objective can capture cross-sectional images down to 700 μm from the surface, as shown in these three-dimensionally constructed images of neurons expressing EYFP in the cerebral neocortex of a mouse under anesthesia. (Image courtesy of Hiroaki Waki, Tomomi Nemoto and Junichi Nabekura, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Japan.)|
Further with focus
For one thing, a researcher can see more in the z-plane with a stereomicroscope, but such scopes offer less magnification. So, the Olympus MVX10 provides the wider field of view and extended depth of field of a stereomicroscope, but can also switch to compound microscope-like performance. "If you want to zoom in to see an individual cell, you can, but then you sacrifice the depth of field," explains Edward A. Lachica, Ph.D., director of product marketing at the Olympus Corporation of the Americas (Center Valley, PA).
Other techniques can also be used to determine the z-plane relationship among, say, proteins. For example, Nikon uses one super-resolution technology–STORM, or stochastic optical reconstruction microscopy–to locate objects in the z-plane with an accuracy of 50 nm. As Ross explains: "We induce an astigmatism in imaging molecules with STORM, and spots–images of molecules after a Gaussian fit–are circular if they are in the plane of focus. The spots get stretched horizontally if they are higher and stretched vertically if they are lower." The amount of the stretch can be used to determine the separation of objects in z.
Making technology like STORM work, however, also depends on the surrounding conditions. For instance, the temperature must be carefully controlled to limit expansion and contraction of the microscope. "For our super-resolution systems, we specify that the temperature should be kept within ±0.5°C to get optimum results."
Focusing with more photons
To capture high-magnification images that show more in the z-plane, researchers can also turn to multiphoton technology. With a multiphoton system, a biologist can vertically section a sample. "The laser scans across a specimen and creates an image, like a page of a book," Lachica says. "Then, the focus drops down in z-space and scans another page. After doing this 300 or 400 times, you create an image of a complete story of where a cell is in Cartesian space." Using what Lachica describes as packets of energy (that is, photons), such scopes can image about 1 mm in z-space. "This was previously limited to 100 μm," says Lachica.
"Imaging deeply with a multiphoton is no simple task because the excitation energy required to visualize fluorescent probes is 'attenuated' by the same optical elements required to image fluorescing samples," Lachica explains. "This effect, called group-velocity dispersion or GVD, reduces significantly the achievable imaging depth. Working with [Newport's] Spectra-Physics (Santa Clara, CA), Olympus offers a multiphoton microscope, the FV1000-MPE, with dispersion compensating optics." (See Fig. 1.) Lachica adds: "Dispersion compensating optics makes sure that we can deliver the package cleanly, so that we can see brightly our cell that may be a millimeter from the surface."
|FIGURE 2. Researchers can't always see as much depth as desired in some objects, such as insects (a-e), but more of the z-plane appears when using the Live EDF (extended depth of field) plug-in with Media Cybernetics' Image-Pro Plus software (f). (Images courtesy of Media Cybernetics.)|
This is like using a flashlight in a cave, as Lachica explains it. "The beam is very efficient at detecting objects nearby; objects further away are difficult to see because particulates in the air 'disperse' the signal coming from the flashlight," he says. "Dispersion compensating optics ensures that the packets of photons delivered by the ultra-fast laser reach their targets effectively or efficiently. That is, they correct the effects of GVD. As a result, we're able to see a distally displaced specimen, and we're able to see it brightly without having to use a lot of laser power."
To further increase the fidelity of information, Olympus puts the multiphoton detector as close to a scope's objective lens as possible. "This ensures that as many photons of light as possible get collected," Lachica says. So far, examples exist of imaging 700 μm below the surface of a sample with the FV1000-MPE.
A software solution
Software can also help a biologist collect information from various z-planes and simultaneously view them all. For example, Media Cybernetics' Image-Pro Plus software can be enhanced with the company's Live EDF (extended depth of field) plug-in. When asked how much this plug-in can extend depth of field, Ryan says, "As much as you are willing to take time to collect." So instead of a fraction of a micron in depth, a scientist can image structures across tens of microns or more. Simply put, this software grabs images as a researcher focuses at deeper and deeper depths in a sample, and it builds a composite image as it goes. Imagine that you could see all the way through a plant cell, for example, and then all of the structures get represented as if in one plane. This plug-in also provides live tiling. So, a researcher could build a composite image that covers more area and depth. To help the scientist keep track, the software shows the current image under the microscope, as well as the current composite image.
To use this software, all you need is a microscope and a camera supported by Image-Pro software (there's a list on the Media Cybernetics Web site at support.mediacy.com/drivers.asp). With an automated z-motor, researchers could even build three-dimensional images of a sample. When asked where this plug-in comes in handy in biology, Kathy Hrach, Image-Pro Plus product manager, says, "Any sample with some sort of depth," and that's just about anything. So, it could be used on a stereomicroscope to study insects or cells (see Fig. 2). It can even be used in molecular biology. For example, in an in situ hybridization experiment, the labeled spots on a genome can all be seen in one image by combining the z-planes across the DNA.
So, from insects under a stereomicroscope to DNA experiments, imaging technology reveals deeper information than ever. To boot, devices in development promise to unveil even more biological structures and experimental results.