Recent advances in lasers, optics and information technology will make biomedical imaging faster and cheaper, and will also reduce the need for surgery, explains Paul French
The invention of the silicon microchip and the resulting boom in information technology has shaped the 20th century to a far greater degree than other physics-based developments such as spacecraft and nuclear power stations. In the same way, new techniques aimed at improving the way that biological and medical information is collected and analysed will probably have a far greater impact on medicine in the 21st century than big breakthroughs such as organ transplantation or bionic implants.
Many important discoveries in physics have been rapidly exploited by the medical community for diagnosing and treating a variety of illnesses. Among some of the best known examples are ultrasound, X-rays and magnetic resonance imaging.
Recent research efforts have focused on making medical diagnosis cheaper, faster and more effective by developing new physics-based imaging techniques and by using powerful computers to process and analyse biomedical information. New technological developments are being increasingly driven by the needs of the medical profession, and optics is playing an increasingly important role.
Medical diagnostics
According to some in the medical profession, the “Holy Grail” of medical data acquisition is akin to the Star Trek tricorder: a portable device that delivers a complete diagnosis when simply pointed at a patient. This is some way off and today’s physicians have to collate information obtained using a variety of different techniques, ranging from X-ray and ultrasound imaging to laboratory analysis of samples of tissue removed from patients. Conventional imaging techniques can be used to establish physical damage or abnormality in the structure of various types of tissue, but unfortunately they can not show biological processes within the body or detect the onset of many diseases.
Detecting physiological changes, such as the oxygenation of blood, often relies on distinguishing the different chemical behaviour of the organs or tissue under study.
Optical spectroscopy is a standard tool in laboratory analysis that is used to detect and monitor the concentration of substances, such as oxygen, by measuring the absorption, fluorescence or Raman spectra. But performing spectroscopy in biological tissue is very different from making measurements on a solution in a test tube.
A tissue sample is often made up of many different components – it is said to be heterogeneous. If one simply irradiates a sample with a light beam and observes the optical “signature”, there will typically be many contributions from the different types of tissue that have interacted with the beam and will confuse the picture. Furthermore, quantitative optical measurements in biological tissue are extremely difficult because tissue scatters light strongly, typically within 100 µm. These considerations have prevented the widespread application of optics in clinical medicine.
A lot of recent research work has instead focused on magnetic resonance imaging (MRI), a technique that detects the signals generated by the nuclear spins of protons present in water and fat, for instance. However, MRI has limitations. It requires a very high magnetic field, which makes the equipment very expensive. There are also many molecules that it cannot detect with sufficiently useful resolution, and imaging at the cellular level remains a formidable challenge.
If the problems associated with scattering can be overcome, optical imaging could provide a cheaper technology that would allow patients to be screened for diseases such as skin and breast cancer, osteoarthritis and diabetes, as well as revealing much more about the working processes of organs, including the brain.
Fluorescence imaging
One of the most widespread biomedical research tools is the optical microscope, which allows researchers to study biological processes at the cellular level. In recent years microscopy has advanced greatly in terms of resolution and the ability to highlight specific working processes in a living organism. In a spectroscopic technique known as fluorescence microscopy, a sample absorbs incident photons and emits light at different wavelengths depending on the molecules present in the sample. A colour image of the fluorescence radiation therefore corresponds to a map of the distribution of different fluorescence molecules, known as fluorophores, that are present.
Although strong scattering and the heterogeneous nature of samples make fluorescence imaging in tissue difficult, it can be achieved using a technique known as “wavelength-ratiometric imaging” in which the tissue sample typically is doped with fluorophores that have spectra that change in a predictable way. By comparing fluorescence-intensity images at two or more wavelengths, it is often possible to produce a map of the distribution of particular fluorophores, even though there may be a strong background from other naturally fluorescent molecules in the tissue.
The technique can be adapted to image the local chemical or physical environment in a tissue sample, such as the pH or calcium-ion concentration. The sample is stained instead with a fluorescent dye that has an emission spectrum that changes in a predictable way according to the strength of the local environmental perturbation. This method of imaging is used in many areas of science, especially biology, although it is restricted by the availability of dyes that are suitable for a particular application.
A complementary approach to fluorescence imaging exploits the lifetime, t, rather than the wavelength dependence of the emission. Typically the fluorescence signal will decay as I = I0exp(-t/t), where t is a characteristic property of the fluorescent molecule. Fluorescence-lifetime imaging (FLIM) can be used to distinguish between different fluorophores in a field of view, or to image changes in the local fluorophore environment that modify t. It is able to make use of a wide range of fluorescent dyes and can often be applied to naturally occurring fluorophores in biological samples.
However, useful changes in the fluorescence lifetimes of biological molecules can be as short as a few picoseconds and many fluorescent dyes have lifetimes in the nanosecond range or shorter. Until very recently most biomedical researchers did not have access to the technology that is required to image on these timescales . Advances in ultrafast lasers and in high-speed imaging detectors, such as CCD cameras and multichannel-plate optical intensifiers, mean that medical researchers are beginning to use FLIM to image living organisms.
Our group at Imperial College in London has developed a system that can image fluorescence-lifetime differences of less than 10 ps. It works by periodically exciting the sample with a series of 10 ps optical pulses from an ultrafast laser and essentially “photographing” the fluorescence-intensity image using a gated optical intensifier and CCD camera with an effective “exposure time” of 90 ps (figure 1). A sequence of fluorescence images is recorded at various time delays after each excitation. So for each pixel in the field of view, the fluorescence decay constant, t, corresponding to that particular part of the sample can then be calculated. The results are plotted in a fluorescence-lifetime image that can serve as a map of the distribution of different fluorophores or as a map of the changes in the local environment (see figures 2 and 3).
This technique also makes it straightforward to analyse samples containing a mixture of fluorophores that have different exponential-decay profiles. In DNA sequencing, for instance, the different bases may be “labelled” with two different fluorophore molecules. By fitting the observed fluorescence-decay profiles to: I = I 1exp(-t /t1) + I 2exp(-t /t2), one can obtain I 1/I 2, which is proportional to the concentration ratio.
Optical imaging in 3-D
In 1873 Ernest Abbe recognized that conventional far-field microscopes have a limited resolution in both the transverse (xy) and vertical (z) planes. The resolution in the xy plane varies directly with the wavelength of light, l, and inversely with the numerical aperture, which is a measure of the amount of light gathered by the microscope lens. In addition, conventional microscopes provide a degree of optical “sectioning” in the vertical direction via the depth of focus. The practical resolution limit should be around 0.2 µm for the transverse direction and around 0.7 µm in depth, assuming the maximum practical numerical aperture is 1.4 and the wavelength of the light is 500 nm. Unfortunately, this precision cannot be achieved in practice since the conventional microscope also collects light from outside the depth of focus and this blurs the depth-resolved image, in particular.
In 1955 Marvin Minsky of Harvard University in the US invented the scanning confocal microscope, which overcomes this problem by using a point source of light to interrogate the object pixel-by-pixel and a confocal pinhole at the detector to ensure that only light from a particular pixel is collected. A complete 2-D image is obtained by scanning the illumination across all the pixels. And because out-of-focus light is rejected by the detector pinhole, it is possible to view the object at various depths and build up a high-resolution 3-D image.
Indeed, confocal scanning can be used in combination with fluorescence imaging to acquire 3-D spectroscopic images of biological samples, a technique that is becoming a standard clinical research tool. One drawback, however, is that it takes a long time to interrogate all the pixels sequentially and build up the image. Also, it is necessary to use a laser to provide sufficiently bright illumination through the pinholes, which can be both costly and rather limited in terms of the wavelength range. Recently, however, low-cost and short-wavelength solid-state lasers have become commercially available and ever more powerful computing capabilities reduce the time it takes to render an image.
But there is a more serious drawback in using confocal fluorescence microscopy to image biological samples. While the confocal microscope only collects light from the sectioned image plane, the out-of-focus sample in the beam is nevertheless irradiated. This extended exposure to laser radiation can damage or destroy the sample. However, it is possible to avoid this problem by using an ultrafast laser that delivers optical pulses to excite fluorescence in the sample.
Even a laser with a relatively low average power can deliver high-intensity light pulses that last for a fraction of a picosecond. The intensity is sufficiently high for the molecule to absorb two photons simultaneously and produce the same excitation as a single photon with twice the energy. As the rate of two-photon absorption is proportional to the square of the light intensity, it is relatively straightforward to focus the beam to ensure that the excitation only occurs at one point.
Fluorophores with high-energy ultraviolet absorption bands can therefore be excited by much lower energy, and hence less damaging, visible or infrared radiation. In other words two-photon microscopy can permit 3-D imaging of many types of biological tissue for the first time without killing the cells under investigation. This will have a profound impact on biomedical research.
A further advantage of two-photon microscopy is that it permits quantitative fluorescence imaging at deeper depths than conventional confocal microscopy. This feature was exploited by Winfried Denk at AT&T Bell Laboratories in the US, who used the technique to image the concentration of calcium ions in order to study the signal transmission between nerve cells.
There have been many other important advances in optical microscopy and the field continues to evolve rapidly. Stefan Hell’s group at the Max-Planck Institute for Biophysical Chemistry in Göttingen, Germany, has developed a two-photon microscope that provides a confocally sectioned image in real time. And Tony Wilson’s group at Oxford University in the UK has developed a whole-field white-light microscope that has the sectioning capability of confocal microscopy but the speed and much lower cost associated with conventional microscopes.
Currently, the best resolution obtained using a confocal microscope at 543 nm is ~ 150 nm in the xy direction and ~ 420 nm in the z direction. To achieve the resolution needed to probe intracellular structure, Hell and co-workers have combined a novel optical microscope using two objective lenses and sophisticated image processing to yield optical resolutions of less than 100 nm. Optical microscopy is now beginning to approach the resolution that can be achieved with scanning probe microscopy.
Imaging through tissue
The most reliable way to detect disease is to look for characteristic changes in tissue samples that have been taken from a patient during a biopsy. But there is an increasing demand for low-cost medical diagnostic tools that are capable of imaging parts of the body without the need for surgery. “Optical biopsy” appears to be a promising tool for diagnosing and monitoring diseases such as cancer, but the difficulties associated with the scattering of the light need to be resolved. Optical techniques will therefore find their first applications at the surface of tissues – either externally, in the skin, eyes and mouth, for example, or internally via an endoscope. But endoscopy would be much more powerful if it were possible to image below the surface of tissues.
The absorption and scattering of light in biological tissue can be illustrated by shining a torch at one’s hand. It is possible to see a reddish glow, but not the outline of the bones that are in the path of the beam. The bones are not visible because the light is multiply scattered in the tissue. The reddish glow is readily understood from the absorption profile of the most common constituents of biological tissue. There is an absorption minimum in the near infrared around 830 nm that will preferentially transmit the red components of the beam rather than the shorter visible wavelengths. The relatively high-transmission spectral region between 650-1300 nm is often described as the “optical window” of biological tissue. Semiconductor lasers and recently developed solid-state laser technology can now conveniently provide high power, tuneable optical radiation in the near infrared and this has enabled many researchers to tackle the challenge of imaging through biological tissue.
When imaging thin (< 2 mm thick) tissue samples, it is possible to use photons that have not been scattered: these are called ballistic photons and they can be used to form high-resolution images. The number of ballistic photons, however, decreases exponentially with propagation distance. So the ballistic signal is usually swamped by multiply scattered photons that obscure any image and saturate most detectors (figure 4). For relatively shallow tissue depths it is possible to use various filtering techniques to block the multiply scattered photons. Indeed, a lot of current research is aimed at extending optical imaging and biopsy to depths of a few millimetres this way. Confocal microscopy, for example, rejects much of the scattered light through its spatial filtering action and has been shown to form useful images at tissue depths of up to 0.3-0.5 mm.
However, imaging to tissue depths beyond ~0.5 mm does not appear to be practical using optical microscopy and spatial filtering alone. Therefore, more sophisticated techniques have been developed to favour the detection of the ballistic photons. These techniques either exploit the fact that the ballistic-signal photons retain their coherence with the incident light, or that the ballistic photons arrive at the detector earlier than any diffuse photons that have been scattered back into their original path.
James Fujimoto and co-workers from the Massachusetts Institute of Technology in the US have successfully developed a ballistic-light imaging technique, known as optical coherence tomography (OCT), which combines confocal microscopy with heterodyne detection using low-coherence light (figure 5). The weak ballistic-light signal reflected from various layers within the tissue sample is combined with a powerful, coherent reference beam. The resulting interference or “beat” signal is detected with high sensitivity using a lock-in amplifier. Using low-coherence-length radiation means that interference will only occur when the reference signal and ballistic-light signal have travelled the same optical distance (to within the coherence length). By adjusting the pathlength of the reference beam, one can therefore detect the ballistic signal from specific depths in the sample and so build up a depth-resolved image. This has proved clinically useful in acquiring depth-resolved images in the eye and many groups are now developing the technique for imaging in strongly scattering biological tissue.
Although the image acquisition time is relatively slow (due to having to scan pixel-by-pixel) the use of ultrafast lasers to provide high average power, low-coherence radiation permits OCT systems to provide real-time depth-resolved images. This technique can readily be used in conjunction with an arthroscope – a type of endoscope specially designed to examine joints. Mark Brezinski at Harvard Medical School in the US has worked with Fujimoto’s group to apply OCT to detect changes in the orientation of the tissue surrounding joints that can signal the advent of osteoarthritis.
Deep imaging
The techniques that rely on ballistic-light detection can only image tissue to depths of a few millimetres. But how do we image through many centimetres of tissue when the ballistic-light signal is not detectable? Can we extract useful information from the multiply scattered photons?
For moderate tissue depths we can exploit the fact that biological tissue tends to scatter light in the forward direction, meaning that most photons will only deviate slightly from their original direction. After a few centimetres of tissue, there can be a significant number of photons that have followed a reasonably well defined “snake-like” path about the original direction through the tissue (figure 4c). The transmitted light therefore comprises three components: ballistic, “snake” and diffuse photons. The “snake light” will arrive at the detector after the ballistic light but before the fully diffuse photons, and can still retain some coherence with the incident light. Images formed using the least-scattered snake light have a poorer resolution compared with the ballistic-light images, but sub-millimetre spatial resolution is still possible. Humio Inaba from Tohoku Institute of Technology in Japan has exploited the technique to image through thicker biological tissue samples, such as human fingers and teeth. And Robert Alfano’s group at the City University of New York has demonstrated improved imaging using polarization to preferentially select the “snake light”.
Another approach is to divide the light arriving at the detector into time windows using ultrafast lasers and a variety of high-speed cameras or photon-counting systems that provide picosecond time gates ranging from ~ 1-100 ps. This permits the earliest arriving, least scattered, light to be selected and provides information about the tissue structures that produced the scattered-light distribution.
Inverse problem images deeper
For many important biomedical applications, such as mammography or functional imaging of the brain, it is necessary to penetrate through several centimetres of tissue, after which almost all the detected signal is diffuse and the ballistic-light signal is negligible. But it is possible to get round this by tackling the “inverse problem”, which entails measuring the scattered-light signal as comprehensively as possible and calculating what distribution of material would have produced this measured signal. The calculations exploit statistical models of photon transport and can have varying degrees of accuracy.
By considering the most probable paths that the photons will take from a given source to a detector, one can probe a volume of tissue and quantify changes in the optical properties within that volume. In general, there will be least uncertainty in the paths of the photons that are detected earliest and so the volume through which they might have travelled can be more precisely defined. This approach provides a means to quantify the average optical properties of biological tissue and to form relatively low-resolution tomographic images.
Optical fibres are often used to conveniently deliver light from a laser source to the area of tissue under investigation and also to collect the scattered light (figure 6a ). The arrival time of the scattered photons can be measured in various time windows using high-speed detectors in conjunction with ultrafast laser sources (the time-domain approach) or from the phase delay of the collected light with respect to an incoming sinusoidally modulated beam (the frequency domain). In theory the two approaches are equivalent, but in practice the time-domain approach has superior resolution compared with the frequency-domain approach because more complex (and expensive) apparatus has been used.
David Delpy and co-workers at University College, London, have developed a clinical instrument that uses the frequency-domain approach to measure the degree of oxygenation of the blood in the brains of premature babies. The instrument comprises a single detector and tunable laser source to determine the mean time-of-flight of photons between source and detector, and therefore the average optical pathlength. Measurements of the amplitude of the detected signal can then quantify changes in absorption. If this measurement is repeated at a number of different wavelengths, one can obtain useful medical data.
By scanning the source and detector across a sample, one could, in principle, build up a 3-D image, but this would be prohibitively slow. A more sophisticated approach is to use an array of sources and detectors simultaneously to acquire as much information as possible about the scattered light (figure 6b).
At present the computation required to perform the iterative inverse-scattering calculations for 3-D imaging can take hours and the achievable image resolution is of the order of 0.5-1 cm for mammography, which is rather poor compared with other imaging methods such as MRI. However, the power of computer processors is increasing month by month and the resolution will improve with the sophistication of the data acquisition and the optical-image-reconstruction algorithms.
The current state-of-the-art in thick-tissue imaging is probably an instrument being developed by Jeremy Hebden and co-workers at University College, London, which boasts 32 time-gated detector channels with 50 ps resolution and 32 sources derived from a tunable ultrafast laser. This is able to rapidly characterize the scattered-light signal with unprecedented precision and provides some indication of what may be achieved in the next decade.
Although the spatial resolution is not likely to improve much beyond ~0.5 mm for breast or brain imaging, this does not matter too much – it is the accuracy and speed that are critical. The goal of thick-tissue optical imaging is usually to obtain spectroscopic information with a view to detecting the presence of a specific type of tissue, such as a tumour, rather than to provide a detailed map of tissue structure. The latter may be better left to MRI, with optical techniques providing complementary information, such as monitoring changes or detecting abnormalities in tissue properties. One can envisage using expensive MRI technology to provide an initial high-resolution map of tissue distribution and applying this information to aid the computational reconstruction of the optical image. A low-cost optical-imaging instrument could then be left in place on the patient to monitor changes in critical parameters, or to facilitate ongoing diagnosis and research.
Functional imaging may well provide the ability to detect tissue abnormalities that are smaller than the resolution limit and this will be an important tool for screening against disease or diagnosing complications following injury. Possibly the most exciting prospect for thick-tissue imaging, however, is the real-time imaging of brain activity.
As advances in information processing and optoelectronics make the technology cheaper and more powerful, progress in biomedicine will grow exponentially. The advances that have been seen in the 20th century may seem incremental and predictable in comparison with the advances that will be made in the next century.
Advances in microscopy will lead to breakthroughs in microbiology and genetic engineering, and new optical techniques will permit “optical biopsies” without the need for surgery. This in turn may increase the reliability of screening programmes and of treatments. The superior understanding of disease and its effects on tissue will allow new therapies and surgical procedures to be developed that can be tuned to the specific needs of the patient. Finally, thick-tissue imaging will lead to breathtaking insights into the working mechanisms of organs. In particular, imaging brain activity will be fascinating and it may even become possible to watch people thinking. Let us hope there is something good to think about.