Skip to main content
Optics and photonics

Optics and photonics

(Courtesy: iStock/Gregory_DUBUS)
23 Oct 2018
Taken from the October 2018 issue of Physics World, which celebrates 30 years of the world’s best physics magazine. Members of the Institute of Physics can enjoy the full issue via the Physics World app.

Since Physics World started, optics-based technologies have revolutionized the way we communicate. Jeff Hecht looks at how these methods have spread since their days in the lab

Optical communications and information processing were hot topics when Physics World made its debut 30 years ago. Fibre-optic transmission lines were becoming the backbone of terrestrial telecommunication networks. The first transatlantic fibre cable, TAT-8, which was turned on at the end of 1988, could carry 40,000 voice telephone conversations simultaneously. Crystal-clear digital submarine links were replacing noisy analogue satellite calls. Digital optical discs were consigning analogue phonograph records and cassette tapes to history. And optical data storage was a hot trend for personal computers – a breakthrough in information technology for those who remembered the punched cards and paper tapes of the digital mesolithic.

Physics World’s coverage in its first year reflected key issues for these emerging technologies. “As marvellous as the present systems are…they still use but a fraction of the many-terahertz bandwidth of optical fibres,” wrote Linn Mollenauer of Bell Labs in the US, in the September 1989 issue. He pointed to two key needs: optical gain to overcome the inevitable attenuation of even the clearest glass fibres, and a way to stop pulses from stretching as they passed through the fibre. As a solution, he sent a powerful pump beam through fibre along with a weak signal beam. By carefully manipulating the light and selecting the right fibre properties, Mollenauer was able to produce pulses called solitons, or solitary waves, that could be amplified by energy from the pump beam and did not stretch even after passing through 6000 km of fibre – enough to cross the Atlantic.

In the February 1989 issue, meanwhile, Elizabeth Giacobino of the Université Pierre et Marie Curie in France and colleagues warned that lab-based optical systems were approaching the point where their performance would be limited by the quantum effect of “shot noise”. According to Giacobino, the solution would be to squeeze light to produce a quantum state in which the uncertainty in the value of one property, such as momentum, could be reduced below the normal limit by increasing uncertainty in another property such as position.

Yet even as such research groups were reporting promising results, two new breakthroughs were emerging that would eventually reshape fibre-optic communications altogether. In late 1986 David Payne at the University of Southampton in the UK had shown that optical fibres doped with erbium could amplify weak signals by up to 26 decibels at wavelengths near 1550 nm – the part of the spectrum where glass fibres are most transparent. Exploiting the discovery initially looked like a long shot because it required an expensive laser to power the erbium amplifier. However, diode laser pumping was demonstrated in 1989, and the following year engineers at Japan’s KDD Laboratories used a single erbium amplifier to boost signals at four separate wavelengths without any interference. By the turn of the century, developers had perfected optics that could split the band near 1550 nm amplified by erbium into a hundred narrow slices. Known as wavelength-division multiplexing, this technique could send a hundred 10 GB/s-signals at closely spaced wavelengths through a series of erbium-fibre amplifiers with little crosstalk.

Blu-ray disc

Optics and the web

This dramatic advancement was perfectly in phase with the explosive growth of the World Wide Web. Hi-tech firms saw share prices peak in 2000 as the dot-com boom hit crazy heights. The likes of solitons and squeezed states got lost along the way, but other optical approaches to information technology got plenty of attention as developers worked on “all-optical networks” with optical logic and switching.

One of those was Andy Walker of Heriot-Watt University in the UK. “Following the successful replacement of electrical cables in telecommunications networks by fibre-optic links,” he wrote in the April 1989 issue of Physics World, “more and more people are wondering whether it may be possible to extend such optical techniques into the realm of information processing.” Optics offered the possibility of ultrafast serial processing of femtosecond pulses, or massive parallel processing by huge arrays of optical devices. Early research focused on nonlinear optics, but in late 1999 Lucent Technologies in the US made headlines with its Lambda Router – an array of 256 microscopic optical mirrors on an inch-square chip that could serve as an optical cross-connect, routing signals direct from fibre to fibre.

Both the Lambda Router and Lucent Technologies were, however, casualties of the collapse of the IT bubble in the early 2000s, which was to take a heavy toll on the optics industry. Fibre-optic communications suffered a decade-long hangover from the overbuilding of telecommunication network capacity during this time. However, network traffic continued its rapid growth, with increasing volumes of video and cloud computing, and parts of the network required more capacity. That led to a new generation of fibre-optic systems, which began emerging in around 2010.

Instead of detecting signal amplitudes, like earlier fibre-optic systems, the new set-ups detected the phase of a signal by using coherent systems. Coherent transmission had been proposed in the 1980s, but optical technology at that time could not meet system requirements. By 2010 digital signal processing electronics had been developed that could detect and decode the phase of incoming optical signals. That allowed receivers to compensate for the dispersion of light signals that pass through long optical fibres, thereby increasing data rates for each single-wavelength signal from 10 GB/s to 100 GB/s using the same fibres installed in the 1990s.

The question now is whether it would be cheaper and easier to lay more fibres rather than squeezing so many separate optical paths into each

Further advances in coherent transmission mean we are now approaching the fundamental limits of how fast data can be sent with a single wavelength down a single fibre. The cutting edge has therefore moved to a new approach called “spatial-division multiplexing”, the goal of which is to develop new optical fibres that offer multiple routes for optical signals. One way of doing this is to create optical fibres containing many parallel light-guiding cores. Another is to couple light signals into separate modes so they can travel independently through a single fibre core. Laboratory tests have demonstrated both approaches, and even shown that each core in a multi-core cable can transmit signals on multiple modes. The question now is if this can be done practically, or whether it would be cheaper and easier to lay more fibres rather than squeezing so many separate optical paths into each.

LED streetlight

From CDs to lighting

The 1980s also saw the first mass-market consumer product based on laser technology – the audio compact disc (CD). In the January 1989 edition of Physics World, Anders Rehnberg from Philips and Du Pont Optical in the UK described what looked to be an important new advancement in optical data storage. This was the first erasable optical discs based on magneto-optic recording, in which lasers heated the disc surface to about 200 °C in tens of nanoseconds, allowing an external magnet to record data on the heated region. Plans were also under way for a new generation of optical storage that would use light from recently introduced red-diode lasers to store a full two-hour movie on a CD-sized disc. That technology would become the DVD, playing standard-definition video.

Meanwhile, a surprise was in the works. A friend collared me at the fall 1991 Materials Research Society meeting and said I had to see what a quiet Japanese gentleman had in his pocket. Isamu Akasaki, then at Nagoya University, pulled out what looked like a diode-laser pointer and showed me the first bright blue light-emitting diode (LED). That soon led to the blue-diode laser, which was eagerly sought after by the consumer electronics industry for use in high-definition video players. In the end, that gold rush fizzled. By the time that high-definition TV reached the market and industry settled on the Blu-ray format, consumers were more interested in getting their video over the Internet than on discs. Ironically, it was the fibre-optic revolution that had made video downloads and streaming possible.

However, the blue LED found a much bigger application in solid-state lighting. When coupled with yellow phosphors, blue LEDs could convert electricity into visible light far more efficiently than an incandescent bulb. That came at an opportune time for energy efficiency, and Akasaki, his colleague Hiroshi Amano, and blue-diode-laser inventor Shuji Nakamura shared the 2014 Nobel Prize for Physics for their work.

Like any new technology, however, solid-state lighting has a few bugs to be resolved. LED street lights were rushed to market with little thought of how the light they emit might affect the night-time environment, so more work is needed on reducing sky brightness, improving colour and cutting light intrusion into homes. That’s going to take some time. But in the long run, LED lighting – like most new optical technologies – should be a big improvement on what went before.

All in all, it’s been a very successful 30 years for optics-based technologies. It’s a field that really demonstrates how research can massively benefit everyday life, and it will be interesting to see where the work takes us next.

Copyright © 2023 by IOP Publishing Ltd and individual contributors
bright-rec iop pub iop-science physcis connect