Skip to main content

Flexible LEDs bring light to fingertips

Light-emitting diodes – or LEDs – have already found their way into traffic lights, TV screens and incredibly efficient light bulbs. And it may not be long before they are also deployed in medical technologies such as proximity-sensing gloves, surgical threads and intravenous flow rate monitors, thanks to the efforts of an international team headed by John Rogers from the University of Illinois, Urbana Champaign.

To target these applications, Rogers and colleagues have exploited a new printing technique for forming arrays of incredibly small red LEDs on flexible substrates.

Like conventional LED manufacture, device fabrication begins by depositing a stack of compound semiconductor layers onto a substrate. The lower and upper layers have an excess and a deficiency of electrons, respectively, and sandwiched between them is a layer just a few nanometres thick, known as a quantum well. When a voltage is applied across the entire structure, electrons and their positive counterparts, known as holes, are driven into the well where they combine to emit light.

LEDs are normally made by sawing wafers into thousands and thousands of square-shaped chips. To prevent them from becoming too fragile, their edges are at least 300 µm – far too long for forming flexible arrays on plastic sheets.

Small enough for flexibility

To produce LEDs with sides of just 50 µm, Rogers and his co-workers define the chip area with photolithography and etching. A printing technique then transfers arrays of these LEDs to alternative substrates, where they are given electrical contacts and wired in series.

“Printing is a key part of the process,” says Rogers. “We have developed that technique to a very high level of sophistication, and we now have yields of over 99% and [placement] accuracies of about one micron.”

While light-emitting organic materials could have substantially simplified the process of making LEDs on a flexible substrate, organics have other shortcomings “Their brightness can’t compete with inorganic LEDs, and encapsulating them to prevent exposure to minute levels of moisture and oxygen is extremely difficult,” explains Rogers.

Stretching, flexing, twisting and bending

To test the robustness of its LED arrays, the team deformed them and monitored changes in performance. “We can accommodate nearly any type of deformation mode – even to extreme values of stretching, flexing, twisting and bending – up to 100,000 cycles or more.” Another strength of the printing process is that it can form LED arrays on a vast range of substrates, including plastic, rubber, aluminium foil, paper and even tree leaves.

To demonstrate potential uses for tiny LED arrays, the researchers attached them to a tube to provide a light source for a medical device that measures the concentration of glucose delivered intravenously.

Another application for these arrays is surgical stitching. It is not possible to print the devices onto a thread, but they can be attached by rolling this material over a glass carrier that is populated with red LEDs. Stitches incorporating these tiny light emitters have been inserted into an anaesthetized mouse. According to the team, adding LEDs to surgical thread delivers multiple benefits: accelerated healing; illumination of deep tissue; and the opportunity to monitor blood oxygenation.

Light at your fingertips

LED arrays can also be deployed in the fingertips of gloves to create proximity sensors for aiding robotic systems or surgical procedures. To do this, the researchers have integrated tiny LEDs with similar-sized photodetectors. This allows the distance to the nearby object to be determined through measurements of the intensity of backscattered light.

Rogers says that a start-up company, MC10, is targeting commercialization of some of the team’s technology

“From the science and materials side, we are working to implement related ideas with blue and ultraviolet LEDs, to expand the functionality.”

The researchers report this work in the latest edition of Nature Materials.

Algae cause a stir in the local environment

The swishing actions of tiny swimming organisms play a key role in distributing heat and nutrients throughout the world’s oceans and lakes, but these mixing effects are more complicated than we first thought. That is according to two separate research groups, based in the US and the UK, that have examined the fluid disturbances that occur in the immediate vicinity of swimming algae.

Many microorganisms have evolved to be able to move through liquids for various biological processes, including foraging for food and reproduction, and this motion acts to stir the fluids. While scientists have examined these processes at relatively large scales, there is still a lack of quantitative data on the fluid dynamics behind these processes at the microscale.

Now, a group of physicists based at the University of Cambridge, UK, led by Knut Drescher, has succeeded in taking a closer look through an experiment involving two types of common algae. The first organism, Chlamydomonas reinhardtii, is a small alga that swims by paddling a pair of whip-like flagella. The second was Volvox carteri, a larger, spherical type of algae that propels itself with thousands of flagella covering its surface.

Flapping flagella

By suspending fluorescent polystyrene microspheres in the water surrounding the algae, Dresher’s team was able to trace the time-averaged water flows using a tracking microscope. The experiments revealed that Volvox carteri interact with water in a similar way to sedimenting particles acted on by gravity. “People assumed that the effects of gravity would be minimal,” says Ray Goldstein, a member of the Cambridge team. “The flow field arising from the gravitational force on the organism falls off very slowly with distance, so the mutual interaction of such organisms is much stronger than without gravity,” he explained.

The researchers found that Chlamydomonas reinhardtii, on the other hand, trigger more complicated flow fields in the vicinity, set up by the combined action of the cell body and its two flagella.

In a separate study, based at the Haverford College in the US, a group led by Jeffrey Guasto created a different vantage point by confining Chlamydomonas reinhardtii in a thin film of water. The researchers focused on the two-dimensional motion of a single stroke of the algae’s flagella, using tracer particles and a high-speed camera, a set-up that enabled them to study the impact of individual flagella movements in more detail.

While its time-averaged results agree with the Cambridge team, Guasto’s group discovered that flow fields vary significantly over the course of one complete “breaststroke”. This finding suggests that the shape and scale of the fluid flows triggered by this alga may be yet more complicated. “Researchers in this field have been content with the widely accepted – but simplified – hydrodynamic models of single swimming cells that ignore near-field and time-dependent effects of swimming,” says Guasto.

Impact of the individual

The next stage in this research is to explore the dynamics across a range of scales to build a more complete picture of how individual swimmers can influence the impact of large groups of swimming organisms. “The locomotion of swimming organisms contributes to the distribution of nutrients, pollutants and heat. Only recently have researchers devoted considerable attention to these effects at the scale of unicellular microorganisms,” says Guasto.

One avenue the Cambridge team intends to pursue is to consider how these latest findings fit in with their previous research that explored the interactions between individual Volvox alga, as reported on physicsworld.com last year.

Howard Stone, a fluid mechanics researcher at Princeton University, is impressed by the work of both groups. “I am confident that [both sets of] measurements will become standard references in the field,” he says.

Both groups have recently published their findings in Physical Review Letters.

On the trail of thorium

By Matthew Chalmers

Flowers presentation
Credit: BARC

 

If you’re wondering who that is second from right, holding a bunch of flowers while desperately trying to smile naturally in front of a camera, right in the hub of India’s nuclear power programme, it’s me. I was in the subcontinent after being sent by Physics World magazine to write about India’s audacious “three-stage” nuclear programme that seeks to exploit the country’s vast reserves of thorium as an alternative nuclear fuel to uranium. (You can read my final article “Enter the thorium tiger” in the October issue of the magazine, which can be downloaded free of charge via this link

The bouquet, along with a large leather wallet, was presented to me as a gift from directors of the Bhabha Atomic Research Centre (BARC) near Mumbai. My fellow flower holders – all from the British High Commission in Delhi – were there to build links between UK and Indian nuclear scientists, while I was present to unearth what I could about India’s nuclear plans. The flowers came from BARC’s extensive flowerbeds, which were laid at the request of the late physicist Homi Bhabha.

BARC
BARC, near Mumbai. Credit: BARC

There’s a certain romanticism to the way Bhabha, who established India’s nuclear programme 60 years ago, is revered among Indian nuclear physicists. He not only provided a vision of energy security that thrives 44 years after his untimely death in an air crash above the Alps, but used his connections to set in place an infrastructure that ensured his vision became reality.

Initially perplexed at why other countries weren’t exploiting thorium – a fuel that has many benefits over uranium – I asked one senior BARC physicist why the UK doesn’t have a nuclear roadmap like India’s. “Ah!,” he said, waving a finger at me, “it’s because you don’t have a Bhabha!”

Indian nuclear physicists take great pride in having developed most of their technology indigenously, owing to India’s being a nuclear-armed nation outside the non-proliferation treaty (NPT). But writing my article for Physics World . was not without its challenges.

Professional hierarchy is more apparent than in, say, a UK physics laboratory, and at times the atmosphere while I was at BARC was hugely formal, particularly when the new lab director was present. Plans to meet a few students and postdocs working at BARC were soon dashed, and recording equipment in India’s heavily guarded government labs is none too popular either.

Access to India’s nuclear programme would have been difficult were it not for the diplomatic context of my visit – and even then there were issues when it came to dealing with India’s top nuclear officials.

Changing geopolitical relations, particularly since 2008, when the US and India signed an agreement that led to India being brought into the nuclear fold, have led several countries to line up to co-operate with India on civil nuclear trade and technology. The UK is one of them.

During my trip the new UK prime minister was also visiting India, along with a trade delegation. Shortly afterwards, a bunch of joint research grants between physicists in the UK and India were funded – selected from a dozen fully costed proposals drawn up in just two days in the basement of a central London hotel back in March amid a flurry of sticky notes and chirpy facilitators from the Engineering and Physical Sciences Research Council (EPSRC). It was an impressive feat to witness, although not without a few bemused faces. Most of the nine Indian and 20 UK delegates had never met nor had much idea about each other’s research interests.

Mumbai
The Mumbai streets. Credit: M Chalmers

One thing that most surprised me in India is how few people on the street, so to speak, seem to know anything about India’s nuclear programme. Those who did know about thorium (whom I found while sipping cold beer in Chennai’s Madras Club, having visited India’s other big nuclear lab – the Indira Gandhi Centre for Atomic Research (IGCAR) on the other side of the country to Mumbai) all thought the programme was nowhere near on track, which is not what the physicists involved will tell you. Most people I got chatting to also assumed that I was interested in their views on weapons, rather than on civil nuclear power, with one or two asserting India’s right to develop them.

There is a degree of sensitivity to civil-nuclear collaboration between India and countries that are signatories of the NPT, which includes pretty much every other country. While having lunch at BARC with the lab’s new director, he made no mention of India’s weapons research as he listed the many basic science and other non-nuclear research taking place there.

igcar_255x149.jpg
Prototype Fast Breeder Reactor. Credit: IGCAR

Yet, gazing out of the window as we enjoyed a local interpretation of fish and chips, I could see – against a background of jungle and well-tended gardens leading out to the Arabian Sea – two large ageing nuclear reactors, one of which is to be shut at the end of this year as part of India’s commitment to separate its strategic and civilian nuclear programmes (a requirement of the US–India deal). I couldn’t help thinking how apt was the phrase “the elephant in the room”, as one UK nuclear physicist described the military dimension of nuclear technology to me.

But the thing that struck me overall while touring BARC and IGCAR was the sheer amount of effort involved to harness a new nuclear fuel cycle – an effort most deem too great at this time given the availability of and experience with uranium. I left IGCAR after being hurried past a blur of laboratories each piecing together a tiny aspect of Bhabha’s plan, from advanced welding joints to material irradiation tests.

In the back seat of the car bound for Chennai airport, I tore open some gift wrap to find that I was the proud new owner of a blue velvet box containing an ornament in the form of a large gold-coloured metal leaf. Lovely.

To read more, check out “Enter the thorium tiger” in the October issue of Physics World magazine, which can be downloaded free of charge via this link.

An interview with Anton Zeilinger

By James Dacey

zeilingersml.jpg

It’s a rare thing to meet a scientist who appears to have a truly open mind about the future of their field. But when I recently caught up with Anton Zeilinger, the Vienna-based quantum information scientist, his enthusiasm for an overthrow of accepted quantum theory left me in no doubt that Zeilinger is a researcher relishing the future.

Zeilinger firmly believes that human emotions play a key role in the progress of science and researchers are often unwilling to accept brave new ideas, perhaps due to a fear of the unknown. “We can see that too often scientists are conservative and sometimes even emotionally against what they perceive as speculation,” he told me.

The physicist also believes that children should be exposed to the concepts of quantum mechanics from an early age, perhaps by incorporating the laws of quantum mechanics into computer games. “It could be a game that works according to the rules of quantum mechanics, not according to the rules of classical mechanics. And we could see if the children are able to play with it, not knowing what is behind it,” he explained.

You can now read my full interview with Anton Zeilinger, which has just been published here on physicsworld.com.

Organic solar cells receive a boost

Physicists in the US have shown that organic semiconductors may be just as promising as their inorganic counterparts in solar cells. The researchers’ discovery – that bound electron–hole pairs can travel a thousand times farther in organic semiconductors than previously observed – suggests that organic solar cells could one day be made efficient, cheap and in high volume.

Today, commercial solar cells are made of inorganic semiconductors such as silicon. When a photon in the visible or near-infrared part of the light spectrum strikes the surface of the cell, it generates an electron–hole pair, which quickly disassociates. It is effectively the separation of such electrons and holes in the semiconductor that creates a voltage, so that a current can flow.

But inorganic solar cells are expensive to produce, and for this reason researchers have long considered organic alternatives. Organic semiconductors can often be made inexpensively in solution, and indeed are already used commercially to make light-emitting diodes (LEDs) and flat-screen displays. They also tend to have a lower impact on the environment than their inorganic counterparts.

Holes in the plan

The problem with organic semiconductors, however, is the separation of electrons and holes. When an electron and a hole are created, they are bound strongly into a quasi-particle known as an exciton, and the only way they can separate is to reach an artificial “heterojunction” and shift either side. Unfortunately, in the organic semiconductors tested in the past, excitons have only been able to travel a few nanometres, which means the cells must be made very thin and, ultimately, inefficient.

Vitaly Podzorov and colleagues at Rutgers University, New Jersey, have now discovered an organic material in which excitons can travel roughly a thousand times farther, of the order of several micrometres. “The further excitons can migrate in a material, the better our chances of collecting many of them at heterojunctions, where they contribute to the generation of electricity by splitting into free electrons and holes,” says Podzorov.

The material in question is rubrene, a highly ordered organic semiconductor that is already used in organic LEDs. Excitons are electrically neutral and are therefore difficult to measure in the bulk, so the Rutgers researchers developed their own method whereby they measured the current of disassociated excitons leaving the material’s underside. The team was able to build a clear picture of the exitons moving through the material by probing it with a light source, where varying the wavelength and polarization enabled the penetration depth of the light to be altered.

Scaling up

Bernard Kippelen, an electrical engineer at the Georgia Institute of Technology in the US, points out a couple of drawbacks with rubrene as an organic semiconductor. He believes it does not absorb as much of the light spectrum as many inorganic semiconductors, and researchers do not yet have a means of fabricating it on large scales. Nonetheless, he says it does highlight the potential for organic solar cells.

“For a long time, researchers in the community thought excitons could not really diffuse more than 10 nanometres, so that the only way forward was to make heterojunctions in the bulk,” he explains. “This work shows that the exciton-diffusion bottleneck, as people refer to the problem, is not unsolvable.”

The Rutgers team believes the reason for rubrene’s superior performance lies in both its highly ordered structure and in the nature of the excitons generated. Rather than staying as single excitons, Podzorov and colleagues think the electron–hole pairs “split” into lower-energy triplet excitons, which have longer lifetimes and thus can travel farther.

According to David Beljonne, an expert in organic semiconductors at the University of Mons in Belgium, this also has its drawbacks. Since triplet excitons have a lower energy, they create a smaller voltage in a solar cell and therefore generate less power. “It clearly emphasizes that in the business of photovoltaics, it is difficult to optimize simultaneously all intervening electronic processes,” Beljonne says. “In other words, one might win here and lose there. But it does open up new avenues as triplets had previously been thought of as sinks for charge generation.”

This research is described in a paper in Nature Materials.

Phonons tunnel across the vacuum

Heat can be conducted across a nanometre-sized vacuum gap – something that was deemed impossible until now. So say researchers at the Air Force Research Laboratory in Ohio who have found that the heat is transferred via an effect called “phonon tunnelling” in which quantized molecular vibrations, called phonons, appear to traverse the forbidden zone. The finding could be important for improving thermoelectric devices and for future nanoscale electronic circuits.

Heat flow between two objects via conduction can normally only occur when the objects are in contact with each other. This process occurs when phonons – quanta of vibrational energy – are transferred from the hotter object to the cooler one. Until now, such transfer was thought to be impossible between non-touching objects in a vacuum because the vacuum is a forbidden zone for phonons, explains team member Igor Altfeder.

The US team has now turned this idea on its head by actually measuring the heat flow between the nanosized platinum-iridium tip of a scanning tunnelling microscope held at room temperature and a cold surface made of gold. The two objects were separated in a vacuum by a 0.3 nm thin gap. The tip was held at room temperature while the gold surface was cooled to 90, 150 or 210 K.

Tunnelling, not radiating

The researchers found that the thermal energy transmitted through the tiny gap exceeds the Planck’s radiation by c2/v2 = 1010 (where c is the speed of light and v the speed of sound). According to their measurements, this means that the last atom at the nanosized tip dissipates heat an astonishing 1010 times faster than normal by generating phonons inside the gold. And, contrary to earlier hypotheses, the heat transfer is not due to the tip emitting radiation into the vacuum.

Altfeder and colleagues obtained their results by applying a range of voltages between the “hot” microscope tip and the cold surface. The researchers then recorded the current produced by electrons travelling across the gap. Because these electrons are directly affected by thermal vibrations in both materials, a measure of this current can be used to calculate the temperature of the tip apex. “The fact that this temperature nearly coincides with the temperature of the sample tells us that the energy escapes from the tip apex at an enormous rate,” Altfeder told physicsworld.com.

According to the AFRL team, the phonon tunnelling is driven by electric fields between the two objects. These electric fields, which exist because the work functions (the minimum of the tip and sample materials) are different, cause the microscope tip and its “image charge” inside a sample to vibrate in unison. In other words, the electric fields at the tip apex cause electrons in the top layer of the gold surface to vibrate at the same rate.

The result could be important for improving thermoelectric devices, interfacial transport at the micro scale and designing future molecular circuits, says Altfeder.

The work, which was reported in Physical Review Letters, backs up recent theoretical studies by Mika Prunnila and Johanna Meltaus of the VTT Technical Research Centre of Finland that predicted phonon tunnelling between piezoelectric materials. In an American Physical Society press release, Prunnila was quoted as saying that the new research might find applications in nanoelectronics, and in devices that harvest energy from temperature gradients.

Geothermal project targets radioactive granite

To most people “geothermal electricity” conjures up images of a steam-enshrouded facility in a place like Iceland where heat from the depths of the Earth comes to the surface.

But some geologists – including UK-based Ryan Law – are championing a technology that could generate large amounts of geothermal electricity in places that are thousands of kilometres away from such geological features. The scheme for generating electricity from the heat of radioactive rocks deep underground goes by several names, including “deep geothermal”, “hot dry rocks” and “engineered geothermal systems” (EGS).

Law is managing director of UK-based Geothermal Engineering, which next year plans to drill about 5 km down near the town of Redruth in Cornwall. The company intends to pump water to the rocks, where it will heat up before returning to the surface at around 200 °C, and under pressure. The hot water will be run through a heat exchanger to drive a steam turbine, providing 10 MW of electricity to the grid. The water will then be returned underground for reheating in a closed loop.

According to Geothermal Engineering, that’s enough electricity to supply about 20,000 homes and the project will also provide heat to the local community. The project will provide continuous carbon-free energy, unlike other renewable sources such as wind and solar, which are intermittent. But unlike conventional technologies, EGS can be deployed “almost anywhere,” says Law, adding “You can access a much greater resource.” For Law, the resource is hot granite that sits underneath Cornwall and nearby Devon – two counties in the south-west corner of England.

Hot granite

The Redruth granite is hot because it is full of radioactive uranium, potassium and thorium. “Cornish granites are known for having a high concentration of these elements, therefore they have a measurably higher heat flow compared to other granites,” says Law. Drawing on the UK government’s 1976–1990 Hot Dry Rocks research project in nearby Rosemanowes, Law says that the energy flux in the rock is 135 milliwatts per square metre – high for granite – and the rock is at a temperature of about 200 °C.

In August the firm won planning permission for the project, which it hopes to operate commercially by 2014. It also has plans for another 25 plants across Devon and Cornwall. Another UK company, EGS Energy, is seeking planning permission to build a 4 MW facility at Cornwall’s Eden Project.

Success in Cornwall could boost similar projects elsewhere in the world where granite or other heat sources – different types of rocks or pools of water – exist. A small 3 MW plant already operates in Landau, Germany. The EU is testing another in Soultz, France. An Australian company, Geodynamics, plans up to 90 EGS sites in Australia.

Lack of investment

EGS is not new but its development has suffered from a lack of private investment. “I’ve always been enthusiastic about it as an energy resource,” says Allan Hoffman, a senior analyst with the US Department of Energy’s Office of Energy Efficiency and Renewable Energy. Hoffman once halted a hot dry rocks project at Los Alamos National Laboratory in New Mexico in the 1990s. “Not because it didn’t work,” he says. “But I wanted the private sector to come up with some of the money.” It didn’t.

With world interest growing in renewable energy, funding might pick up. A 2006 report by the Massachusetts Institute of Technology noted that the US has enough geothermal heat to theoretically meet 2000 times its primary energy needs, and that EGS could realistically supply 10% of its electricity by 2050.

“The good news is you don’t need new technologies,” says Jefferson Tester, the report’s lead author and now the associate director for energy at Cornell University’s Center for a Sustainable Future. “But it isn’t cheap to drill.”

Law hopes to raise £12.5 million to start his first phase, and will need another £30 million for the second. Like any such project, he can’t be sure that he’ll find his hot granite until he gets to it. Geothermal firm AltaRock Energy abandoned a job in California last year after finding what it called “geological anomalies.”

Thousands of tiny tremors

Law is confident he’ll avoid a similar fate because he’s going through loose rock. That will ease drilling and make it easier to pump water through, as will the high heat. That, in turn, reduces the threat of “earthquakes”. A project in Switzerland closed in 2006 after a small tremor.

Law acknowledges that drilling can trigger thousands of “micro seismic events,” but says that’s little different from oil and gas drilling. And he notes that they help decipher underground structures. “Most of these events are smaller than a horse walking next to you. You never feel them. But there are thousands of them, picked up by seismometers. We use the data to map a 3D picture.”

Tester concurs. “They give you just the information you need,” he says. “We like to call them acoustic emissions.”

APS responds to climate-change accusations

The American Physical Society (APS) has issued a strongly worded statement in response to a published resignation letter from a prominent member of the society. The letter, written by Harold Lewis, emeritus professor of physics at the University of California, Santa Barbara accused the society of benefiting financially from climate-change funding. Addressed to the APS president, Curtis Callan, the letter calls global warming a “scam” and says that “the (literally) trillions of dollars driving it…has carried APS before it like a rogue wave”.

Lewis, 87, who has been an APS member for 67 years, has had a distinguished career that includes serving on the US defence science board, the advisory committee on reactor safeguards and the nuclear safety oversight committee. Lewis writes that climate change is “the greatest and most successful pseudoscientific fraud I have seen in my long life as a physicist,” and that the APS has “accepted the corruption as the norm, and gone along with it.” He adds that Princeton University physics department, of which Callan is chair, “would lose millions a year if the global warming bubble burst.”

Callan strongly denies that charge. “Do any members of the Princeton physics department perform research on subjects even remotely related to climate science? No,” Callen told physicsworld.com. “Would a hypothetical physicist engaged in such work be likely to shade the results of his or her work to hew to some “party line” demanded by a funding agency? That would be contrary to the ethical code subscribed to by all scientists I know.”

Lewis is also one of the 160 physicists who last year failed to persuade the society to modify its “appallingly tendentious” formal statement on climate change, which it had released in November 2007, to reflect their own doubts about the human contribution to global warming. “Everything that has been done in the last year has been designed to silence debate.” Lewis writes in his letter to Callan. “APS management has gamed the problem from the beginning, to suppress serious conversation about the merits of the climate change claims.” In resigning, Lewis says the APS “no longer represents me”.

Strong response

In response to Lewis’s letter, the APS took the unusual step of issuing a public statement on Tuesday. The society says there is “no truth to Dr Lewis’s assertion that APS policy statements are driven by financial gain,” adding that the “specific charge that APS as an organization is benefiting financially from climate-change funding is equally false”.

“The APS adheres to rigorous ethical standards in developing its statements,” the statement says. “Neither the operating officers nor the elected leaders of the society have a monetary stake in [climate-change] funding.” The statement adds that, because relatively few APS members conduct climate-change research, the vast majority of the society’s members “derive no personal benefit from such research support”. APS press secretary Tawanda Johnson told physicsworld.com that the society released the statement to defend its reputation in the face of the accusations.

Gavin Schmidt, a climate physicist at NASA’s Goddard Institute for Space Studies, denies Lewis’s claim that research in climate change is congruent with financial gain. “People don’t get paid to get results,” he says. “Funding pays for postdocs, graduate students and equipment.” Schmidt adds the issue raised by Lewis is “a manufactured story” to make people believe there is some discontent in the profession.

Lewis does, however, have some support among physicists. “[Lewis] is on target with the big picture,” says Princeton physicist Will Happer, a leader of last year’s effort to change the APS statement on climate change.

The APS says in response to the “widespread interest expressed by its members” that it will now organize a “topical group” to encourage exchange of information on the physics of climate.

Anton Zeilinger: a quantum pioneer

Within the quantum community, you’re viewed as someone who always looks to push the limits and to test things that other people wouldn’t. Would you agree with that?

Even in the most basic sciences you cannot work without bold attempts taking risks. You have to be open, you have to be challenging – this is the interesting stuff. I wouldn’t like to look at science as just one more step here or one more step there.

But do you think this is always possible for scientists to take this view when they have other things to worry about things like getting their next research grant?

Well, some of that is true. But much of it is self-imposed. We can see that too often scientists are conservative and sometimes even emotionally against what they perceive as speculation. The really new in science cannot be logical consequences of what we know already… that would just be the next logical step. I don’t want to belittle the more down-to-earth approach in the sciences but I find it more interesting to ask bold new questions

Do you take inspiration from certain scientists?

We can see that too often scientists are conservative and sometimes even emotionally against what they perceive as speculation.

Einstein is very inspiring – especially in the way he was stubborn. He had his opinion about quantum mechanics, which – in its consequences – turned out to be wrong. There’s no question about that. But he stood up for it because he believed in it. This kind of stubbornness is important in science because sometimes it is the others who are wrong. And sometimes even wrong positions can lead to something important. Einstein’s criticism of quantum physics inspired foundational research, which opened up the road toward quantum information and quantum computation.

What do you predict for the future of quantum mechanics as a theory?

We will find that quantum mechanics underpins much more than we realize today. To take the next steps will be a big challenge: it’s very difficult, and it cannot be found by looking at the theory. The theory may be so strong, be absolutely correct, be fantastically beautiful [pauses] but there must be something beyond – the question is where?

So how do you approach that question? Is it purely through thought experiments or is there a more practical way?

Well, one thing is to make sure that all the puzzles and paradoxes that people predict are really there in the laboratory and that people stop developing concepts that might allow them to avoid these things.

The theory will not break down in directions where people want quantum mechanics to break down. For instance, it is not in the direction of large macroscopic bodies. But it could be in the direction of quantum gravity. People have tried to quantize gravity now for 80 years, since the 1930s. Some of the brightest minds of our civilization have tried it unsuccessfully. That shows me that maybe we are somehow fundamentally not asking the right questions.

How about quantum computing – when do you think we’ll see machines that can perform useful calculations?

Well, of course, it is very difficult to predict. But it may well provide a path going beyond the limitations of Moore’s law [which describes how the number of transistors that can be placed on an integrated circuit has doubled approximately every two years}. For quantum computing, it will depend on the number of qubits we can handle in experiments. Most people say we need about 40–50 to have an interesting quantum computer. Now, we are at the level of about 10, with ion quantum computing, so it should take 15–20 years. It’s not so bad; we’re not so far away.

Can you tell me about your own research in quantum computing?

We are working on optical quantum computing, which works with photons only. The problem with photons is that they are so fast. They only appear in the apparatus for a very short time – a few nanoseconds – and so you have to be extremely fast in handling them. And at present we don’t have the detectors with a high enough efficiency to handle the photons. But in principle, optical quantum computing can be universal.

You’ve said before that it excites you to think of a mobile phone with an in-built quantum computer. What other potential applications do you want to see?

In my lifetime I have seen things that are mind-boggling.

I said this partly tongue-in-cheek in the hope to encourage young people to have courageous minds. You have to set goals that are wildly ambitious. The point is that completely new technologies will emerge that we never predicted. Look at the laser. When it was invented [50 years ago] nobody predicted the two most common applications we see today – the CD player and the supermarket scanner. Nobody predicted this – that’s the way it always works.

But you enjoy new gadgets?

In my lifetime I have seen things that are mind-boggling. I still remember the day we saw our first scientific calculator at our institute – it must have been 25–30 years ago, or something like that. There were a group of us who sat there all afternoon punching in calculations and we were completely excited by this.

I love to have the most recent gimmicks. I have already put in an order for the iPhone 4 [Zeilinger also owns an iPad]. I just like to play with these things, even when I know I will use just a small subsection of all the possibilities.

You’ve also said that you would like children to be exposed to quantum mechanics from a young age. How?

I’ve kept saying this for many years and I think I should just do it. I want to meet people who are good at this kind of thing who would help me to expose children very early to quantum physics. You clearly cannot tell them about quantum states and Hilbert space, but a possible way to do it would be to have quantum phenomena simulated on a computer. It could be a game that works according to the rules of quantum mechanics, not according to the rules of classical mechanics. And we could see if the children are able to play with it, not knowing what is behind it.

Or I was thinking of simply showing them phenomena on the computer, like a mock-up of very simple experiments. Maybe if the children play with this they can develop a different kind of intuition.

Further reading

“A quantum renaissance” by Markus Aspelmeyer and Anton Zeilinger

“Probing the limits of the quantum world” by Markus Arndt, Anton Zeilinger and Klaus Hornberger

Hybrid qubits closer to reality

Three independent groups of physicists have made important progress towards exchanging quantum information between microwave circuits and ensembles of spins. The breakthroughs could lead to hybrid quantum bits that could make it much easier to develop practical quantum computers. Such devices, which are based on quantum-mechanical concepts such as entanglement could, in principle, outperform conventional computers on certain tasks.

The ideal quantum bit, or qubit, would possess two important properties. One is that its value can be easily read and written; and the other is that its quantum nature endures long enough for a calculation to be performed. The most robust qubits tend to be those that are well isolated from their surroundings because this prevents their quantum nature from deteriorating via contact with the noisy outside world. But if a qubit is too isolated it becomes very difficult to read and write to the qubit, rendering it useless in a practical quantum computer.

Best of both worlds

One way around this dilemma is to build qubits out of two physical entities – one that is robust and another that is easy to interact with. The challenge in creating such hybrid qubits is knowing how to transfer quantum information between the two components. That problem is the subject of new research, which looks at how to couple the spins of either electrons or nuclei (which can retain quantum information for relatively long periods) with superconducting qubits (which are easy to control using electrical and microwave pulses).

One team led by Robert Schoelkopf at Yale University in the US has coupled a superconducting microwave cavity to electron spins that are isolated within two crystals – ruby and diamond – with small concentrations of magnetic impurities. A second team – led by Daniel Esteve at France’s Atomic Energy Commission (CEA) in Saclay, near Paris – did the same using a diamond with a different type of impurity. Both studies involved coupling to a relativity large number of electron spins (about 1012 at Yale) to boost the interaction strength between spins and microwaves.

Flipping back and forth

The teams achieved the coupling by placing the crystals next to tiny superconducting cavities that resonate at a specific microwave frequency. In each case, a magnetic field was applied to the sample, which aligns the spins and sets the energy required to flip the direction of a spin. When this energy matches the energy of the microwaves in the cavity, the spins can flip back and forth in what is called a Rabi oscillation.

This process involves the exchange of photons between spins and cavity, which could provide a mechanism for exchanging quantum information between an ensemble of spins and a superconducting qubit attached to a microwave line. According to Yale’s David Schuster, a number of qubits could be stored in such an ensemble of spins by encoding information in quantum states with different phases.

Nuclear spins are even better

Meanwhile, at the University of Oxford in the UK, Andrew Briggs and an international team of colleagues (including Yale’s Schuster and Schoelkopf) have shown that a number of different microwave modes can be stored simultaneously in a spin ensemble. This team has also come up with a way of transferring quantum information from an ensemble of electron spins to an ensemble of nuclear spins. The big advantage of nuclear spins is that they are even more isolated than electron spins and so could store quantum information for several seconds before being transferred to superconducting qubits.

The Oxford group looked at the electron spins of nitrogen atoms in a fullerene cage of carbon atoms and also in silicon crystals doped with phosphorous atoms. The team first encoded up to 100 different multiple-photon excitations in their electron spin ensemble by applying short magnetic pulses to the spins. They were also able to retrieve the excitations. Although the Oxford group did not store or retrieve quantum information from single microwave photons – which would be needed in a quantum computer – the work suggests that this should indeed be possible.

Hyperfine interactions

Briggs and colleagues were able to transfer the encoded excitations from electron spins to the material’s nuclear spins and then back again. This is done by exploiting the “hyperfine” interaction between the nuclear spins and the spin of the electrons.

While the three groups have made important progress towards creating hybrid qubits, physicists must still work out how to store information from single photons in a spin ensemble, and how to connect the microwave-spin systems to a superconducting qubit. “Getting all of these things to happen at the same time is one of the goals of this research area moving forward,” explains Schuster.

Copyright © 2026 by IOP Publishing Ltd and individual contributors