Skip to main content

Do you think that quantum computing is theoretically possible?

By Hamish Johnston

As I mentioned yesterday, I’m on my way to Vancouver for the annual meeting of the American Association for the Advancement of Science (AAAS, where the future of quantum computers is on the agenda.

I’m looking forward to catching up with Scott Aaronson of the Massachusetts Institute of Technology, who has been in the news lately because of his $100,000 challenge. The mathematical physicist is offering this princely sum to anyone who can convince him that scalable quantum computers are impossible. This might seem like easy money – after all, physicists have struggled for years to build even the most primitive quantum processors, and scaling these up to make a working quantum computer seems a tall order.

hands smll.jpg

But Aaronson isn’t talking about hardware, instead he wants you to disprove the underlying quantum physics that would make a quantum computer tick. “This is a bet on the validity of quantum mechanics as it’s currently understood,” he told me recently. Can he raise the money? Yes, and he even thinks it would be well spent because disproving some or all of quantum mechanics would lead to a revolution in physics. Has he received any serious entries so far? No, but there is no time limit on the challenge so get your ideas to Aaronson.

In this week’s Facebook poll we are asking if you think Aaronson will hang on to his hard-earned cash?

Do you think that quantum computing is theoretically possible?

Yes, for sure
No way
I’m caught in a superposition of yes and no

Last week we asked who is the most inspiring of the current communicators out of a list of six famous physicists.

The winner with 34% of the vote is Brian Greene of Columbia University. Runner-up is Oldham’s own Brian Cox, with 18%. And you don’t have to be called Brian to be on the podium because third place goes to Michio Kaku of the City University of New York with slightly less that 18%.

It’s interesting to note that Greene, Cox and Kaku have all had their own TV shows recently, so that could explain their popularity.

Rounding off the results, in fourth, fifth and sixth places, respectively, are Stephen Hawking, Neil deGrasse Tyson and Lisa Randall.

Other suggestions from readers included Neil Turok, who is director of the Perimeter Institute in Canada, and Jim Al-Khalili of the University of Surrey – who famously declared that he would eat his underpants if neutrinos can travel faster than the speed of light. Other suggestions were Lee Smolin and Lawrence Krauss. If you would like to hear Krauss in action, he will be giving a live lecture on physicsworld.com on 6 March.

Measure for measure

Regular readers of Physics World will be familiar with Robert Crease through his columns and occasional features for the magazine, which frequently touch on the subject of measurement. Those of us who find Crease’s writings entertaining and informative will have the same reaction to his new book World in the Balance: the Historic Quest for a Universal System of Measurement.

The publication of Crease’s book is timely because the “historic quest” of the title is now within striking distance of being achieved. In October 2011, during the 24th General Conference on Weights and Measures (CPGM), the member states of the Metre Convention unanimously adopted a resolution entitled “On the possible future revision of the International System of units, the SI”. Although the diplomatic and cautious language of the resolution somewhat obscures its importance, its contents represent a sea change in the science of measurement. At its heart is a proposal to redefine five of the seven SI base units in terms of fundamental constants or invariants of nature.

Most notably, the kilogram will be defined by fixing the numerical value of the Planck constant, thus consigning to history the last remaining physical artefact defining a base unit: the international prototype of the kilogram currently kept at the Bureau International des Poids et Mesures (BIPM) in Sèvres, France. The ampere, kelvin and mole will be redefined in terms of the elementary charge, the Boltzmann constant and the Avogadro constant, respectively. As for the candela, it will be defined by fixing the numerical value for the luminous efficacy of monochromatic radiation of a specified frequency. Taken together with the existing definitions of the second and the metre, which are pinned to a certain hyperfine transition of caesium and the speed of light, the result will be an absolute and universal system of measurement.

Of course, a proposal such as this does not come out of the blue. Rather, it is the result of many discussions, arguments and scientific papers, to say nothing of advances in science. But as Crease explains in his book, it is also the culmination of society’s quest for a solid basis of measurement, something we have striven for since our civilization’s very beginning. In the hands of a lesser writer, the story of this quest could easily have been a dull tale of a succession of bars and weights, each bearing the names of kings or emperors. Crease, however, offers a broad view on the importance of measurement, while also taking the reader on many fascinating excursions.

One such excursion concerns the history of measurement outside the Western world. In China, for example, he meets Guangming Qiu, the last of a team of historians set up 35 years ago to document Chinese metrology. Born in 1936, Qiu has lived through great transformations in China and now knows as much about her subject as anyone alive. Crease also tells the story of West African gold weights through an interview with Tom Phillips, a UK painter and sculptor who has become an expert in these artefacts and their use.

In one particularly amusing side-trip, Crease goes to the Museum of Modern Art in New York to look at Marcel Duchamp’s work 3 stoppages étalon (see Physics World December 2009 pp28–33). Duchamp made this work by taking three metre-long threads, dropping them from a height of 1 m and preserving the way they fell onto a board as curved lines. Art historians have expended much ink in developing explanations of this piece, and Crease recounts some of their analyses. It is known, for example, that Duchamp was interested in science in his youth and had visited the metrology museum of the Conservatoire National des Arts et Métiers in Paris. When he made 3 stoppages étalon, he called it “a joke on the metre”, but he later added that it was “a humorous application of Riemann’s post-Euclidian geometry, which was devoid of straight lines”.

Duchamp’s second comment is particularly interesting in light of the modern definition of the metre in terms of the speed of light. Although SI units are all “proper units” – in other words, their definitions apply only in a small spatial domain that shares the motion of the standards in question – as soon as one wishes to measure a vertical distance from the surface of the Earth, general relativity becomes relevant. For example, the frequency of a clock near the surface of the Earth shifts by about 1 part in 1016 per metre of altitude. Thus, measuring a vertical distance by means of timing the passage of light requires not only high-level apparatus but also knowledge of general relativity. Maybe the curving of a metre-long thread as it falls through space does indeed have a deeper significance!

Central to any history of metrology is the creation of the metric system at the time of the French Revolution. Crease describes this well, and his discussion proceeds naturally into an account of how the new system was hesitantly adopted in France, and how attempts were made to introduce it to Britain and the US. By the 1860s, flaws in this early system led geodesists – mapmakers and surveyors – to call for a new, improved metre and the creation of a Bureau of Weights and Measures. One problem they noted was that the official “Metre of the Archives” constructed in 1799 was an “end standard”, a simple rectangular bar the length of which from end to end defined the metre. This type of standard is easily damaged, whereas a “line standard” – in which length is defined as the distance between two fine lines engraved on a bar’s surface – is more robust. Another problem was that the metre was actually kept in the French Archives and was thus not directly accessible.

The geodesists’ proposals did not at first please the French, who felt that their metre and kilogram provided all that was necessary. But in the end, the French gave way, and the Metre Convention was signed in Paris in 1875, creating the BIPM and setting in train the construction of new standards of the metre and the kilogram. Crease’s description of these events incorporates an account of his own recent visit to the French Archives, during which he was shown the actual 1799 metre and kilogram.

The first serious move towards a standard of length based on the wavelength of light, rather than a physical object, was made by the American scientist Charles Peirce in the 1870s. Unfortunately, Peirce combined scientific brilliance with a flawed character: he scarcely began one ambitious project before starting others (and rarely finished any), was prone to extreme mood swings, indulged in irrational financial dealings and carried on numerous extramarital affairs. Rude and aggressive to friends and enemies alike, he found it impossible to maintain good relations with his fellow scientists. Since little of his work was ever finished, let alone published, the first measurements of the metre in terms of the wavelength of light are almost always credited to Albert Abraham Michelson, who came to the BIPM in 1892 and performed the measurements with the bureau’s director, René Benoît.

Crease has already written about many of these people, both in Physics World and elsewhere, but here he goes into their work in much greater detail. Among those who feature is Charles-Édouard Guillaume, Benoît’s successor as BIPM director, who discovered an alloy of iron and about 34% nickel called invar that has almost zero thermal expansion at room temperature. Invar very quickly found numerous applications in metrology for such things as standard-length bars and gauges, geodesic tapes and wires for surveying. For its discovery, Guillaume received the 1920 Nobel Prize for Physics.

The latter part of Crease’s book traces out the rest of the path towards last October’s resolution at the 24th CGPM, through successive definitions of the metre and the ampere to the creation of SI units in 1960. He does this in a clear and entertaining way, bringing people who have been or are engaged in the effort into his narrative as often as possible. He ends with an account of the new work that has made it possible to consider defining the kilogram in terms of the Planck constant – the key advance that has at last opened the way to an absolute system of units.

Early on, Crease remarks that, in his experience, metrologists like to pass themselves off as colourless people who lead dry careers in a field outside mainstream science. He says that his book exposes this as a false image, adding that the closer he looked into metrology, the more he found tales as wild – and personalities as outsized and creative – as those found in politics, music and the arts. Regardless of whether this is true – and as a metrologist myself, I am not impartial – Crease’s excellent book captures the spirit of metrology and brings to life a subject that the reader will, I am sure, find fascinating and compelling.

Take part in our photo challenge

Photograph through sunglasses


By James Dacey

Physics is without doubt an incredibly visual subject. From the distant stars and galaxies observed by telescopes to the technicolour bursts captured by particle detectors, images play an inspirational role in our understanding of the physical universe at all scales.

At the heart of scientific imagery is light. Indeed, in recognition of the vital role light plays in science and engineering, a proposal has been made for an International Year of Light in 2015 to promote improved public and political understanding of the central role of light in the modern world.

We want to celebrate the connections between light and the physical world by asking you to share your photos. Take part in the Physics World photo challenge by submitting photos to our Flickr group on the theme of “Light In Physics”. Please add your photos by Wednesday 29 February and then after this date we will choose a selection of our favourite images to be showcased on physicsworld.com.

Nature is teeming with photo opportunities. It might be the dramatic light shows produced by aurora, pearl-like water droplets glistening in a spider’s web, or the shimmering structural colours paraded by animals such as peacocks and butterflies. Or you may choose to capture an image indoors, maybe a laboratory demonstration of a basic optics principle or perhaps a fascinating array of laser light. Be as creative as like.

Please also feel free to write a caption to share the story behind the image. Your photos may be an interesting physical phenomenon, or may have required some inventive and time-consuming photography. Expensive equipment is not necessarily required, however. People prove every day that you can capture an inspiring snapshot using the most basic of cameras, even the one on your mobile phone.

We look forward to your photos – happy snapping!

Axions could solve lithium problem

For more than a decade, scientists have been aware that the theory used to explain how the lightest elements are created overestimates the overall amount of lithium-7 in the universe. Now, physicists in the US think the answer to this so-called lithium problem might lie in a hypothetical particle known as the axion – although many are not convinced.

The theory is called Big Bang nucleosynthesis and describes a stage early in the universe’s evolution when, at temperatures of thousands of degrees, protons and neutrons began to assemble into atomic nuclei and form the first light elements: deuterium, along with isotopes of helium and lithium. As temperatures dropped, nucleosynthesis drew to a close, and eventually electrons began to add themselves to the nuclei during a period called recombination. At this time, photons stopped scattering off charged particles and the universe became transparent.

Cosmologists know this because they can detect the cosmic microwave background (CMB), which is a haze of radiation throughout the universe the temperature of which derives from that of the last photon scattering. From fluctuations in the CMB, cosmologists can calculate the ratio of baryons to photons. Baryons include the protons and neutrons that make up everyday matter. It is this baryon-to-photon ratio that predicts the abundances of the first light elements. But for lithium-7, the prediction appears to be some three times higher than the amount observed.

What happened to the lithium?

Several theories have been put forward to explain this lack of lithium, but none has won widespread acceptance. Now, particle physicists Pierre Sikivie and colleagues at the University of Florida in Gainesville think they have a straightforward solution. “What’s nice about our proposal is that we don’t have to assume anything new,” says Sikivie. “We just take the axion, which has long been discussed, and point out some properties that have been overlooked.”

Axions were first proposed in the late 1970s to solve a puzzle in particle physics known as the strong-CP problem, although more recently they have been proposed as candidates for dark matter, which is the mysterious substance thought to make up nearly a quarter of the mass/energy of the universe. If they exist, axions would be very light and interact very weakly with matter – properties that make them difficult to find. Indeed, no experiment on Earth has yet discovered any evidence of axions.

Sikivie and colleagues point out that axions can form a Bose–Einstein condensate (BEC). Such condensates contain particles that have all fallen into their lowest energy state, and are best known to occur in low-density gases at temperatures close to absolute zero. But since the critical temperature for transition to a BEC depends on density, say the Florida researchers, particles can form BECs at higher temperatures as long as they are dense enough. Even in the primordial heat of the Big Bang, the researchers say, axions would easily be dense enough to form a BEC.

Transferring heat

An axion condensate would have a marked effect on Big Bang nucleosynthesis. Passing photons would make waves in it, transferring heat and, ultimately, depleting in number. This means that the baryon-to-photon ratio would increase towards the time of recombination, giving cosmologists today a falsely high impression of the amount of lithium that should have been created.

At least, that is what Sikivie and colleagues think – others are not so sure. Kenneth Nollett of the Argonne National Laboratory in the US points out that, in alleviating the lithium problem, the Florida group’s theory overestimates the amount of deuterium. What is more, the theory requires the effective number of neutrinos – an important value in cosmology – to increase from what has been calculated from the CMB. Whereas observations generally suggest the neutrino number to be between 3 and 4, Sikivie and colleagues expect it to be about 6.8.

“I guess the bottom line for me is that it is important that many possible explanations of the lithium problem are being pursued, but I am sceptical about the [Florida group’s] proposal,” says Nollett.

Sikivie admits the deuterium and neutrino overestimates are potential problems for the theory. Still, he is waiting for results from the European Space Agency’s Planck space observatory, which will provide the most accurate measurement of the effective neutrino number in the next year. “Time will tell,” he says.

The research is published in Physical Review Letters.

How do we ditch fossil fuels?

Environmental concerns about the impact of carbon-dioxide emissions are continuing to build, and politicians are becoming increasingly aware that fossil-fuel reserves will not last forever. Part of a technological solution to these problems is to invest in cleaner and more sustainable energy technologies. But, of course, there are plenty of technical and political hurdles that need to be overcome first.

Some of these issues are addressed in this exclusive interview with Dan Kammen, founding director of the Renewable and Appropriate Energy Laboratory (RAEL) at the University of California, Berkeley in the US. Kammen is confident that we can transform the global energy system to be supplied entirely by renewable energies, and he believes the main challenge is to seek ways of driving down the costs at all scales.

“We have lots of industry and vested interest thinking about the current system and seeing no benefit, but lots of risks, in making the transition. And we need to bring those risks down through financing mechanisms, through popular community voter support,” he says.

Kammen also voices his opinion on the role that nuclear energy should play. “Nuclear is certainly clean in terms of the CO2 emissions, but we have a very mixed record around the world of managing nuclear,” he says. In explaining this point, Kammen refers to both the patchy safety record of the industry and the fact that many high-profile projects have ended up costing far more than their initial budgets.

In addition to his work at Berkeley, Kammen also works with the World Bank by advising it on technical and policy issues relating to energy projects around the globe. It is a role that has seen Kammen spend time in many locations around the world, including Sudan, Central America and various island states of the South Pacific. Kammen says he sees the rapid industrialization around the world as much more of an opportunity than a problem.

“They’re building energy infrastructure very quickly and they see many of the pitfalls of a fossil-fuel addiction,” he says. “The more they can make use of local clean energy resources and energy efficiency, it allows them to chart a path…and be less dependent than many people thought on the fossil-fuel economy that everyone sees as one we need to eventually phase out.”

Teleporting to Vancouver

By Hamish Johnston

Tomorrow I will be winging my way to Vancouver to attend the annual meeting of the American Association for the Advancement of Science – the AAAS. I have lots on for the next few days, including a trip to the TRIUMF accelerator lab to find out how physicists there are planning to make medical isotopes using an accelerator rather than having to rely on ageing nuclear reactors.

Geordie Rose

I will also be spending a lot of time talking to people about quantum computing (QC). Indeed, a good chunk of the programme at the AAAS is devoted to QC, a field in which Canadian physicists have excelled.

One such physicist is Geordie Rose (right), founder of the quantum-computer maker D-Wave Systems, which is based in Vancouver. I plan to visit D-Wave on Friday, when I hope to find out what is in that giant box that often appears behind Geordie.

What I’m not looking forward to is the 10-hour flight – if only quantum teleportation worked for macroscopic objects.

Wake up, little SUSY

By Hamish Johnston

I should have known better than to listen to rumours. But some of the most reliable gossips in the particle-physics blogosphere had being saying to expect news of evidence for a supersymmetry particle – or sparticle – to come from the Large Hadron Collider today.

Ximo Poveda

So, earlier today I donned my headphones and pointed my browser at a talk given by Ximo Poveda of the ATLAS experiment (right), who was tipped to be the bringer of good news. He went through four or five searches for several supersymmetric partners of various quarks and leptons – squarks and sleptons called the stop, stau and sbottom – and each time the conclusion was “we see nothing beyond the Standard Model”.

Supersymmetry (or SUSY) is an attractive route beyond the Standard Model because it offers a solution to the “hierarchy problem” of particle physics, provides a way of unifying the strong and electroweak forces, and even contains a dark-matter particle. Many physicists hope the LHC will confirm SUSY’s central prediction – that for each of the Standard Model particles there exists a heavier sparticle sibling. But so far, nothing beyond the Standard Model has emerged.

Oh, well. On a brighter note, CERN has announced that in 2012 it will be running the LHC at the higher collision energy of 8 TeV, instead of the 7 TeV collisions that took place in 2011. As well as boosting the chances that sparticles will be spotted, the move to a higher energy should also make it clearer whether the Higgs boson has been found with a mass of about 125 GeV – as suggested late last year.

The downside is that higher-energy collisions require greater current flowing through the LHC’s superconducting magnets – and there is a problem with the electrical connectors between magnets. In 2008 the LHC failed spectacularly when one of these connectors overheated; let’s hope that doesn’t happen again.

Butterfly lights the way to better thermal imaging

Studying the properties of iridescent butterfly wings could help engineers develop temperature sensors that are smaller and faster, according to researchers in the US. They also say that the technology could work without the need for expensive and cumbersome cooling technology and could also have implications for imaging technology such as thermal night vision and medical diagnostics.

There are many ways to detect heat, such as measuring the change in electrical resistance when certain substances change temperature. But to remain sensitive to incoming radiation, these devices must be constantly cooled, otherwise they would continue to register the presence of a heat source for some time after it has been moved away. As a result, the most sensitive thermal imagers require liquid-helium refrigeration. Since the heat sinks required are relatively large and power-hungry, this limits the minimum size and efficiency of the sensors. These requirements pose severe challenges for those designing portable equipment, such as thermal-imaging goggles. Indeed, goggles pose a particular problem because an ideal pair would be transparent to visible light, which is difficult to achieve with heat sinks in the way.

Now, Radislav Potyrailo and colleagues at the General Electric Global Research Centre and the University of Albany in New York have shown that the goals of high sensitivity and convenience could both be satisfied with a little help from nature. They have created a material inspired by the wings of the Morpho butterfly, which are covered with scales that reflect light at some wavelengths and absorb it at others. While this process is not completely understood by scientists, it is known that as the wing heats up, the intensity of the different wavelengths of visible radiation reflected changes slightly, which alters the colour of the wing.

Easy to chill

Potyrailo’s team decided to investigate this effect to find out whether the same principle could be used to construct a synthetic temperature sensor. If it could, it would have a significant advantage over current temperature sensors, because the wings of Morpho butterflies are made of chitin, which is a natural polymer with a much lower heat capacity than the metals and semiconductors used in today’s sensors. This means that a sensor using this technology could cool down quickly without heat sinks.

Potyrailo’s team tested the infrared absorption of Morpho scales and found that when the scales were heated from one side by infrared radiation, thermal expansion caused the ridges to move slightly further apart, thus changing the wavelengths reflected and absorbed when white light hit the scales from the other side. The effect was accompanied by a slight reduction in the refractive index of the chitin.

Decorating with nanotubes

Building on previous work by other researchers that revealed that decorating a material surface with carbon nanotubes enhances its ability to absorb infrared radiation, the team showed that the wings absorbed infrared better if carbon nanotubes were added to the exposed surface. As a bonus, because carbon nanotubes have excellent thermal conductivity, the decoration helped to diffuse heat through the chitin away from the site of irradiation, thus providing a molecular heat sink.

The research is still at an early stage, and the researchers will need to find a way to produce the nanostructured chitin – or a similar material – synthetically before they can produce a viable sensor. But the researchers believe the work could one day lead to relatively cheap, multicolour thermal-image sensors, in which mid-infrared would appear as one colour and far-infrared as another. “If we have a very small pixel size, then in the same or neighbouring pixels we can create these ‘Christmas trees’ that are responding to different regions of the infrared spectrum and they will be responding with different colours. These days it’s either several chips that are combined together or very sophisticated, very complicated design of heat sensor,” says Potyrailo.

Clever use of natural structures

“I find the [work] to be a very clever use of the natural photonic structures found on the wing scales of tropical Morpho butterflies,” says Mohan Srinivasarao of Georgia Institute of Technology, an expert on butterfly wings. “In my opinion, the more we look carefully at these structures, the more we are going to find more ways of using them; and, of course, the butterflies may not have created these structures for the uses we may find as we look more closely.”

The research is published in Nature Photonics.

Physics of the fringe

Hair data


The shape of a real ponytail at various lengths is given by “a”. Calculated shapes by “b” and “c”, where “c” includes a term for frizziness. (Courtesy: American Physical Society)

By Hamish Johnston

Here is some research that is truly on the fringe – or a “big bangs theory” for our readers in North America.

Physicists in the UK have published a paper in Physical Review Letters entitled “Shape of a ponytail and the statistical physics of hair fiber bundles”.

Written by Raymond Goldstein of the University of Cambridge, Robin Ball of the University of Warwick and Patrick Warren of the shampoo-maker Unilever, the article offers an “equation of state for the human ponytail”. Amazingly, the physicists are not the first to calculate an equation of state for hair – that was done back in 1946 by C F van Wyk, who was interested in the compressibility of wool.

According to the UK-based trio, the shape of a ponytail is defined by the competing effects of the elasticity of individual hairs, gravity and mutual interactions between hairs in a bundle. And because a ponytail can contain as many as 100,000 hairs, the problem is best addressed using statistical physics.

The researchers derived a relatively simple equation of state that includes the “Rapunzel number”, which they describe as a dimensionless measure of ponytail length. The team then used the equation to predict the shape of a ponytail as a function of length and compared the results with the shapes of ponytails made with real human hair.

The derivation that best describes a real ponytail also includes a term that reflects the observation that hair becomes “frizzier” as it grows out.

What’s next for the trio? The researchers want to apply their newfound equation of state to study the dynamics of ponytails – how they swish back and forth when the wearer is running.

You can read the paper here, but a subscription may be needed.

Nanoshells could boost photovoltaics

Researchers in the US have reported on a new way to increase the amount of light absorbed by thin-film solar-cell materials. The new technique relies on “whispering gallery” modes in which light becomes trapped inside tiny shells made of silicon. The result could lead to more efficient photovoltaics, claims the team.

Nanocrystalline silicon could be ideal for making photovoltaic devices because it is an excellent conductor of electricity and can withstand harsh sunlight without suffering any damage. However, there is a problem: silicon does not absorb light very efficiently. Layers of the material have to be built up to increase the amount of light absorbed – a process that is both time-consuming and expensive.

Now, Yi Cui and colleagues at Stanford University have shown that nanoshells made of silicon could offer a quicker and cheaper route to solar-cell fabrication. The cavity inside such a structure confines light in a “whispering gallery” mode, whereby the light orbits around the edge of the cavity at precise resonant frequencies as a result of total internal reflection. “Light effectively gets trapped in these hollow shells,” explains Cui. “It circulates round and round rather than just passing through, and this is very desirable for solar applications because the longer the light is kept in the material, the better its absorption will be.”

Silica balls

The researchers created their nanoshells by first fabricating balls of silica just 50 nm in size and coating these with a layer of silicon. Next, they etched away the glass centre of the shells using hydrofluoric acid. The acid does not attack the surrounding silicon layer and so the technique produces a light-sensitive silicon shell.

The nanoshells can be made in a matter of minutes. In contrast, a micrometre-thick flat film of solid nanocrystalline silicon with equivalent light-absorbing properties would take a few hours to deposit. The nanoshells also absorb light over a broader spectrum than the flat layer of silicon.

And that is not all: a significantly smaller amount of material is required to make a nanoshell compared with a flat silicon slab – roughly 5%, according to Cui and co-workers. This is something that could obviously bring down processing costs. “Looking down the road, the fact that much less material is required to make these nanoshells might come in useful when manufacturing many other types of thin-film cells, such as those that use rarer, more expensive materials like tellurium and indium,” he told physicsworld.com.

New applications

The nanoshells are also fairly indifferent to the angle of incoming sunlight hitting them and the layers are able to bend and twist without becoming damaged. “All these factors might open up an array of new applications in situations where optimal exposure to the angle of the Sun is not always possible,” adds Cui. “Imagine solar sails on the high seas or photovoltaic clothing for mountain climbing, for example.”

Having performed detailed theoretical calculations on the nanoshells, the researchers are now busy making real cells from silica. “We are also exploring the structures to see if they can be used in other types of applications, such as solar fuels and photodetectors as well,” reveals Cui.

The work is described in Nature Communications.

Copyright © 2026 by IOP Publishing Ltd and individual contributors