Skip to main content

Extracting entropy information from quantum dots

Researchers have succeeded in measuring how energy dissipates in quantum dots by quantifying the entropy they produce. The work, by a team at Stanford University in the US, could help in the optimization of real-world nanoscale devices used in applications such as quantum memories and information processing.

Technologies like memory storage devices and information processors are intrinsically dissipative, explains materials scientist and engineer Aaron Lindenberg, who led this new study. Energy is lost as heat in many ways but at a fundamental level, this arises from the Landauer principle, which defines a lower limit for these energy costs. “When physical or computational processes evolve non-quasi-statically – for example, over a finite amount of time and out-of-equilibrium, the energy costs increase. Despite its fundamental and practical importance, directly measuring this dissipation remains extremely challenging, particularly as modern devices continue to shrink in size.”

In the new work, Lindenberg and colleagues wanted to measure energy dissipation directly in real materials in contrast to previous experiments that measured entropy production in very clean systems, such as defect centres in diamond. “Previously studied materials behaved like simple two-state ‘Markov’ systems, where the probability of moving to the next step is determined only by the current state,” explains Lindenberg, “but real materials often have memory effects and hidden internal states.

Good test systems

Quantum dots, which are tiny semiconductor crystals that emit fluorescent light when excited with ultraviolet light, are good test systems in this context, he says. When they emit light, charge carriers (electrons and holes) can tunnel into nearby defect states, temporarily stopping the emission, causing the quantum dot to “blink” between bright and dark states. This non-Markovian and stochastic blinking follows statistical patterns (a power law waiting-time distribution) that hint at memory effects and hidden states.

In their experiment, Lindenberg and colleagues in the School of Engineering and Photon Science at the SLAC National Accelerator Laboratory kept the ultraviolet excitation on continuously and switched an additional strong laser field on and off. This process changed the blinking statistics and drove the system out-of-equilibrium. The researchers then recorded the fluorescence blinking traces and used machine learning to optimize a physics-based “hidden Markov model”. “This allowed us to reconstruct the hidden state trajectories that are Markovian and then compute entropy production from them,” says Yuejun Shen, the first author of the study, which is detailed in Nature Physics.

Entropy production of quantum dots is a quantity that describes how reversible a microscopic process. It encodes information about memory, information loss (as the distribution of charge carriers in the dots evolves) and energy dissipation. Such measurements therefore enable new possibilities for determining the ultimate efficiency limits of a device, he explains.

Measuring entropy production

Lindenberg adds that the new work provides a general method to measure entropy production in complex, stochastic and non-equilibrium systems in which we cannot observe all internal states directly.

While practical applications may still be a way off, the approach could eventually help measure and reduce dissipation in nanoscale devices, he tells Physics World. “This is especially important as device sizes continue to shrink and stochastic fluctuations become unavoidable.”

As to future work, the Stanford researchers say they would now like to measure energy dissipation in other material systems and implement optimization algorithms to minimize this dissipation. “This new type of calorimetry could have applications in many other types of information storage devices and technologies,” says Lindenberg.

Rechargeable liquid solar battery stores sunlight in molecules

Being able to store renewable energy, such as that produced by sunlight, so that it can be used at night or on cloudy days remains a major challenge. In recent years, researchers have been looking into molecular solar thermal (MOST) energy storage systems that harness the energy from photons and release it when needed. Now, a joint US team from Grace Han’s lab at the University of California, Santa Barbara, and Kendall Houk’s lab at the University of California, Los Angeles, has published details of a bio-inspired pyrimidone-based molecule that, when highly strained, stores a record amount of photon energy in its chemical bonds. The energy released when the molecule is allowed to relax is enough to boil water, the researchers say.

The structure of pyrimidone looks very much like that of a component found in DNA, which, when exposed to ultraviolet light, can reversibly form “Dewar lesions”. “These lesions naturally contain significant ring strain, something that immediately stood out to us as a promising feature for energy storage,” says first author Han Nguyen.

The researchers set out to engineer a synthetic version of this structure, the Dewar isomer of pyrimidone, which they also designed to be highly strained. They did this by combining a de-aromatization strategy with a compounded strain effect from fusing two already strained rings contained within the molecule.

Better than previous MOST energy storage systems and Li-ion batteries

“As a result, each molecule can store a large amount of energy, reaching 228 kJ/mol,” says Nguyen. “This translates to a gravimetric energy density of 1.6 MJ/kg, a value that is at least 1.6 times higher than previous MOST energy storage systems and nearly double the energy density of a standard lithium-ion battery (around 0.9 MJ/kg).”

The system can be described as a mechanical spring, she says. “When hit with sunlight, it twists into a strained, high-energy shape. It stays locked in that shape until a trigger – such as a small amount of heat or a catalyst – snaps it back to its relaxed state, releasing the stored energy as heat. It can be thought of as a liquid solar battery that stores sunlight and can be recharged.”

The result, Nguyen explains, addresses a long-standing limitation of MOST materials: insufficient energy density for practical use. Until now, the stored heat could only be released and used in well-insulated environments to minimize losses. “In contrast, our system releases enough heat to operate under ambient conditions and in our demonstrations, the heat output is strong enough to boil a sample of 0.5 mL of water in under one second.”

According to the researchers, the study “marks an important step toward real-world applications and shows that MOST systems can now move beyond controlled laboratory settings and function robustly in practical environments”.

A simplified structure that is soluble in water

This pyrimidone system had not been explored before as a candidate for MOST materials, so the researchers first had to design a simplified structure based on the Dewar lesion. The challenge, remembers Nguyen, was to strip away parts that were not relevant to the application in hand while retaining the features responsible for efficient storage and release of energy. “Through iterative design and testing, we arrived at a structure that is both efficient and practical,” she says.

Another challenge was improving functionality. “Our early designs still required solvents, which limit practical use,” she explains. “To solve this, we engineered the system into a liquid-state molecule that can operate without solvent.”

The fact that the material is soluble in water means that it could be pumped through roof-mounted solar collectors to charge during the day and stored in tanks to provide heat at night, adds team member Benjamin Baker. “With solar panels, you need an additional battery system to store the energy, but with molecular solar thermal energy storage, the material itself is able to store that energy from sunlight.”

The UCSB researchers hope their work, which they detail in Science, will encourage further research in the field, so that pyrimidone and other heterocycles like it can be further improved and optimized. “We would like to design and develop molecules that absorb in a broader range of solar radiation,” says Houk. “We also want to maintain high energy density, thermal stability and energy release upon thermal ring opening and will use quantum mechanical calculations to make these predictions.”

To this end, he adds, the team plans to screen hundreds to thousands of molecules, perhaps with AI assistance, to open up new avenues for experimental research.

Nguyen tells Physics World that the goal of her laboratory is “to make heat more affordable and accessible, especially in situations where people need it most. For example, our materials could be useful in emergency or disaster settings where access to power and fuel is limited”.

Looking further ahead, she says that the technology could be integrated into real-world systems, such as heating for houses and buildings, helping to provide more reliable and accessible heat in everyday life.

The coming hurricane: early-career physicists and the crisis in American science

With several dozen talks taking place at any one time, figuring out which session to attend at the American Physical Society’s Global Physics Summit is a challenge. On Wednesday, though, my choice was particularly stark. Should I depress myself by attending the session on “The Crisis in American Science”? Or restore my faith in humanity by finding out “How Early-Career Physicists Are Solving Society’s Greatest Challenges”?

A quirk of scheduling – or an APS organizer with a dark sense of humour – meant that the two sessions were practically next door to each other in Denver’s Colorado Convention Center. So, after a bit of dithering, I decided to oscillate between the two, hoping to strike a balance between cheer and gloom.

Reasons to be cheerful

The first speaker in the early-career physicists session was Rosimar Rios-Berrios, a physicist-turned-atmospheric-scientist at the US National Center for Atmospheric Research (NCAR). Located up the road from Denver in Boulder, Colorado, NCAR’s iconic, adobe-style Mesa Laboratory was designed by the architect I M Pei, and it supports the work of several hundred scientists who study weather and climate.

Rios-Berrios is originally from the US island territory of Puerto Rico, and her APS talk focused on a weather phenomenon that’s hugely important for anyone living on tropical islands or coasts: hurricanes. Rios-Berrios is trying to understand how these storms develop. In particular, she wants to know what happens in the atmosphere to cause gentle Caribbean breezes and fluffy white clouds to morph into devastating hurricanes like 2017’s Hurricane Maria, which killed nearly 3000 people in Puerto Rico alone, or 2025’s Hurricane Melissa, which devastated parts of Jamaica.

In previous studies, a specific type of atmospheric wave called a convectively coupled Kelvin wave had emerged as a possible hallmark of incipient tropical cyclones (a generic term for hurricanes and their Pacific Ocean equivalents, typhoons). These waves appear in the region near the equator where northeast and southeast trade winds meet, and they move eastward at around 15-20 metres per second, with wavelengths of 7000 km.

As they travel, the Kelvin waves perturb local concentrations of moisture, and their passing heralds a change in the direction of the prevailing winds. The resulting “zonal wind anomalies” are thought to play a role in spawning early-stage tropical cyclones. Observational data support this idea: two days after these waves pass through an area, there’s a noticeable uptick in tropical cyclones.

Rios-Berrios decided to investigate this correlation by developing an idealized model that captures the convective dynamics of both Kelvin waves and tropical cyclones. To simplify things, she based her model on an “Earth-like “aquaplanet” that lacks land and sea ice, and where temperatures vary only with latitude, and never with longitude or the passage of the seasons.

Despite these simplifications, Rios-Berrios found that this watery model world nevertheless produces hurricane-like swirls of wind and cloud. And after observing the full life cycle of more than 100 simulated tropical cyclones, she concluded that Kelvin waves do, in fact, influence their formation, with a two-day time lag that corresponds well to the data. “These results could help forecast active tropical cyclone periods several weeks in advance,” Rios-Berrios told the APS audience.

A gathering storm

Fortified by this promise, I picked up my laptop and walked the short distance to the “science crisis” session down the hall. I missed the session’s first talk, which aimed to draw comparisons between the Trump administration’s cuts to the National Science Foundation and the 1990s round of belt-tightening that doomed the Superconducting Super Collider project. However, I arrived in time for the second talk, in which a science historian, Zuoyue Wang, was scheduled to discuss parallels between the current situation and a succession of Cold War-era science budget crises.

To my surprise, Wang’s first slide featured a photo of NCAR’s iconic Mesa Lab – but not because of the great work being done there by Rios-Berrios and her colleagues. The fact that NCAR studies climate as well as weather means it has fallen foul of MAGA Republicans’ denial of climate change. In December 2025, the Trump White House issued a memo specifically targeting the centre and claiming that “NCAR’s activities veer far from strong or useful science”.

As a consequence of this disfavour, Wang told the audience, “NCAR is being dismantled as we speak.” Though some members of the US Congress have pushed back against the administration’s cuts, Wang described the attacks on science as “never-ending”.

By the end of Wang’s talk, all the optimism I’d built up in hearing about Rios-Berrios’ efforts to protect people from tropical cyclones was blasted from its foundations. If this is the kind of science the current US regime deems “useless”, I wondered, what on Earth is going to happen to the rest of the work on show at events like the APS Summit?

From the classroom to the committee room: Dave Robertson MP on politics and physics

This episode of the Physics World Weekly podcast features a conversation with Dave Robertson, who was elected member of the UK parliament for Lichfield in 2024. Robertson spent eight years teaching physics after studying the subject at the University of Liverpool. He then worked for a teachers’ union, which inspired him to become a candidate for the Labour Party.

He chats with Physics World’s Matin Durrani about his transition from the classroom to the committee room and how parliament “is a truly bonkers and truly bizarre workplace”.

Robertson has already sponsored three physics-related events at the Palace of Westminster and he talks about his membership of various cross-party parliamentary groups – including those on nuclear energy and space.

Robertson has not forgotten his roots in education and is adamant that the UK must address its nationwide shortage of physics teachers. He also urges physicists to speak out about how they can help address many of the world’s problems, notably climate change.

Geometry induces chirality in nickel – and magnons flow

The ability to control the direction in which a signal travels – without external switching, without added circuitry – is a longstanding goal in the design of compact magnetic devices. Magnetochiral anisotropy offers exactly that: a material-level asymmetry in which magnetic waves (known as magnons) travelling in opposite directions are physically inequivalent, opening a route to magnetic logic operations and memory that retains data without a continuous power supply.

The effect has been understood in principle for decades, but always felt like a phenomenon that nature deliberately made inconvenient. Accessing magnetochiral anisotropy required materials that are chiral at the crystalline level – compounds like Cu2OSeO3, where the Dzyaloshinskii-Moriya interaction (DMI, a quantum mechanical force that pushes neighbouring magnetic moments to twist relative to each other) emerges only from a non-centrosymmetric crystal lattice that takes considerable effort to synthesize.

And even after synthesis, the device still needs cooling to cryogenic temperatures and application of an external magnetic field just to function. As a result, a phenomenon with genuine technological promise has spent most of its life confined to fundamental research, perpetually interesting and perpetually out of reach.

A research team headed up at École Polytechnique Fédérale de Lausanne (EPFL) has come up with a way to move this technology closer to real-world application. The idea is deceptively simple: stop asking which material can provide chirality, and start asking whether the shape of the structure itself can do the job instead. It turns out it can – as the team demonstrated, not just in a theoretical prediction, but as an actual measurement, at room temperature and with zero applied magnetic field.

Dirk Grundler, head of EPFL’s Laboratory of Nanoscale Magnetic Materials and Magnonics, and collaborators showed that a structural twist, in the form of a helical surface relief printed onto an otherwise ordinary polycrystalline nickel nanotube, is sufficient to induce chirality. The torsion and curvature of the twist generate a shape anisotropy and force the magnetic ground state into a spiralling spin texture. Its toroidal moment does exactly what the DMI does in a natural chiral crystal – it breaks the symmetry of the magnon dispersion relation, which describes how the energy of a magnetic wave depends on its direction of travel.

These twisted structures, termed artificial chiral magnets (ACMs), satisfy three conditions that no natural chiral magnet has jointly met: room temperature stability, zero field operation and realization in polycrystalline nickel – a material that is naturally achiral.

ACM outperforms a natural chiral crystal

The researchers used two-photon lithography to write the twisted polymeric scaffold and atomic layer deposition to coat it with a conformal 30-nm thick nickel shell. The handedness of the design is directly inherited by the magnetic ground state – as confirmed by X-ray magnetic circular dichroism microscopy – and can be flipped by field history, producing opposite helicity states in the same structural device on demand.

The team also performed microfocused Brillouin light scattering spectroscopy to resolve the magnon dynamics. At zero field and room temperature, the intensity non-reciprocity parameter (the difference in signal strength between waves travelling in opposite directions) reached 35.7%, switching reproducibly between two stable configurations (spin texture spiralling clockwise or anticlockwise) under field cycling without drift or degradation. At ±250 mT, the frequency non-reciprocity parameter (how much the frequency of a magnetic wave changes depending on which way it travels) peaked at 5.4×10-2, nearly three times the value reported for the bulk chiral material Cu2OSeO3 at cryogenic temperature.

Overall, the geometrically engineered nickel tube at room temperature outperformed a natural chiral crystal at low temperature. Using micromagnetic simulations and analytical modelling, the team traced the origin of this non-reciprocity to two cooperating mechanisms, both of which are tied to the geometry of the tube rather than the chemistry of the material, and both of which scale with decreasing tube radius. This implies that the numbers reported in this study are not a ceiling – they are a starting point.

A scalable blueprint for chirality-engineered magnonics

The most important factor in this result is not the nickel. While nickel was the material used, the principle does not belong exclusively to nickel. Because the chirality here is geometric – written into the shape of the structure rather than the chemistry of the lattice – it is transferable to any ferromagnet that can be deposited conformally over a three-dimensional scaffold.

Analytical calculations predict that permalloy, a nickel–iron magnetic alloy with higher saturation magnetization and exchange stiffness than nickel, should produce stronger non-reciprocity in an identical geometry. And since non-reciprocity scales with decreasing tube radius, sub-100-nm geometries accessible through next-generation two-photon lithography represent a direct route to significantly amplified effects.

Moreover, this ACM structure is multifunctional by design. Spin waves travelling through the helical magnetic structure behave differently depending on their characteristics. Some move quickly and directionally, making them suitable for carrying signals. Others are nearly stationary and strongly asymmetric – they travel in one direction and are blocked in the other, which is the defining behaviour of a diode.

The twist of the magnetic texture (clockwise or anticlockwise) can also be set by a magnetic field pulse and held indefinitely without requiring any power, functioning as a memory that stores information in the handedness of the spin arrangement rather than in a voltage or a charge. Because this directional asymmetry of magnetochiral anisotropy is a property of the geometry and not just of the spin waves, electrical current passing through the same structure is expected to experience the same effect – flowing more easily in one direction than the other.

In other words, a single nanoscale helix could simultaneously route signals, switch them, remember them and rectify them. One structure, four functions, no exotic material – the chirality was never in the crystal, it was in the geometry.

The findings are reported in Nature Nanotechnology.

Fluid flow: how heat can move from cooler to warmer regions

Vortex-induced heat backflow

We are all familiar with the fact that heat flows from warmer regions to colder ones. A team of researchers at the EPFL in Switzerland has now shown that the reverse can happen in many highly ordered materials, without violating the laws of thermodynamics. Besides the fundamental scientific appeal of this finding, the researchers say their work could help in the design of electronic devices in which heat flow could be guided, potentially minimizing heat losses.

Scientists have always thought of heat as following Fourier’s law of diffusion, explains physicist Nicola Marzari, who led this new study. Fourier built his law on earlier work by Newton and established that heat transfer through a solid depends on an intrinsic material property (how easily the material conducts heat) and on the temperature difference (more precisely, the gradient) between a hot and a cold region. “The minus sign between the heat current and the temperature gradient in his equations captures the fact that heat always flows from hot to cold regions,” Marzari explains.

In the 1960s, however, theory argued – and experiments in solid helium revealed – that heat could also propagate like a wave; this behaviour was called “second sound”, in analogy with an acoustic wave travelling through a solid. “At the time, the phenomenon was considered fairly exotic and could only occur at very low temperatures – just a few Kelvin above absolute zero,” says Marzari. “But, in 2015, we showed that this behaviour could be found in many materials – from two-dimensional monolayers to graphite and diamond – and at much higher temperatures. Indeed, experiments on graphite in 2019 and 2022 confirmed these predictions at 100 K and 200 K.”

“Directing heat as we want”

This mode of heat propagation, which could also occur at room temperature, is known as phonon hydrodynamics because heat is now considered as moving like a fluid, Marzari adds. Phonons are the extended atomic vibrations that transport heat. Normally, interference or collisions between these cause heat to dissipate slowly, following Fourier’s law. “However, the emergence of fluid-like behaviour means that vortices can appear and that obstacles can send fluid backwards – from a cold region to a hot one,” he explains. “This is very counterintuitive, but doesn’t break thermodynamics. It also means we can start thinking of guiding and directing heat as we want, or maybe build thermal diodes in which heat can only flow in one direction.”

Back in 2020, the researchers derived a unified theory of heat propagation that encompassed Fourier diffusion and the hydrodynamic regime. This was a feat in itself, says Marzari: Fourier’s law dates from 1822 and the microscopic theory of heat (the Peierls-Boltzmann transport equation) from 1929. “Our new ‘viscous heat equations’ from 2020 are both accurate and reasonably simple to solve, so there was a lot of excitement at being able to try and look at what would they predict.”

“Simple” here is relative, he admits. “But the community has learnt a lot from the study of hydrodynamics phenomena in real fluids – for example, how ships, airplanes and even bumblebees stay afloat – and we used some of that knowledge in our new work.”

Temperature profile of a hydrodynamic system contains two contributions

Reporting their work in Physical Review Letters, the EPFL researchers showed how they could maximize hydrodynamic flow in a strip of graphite using a Fourier-space framework. They knew that the temperature profile of a hydrodynamic system contains two contributions: vorticity (how heat flow swirls) and compressibility (how heat flow is squeezed). They were therefore able to show how compressibility plays a critical role in phonon fluids. This, says team member Enrico Di Lucente, provides an explanation for why heat backflow is at a maximum when compressibility is minimized: incompressible heat flow cannot be squeezed when it encounters resistance, but is instead redirected backwards.

In such hydrodynamic heat backflow, heat flows from cooler regions to warmer ones, leading to a negative temperature difference and overall negative thermal resistance across the device. While the effect observed is very small, Di Lucente says that he and his colleagues could now design experiments to maximize it, “potentially changing how we think about energy loss in electronic systems”. For example, “you could imagine a smartphone with a hydrodynamic shield to direct thermal energy away from the battery, so it doesn’t overheat”.

Looking ahead, the researchers are now working with experimental colleagues who are able to carve very precise microscopic structures that could confirm the predicted phenomena . “We will also explore novel geometries and architectures, to make the effects we have observed larger and larger,” says Marzari. “These comes with fancy names – such as Christmas trees and Tesla valves – so, stay tuned.”

Di Lucente has now moved to Columbia University to work in Michele Simoncelli’s team, who was involved in the earlier studies for the viscous heat equations. And Marzari is moving to the Cavendish Laboratory at the University of Cambridge, where he has been elected as the new Cavendish Professor of Physics.

Surface contamination holds the key to a static electricity mystery

When Scott Waitukaitis set out to understand a puzzling aspect of static electricity, he didn’t expect to find the answer in a substance colloquially known as “schmutz”. But after a painstaking series of experiments, Waitukaitis and colleagues at the Institute of Science and Technology Austria (ISTA) have strong evidence that carbon-based surface contaminants – in other words, schmutz – are, in fact, the determining factor in how electric charge flows when certain types of insulating materials come into contact. By clearing up this mystery, the ISTA scientists say that their findings, which are published today in Nature, could shed fresh light on phenomena such as lightning and protoplanetary disk formation where static electricity plays a significant role.

If you’ve ever felt a shock after rubbing your hair with a balloon or shuffling across a carpet, you’ll know that static electricity can be a real pain. But for the scientists who study it, the pain runs much deeper. “Experimentally, it’s really hard,” Waitukaitis says. “There’s just a tonne of problems with this topic.”

During his talk at the APS Global Physics Summit in Denver, Colorado this week, Waitukaitis listed a few of these problems. Measuring an object’s charge is tricky. You can’t tell whether surface charge is coming from electrons or ions. And whenever you touch the object, you change its charge in unpredictable ways. Because of these complications, Waitukaitis says that even the most careful experiments are plagued by systematic effects.

Acoustic levitation to the (partial) rescue

To avoid the worst of these effects, an experimental team led by Galien Grosjean built an apparatus that uses sound waves to suspend a tiny sphere of silicon dioxide above a plate made from the same material. Turning this acoustic potential off and on again enabled the team to drop the initially neutral sphere onto the plate and “catch” it on the rebound without actually touching it. By applying a varying electric field to the sphere and measuring how it oscillates during its “catch” phase, they could also measure how much charge the sphere gained or lost via contact with the plate to within 500 electrons.

It wasn’t easy, though. “Bouncing this tiny sphere on a plate and catching it again is tricky enough to achieve once, but to understand the charging behaviour, we need to repeat this hundreds or even thousands of times in a row, without ever losing the particle,” Grosjean tells Physics World.

Preparing the samples was also challenging, he continues. “We were looking for what could possibly cause same-material samples to charge differently, so it was absolutely crucial for the samples to be prepared in exactly the same conditions.”

After these careful preparations, Grosjean, Waitukaitis and colleagues observed a curious pattern. Although every individual sphere charged in a consistent way with every individual plate, some spheres became more positively charged with each bounce, while others became more negatively charged. In effect, the spheres behaved as if they were made of completely different materials.

Spectroscopic investigations

Suspecting that surface contamination could be the culprit, the ISTA scientists tried cleaning the spheres and plates with plasma and baking them at 200 C. The results were stark: the “clean” spheres all became more negatively charged with each bounce, regardless of how they’d responded before. Then, over the next day or so, the previous pattern reemerged. Whatever they’d removed, it was obviously coming back.

At this point, the ISTA scientists started working with spectroscopists to identify what, exactly was on the surfaces of their spheres. The answer? Carbon, in the form of carbon dioxide, methane and various longer-chain carbon-rich molecules. “We never get the same cocktail of carbon on the surface twice, but the fact that it’s there really matters,” Waitukaitis says. By adding or removing carbon to their spheres, he adds, “we can make everything that was charging positively charge negatively and vice versa.”

What they can’t do – at least, not yet – is explain exactly how this carbon schmutz changes the spheres’ charging behaviour. “Everybody starts on this topic thinking, ‘I’m an awesome physicist, I can kick its butt in less than two years,’” Waitukaitis says. “That’s not the case.”

In a future experiment, the ISTA scientists plan to deliberately dope their spheres with specific functional groups of carbon, in effect creating their own, tailored versions of schmutz to study. “Schmutz is a pain, but for us, it’s the thing that matters,” Waitukaitis concludes.

Quantum physicists Charles Bennett and Gilles Brassard win $1m Turing Award

The quantum physicists Charles Bennett and Gilles Brassard have been awarded the 2025 ACM Turing Award “for their essential role in establishing the foundations of quantum information science and transforming secure communication and computing”.

The Turing Award, often referred to as the “Nobel Prize in Computing,” is awarded by the Association for Computing Machinery (ACM) and carries a $1m prize. The award is named after Alan Turing, the British mathematician who formulated the mathematical basis of computing.

For over 40 years, Bennett and Brassard have played a crucial role in the foundations of quantum information science, in particular establing a quantum cryptography protocol in the 1980s.

Classical cryptography today is a vital part of computer and communication networks, protecting everything from business e-mails to bank transactions.

Information is kept secret using an encryption algorithm together with a secret “key” that the sender uses to scramble a message into a form that cannot be understood by an eavesdropper. The recipient then uses the same key with a decryption algorithm to read the message.

The issue with standard encryption is that the key must be known to both parties with the problem being how to distribute the key securely.

Quantum cryptography, or quantum key distribution (QKD), however, provides an automated method for distributing secret keys using standard communication fibres. Based on the principles of quantum mechanics, QKD is inherently secure and allows the key to be changed frequently, reducing the threat of key theft.

The first method for distributing secret keys encoded in quantum states was proposed in 1984 by Bennett working at IBM Research and Gilles Brassard at the University of Montreal.

In their “BB84” protocol, a bit of information is represented by the polarization state of a single photon – “0” by horizontal and “1” by vertical, for example. The sender transmits a string of polarized single photons to the receiver and by carrying out a series of measurements they are able to establish a shared key and to test whether an eavesdropper has intercepted any information en-route.

The BB84 protocol not only tests for eavesdropping, but also guarantees that sender and reciever can establish a secret key even if an eavesdropper has determined some of the bits in their shared binary sequence, using a technique called “privacy amplification”.

In 1993 Bennett and Brassard along with other collaborators, introduced the concept of quantum teleportation, demonstrating how an arbitrary quantum state could be transmitted between distant parties using quantum entanglement and classical communication.

Subsequent work by the duo also led the development of scalable quantum communication, an effort that continues today. “Bennett and Brassard fundamentally changed our understanding of information itself,” notes ACM president Yannis Ioannidis. “Their insights expanded the boundaries of computing and set in motion decades of discovery across disciplines. The global momentum behind quantum technologies today underscores the enduring importance of their contributions.”

 A life in science

Bennett was born on 7 April 1943 in New York, US.  After earning his Bachelor’s degree from Brandeis University in 1964 and his PhD from Harvard University in 1971, he moved to Argonne National Laboratory. In 1972, Bennett joined IBM Research where he has remained since.

Brassard was born on 20 April 1955 in Montreal, Canada. He earned his Bachelor’s and Master’s degrees from the University of Montreal and his PhD in theoretical computer science from Cornell University in 1979. He then joined the faculty of the University of Montreal shortly thereafter where he has remained since.

As well as the Turing Prize, both Brassard and Bennett have been awarded the Wolf Prize in Physics, the Micius Quantum Prize as well as the Breakthrough Prize in Fundamental Physics.

Superfluid plasmon appears in a two-dimensional superconductor

A collective mode of electrons predicted to exist in high-temperature superconductors, but difficult to observe in experiments has been identified by physicists at the Massachusetts Institute of Technology (MIT). The finding could advance our understanding of these materials, they say.

According to the Bardeen–Cooper–Schrieffer (BCS) theory, superconductivity occurs when electrons in a material overcome their mutual electrical repulsion to form electron pairs. These Cooper pairs, as they are known, can then travel unhindered through the material as a supercurrent without scattering off phonons (quasiparticles arising from vibrations of the material’s crystal lattice) or other impurities.

Cooper pairing is characterized by a tell-tale energy gap near the Fermi level, which is the highest energy level that electrons can occupy in a solid at absolute zero temperature. This gap is equivalent to the minimum energy required to break up a Cooper pair, and identifying it is regarded as unequivocal proof of a material’s superconducting nature.

In high-temperature cuprate semiconductors, which are layered materials, the Cooper pairs are confined to two-dimensional copper–oxygen (CuO2) planes that are only weakly coupled with respect to each other. Researchers are able to study the collective oscillations of these conduction electrons – known as plasmons – that travel perpendicular to these superconducting layers using terahertz (THz) spectroscopy at millielectronvolts energies, which is lower than the superconducting gap of the material. They can do this because the plasmons interact strongly with light.

However, doing the same for the electrons within the CuO2 layers themselves, is not so easy. This is because the collective electron behaviour occurs at energies that are much higher than the superconducting gap.

A 2D superfluid plasmon

Now, a team of physicists led by Nuh Gedik say they have succeeded in identifying a 2D superfluid plasmon in the layered superconductor bismuth strontium calcium copper oxide (or “BSCCO”) using a new THz microscope they developed in their laboratory. This plasmon has energies that are lower than the superconducting gap of the material.

THz radiation spans wavelengths from 30 μm (10 THz) in the mid-infrared part of the electromagnetic spectrum to 1–3 mm (0.1–0.3 THz) in the microwave domain. This is much larger than the size of atoms and molecules, so THz light cannot be used to resolve microscale structures. To overcome this fundamental diffraction limit, which restricts spatial resolution to roughly half of the wavelength of the light being used, Gedik and colleagues used spintronic emitters, which are devices that produce sharp pulses of THz light.

The researchers explain that when they shone this light on the multi-layered BSCCO, it triggered a cascade of effects in the electrons within each layer. By placing their sample, held at ultracold temperatures so that it became a superconductor, close to (in the near field) of the spintronic emitter, they were able to trap the THz light before it had time to spread. They were thereby able to “squeeze” it into a space much smaller than its wavelength. In this regime, the light can bypass the diffraction limit and resolve features previously too small to observe – in this case, collective THz oscillations of superconducting electrons within the material. Such a “jiggling” superfluid as the researchers have dubbed it was predicted to exist but never directly visualized until now.

“The new mode of electrons we have seen will provide a novel way of studying high temperature superconductivity in these systems,” says Gedik, “and we will now be looking into how this collective mode changes as a function of temperature, doping and sample geometry.”

The tool that we built could also be used to study the properties of 2D materials other than high temperature superconductors – for example, the optical behaviour in the THz regime for many small samples and heterostructures, he tells Physics World.

The present work is detailed in Nature.

Will the demise of the US penny damage science education?

Let us mourn the demise of the American penny. With each of the one-cent coins costing about three cents to make, it was “wasteful” to keep producing them, pronounced President Trump. US pennies won’t vanish soon. While the last was minted in November 2025, about 250 billion will remain in circulation for a time despite the rising number of cash-free transactions.

The US penny has been around since 1793. Lamenting its passing is faintly obscene compared to other things that the US government has done lately, such as terminating science agencies, cutting jobs, and slashing budgets, environmental regulations and vaccine research. But I can’t stop thinking about what the penny meant to my own science education.

Science collaborators

Pennies, which until the early 1980s were 95% copper, taught me about corrosion. I learned, for instance, that the Statue of Liberty’s green colour is due to oxidized copper. At school, we were taught how to make pennies a light shade of green by immersing them in salt and vinegar; a plant food such as Miracle Gro works even better as it contains ammonia. We were then instructed to figure out how to clean off the green, discovering that an acid like lemon juice did the trick.

When I placed a drop of water on the surface of a penny, the dome-like shape it adopted – caused simply by surface tension – was an impressive sight. My first lessons on ions, meanwhile, involved placing pennies and steel nails in a bath of salt and vinegar: the nails got electroplated with copper; the pennies with zinc.

We also had to determine the density of pennies, which are 19 mm in diameter and 1.52 mm thick, by submerging them in a graduated cylinder to find their volume and the weighing them to determine their mass. From 1983 – years after my high-school career – this exercise turned more interesting still because pennies became 97.5% zinc and were only plated with copper so you had to be eagle-eyed to tell old and new apart.

Pennies were indispensable lab props too. All the kilogram weights for mechanics experiments were bags of 400 pennies (the 1983+ penny weighing exactly 2.50 g). They were great for coin-tossing in statistics classes too, although I assume other coins gave the same result, even if that was an experiment we never tried.

The humble penny wasn’t some piece of lab equipment manufactured by an educational company but a familiar part of our world

The humble penny was effective for all these uses because it wasn’t some piece of lab equipment manufactured by an educational company but a familiar part of our world. The coins were cheap and available, and nobody cared if you lost them or took a few home.

You could stick pennies under a leg to prop up a wobbly table. They made makeshift washers if you punched in a hole and inserted nails or screws. If you were bored by the hand-cranked penny-squishing machines at tourist sites and amusement parks whose results are fully predictable, a more exciting way to deface currency would be to lay pennies on railroad tracks and hunt for the results in the stones after the train passes, though never do this because it’s dangerous.

Unit value

Pennies taught me something indirectly. After a breakup, my ex left abandoning some clothing, a cat and a large bowl containing literally thousands of pennies. The clothes I could throw out and I had to learn to love the cat. But the bowl?

I tried to put it at the bottom of my closet, but the damned thing continued to haunt me. Should I toss the bowl and its contents in the garbage? Wasteful, un-environmental and avoidant. Stuff the pennies into 50-cent coin-roll holder or take them to a coin-counting kiosk in a bank, and then present them to a teller? Psychologically unsatisfying.

No, I had to deal with the pennies doing with them what they were meant for. I must spend them. At a bar one night I tried to pay the tab using all pennies. They were legal tender, right? The bouncer was summoned. One night a taxi driver furiously threw my pennies back at me, accusing me of treating him like a waiter. I was astonished that he thought I was disrespecting him rather than engaged in post-breakup, self-absorbed infantile behaviour.

I managed to befriend a sympathetic newsstand worker who, a few times a week, was willing to let me buy the New York Times all in pennies

I could only use the fundamental unit of US currency in anomalous circumstances that I had to generate myself. In those days the New York Times cost 35 cents and I managed to befriend a sympathetic newsstand worker who, a few times a week, was willing to let me buy it all in pennies. He’d cheerfully greet me with “Here come my pennies!” and claimed I was becoming a better person now I was greeting vendors with smiles, not scowls.

I learned to work the monetary system methodically. For things that cost a little over 25 cents, I used a quarter and then pennies for what was left; for things that cost a little over a dollar I handed over a bill and the rest in pennies. I’d choreograph my purchases in advance so that I could use the appropriate lesser unit of currency plus pennies.

I often exploited the fact that sales tax in the US is only added on at the till, which means that something priced $2 with a sales tax of 6.5% will be $2.13 when you pay the cashier. So I’d hunt around in my pocket for a moment, and then in feigned chagrin say that I only had pennies, and hand cashier the 13 of them that I had carefully calculated beforehand would be needed.

Soon, keeping exacting track of purchases, I managed to spend an average of about 200 pennies a week. From day to day and even week to week the pile in the bowl barely dwindled. But, finally, after a little less than a year only a handful remained. I was thrilled – it was better than seeing a therapist.

The critical point

You might think that the moral I’m about to draw is the need for faith in incremental change – that, penny by penny, you can move mountains. That’s certainly the lesson teachers urge on you if you’re learning a foreign language or playing a musical instrument.

No, I was instead moved by the humbler experience of valuing an entire system of units moored to a stable, familiar, simple but all-important base unit that you can literally count on.

I still value that lesson, though it’s less concrete than what I learned from corroded, nailed or squished pennies.

Copyright © 2026 by IOP Publishing Ltd and individual contributors