Skip to main content

Top 10 Breakthroughs of the Year in physics for 2024 revealed

Physics World is delighted to announce its Top 10 Breakthroughs of the Year for 2024, which includes research in nuclear and medical physics, quantum computing, lasers, antimatter and more. The Top Ten is the shortlist for the Physics World Breakthrough of the Year, which will be revealed on Thursday 19 December.

Our editorial team has looked back at all the scientific discoveries we have reported on since 1 January and has picked 10 that we think are the most important. In addition to being reported in Physics World in 2024, the breakthroughs must meet the following criteria: 

  • Significant advance in knowledge or understanding 
  • Importance of work for scientific progress and/or development of real-world applications 
  • Of general interest to Physics World readers 

Here, then, are the Physics World Top 10 Breakthroughs for 2024, listed in no particular order. You can listen to Physics World editors make the case for each of our nominees in the Physics World Weekly podcast. And, come back next week to discover who has bagged the 2024 Breakthrough of the Year. 

Light-absorbing dye turns skin of live mouse transparent

Zihao Ou holds a vial of the common yellow food dye tartrazine in solution

To a team of researchers at Stanford University in the US for developing a method to make the skin of live mice temporarily transparent. One of the challenges of imaging biological tissue using optical techniques is that tissue scatters light, which makes it opaque. The team, led by Zihao Ou (now at The University of Texas at Dallas), Mark Brongersma and Guosong Hong, found that the common yellow food dye tartrazine strongly absorbs near-ultraviolet and blue light and can help make biological tissue transparent. Applying the dye onto the abdomen, scalp and hindlimbs of live mice enabled the researchers to see internal organs, such as the liver, small intestine and bladder, through the skin without requiring any surgery. They could also visualize blood flow in the rodents’ brains and the fine structure of muscle sarcomere fibres in their hind limbs. The effect can be reversed by simply rinsing off the dye. This “optical clearing” technique has so far only been conducted on animals. But if extended to humans, it could help make some types of invasive biopsies a thing of the past. 

Laser cooling positronium 

To the AEgIS collaboration at CERN, and Kosuke Yoshioka and colleagues at the University of Tokyo, for independently demonstrating laser cooling of positronium. Positronium, an atom-like bound state of an electron and a positron, is created in the lab to allow physicists to study antimatter. Currently, it is created in “warm” clouds in which the atoms have a large distribution of velocities, making precision spectroscopy difficult. Cooling positronium to low temperatures could open up novel ways to study the properties of antimatter. It also enables researchers to produce one to two orders of magnitude more antihydrogen – an antiatom comprising a positron and an antiproton that’s of great interest to physicists. The research also paves the way to use positronium to test current aspects of the Standard Model of particle physics, such as quantum electrodynamics, which predicts specific spectral lines, and to probe the effects of gravity on antimatter. 

Modelling lung cells to personalize radiotherapy

To Roman Bauer at the University of Surrey, UK, Marco Durante from the GSI Helmholtz Centre for Heavy Ion Research, Germany, and Nicolò Cogno from GSI and Massachusetts General Hospital/Harvard Medical School, US, for creating a computational model that could improve radiotherapy outcomes for patients with lung cancer. Radiotherapy is an effective treatment for lung cancer but can harm healthy tissue. To minimize radiation damage and help personalize treatment, the team combined a model of lung tissue with a Monte Carlo simulator to simulate irradiation of alveoli (the tiny air sacs within the lungs) at microscopic and nanoscopic scales. Based on the radiation dose delivered to each cell and its distribution, the model predicts whether each cell will live or die, and determines the severity of radiation damage hours, days, months or even years after treatment. Importantly, the researchers found that their model delivered results that matched experimental observations from various labs and hospitals, suggesting that it could, in principle, be used within a clinical setting. 

A semiconductor and a novel switch made from graphene 

Epigraphene

To Walter de Heer, Lei Ma and colleagues at Tianjin University and the Georgia Institute of Technology, and independently to Marcelo Lozada-Hidalgo of the University of Manchester and a multinational team of colleagues, for creating a functional semiconductor made from graphene, and for using graphene to make a switch that supports both memory and logic functions, respectively. The Manchester-led team’s achievement was to harness graphene’s ability to conduct both protons and electrons in a device that performs logic operations with a proton current while simultaneously encoding a bit of memory with an electron current. These functions are normally performed by separate circuit elements, which increases data transfer times and power consumption. Conversely, de Heer, Ma and colleagues engineered a form of graphene that does not conduct as easily. Their new “epigraphene” has a bandgap that, like silicon, could allow it to be made into a transistor, but with favourable properties that silicon lacks, such as high thermal conductivity. 

Detecting the decay of individual nuclei 

To David Moore, Jiaxiang Wang and colleagues at Yale University, US, for detecting the nuclear decay of individual helium nuclei by embedding radioactive lead-212 atoms in a micron-sized silica sphere and measuring the sphere’s recoil as nuclei escape from it. Their technique relies on the conservation of momentum, and it can gauge forces as small as 10-20 N and accelerations as tiny as 10-7 g, where g is the local acceleration due to the Earth’s gravitational pull. The researchers hope that a similar technique may one day be used to detect neutrinos, which are much less massive than helium nuclei but are likewise emitted as decay products in certain nuclear reactions. 

Two distinct descriptions of nuclei unified for the first time 

To Andrew Denniston at the Massachusetts Institute of Technology in the US, Tomáš Ježo at Germany’s University of Münster and an international team for being the first to unify two distinct descriptions of atomic nuclei. They have combined the particle physics perspective – where nuclei comprise quarks and gluons – with the traditional nuclear physics view that treats nuclei as collections of interacting nucleons (protons and neutrons). The team has provided fresh insights into short-range correlated nucleon pairs – which are fleeting interactions where two nucleons come exceptionally close and engage in strong interactions for mere femtoseconds. The model was tested and refined using experimental data from scattering experiments involving 19 different nuclei with very different masses (from helium-3 to lead-208). The work represents a major step forward in our understanding of nuclear structure and strong interactions.  

New titanium:sapphire laser is tiny, low-cost and tuneable 

To Jelena Vučković, Joshua Yang, Kasper Van Gasse, Daniil Lukin, and colleagues at Stanford University in the US for developing a compact, integrated titanium:sapphire laser that needs only a simple green LED as a pump source. They have reduced the cost and footprint of a titanium:sapphire laser by three orders of magnitude and the power consumption by two. Traditional titanium:sapphire lasers have to be pumped with high-powered lasers – and therefore cost in excess of $100,000. In contrast, the team was able to pump its device using a $37 green laser diode. The researchers also achieved two things that had not been possible before with a titanium:sapphire laser. They were able to adjust the wavelength of the laser light and they were able to create a titanium:sapphire laser amplifier. Their device represents a key step towards the democratization of a laser type that plays important roles in scientific research and industry. 

Quantum error correction with 48 logical qubits; and independently, below the surface code threshold   

Photo of the Google Quantum AI Willow chip, which looks like a dark grey square inside a lighter, silvery grey one, on a grey woven-metallic background

To Mikhail Lukin, Dolev Bluvstein and colleagues at Harvard University, the Massachusetts Institute of Technology and QuEra Computing, and independently to Hartmut Neven and colleagues at Google Quantum AI and their collaborators, for demonstrating quantum error correction on an atomic processor with 48 logical qubits, and for implementing quantum error correction below the surface code threshold in a superconducting chip, respectively. Errors caused by interactions with the environment – noise – are the Achilles heel of every quantum computer, and correcting them has been called a “defining challenge” for the technology. These two teams, working with very different quantum systems, took significant steps towards overcoming this challenge. In doing so, they made it far more likely that quantum computers will become practical problem-solving machines, not just noisy, intermediate-scale tools for scientific research. 

Entangled photons conceal and enhance images 

To two related teams for their clever use of entangled photons in imaging. Both groups include Chloé Vernière and Hugo Defienne of Sorbonne University in France, who as duo used quantum entanglement to encode an image into a beam of light. The impressive thing is that the image is only visible to an observer using a single-photon sensitive camera – otherwise the image is hidden from view. The technique could be used to create optical systems with reduced sensitivity to scattering. This could be useful for imaging biological tissues and long-range optical communications. In separate work, Vernière and Defienne teamed up with Patrick Cameron at the UK’s University of Glasgow and others to use entangled photons to enhance adaptive optical imaging. The team showed that the technique can be used to produce higher-resolution images than conventional bright-field microscopy. Looking to the future, this adaptive optics technique could play a major role in the development of quantum microscopes. 

First samples returned from the Moon’s far side  

To the China National Space Administration for the first-ever retrieval of material from the Moon’s far side, confirming China as one of the world’s leading space nations. Landing on the lunar far side – which always faces away from Earth – is difficult due to its distance and terrain of giant craters with few flat surfaces. At the same time, scientists are interested in the unexplored far side and why it looks so different from the near side. The Chang’e-6 mission was launched on 3 May consisting of four parts: an ascender, lander, returner and orbiter. The ascender and lander successfully touched down on 1 June in the Apollo basin, which lies in the north-eastern side of the South Pole-Aitken Basin. The lander used its robotic scoop and drill to obtain about 1.9 kg of materials within 48 h. The ascender then lifted off from the top of the lander and docked with the returner-orbiter before the returner headed back to Earth, landing in Inner Mongolia on 25 June. In November, scientists released the first results from the mission finding that fragments of basalt – a type of volcanic rock – date back to 2.8 billion years ago, indicating that the lunar far side was volcanically active at that time. Further scientific discoveries can be expected in the coming months and years ahead as scientists analyze more fragments. 

 

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

Exploring this year’s best physics research in our Top 10 Breakthroughs of 2024

This episode of the Physics World Weekly podcast features a lively discussion about our Top 10 Breakthroughs of 2024, which include important research in nuclear physics, quantum computing, medical physics, lasers and more. Physics World editors explain why we have made our selections and look at the broader implications of this impressive body of research.

The top 10 serves as the shortlist for the Physics World Breakthrough of the Year award, the winner of which will be announced on 19 December.

Links to all the nominees, more about their research and the selection criteria can be found here.

 

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

Automated checks build confidence in treatment verification

ChartCheck

Busy radiation therapy clinics need smart solutions that streamline processes while also enhancing the quality of patient care. That’s the premise behind ChartCheck, a tool developed by Radformation to facilitate the weekly checks that medical physicists perform for each patient who is undergoing a course of radiotherapy. By introducing automation into what is often a manual and repetitive process, ChartCheck can save time and effort while also enabling medical physicists to identify and investigate potential risks as the treatment progresses.

“To ensure that a patient is receiving the proper treatment a qualified medical physicist must check a patient’s chart after every five fractions of radiation has been delivered,” explains Ryan Manger, lead medical physicist at the Encinitas Treatment Center, one of four clinics operated by UC San Diego in the US. “The current best practice is to check 36 separate items for each patient, which can take a lot of time when each physicist needs to verify 30 or 40 charts every week.”

Ryan Manger

Before introducing ChartCheck into the workflow at UC San Diego, Manger says that around 70% of the checks had to be done manually. “The weekly checks are really important for patient safety, but they become a big time sink when each task takes five or ten minutes,” he says. “It’s easy to get fatigued when you’re looking at the same things over and over again, and we have found that introducing automation into the process can have a positive impact on everything else we do in the clinic.”

ChartCheck monitors the progress of ongoing treatments by automatically performing a comprehensive suite of clinical checks, raising an alert if any issue is detected. As an example, after each treatment the tool verifies that the delivered dose matches the parameters defined in the clinical plan, while it also monitors real-time changes such as any movement of the couch during treatment. It also collates together all the necessary safety documentation, allows comments or notes to be added, and highlights any scheduling changes when a patient decides to take a treatment break, for instance, or the physician adds a boost to the clinical plan.

As well as consolidating all the information on a single platform, ChartCheck allows physicists to analyse the treatment data to identify and understand any underlying issues that might affect patient safety. “It has given us a lot more vision of what’s happening across all our treatments, which is typically around 300 per week,” says Manger. “Within just three months it has illuminated areas that we were unaware of before, but that might have carried some risk.”

What’s more, the physicists at UC San Diego have found that automating many of the routine tasks has enabled them to focus their attention where it is needed most. “We have implemented the tool as a first-pass filter to flag any charts that might need further attention, which is typically around 10–15% of the total,” says Manger. “We can then use our expertise to investigate those charts in more detail and to understand what the risk factors might be. The result is that we do a better check where it’s needed, rather than just looking at the same things over and over.”

Jennifer Scharff

Jennifer Scharff, lead physicist at the John Stoddard Cancer Center in Des Moines, Iowa, also values the extra insights that ChartCheck offers. One major advantage, she says, is how easy it is to check whether the couch might have moved between treatment fields. “It’s not ideal when the couch moves, but sometimes it happens if a patient coughs or sneezes during the treatment and the therapist needs to adjust the position slightly when they get back into their breath hold,” she says. “In ChartCheck it’s really easy to see those positional shifts on a daily basis, and to identify any trends or issues that we might need to address.”

ChartCheck offers full integration with ARIA, the oncology information system from Varian, making it easy to implement and operate within existing clinical workflows. Although ARIA already offers a tool for treatment verification, Scharff says that ChartCheck offers a more comprehensive and efficient solution. “It checks more than ARIA does, and it’s much faster and more efficient to do a weekly physics check,” she says. “As an example, it’s really easy to see the journal notes that our therapists make when something isn’t quite right, and it helps us to identify patients who need a final chart check when they want to pause or stop their treatment.”

The automated tool also guarantees consistency between the chart checks undertaken by different physicists, with Scharff finding the standardized approach particularly useful when locums are brought into the team. “It’s easy for them to see all the information we can see, we can be sure that they are making the same checks as we do, and the same documents are always sent for approval,” she says. “The system makes it really easy to catch things, and it calls out the same thing for everyone.”

With the medical physicists at UC San Diego working across four different treatment centres, Manger has also been impressed by the ability of ChartCheck to improve consistency between physicists working in different locations. “The human factor always introduces some variations, even between physicists who are fully trained,” he says. “Minimizing the impact of those variations has been a huge benefit that I hadn’t considered when we first decided to introduce the software, but it has allowed us to ensure that all the correct policies and procedures are being followed across all of our treatment centres.”

Overall, the experience of physicists like Manger and Scharff is that ChartCheck can streamline processes while also providing them with the reassurance that their patients are always being treated correctly and safely. “It has had a huge positive impact for us,” says Scharff. “It saves a lot of time and gives us more confidence that everything is being done as it should be.”

Patient-specific quality assurance (PSQA) based on independent 3D dose calculation

Want to learn more on this subject?

 

In this webinar, we will discuss that patient specific quality assurance (PSQA) is an essential component of the radiation treatment process. This control allows us to ensure that the planned dose will be delivered to the patient. The increasing number of patients with indications for modulated treatments requiring PSQA has significantly increased the workload of the medical physics departments, and the need to find more efficient ways to perform it has arisen.

In recent years, there has been an increasing evolution of measurement systems. However, the experimental process involved imposes a limit on the time savings. The 3D dose calculation systems are presented as a solution to this problem, allowing the reduction of the time needed for the initiation of treatments.

The use of 3D dose calculation systems, as stated in international recommendations (TG219), requires a process of commissioning and adjustment of dose calculation parameters.

This presentation will show the implementation of PSQA based on independent 3D dose calculation for VMAT treatments in breast cancer using DICOM information from the plan and LOG files. Comparative results with measurement-based PSQA systems will also be presented.

An interactive Q&A session follows the presentation.

Want to learn more on this subject?

Dr Daniel Venencia is the chief of the medical physics department at Instituto Zunino – Fundación Marie Curie in Cordoba, Argentina. He holds a BSc in physics and a PhD from the Universidad Nacional de Córdoba (UNC), Daniel has completed postgraduate studies in radiotherapy and nuclear medicine. With extensive experience in the field, Daniel has directed more than 20 MSc and BSc theses and three doctoral theses. He has delivered more than 400 presentations at national and international congresses. He has published in prestigious journals, including the Journal of Applied Clinical Medical Physics and the International Journal of Radiation Oncology, Biology and Physics. His work continues to make significant contributions to the advancement of medical physics.

Carlos Bohorquez, MS, DABR, is the product manager for RadCalc at LifeLine Software Inc., a part of the LAP Group. An experienced board-certified clinical physicist with a proven history of working in the clinic and medical device industry, Carlos’ passion for clinical quality assurance is demonstrated in the research and development of RadCalc into the future.

 

Scientists braced for Donald Trump’s second term as US president

Before Donald Trump takes oath for a second term as US president on 20 January, the US scientific community is preparing for what the next four years may look like. Many already have a sense of trepidation given his track record from his first term in office. There are concerns, for example, about his nominations for cabinet and other key positions. Others are worried about the role that SpaceX boss Elon Musk will play as the head of a new “department of government efficiency”.

Neal Lane, a senior fellow in science and technology at Rice University’s Baker Institute and science adviser to former president Bill Clinton, told Physics World that he doesn’t see “any good news for science, especially any fields or studies that seem to be offensive to important segments of Trump’s supporter base”. Lane says that includes research related to “climate change, reproduction, gender and any other aspects of diversity, environmental protection and justice, biodiversity, public health, vaccinations, most fields of the social sciences, and many others”.

John Holdren, who was science adviser to Barack Obama and is a member of Harvard University’s Kennedy School and the Woodwell Climate Research Center, is equally pessimistic. “The stated intentions of president-elect Trump and his acolytes concerning energy and climate policies are deeply dismaying,” he says. “If history is any guide, Trump will also try to put a large crimp in federal research on climate science and advanced clean energy.”

We saw budgets for science agencies go up [under Trump] due to a variety of factors, so that’s something we hope for again

Jennifer Grodsky

During his first term in office between 2017 and 2021, Trump tried to ban immigration from Muslim-majority countries and created the China Initiative that led to charges against some US scientists for collaborations with colleagues in Chinese universities. He also famously used a Sharpie pen to change the apparent course of hurricane Duran on a National Weather Service map, resulting in consternation from researchers.

When COVID-19 emerged, he suggested ineffective and possibly dangerous treatments for it and had a fraught relationship with Anthony Fauci, who was then in charge of the country’s response to the pandemic. Lane says that the administration is likely to continue “to downplay evidence-based science in setting policies and allow misinformation” to be published on agency websites. “That would result not only in damage to the integrity of US science, but to the trust the American public places in science,” Lane adds. “Ultimately, it could affect people’s lives and livelihoods.”

On the other hand, under Trump’s stewardship, the COVID-19 vaccine was developed at record-breaking speed, and while it took 18 months in office before he nominated a science adviser, his pick of meteorologist Kelvin Droegemeier was generally applauded by the scientific community. Funding for science also increased during Trump’s first term. “We saw budgets for science agencies go up due to a variety of factors, so that’s something we hope for again,” says Jennifer Grodsky, Boston University’s vice-president for federal relations.

And the nominees are…

In Trump’s first term, various members of his presidential staff and cabinet managed to dissuade him from pursuing some more unorthodox ideas related to science and medicine. And when they failed to do so, Congress acted as a hard brake. The Senate has a constitutional responsibility to advise the president on (and consent by a simple majority to) presidential nominations for cabinet positions, ambassadorships and other high offices. Since the new Senate will take office on 3 January with a Republican majority of 53 to 47 Democrats, many Trump nominees will likely be ready to take office when he becomes president on 20 January.

Most nominees for posts, however, are fully behind Trump’s desire to “drain the swamp” of Washington’s “politics as usual” and have some non-mainstream views on science. Stanford University health economist Jay Bhattacharya, for example, who has been picked to lead the National Institutes of Health, was a vocal critic of the US response to the COVID-19 pandemic who stated that lockdowns caused irreparable harm. Vaccine sceptic Robert F Kennedy Jr, an environmental lawyer, has been chosen to head the Department of Health and Human Services while Marty Makary, a Johns Hopkins University surgeon and cancer specialist who shares many of Kennedy’s attitudes about health, is tagged to lead the Food and Drug Administration.

While the nominees to head environmental and energy agencies come from more mainstream candidates, they could – if approved – implement significant changes in policy from the Biden administration. Trump wants, for example, to open protected areas to drilling and mining. He also aims to take the US out of the Paris Accord on climate change for a second time – after Biden rescinded the first removal.

As a sign of things to come, Trump has already nominated Lee Zeldin as administrator of the Environmental Protection Agency (EPA). A former Republican Congressman from New York and a critic of much environmental legislation, Zeldin says that the EPA will “restore US energy dominance” while “protecting access to clean air and water”. But his focus on pro-business deregulation is set to dismay environmentalists who applauded the Biden administration’s EPA ban on several toxic substances and limitation on the amounts of “forever” chemicals in water.

Swimming against the tide of a hostile White House will not be easy

John Holdren

When it comes to energy, Trump has nominated Chris Wright, founder and chief executive office of the Denver-based fracking company Liberty Energy, to head the Department of Energy. While Wright accepts that fossil fuels contribute to global warming, he has also referenced scientific studies that support his claim that climate change “alarmists” are wrong about the impact of a warmer world. If approved, Wright will participate in a new National Energy Council that Department of the Interior nominee Doug Burgum will chair. A software company billionaire and current governor of North Dakota, Burgum has mirrored Wright in accusing the “radical left” of engaging in a war against US energy to reduce climate change.

Lane predicts that the Trump administration will even try to privatize agencies within government departments, potentially including the National Oceanic and Atmospheric Administration, which is part of the US Department of Commerce. “That could result in forcing people to pay to get timely weather reports,” he says. “To find out, for example, where a hurricane is headed or to receive better tornado warnings.”

Space Force

A different threat to science comes from Musk’s department of government efficiency, which he will run together with the biotechnology billionaire Vivek Ramaswamy. As the owner of SpaceX, Starlink and Tesla, Musk – currently the world’s richest person – asserts that the department can cut $2 trillion from the roughly $6.5 trillion annual US government budget. While some are sceptical of that pledge, scientists fear the effort could target science-related agencies. The Department of Education, for example, could be shut with several prominent Republicans, including Trump, having already called for its elimination.

Another possible target for budget cuts is NASA, which is already in financial trouble, having been forced to postpone the next lunar Artemis mission to April 2026 and the planned crewed Moon landing to mid-2027. Trump has nominated Jared Isaacman – a billionaire associate of Musk – as the agency’s administrator. Co-founder of the aerospace firm Draken International, Isaacman developed and financed September’s Polaris Dawn mission, in which he and three other private astronauts were taken into orbit by Musk’s SpaceX rockets. If confirmed in office, Isaacman is expected to expand existing links between NASA and the commercial space sector.

Another impact of Trump’s second term could be collaborations between US and foreign scientists. A return to the China initiative that Biden rescinded seems possible, and Trump has promised to continue the hard line against immigrants that marked his first term in office. Some university leaders have already warned overseas students not to travel home during the winter break in case they are not allowed back into the US. “New executive orders that may impact travel may be implemented,” a statement by the leadership of Massachusetts Institute of Technology noted. “Any processing delays could impact students’ ability to return to the US as planned.”

As the Biden administration departs and Trump is sworn into office on 20 January, many scientists will be hopeful, but unconvinced, that science is heading in the right direction. For Holdren, the next four years simply promises to be a rocky time. “Swimming against the tide of a hostile White House will not be easy,” he adds. “Let us hope all who understand the challenge will rise to it.”

Unclear nature: anthropological study of CERN is a missed opportunity to bridge physics and social sciences

When I was asked to review Unfinished Nature: Particle Physics at CERN, a new ethnography of CERN by Arpita Roy, an anthropologist at the University of California Berkeley, US, I was excited. Having recently completed a PhD in science communication where I studied the researchers at CERN – albeit from a very different, quantitative‐heavy, perspective – the subject is close to my heart.

Roy spent two and a half years doing fieldwork at CERN, around the time of the discovery of the Higgs boson in 2012. The book examines this event through an anthropological lens, asking questions such as how are scientific advances made and how do scientists understand their work? Unfortunately, although I read many books and papers of a similar nature for my doctoral studies, I struggled with Unfinished Nature.

A good book makes you pause to reflect. You may find yourself enlightened by the author’s perspectives or disagree with their arguments, but comprehension is key in either case. A book that has you stumbling through the pages without clarity, re-reading sentences over and over again in an effort to make sense of them, is frustrating. I may lack the expertise to appreciate the finer points of the subject, but I struggled despite repeated, earnest attempts to read the book with the care and attention the topic deserves.

Take the following snippet from the first page of the introduction, which sets the tone for what is to come: “But what has been lost to sight is the elucidation of how a science like particle physics may incorporate elements into its domain beyond what its epistemic assumption would lead us to expect, which deepens the mystery of what logic of classification it obeys. It is far from easy, however, to explicate the notion of classification, if only for the reason that it engenders notions of system, category, or context whose lucidity is hard to pinpoint in the scientific realm.” While I eventually understood (or at least think I did) what Roy is trying to say, the phrasing is unnecessarily convoluted.

None of this is criticism of Roy as a researcher but reflects the seemingly intentionally confusing language that academics – and my fellow social scientists in particular – are expected to use, despite increased calls to make research more accessible to those without specialist knowledge.

The ideas and stories Roy covers are no doubt interesting, even if the book itself isn’t an easy read. Unfinished Nature is more suited to the invested social scientist familiar with the particular flavour of academic prose adopted by anthropologists than physicists or physics enthusiasts indulging a more superficial interest in the lives of researchers at CERN.

  • 2024 Columbia University Press 296pp £30.00

Quantum processor enters unprecedented territory for error correction

Researchers at Google Quantum AI and collaborators have developed a quantum processor with error rates that get progressively smaller as the number of quantum bits (qubits) grows larger. This achievement is a milestone for quantum error correction, as it could, in principle, lead to an unlimited increase in qubit quality, and ultimately to an unlimited increase in the length and complexity of the algorithms that quantum computers can run.

Noise is an inherent feature of all physical systems, including computers. The bits in classical computers are protected from this noise by redundancy: some of the data is held in more than one place, so if an error occurs, it is easily identified and remedied. However, the no-cloning theorem of quantum mechanics dictates that once a quantum state is measured – a first step towards copying it – it is destroyed. “For a little bit, people were surprised that quantum error correction could exist at all,” observes Michael Newman, a staff research scientist at Google Quantum AI.

Beginning in the mid-1990s, however, information theorists showed that this barrier is not insurmountable, and several codes for correcting qubit errors were developed. The principle underlying all of them is that multiple physical qubits (such as individual atomic energy levels or states in superconducting circuits) can be networked to create a single logical qubit that collectively holds the quantum information. It is then possible to use “measure” qubits to determine whether an error occurred on one of the “data” qubits without affecting the state of the latter.

“In quantum error correction, we basically track the state,” Newman explains. “We say ‘Okay, what errors are happening?’ We figure that out on the fly, and then when we do a measurement of the logical information – which gives us our answer – we can reinterpret our measurement according to our understanding of what errors have happened.”

Keeping error rates low

In principle, this procedure makes it possible for infinitely stable qubits to perform indefinitely long calculations – but only if error rates remain low enough. The problem is that each additional physical qubit introduces a fresh source of error. Increasing the number of physical qubits in each logical qubit is therefore a double-edged sword, and the logical qubit’s continued stability depends on several factors. These include the ability of the quantum processor’s (classical) software to detect and interpret errors; the specific error-correction code used; and, importantly, the fidelity of the physical qubits themselves.

In 2023, Newman and colleagues at Google Quantum AI showed that an error-correction code called the surface code (which Newman describes as having “one of the highest error-suppression factors of any quantum code”) made it just about possible to “win” at error correction by adding more physical qubits to the system. Specifically, they showed that a distance-5 array logical qubit made from 49 superconducting transmon qubits had a slightly lower error rate than a distance-3 array qubit made from 17 such qubits. But the margin was slim. “We knew that…this wouldn’t persist,” Newman says.

“Convincing, exponential error suppression”

In the latest work, which is published in Nature, a Google Quantum AI team led by Hartmut Neven unveil a new superconducting processor called Willow with several improvements over the previous Sycamore chip. These include gates (the building blocks of logical operations) that retain their “quantumness” five times longer and a Google Deepmind-developed machine learning algorithm that interprets errors in real time. When the team used this new tech to create nine surface code distance-3 arrays, four distance-5 arrays and one 101-qubit distance-7 array on their 105-qubit processor, the error rate was suppressed by a factor of 2.4 as additional qubits were added.

Diagram showing a 3x3 array of gold data qubits, a 5x5 array and a 7x7 array. In the 3x3 array, the gold data qubits are surrounded by 8 red measure qubits, for a total of 17 qubits. In the 5x5 array, the data qubits are surrounded by 24 cyan-coloured measure qubits, for a total of 49 qubits. In the 7x7 array, the data qubits are surrounded by 48 blue-coloured measure qubits, for a total of 97 qubits.

“This is the first time we have seen convincing, exponential error suppression in the logical qubits as we increase the number of physical qubits,” says Newman. “That’s something people have been trying to do for about 30 years.”

With gates that remain stable for hours on end, quantum computers should be able to run the large, complex algorithms people have always hoped for. “We still have a long way to go, we still need to do this at scale,” Newman acknowledges. “But the first time we pushed the button on this Willow chip and I saw the lattice getting larger and larger and the error rate going down and down, I thought ‘Wow! Quantum error correction is really going to work…Quantum computing is really going to work!’”

Mikhail Lukin, a physicist at Harvard University in the US who also works on quantum error correction, calls the Google Quantum AI result “a very important step forward in the field”. While Lukin’s own group previously demonstrated improved quantum logic operations between multiple error-corrected atomic qubits, he notes that the present work showed better logical qubit performance after multiple cycles of error correction. “In practice, you’d like to see both of these things come together to enable deep, complex quantum circuits,” he says. “It’s very early, there are a lot of challenges remaining, but it’s clear that – in different platforms and moving in different directions – the fundamental principles of error correction have now been demonstrated.  It’s very exciting.”

Squishy silicone rings shine a spotlight on fluid-solid transition

People working in industry, biology and geology are all keen to understand when particles will switch from flowing like fluids to jamming like solids. With rigid particles, and even for foams and emulsions, scientists know what determines this crunch point: it’s related to the number of contact points between particles. But for squishy particles – those that deform by more than 10% of their size – that’s not necessarily the case.

“You can have a particle that’s completely trapped between only two particles,” explains Samuel Poincloux, who studies the statistical and mechanical response of soft assemblies at Aoyama Gakuin University, Japan.

Factoring that level of deformability into existing theories would be fiendishly difficult. But with real-world scenarios – particularly in mechanobiology – coming to light that hinge on the flow or jamming of highly deformable particles, the lack of explanation was beginning to hurt. Poincloux and his University of Tokyo colleague Kazumasa Takeuchi therefore tried a different approach. Their “easy-to-do experiment” sheds fresh light on how squishy particles respond to external forces, leading to a new model that explains how such particles flow – and at what point they don’t.

Pinning down the differences

To demonstrate how things can change when particles can deform a lot, Takeuchi holds up a case containing hundreds of rigid photoelastic rings. When these rings are under stress, the polarization of light passing through them changes. “This shows how the force is propagating,” he says.

As he presses on the rings with a flat-ended rod, a pattern of radial lines centred at the bottom of the rod lights up. With rigid particles, he explains, chains of forces transmitted by these contact points conspire to fix the particles in place. The fewer the contact points, the fewer the chains of forces keeping them from moving. However, when particles can deform a lot, the contact areas are no longer points. Instead, they extend over a larger region of the ring’s surface. “We can already expect that something will be very different then,” he says.

The main ingredient in Takeuchi and Poincloux’s experimental study of these differences was a layer of deformable silicone rings 10 mm high, 1.5 mm thick and with a radius of 3.3 mm, laid out between two parallel surfaces. The choice of ring material and dimensions was key to ensuring the model reproduced relevant aspects of behaviour while remaining easy to manipulate and observe. To that end, they added an acrylic plate on top to stop the rings popping out under compression. “There’s a lot of elastic energy inside them,” says Poincloux, nodding wryly. “They go everywhere.”

By pressing on one of the parallel surfaces, the researchers compressed the rings (thereby adjusting their density) and added an oscillating shear force. To monitor the rings’ response, they used image analysis to note the position, shape, neighbours and contact lengths for each ring. As they reduced the shear force amplitude or increased the density, they observed a transition to solid-like behaviour in which the rings’ displacement under the shear force became reversible. This transition was also reflected in collective properties such as calculated loss and storage moduli.

Unexpectedly simple

Perhaps counterintuitively, regular patterns – crystallinity – emerged in the arrangement of the rings while the system was in a fluid phase but not in the solid phase. This and other surprising behaviours make the system hard to model analytically. However, Takeuchi emphasises that the theoretical criterion for switching between solid-like and fluid-like behaviour turned out to be quite simple. “This is something we really didn’t expect,” he says.

  • The top row in the video depicts the fluid-like behaviour of the rings at low density. The bottom row depicts the solid-like behaviour of the rings at a higher density. (Courtesy: Poincloux and Takeuchi 2024)

The researchers’ experiments showed that for squishy particles, the number of contacts no longer matters much. Instead, it’s the size of the contact that’s important. “If you have very extended contact, then [squishy particles] can basically remain solid via the extension of contact, and that is possible only because of friction,” says Poincloux. “Without friction, they will almost always rearrange and lose their rigidity.”

Jonathan Bares, who studies granular matter at CNRS in the Université de Montpellier, France, but was not involved in this work, describes the model experiment as “remarkably elegant”. This kind of jamming state is, he says, “challenging to analyse both analytically and numerically, as it requires accounting for the intricate properties of the materials that make up the particles.” It is, he adds, “encouraging to see squishy grains gaining increasing attention in the study of granular materials”.

As for the likely impact of the result, biophysicist Christopher Chen, whose work at Boston University in the US focuses on adhesive, mechanical and biochemical contributions in tissue microfabrication, says the study “provides more evidence that the way in which soft particles interact may dominate how biological tissues control transitions in rigidity”.  These transitions, he adds, “are important for many shape-changing processes during tissue assembly and formation”.

Full details of the experiment are reported in PNAS.

The heart of the matter: how advances in medical physics impact cardiology

Medical physics techniques play a key role in all areas of cardiac medicine – from the use of advanced imaging methods and computational modelling to visualize and understand heart disease, to the development and introduction of novel pacing technologies.  At a recent meeting organised by the Institute of Physics’ Medical Physics Group, experts in the field discussed some of the latest developments in cardiac imaging and therapeutics, with a focus on transitioning technologies from the benchtop to the clinic.

Monitoring metabolism

The first speaker, Damian Tyler from the University of Oxford described how hyperpolarized MRI can provide “a new window on the reactions of life”. He discussed how MRI – most commonly employed to look at the heart’s structure and function – can also be used to characterize cardiac metabolism, with metabolic MR studies helping us understand cardiovascular disease, assess drug mechanisms and guide therapeutic interventions.

In particular, Tyler is studying pyruvate, a compound that plays a central role in the body’s metabolism of glucose. He explained that 13C MR spectroscopy is ideal for studying pyruvate metabolism, but its inherent low signal-to-noise ratio makes it unsuitable for rapid in vivo imaging. To overcome this limitation, Tyler uses hyperpolarized MR, which increases the sensitivity to 13C-enriched tracers by more than 10,000 times and enables real-time visualization of normal and abnormal metabolism.

As an example, Tyler described a study using hyperpolarized 13C MR spectroscopy to examine cardiac metabolism in diabetes, which is associated with an increased risk of heart disease. Tyler and his team examined the downstream metabolites of 13C-pyruvate (such as 13C-bicarbonate and 13C-lactate) in subjects with and without type 2 diabetes. They found reduced bicarbonate levels in diabetes and increased lactate, noting that the bicarbonate to lactate ratio could provide a diagnostic marker.

Among other potential clinical applications, hyperpolarized MR could be used to detect inflammation following a heart attack, elucidate the mechanism of drugs and accelerate new drug discovery, and provide an indication of whether a patient is likely to develop cardiotoxicity from chemotherapy. It can also be employed to guide therapeutic interventions by imaging ischaemia in tissue and assess cardiac perfusion after heart attack.

“Hyperpolarized MRI offers a safe and non-invasive way to assess cardiac metabolism,” Tyler concluded. “There are a raft of potential clinical applications for this emerging technology.”

Changing the pace

Alongside the introduction of new and improved diagnostic approaches, researchers are also developing and refining treatments for cardiac disorders. One goal is to create an effective treatment for heart failure, an incurable progressive condition in which the heart can’t pump enough blood to meet the body’s needs. Current therapies can manage symptoms, but cannot treat the underlying disease or prevent progression. Ashok Chauhan from Ceryx Medical told delegates how the company’s bio-inspired pacemaker aims to address this shortfall.

In healthy hearts, Chauhan explained, the heart rate changes in response to breathing, in a mechanism called respiratory sinus arrythmia (RSA). This natural synchronization is frequently lost in patients with heart failure. Ceryx has developed a pacing technology that aims to treat heart failure by resynchronizing the heart and lungs and restoring RSA.

Ashok Chauhan from Ceryx Medical

The device works by monitoring the cardiorespiratory system in real time and using RSA inputs to generate stimulation signals in real time. Early trials in large animals demonstrated that RSA pacing increased cardiac output and ejection fraction compared with monotonic (constant) pacing. Last month, Ceryx begun the first in-human trials of its pacing technology, using an external pacemaker to assess the safety of the device.

Eliminating sex bias

Later in the day, Hannah Smith from the University of Oxford presented a fascinating talk entitled “Women’s hearts are superior and it’s killing them”.

Smith told a disturbing tale of an elderly man with chest pain, who calls an ambulance and undergoes electrocardiography (ECG) that shows he is having a heart attack. He is rushed to hospital to unblock his artery and restore cardiac function. His elderly wife also feels unwell, but her ECG only shows slight abnormality. She is sent for blood tests that eventually reveal she was also having a severe heart attack – but the delay in diagnosis led to permanent cardiac damage.

The fact is that women having heart attacks are more likely to be misdiagnosed and receive less aggressive treatment than men, Smith explained. This is due to variations in the size of the heart and differences in the distances and angles between the heart and the torso surface, which affect the ECG readings used to diagnose heart attack.

To understand the problem in more depth, Smith developed a computational tool that automatically reconstructs torso ventricular anatomy from standard clinical MR images. Her goal was to identify anatomical differences between males and females, and examine their impact on ECG measurements.

Using clinical data from the UK Biobank (around 1000 healthy men and women, and 84 women and 341 men post-heart attack), Smith modelled anatomies and correlated these with the respective ECG data. She found that the QRS complex (the signal for the heart to start contracting) was about 6 ms longer in healthy males than healthy females, attributed to the smaller heart volume in females. This is significant as it implies that the mean QRS duration would have to increase by a larger percentage for women than men to be diagnosed as elevated.

She also studied the ST segment in the ECG trace, elevation of which is a key feature used to diagnose heart attack. The ST amplitude was lower in healthy females than healthy males, due to their smaller ventricles and more superior position of the heart. The calculations revealed that overweight women would need a 63% larger increase in ST amplitude to be classified as elevated than normal weight men.

Smith concluded that heart attacks are harder to see on a woman’s ECGs than on a man’s, with differences in ventricular size, position and orientation impacting the ECG before, during and after heart attacks. Importantly, if these relationships can be elucidated and corrected for in diagnostic tools, these sex biases can be reduced, paving the way towards personalised ECG interpretation.

Prize presentations

The meeting also included a presentation from the winner of the 2023 Medical Physics Group PhD prize: Joshua Astley from the University of Sheffield, for his thesis “The role of deep learning in structural and functional lung imaging”.

Joshua Astley from the University of Sheffield

Shifting the focus from the heart to the lungs, Astley discussed how hyperpolarized gas MRI, using inhaled contrast agents such as 1He and 129Xe, can visualize regional lung ventilation. To improve the accuracy and speed of such lung MRI studies, he designed a deep learning system that rapidly performs MRI segmentation and automates the calculation of ventilation defect percentage via lung cavity estimates. He noted that the tool is already being used to improve workflow in clinical hyperpolarized gas MRI scans.

Astley also described the use of CT ventilation imaging as a potentially lower-cost approach to visualize lung ventilation. Combining the benefits of computational modelling with deep learning, Astley and colleagues have developed a hybrid framework that generates synthetic ventilation scans from non-contrast CT images.

Quoting some “lessons learnt from my thesis”, Astley concluded that artificial intelligence (AI)-based workflows enable faster computation of clinical biomarkers and better integration of functional lung MRI, and that non-contrast functional lung surrogates can reduce the cost and expand use of functional lung imaging. He also emphasized that quantifying the uncertainty in AI approaches can improve clinician’s trust in using such algorithms, and that making code open and available is key to increasing its impact.

The day rounded off with awards for the meeting’s best talk in the submitted abstracts section and the best poster presentation. The former was won by Sam Barnes from Lancaster University for his presentation on the use of electroencephalography (EEG) for diagnosis of autism spectrum disorder. The poster prize was awarded to Suchit Kumar from University College London, for his work on a graphene-based electrophysiology probe for concurrent EEG and functional MRI.

Conditioning prepares aluminium-ion batteries for real-world use

Imagine a smartphone that charges faster, lasts longer and is more eco-friendly – all at a lower cost. Aluminium-ion batteries (AIBs) could make this dream a reality, and scientists are working to unlock their potential as a more abundant, affordable and sustainable alternative to the lithium-ion batteries currently used in mobile devices, electric cars and large-scale energy storage. As part of this effort, Dmitrii A Rakov and colleagues at the University of Queensland, Australia recently overcame a technical hurdle with an AIB component called the solid-electrolyte interphase. Their insights could help AIBs match, or even surpass, the performance of their lithium-ion counterparts.

Like lithium-ion batteries, AIBs contain an anode, a cathode and an electrolyte. This electrolyte carries aluminium ions, which flow between the positively-charged anode and the negatively-charged cathode. During discharge, these ions move from the anode to the cathode, generating energy. Charging the battery reverses the process, with ions returning to the anode to store energy.

The promise and the problem

Sounds simple, right? But when it comes to making AIBs work effectively, this process is far from straightforward.

Aluminium is a promising anode material – it is lightweight and stores a lot of energy for its size, giving it a high energy density. The problem is that AIBs are prone to instabilities as they cycle between charging and discharging. During this cycling, aluminium can deposit unevenly on the anode, forming tree-like structures called dendrites that cause short circuits, leading to battery failure or even safety risks.

Researchers have been tackling these issues for years, trying to figure out how to get aluminium to deposit more evenly and stop dendrites from forming. An emerging focus of this work is something called the solid-electrolyte interphase (SEI).  This thin layer of organic and inorganic components forms on the anode as the battery charges, and like the protective seal on a jar of jam, it keeps everything inside fresh and functioning well.

In AIBs, though, the SEI sometimes forms unevenly or breaks, like a seal on a jar that doesn’t close properly. When that happens, the aluminium inside can misbehave, leading to performance issues. To complicate things further, the type of “jam” in the jar – different electrolytes, like chloroaluminate ionic liquids – affects how well this seal forms. Some electrolytes help create a better seal, while others make it harder to keep the aluminium deposits stable.

Cracking the code of aluminium deposition

In their study, which is published in ACS Nano, the Queensland scientists, together with colleagues at the University of Southern Queensland and Oak Ridge National Laboratory in the US, focused on how the aluminium anode interacts with the liquid electrolyte.  They found that the formation of the SEI layer is highly dependent on the current running through the battery and the type of counter electrode (the “partner” to the aluminium anode). Some currents and conditions allow the battery to work well for more cycles. But under other conditions, aluminium can build up in uneven, dendritic structures that ultimately cause the battery to fail.

Photo of a battery being assembled from vials of chemicals in a laboratory, with a person's blue-gloved hand visible next to the vials

To understand how this happens, the researchers investigated how different electrolytes and cycling conditions affect the SEI layer. They discovered that in some cases, when the SEI isn’t forming evenly, aluminium oxide (Al2O3) – which is normally a protective layer – can actually aggravate the problem by causing the aluminium to deposit unevenly. They also found that low currents can deplete some materials in the electrolyte, leading to parasitic reactions that further reduce the battery’s efficiency.

To solve these issues, the scientists recommend exploring different aluminium-alloy chemistries. They also suggest that specific conditioning protocols could smooth out the SEI layer and improve the cycling performance. One example of such a conditioning protocol is pre-cycling, which is a process where the battery is charged and discharged in a controlled way before regular use to condition it for better long-term performance.

“Our research demonstrates that, like in lithium-ion batteries, aluminium-ion batteries also need pre-cycling to maximize their lifetime,” Rakov tells Physics World. “This is important knowledge for aluminium-ion battery developers, who are rapidly emerging as start-ups around the world.”

By understanding the unique pre-cycling needs of aluminium-ion batteries, developers can work to design batteries that last longer and perform more reliably, bringing them closer to real-world applications.

How far are we from having an aluminium-ion battery in our mobile phones?

As for when those applications might become a reality, Rakov highlights that AIBs are still in the early stages of development, and many studies test them under conditions that aren’t realistic for everyday use. Often, these tests use very small amounts of active materials and extra electrolyte, which can make the batteries seem more durable than they might be in real life.

In this study, Rakov and colleagues focused on understanding how aluminium-ion batteries might degrade when handling higher energy loads and stronger currents, similar to what they would face in practical use. “We found that different types of positive electrode materials lead to different types of battery failure, but by using special pre-cycling steps, we were able to reduce these issues,” Rakov says.

Copyright © 2025 by IOP Publishing Ltd and individual contributors