The International Year of Quantum Science and Technology (IYQ) has officially closed following a two-day event in Accra, Ghana. The year has seen hundreds of events worldwide celebrating the science and applications of quantum physics.
Officially launched in February at the headquarters of the UN Educational, Scientific and Cultural Organization (UNESCO) in Paris, IYQ has involved hundreds of organizations – including the Institute of Physics, which publishes Physics World.
The year 2025 was chosen for an international year dedicated to quantum physics as it marks the centenary of the initial development of quantum mechanics by Werner Heisenberg. A range of international and national events have been held touching on quantum in everything from communications and computing to medicine and the arts.
One of the highlights of the year was a workshop on 9–14 June 2025 in Helgoland – the island off the coast of Germany where Heisenberg made his breakthrough exactly 100 years earlier. It was attended by more than 300 top quantum physicists, including four Nobel prize-winners, who gathered for talks, poster sessions and debates.
The closing event in Ghana, held on 10–11 February, was attended by government officials, UNESCO directors, physicists and representatives from international scientific societies, including the IOP. They discussed UNESCO’s official 2025 IYQ report as well as heard a reading of the IYQ 2025 poetry contest winning entry and attended an exhibition with displays from IYQ sponsors.
Organizers behind the IYQ hope its impact will be felt for many years to come. “The entire 2025 year was filled with impactful events happening all over the world. It has been a wonderful experience working alongside such dedicated and distinguished colleagues,” notes Duke University physicist Emily Edwards, who is a member of the IYQ steering committee. “We are thrilled to see the enthusiasm continue through to 2026 with the closing ceremony and are proud that a strong foundation has been laid for the years ahead.”
The UN has declared “international years” since 1959, to draw attention to topics deemed to be of worldwide importance. In recent years, there have been a number of successful science-based themes, including physics (2005), astronomy (2009), chemistry (2011), crystallography (2014) and light and light-based technologies (2015).
Read our two free-to-read quantum briefings, published in May and October, which feature articles on the history, mystery and industry of quantum mechanics.
Rewatch our Physics World Live: Quantum held in June that included a discussion of how technological developments have created a whole new ecosystem of “quantum 2.0” businesses
Science fiction became science fact in 2022 when NASA’s DART mission took the first steps towards creating a planetary defence system that could someday protect Earth from a catastrophic asteroid collision. However, much more work on asteroid deflection is needed from the latest generation of researchers – including Rahil Makadia, who has just completed a PhD in aerospace engineering at University of Illinois at Urbana-Champaign.
In this episode of the Physics World Weekly podcast, Makadia talks about his work on how we could deflect asteroids away from Earth. We also chat about the potential threats posed by near-Earth asteroids – from shattered windows to global destruction.
Makadia’s stresses the importance of getting a deflection right the first time, because his calculations reveal that a poorly deflected asteroid could return to Earth someday. In November, he published a paper that explored how a bad deflection could send an asteroid into a “keyhole” that guarantees its return.
But it is not all gloom and doom, Makadia points out that our current understanding of near-Earth asteroids suggests that no major collision will occur for at least 100 years. So even if there is a threat on the horizon, we have lots of time to develop deflection strategies and technologies.
Flowing fluids that act like the interlocking teeth of mechanical gears offer a possible route to novel machines that suffer less wear-and-tear than traditional devices. This is the finding of researchers at New York University (NYU) in the US, who have been studying how fluids transmit motion and force between two spinning solid objects. Their work sheds new light on how one such object, or rotor, causes another object to rotate in the liquid that surrounds it – sometimes with counterintuitive results.
“The surprising part in our work is that the direction of motion may not be what you expect,” says NYU mathematician Leif Ristroph, who led the study together with mathematical physicist Jun Zhang. “Depending on the exact conditions, one rotor can cause a nearby rotor to spin in the opposite direction, like a pair of gears pressed together. For other cases, the rotors spin in the same direction, as if they are two pulleys connected by a belt that loops around them.”
Making gear teeth using fluids
Gears have been around for thousands of years, with the first records dating back to 3000 BC. While they have advanced over time, their teeth are still made from rigid materials and are prone to wearing out and breaking.
Ristroph says that he and Zhang began their project with a simple question: might it possible to avoid this problem by making gears that don’t have teeth, and in fact don’t even touch, but are instead linked together by a fluid? The idea, he points out, is not unprecedented. Flowing air and water are commonly used to rotate structures such as turbines, so developing fluid gears to facilitate that rotation is in some ways a logical next step.
To test their idea, the researchers carried out a series of measurements aimed at determining how parameters like the spin rate and the distance between spinning objects affect the motion produced. In these measurements, they immersed the rotors – solid cylinders – in an aqueous glycerol solution with a controllable viscosity and density. They began by rotating one cylinder while allowing the other one to spin in response. Then they placed the cylinders at varying distances from each other and rotated the active cylinder at different speeds.
“The active cylinder should generate fluid flows and could therefore in principle cause rotation of the passive one,” says Ristroph, “and this is exactly what we observed.”
When the cylinders were very close to each other, the NYU team found that the fluid flows functioned like gear teeth – in effect, they “gripped” the passive rotor and caused it to spin in the opposite direction as the active one. However, when the cylinders were spaced farther apart and the active cylinder spun faster, the flows looped around the outside of the passive cylinder like a belt around a pulley, producing rotation in the same direction as the active cylinder.
A model involving gear-like- and belt-like modes
Ristroph says the team’s main difficulty was figuring out how to perform such measurements with the necessary precision. “Once we got into the project, an early challenge was to make sure we could make very precise measurements of the rotations, which required a special way to hold the rotors using air bearings,” he explains. Team member Jesse Smith, a PhD student and first author of a paper in Physical Review Letters about the research, was “brilliant in figuring out every step in this process”, Ristroph adds.
Another challenge the researchers faced was figuring out how to interpret their findings. This led them to develop a model involving “gear-like” and “belt-like” modes of induced rotations. Using this model, they showed that, at least in principle, a fluid gear could replace regular gears and pulley-and-belt systems in any system – though Ristroph suggests that transmitting rotations in a machine or keep timing via a mechanical device might be especially well-suited.
In general, Ristroph says that fluid gears offer many advantages over mechanical ones. Notably, they cannot become jammed or wear out due to grinding. But that isn’t all: “There has been a lot of recent interest in designing new types of so-called active materials that are composed of many particles, and one class of these involves spinning particles in a fluid,” he explains. “Our results could help to understand how these materials behave based on the interactions between the particles and the flows they generate.”
The NYU researchers say their next step will be to study more complex fluids. “For example, a slurry of corn starch is an everyday example of a shear-thickening fluid and it would be interesting to see if this helps the rotors better ‘grip’ one another and therefore transmit the motions/forces more effectively,” Ristroph says. “We are also numerically simulating the processes, which should allow us to investigate things like non-circular shapes of the rotors or more than just two rotors,” he tells Physics World.
A new class of biomolecules called magneto-sensitive fluorescent proteins, or MFPs, could improve imaging of biological processes inside living cells and potentially underpin innovative therapies.
The fluorescent proteins commonly used in biological studies respond solely to light being shone at them. But because that light gets scattered by tissues there are inaccuracies in determining exactly where the resulting fluorescence originates. By contrast, the MFPs created by a team led by Harrison Steel, head of the Engineered Biotechnology Research Group at the University of Oxford in the UK, fluoresce partly in response to highly predictable magnetic fields and radio waves that pass through biological tissues without deflection.
Sensor schematic An MFP excited by blue light emits green fluorescence, the intensity of which can be modulated by applying appropriate magnetic or radiofrequency fields. (Courtesy: Gabriel Abrahams)
To detect where MFPs are located within living cells, the researchers apply both a static magnetic field with a precisely known gradient and a radiofrequency (RF) signal, which modulate the fluorescence triggered via excitation by a light-emitting diode (LED).
The emitted fluorescence is brightest whenever the RF is in resonance with a transition energy of the entangled electron system present within the MFP. Since the resonance frequency depends on the surrounding magnetic field strength, the brightness reveals the protein’s location.
As detailed in their recent Nature paper, the researchers engineered the MFPs by “directed evolution”: starting with a DNA sequence, making two to three thousand variants of it, and selecting the variants with the best fluorescence response to magnetic fields before repeating the entire process multiple times. The resulting proteins were tested via ODMR (optically detected magnetic resonance) and MFE (magnetic-field effect) experiments, revealing that they could be detected in single living cells and sense their local microenvironment.
Importantly, these MFPs can be made in research labs using a straightforward biological technique. “This is a totally different way of coming up with new quantum materials compared to other engineering efforts for quantum sensors like nitrogen vacancies [in diamonds] which need to be manufactured in highly specialized facilities,” explains first author Gabriel Abrahams, a doctoral student in Steel’s research group. Abrahams helped develop quantum diamond microscopes during his master’s in physics at the Quantum Nano Sensing Lab in Melbourne, Australia before moving onto the Oxford Interdisciplinary Bioscience Doctoral Training Programme.
The MFPs were inspired by the work of study co-authors Maria Ingaramo and Andy York, both then working for Calico Life Sciences. They had observed a small change in fluorescence when a magnet interacted with a quantum-enabled protein, explains Abrahams. “That was really cool! I hadn’t seen anything like that, and there were clearly potential applications if it could be made better,” he says.
Steel tells Physics World that “a lot of the past work in quantum biology was with fragile proteins, often at cryogenic temperatures. Surprisingly you could easily measure these MFPs in single living cells every few minutes as they can work for a long time at room temperature”. Furthermore, using MFPs only requires adding a magnet to existing fluorescence microscopy equipment, allowing new data to be cost-effectively obtained.
“For instance, you might use three or four fluorescent proteins to tag natural processes in a mammalian cell in a petri dish to see when they are being used and where they go. We could instead tag with 10 or 15 MFPs, allowing you to measure extra targets by just applying a magnetic field,” Steel explains.
Quantum engineer Peter Maurer from the University of Chicago in the US, who was not involved in the study, is enthusiastic about these new MFPs. “By combining magnetic fields and fluorescence, this work establishes an exciting new imaging modality with broad potential for future evolution. Notably, similar approaches could be directly applicable to qubits [quantum bits], such as the fluorescent protein qubits our team published in Nature last year,” he says.
Next, Steel intends to improve their instrumentation for using MFPs – much of which was adopted from researchers investigating how birds navigate via the earth’s magnetic field. Future MFP applications could include microbiome studies sensing where bacteria travel in our bodies, and the development of highly controllable actuators for drug delivery. “If you would like to turn on the protein’s ability to bind to a cancer cell, for example, you could simply put a magnet on the outside of a person in the right location,” he concludes.
Royal approval (Clockwise from top left) The Duke of Edinburgh with IOP group chief executive Tom Grinyer; talking to Selina Ambrose from Promethean Particles; the exhibition he toured; and speaking after the panel debate. (Courtesy: Carmen Valino)
The Duke of Edinburgh visited the headquarters of the Institute of Physics (IOP) in central London on 5 February to learn about the role that physics plays in supporting the green economy.
The event was attended by about 100 business leaders, policy chiefs, senior physicists, and IOP and IOP Publishing staff. It highlighted how physics research is helping to deliver clean energy solutions and support economic growth.
A total of 12 companies took part in an exhibition that was visited by the duke. They included two carbon-capture firms – Nellie Technologies and Promethean Particles – as well as the fusion firm Tokamak Energy and Sunamp, which makes non-flammable “thermal batteries”.
The event included a panel debate chaired by Tara Shears, the IOP’s vice-president for science and innovation.
It featured ex-BP boss John Browne, who now works in green energy, along with Sizewell C energy-strategy director David Cole, Nellie Technologies founder Stephen Millburn, solar-cell physicist Jenny Nelson from Imperial College, and Emily Nurse from the UK’s Climate Change Committee.
After the debate, the duke said the event had showcased “some of the brilliant ideas that are trying to solve some really challenging issues through creativity and imagination”. He expressed particular delight that people are central to that mission.
“Our ability to evolve the right skills for the future has been well demonstrated here,” he said. “It comes down to creating the right climate to allow these ideas to flourish and come to market. We simply cannot drop this issue.”
Tom Grinyer, group chief executive of the IOP, reminded delegates that physics is fundamental to the UK economy. “We’re seeing how research is translating into real-world solutions that matter today, from clean power and climate intelligence, to advanced materials and future technologies,” he said.
But he warned that long-term investment in young people will be vital to create the physicists and business leaders who can tackle those challenges.
Re-entry of space debris. Courtesy: S Economon and B Fernando
When chunks of space debris make their fiery descent through the Earth’s atmosphere, they leave a trail of shock waves in their wake. Geophysicists have now found a way to exploit this phenomenon, using open-source seismic data from a network of earthquake sensors to monitor the waves produced by China’s Shenzhou-15 module as it fell to Earth in April 2024. The method is valuable, they say, because it makes it possible to follow debris – which can be hazardous to humans and animals – in near-real time as they travel towards the surface.
“We’re at the situation today where more and more spacecraft are re-entering the Earth’s atmosphere on a daily basis,” says team member Benjamin Fernando, a postdoctoral researcher at Johns Hopkins University in the US. “The problem is that we don’t necessarily know what happens to the fragments this space debris produces – whether they all break up in the atmosphere or if some of them reach the ground.”
Piggybacking on a network of earthquake sensors
As the Shenzhou-15 module re-entered the atmosphere, it began to disintegrate, producing debris that travelled at supersonic speeds (between Mach 25‒30) over the US cities of Santa Barbara, California and Las Vegas, Nevada. The resulting sonic booms produced vibrations strong enough to be picked up by a network of 125 seismic stations spread over Nevada and Southern California.
Fernando and his colleague Constantinos Charalambous at Imperial College London in the UK used freely available data from these stations to measure the arrival times of the largest sonic boom signals. Based on these data, they produced a contour map of the path the debris took and the direction in which it propagated. They also determined the altitude of the module as it travelled by using ratios of the speed of sound to the apparent speed of the incident wavefront its supersonic flight generated as it passed over the seismic stations. Finally, they used a best-fit seismic inversion model to estimate where remnants of the module may have landed and the speed at which they travelled over the ground.
The analyses revealed that the module travelled roughly 20-30 kilometres south of the trajectory that US Space Command had predicted based on measurements of the module’s orbit alone. The seismic data also showed that the module gradually disintegrated into smaller pieces rather than undergoing a single explosive disassembly.
Advantages of accurate tracking
To obtain an estimate of the object’s trajectory within seconds or minutes, the researchers had to simplify their calculations by ignoring the effects of wind and temperature variations in the lower troposphere (the lowest layer of the Earth’s atmosphere). This simplification also did away with the need to simulate the path of wave signals through the atmosphere, which was essential for previous techniques that relied on radar data to follow objects decaying in low Earth orbit. These older techniques, Fernando adds, produced predictions of the objects’ landing sites that could, in the worst cases, be out by thousands of kilometres.
The availability of accurate, near-real time debris tracking could be particularly helpful in cases where the debris is potentially harmful. As an example, Fernando cites an incident in 1996, when debris from the Russian Mars 96 spacecraft fell out of orbit. “People thought it burned up and [that] its radioactive power source landed intact in the ocean,” he says. “They tried to track it at the time, but its location was never confirmed. More recently, a group of scientists found artificial plutonium in a glacier in Chile that they believe is evidence the power source burst open during the descent and contaminated the area.”
Though Fernando emphasizes that it’s rare for debris to contain radioactive material, he argues “we’d benefit from having additional tracking tools” when it does.
Towards an automated algorithm for trajectory reconstruction
Fernando had previously used seismometers to track natural meteoroids, comets and asteroids on both Earth and Mars. In the latter case, he used data from InSight, a NASA Mars mission equipped with a seismometer.
“The meteoroids hitting the Red Planet were a really good seismic source for us,” he explains. “We detected the sonic booms from them breaking up and, occasionally, would actually detect the impact of them hitting the ground. We realized that we could actually apply those same techniques to studying space debris on Earth.
“This is an excellent example of a technique that we really perfected the expertise for a planetary science kind of pure science application. And then we were able to apply it to a really relevant, challenging problem here on Earth,” he tells Physics World.
The scientists say that in the longer term, they hope to develop an algorithm that automatically reconstructs the trajectory of an object. “At the moment, we’re having to find the sonic boons and analyse the data ‘by hand’,” Fernando says. “That’s obviously very slow, even though we’re getting better.”
A better solution, Fernando continues, would be to develop a machine learning tool that can find sonic booms in the data when a re-entry is expected, and then use those data to reconstruct the trajectory of an object. They are currently applying for funding to explore this option in a follow-up study.
Beyond that, there’s also the question of what to do with the data once they have it. “Who would we send the data to?” Fernando asks rhetorically. “Who needs to know about these events? If there’s a plane crash, hurricane, or similar, there are already good international frameworks in place for dealing with these events. It’s not clear to me, however, that such a framework for dealing with space debris has caught up with reality – either in terms of regulations or the response when such an event does happen.”
A proposed industrial-scale green hydrogen and ammonia project in Chile that astronomers warned could cause “irreparable damage” to the clearest skies in the world has been cancelled. The decision by AES Andes, a subsidiary of the US power company AES Corporation, to shelve plans for the INNA complex has been welcomed by the European Southern Observatory (ESO).
AES Andes submitted an Environmental Impact Assessment for the green hydrogen project in December 2024. Expected to cover more than 3000 hectares, it would have been located just a few kilometres from ESO’s Paranal Observatory in Chile’s Atacama Desert, which is one of the world’s most important astronomical research sites due to its stable atmosphere and lack of light pollution.
That same month, ESO conducted its own impact assessment, concluding that INNA would increase light pollution above Paranal’s Very Large Telescope by at least 35% and by more than 50% above the southern site of the Cherenkov Telescope Array Observatory (CTAO).
Once built, the CTAO will be the world’s most powerful ground-based observatory for very-high-energy gamma-ray astronomy.
ESO director general Xavier Barcons had warned that the hydrogen project would have posed a major threat to “the performance of the most advanced astronomical facilities anywhere in the world”.
On 23 January, however, AES Andes announced that it will discontinue plans to develop the INNA complex. The firm stated that after a review of its project portfolio it had chosen to instead focus on renewable energy and energy storage. On 6 February AES Andes sent a letter to Chile’s Environmental Assessment Service requesting that INNA is not evaluated, which formally confirmed the end of the project.
Barcons says that ESO is “relieved” about the decision, adding that the case highlights the urgent need to establish clear protection measures in the areas around astronomical observatories.
Barcons notes that green-energy projects as well as other industrial projects can be “fully compatible” with astronomical observatories as long as the facilities are located at sufficient distances away.
Romano Corradi, director of the Gran Telescopio Canarias, which is located at the Roque de los Muchachos Observatory, La Palma, Spain, told Physics World that he was “delighted” with the decision.
Corradi adds that while it is unclear if preserving the night-sky darkness of the region was a relevant factor for the decision to cancel the project, he hopes that global pressure to defend the dark skies played a role.
High-energy heavy-nuclei collisions, conducted at particle colliders such as CERN’s Large Hadron Collider (LHC) and BNL’s Relativistic Heavy Ion Collider (RHIC) are able to produce a state of matter called a quark-gluon plasma (QGP).
A QGP is believed to have existed just after the Big Bang. The building blocks of protons and neutrons – quarks and gluons – were not confined inside particles as usual but instead formed a hot, dense, strongly interacting soup.
Studying this state of matter helps us understand the strong nuclear force, the early universe, and how matter evolved into the forms we see today.
In order to understand QGP created in a particle collider you need to know the initial conditions. In this case that is the shape and structure of the heavy nuclei that collided.
A major complicating factor here is that most atomic nuclei are deformed. They are not spherical but rather squashed and ellipsoidal or even pear-shaped.
Collisions of deformed nuclei with different orientations brings in a large amount of randomness and therefore hinders our ability to describe the initial conditions of the QGP.
A new method called imaging-by-smashing was developed by the STAR experiment at RHIC, where atomic nuclei are smashed together at extremely high speeds. By studying the patterns in the debris from these collisions, researchers can infer the original shape of the nuclei.
In this latest study, they compared collisions between two types of nuclei: uranium-238, which has a strongly deformed shape, and gold-197, which is nearly spherical.
The differences between uranium and gold helped isolate the effects of uranium’s deformation. Their results matched predictions from advanced hydrodynamic simulations and earlier low-energy experiments.
Most interestingly, they found hints that uranium might possess a pear-like (octupole) shape, in addition to its dominant football-like (quadrupole) shape. This feature had not previously been observed in high-energy collisions
This method is still new, but in the future, it could give us key insights nuclear structure throughout the periodic table. These measurements probe nuclei at energy scales orders of magnitudes higher than traditional methods, potentially revealing how nuclear structure evolves across very different energy regimes.
In quantum mechanics, a quantum state is a complete description of a system’s physical properties.
If the system changes slowly and returns to its original physical configuration, then its quantum state also returns to its original form except for a phase factor.
In a pioneering work in 1984, physicist Michael Berry discovered that this factor can be separated into two parts: the dynamic and the geometric phase.
The usual dynamic phase depends on energy and time and was already well understood. The new part, the geometric phase (or Berry phase after its discoverer) arises purely from the geometry of the path that the state takes through parameter space.
The Berry phase has profound implications across physics, appearing in phenomena like the quantum Hall effect, molecular dynamics, and polarised light. It reveals deep connections between geometry, topology, and physical observables.
In a recent paper, this concept was extended from wave evolution to certain wave scattering events, where waves bounce off or pass through materials and their properties shift.
In order to do this, the authors used a mathematical tool called a scattering matrix. The matrix encodes all the possible outcomes of a scattering process – reflection, transmission, or deflection -based on the system’s properties.
They showed that these wave shifts can also be split into dynamic and geometric parts. Importantly this splitting can be done in such a way that doesn’t depend on arbitrary choices (i.e., it’s gauge-invariant).
The team demonstrated their idea with known examples like light passing through a changing waveplate, beams reflecting off surfaces, and time delays in 1D systems.
Their approach is not only able to describe known phenomena, but also reveals new physical features, provides new insights, and uncovers previously unnoticed connections.
Going forward, identifying the geometric and dynamic origins of various scattering-induced shifts offers new ways to control wave-scattering phenomena.
This could have applications in photonics, imaging, quantum computing, and micromanipulation.
Radiation therapy is usually delivered by prescribing the same radiation dose for each particular type of tumour. But this “one-size-fits-all” approach does not account for a tumour’s intrinsic radiosensitivity and heterogeneity and can lead to recurrence and treatment failure. Researchers in Sweden and Germany are now investigating whether biologically individualized radiotherapy plans, created using PET images of a patient’s tumour biology, can improve treatment outcomes.
The research team – headed up by Marta Lazzeroni from Stockholm University – studied 28 patients with advanced head-and-neck squamous cell carcinoma (HNSCC). All patients underwent two pre-treatment PET/CT scans, using 18F-fluoromisonidazole (FMISO) and 18F-FDG as tracers to respectively quantify radioresistance and tumour cellularity (the percentage of clonogenic cells) – both critical factors that influence treatment response.
“FMISO provides information on hypoxia-related radioresistance, but tumour control also strongly depends on the number of clonogenic cells, which is not captured by hypoxia imaging alone,” Lazzeroni explains. “To our knowledge, this is the first study to combine FMISO and FDG PET within a unified radiobiological framework to guide biologically individualized dose escalation.”
For each patient, the researchers used FMISO uptake to derive voxel-level maps of oxygen partial pressure (pO2) in the tumour and define a hypoxic target volume (HTV). The FDG scans were used to estimate spatial variations in clonogenic tumour cell density, which directly influence the dose required to realise a given tumour control probability (TCP).
Based on individual tumour profiles, the team used automated planning to create volumetric-modulated arc therapy plans comprising 35 fractions with an integrated boost. The plans delivered escalated dose to radioresistant subvolumes (the HTV), while maintaining clinically acceptable sparing of organs-at-risk. The PET datasets were used to calculate the prescribed dose required to achieve a TCP of 95%.
Meeting clinical feasibility
The automated planning pipeline achieved high-quality treatment plans for all patients without manual intervention. The average EQD2 (the dose delivered in 2 Gy fractions that’s biologically equivalent to the total dose) to the HTV was boosted to 81±3.2 Gy, and all 28 plans met the clinical constraints for protecting the brainstem, spinal cord and mandible. Parotid glands were spared in 75% of cases, with the remainder being glands that overlapped the target.
Lazzeroni and colleagues suggest that these results confirm the overall clinical feasibility of their personalized dose-escalation strategy and demonstrate how biology-guided prescriptions could be integrated into existing treatment planning workflows.
The researchers also performed a radiobiologic evaluation of the treatment plans to see whether the optimized dose distribution achieved the desired target control. For this, they calculated the TCP based on the planned dose distribution, the PET-derived radioresistance data and clonogenic cell density maps. For all patients, the plans achieved model-predicted TCP values exceeding 90%, a notable improvement on tumour control rates reported in the clinical literature for HNSCC, which are typically around 60%.
The proposed strategy is based on pre-treatment PET images, but biological changes during treatment – including temporal and spatial variations in tumour hypoxia – could impact its effectiveness. In future, the researchers suggest that longitudinal imaging, such as PET/CT scans at weeks 3 and 5, could be used to monitor evolving tumour biology and inform adaptive replanning. This is particularly relevant in HNSCC, where tumour shrinkage and reoxygenation are common, and where updated imaging is required to determine whether dose escalation or de-escalation is appropriate to maintain tumour control and optimize normal tissue sparing.
The researchers point out that as the biology-guided dose prescriptions were planned but not delivered, prospective trials will be required to assess whether the observed dosimetric and biologic gains translate to improved patient outcomes.
“This study was designed as a feasibility and modelling investigation, and the next step is prospective clinical validation,” Lazzeroni tells Physics World. “Based on the promising results of this approach, prospective clinical trials are currently in the planning phase within the group led by Anca-L Grosu in Germany. These trials will focus on integrating longitudinal PET imaging during treatment to enable biologically adaptive radiotherapy.”