Skip to main content

Proton arc therapy: do we need it; can we deliver it?

Radiotherapy plays an essential role in the management of cancer, with roughly half of all cancer patients receiving radiation as part of their treatment. The majority of such treatments are delivered using external beams of X-rays, targeted at the tumour to damage or kill cancerous cells. Another approach is particle therapy, in which tumours are irradiated with beams of protons or carbon ions. While less prevalent than photon-based radiotherapy, the number of proton-therapy centres worldwide has increased at pace in recent years.

The technologies used to deliver photon-based treatments have progressed a great deal over the last few decades, evolving from 3D conformal radiotherapy to intensity-modulated radiotherapy (IMRT) to volumetric modulated arc therapy (VMAT), in which the linear accelerator delivering the therapeutic X-rays rotates around the patient during treatment (see box below). Each of these steps helped to improve dose coverage of the tumour being targeted, while reducing unwanted radiation being delivered to normal tissues within the body.

The last few decades have also seen proton therapy transition from research laboratories into the clinical setting. During this time, the dose delivery technique has progressed from passive scattering to intensity-modulated proton therapy (IMPT) using scanned proton pencil beams. So will protons follow the lead of photons and introduce rotational delivery – and is proton arc therapy (PAT) the next logical step?

That was the question under consideration at the recent ESTRO 2021 congress, where a dedicated conference session examined the latest developments in PAT and the technique’s potential to improve proton therapy for cancer patients.

Radiation-therapy techniques

Three-dimensional conformal radiotherapy, 3D CRT

3D CRT is a photon-based treatment that uses 3D medical images to precisely define the tumour target. The radiation dose is then shaped to match the shape of the tumour by delivering X-ray beams from many directions.

Intensity-modulated radiotherapy, IMRT

IMRT is a type of conformal radiotherapy in which not only the shape, but also the intensity profile, of each treatment beam is varied to precisely target the tumour.

Volumetric modulated arc therapy, VMAT

With VMAT, the linear accelerator that delivers the radiation beam rotates around the patient during treatment. The shape and intensity of the X-ray beam are continuously controlled as it moves around the body.

Stereotactic body radiotherapy, SBRT

Stereotactic radiotherapy uses high radiation doses to treat tumours in the brain and central nervous system in one or just a few treatments. SBRT is similar but refers to treatment of tumours elsewhere in the body.

Intensity-modulated proton therapy, IMPT

IMPT is a proton therapy technique in which scanned proton pencil beams of variable energy and intensity are used to precisely paint the radiation dose onto a tumour.

Proton arc therapy, PAT

With PAT, the proton beams are delivered continuously as the gantry rotates around the patient. During this rotation, the beam energy and intensity are adjusted to match the dose to the target volume.

The promise of PAT

When it comes to PAT, protons are delivered continuously as the gantry – the large circular structure containing the equipment that delivers the protons to the patient – rotates around the patient. First described in the literature back in 1997, the thinking was that PAT could improve dose distribution, increase treatment robustness and reduce delivery time compared with IMPT. Unfortunately, at that time, only 16 proton-therapy centres were in operation, and only one with a gantry. As such, PAT appeared too technically challenging to continue and was soon dropped.

Now, however, there are more than 100 proton-therapy centres worldwide, most with a gantry and using pencil-beam scanning. “The potential for integrating PAT into the clinic is huge right now,” said Laura Toussaint from the Danish Centre for Particle Therapy. Toussaint shared the results of several published treatment-planning studies comparing PAT with other proton- and photon-based treatments. The first example presented a whole-brain treatment, which aimed to avoid irradiating the hippocampus during brain radiotherapy, to spare the patient’s memory function (Acta Oncol. 58 483). While VMAT, IMPT and PAT all delivered equivalent target dose distributions, PAT reduced the dose to the hippocampus, as well as to the cochlea.

Toussaint also presented a study of head-and-neck treatments, where robustness can be a particular challenge (Radiat. Oncol. 15 21). Here, PAT offered improved target dose conformity and better sparing of nearby vital organs, known as organs-at-risk (OARs), compared with IMPT. The PAT plan was also more robust to changes in patient set-up and anatomy.

She also described a comparison of lung stereotactic body radiotherapy (SBRT) delivered via PAT, IMPT and VMAT. Again, PAT delivered the lowest dose to OARs, and reduced the integral dose. PAT could also mitigate interplay effects better than IMPT. Studies of brain, breast, prostate and spine cases all reported similar findings.

“Overall, from these treatment planning studies, when comparing proton arc to IMPT, we saw no strong evidence of improved target dose conformity or homogeneity,” Toussaint explained. “However, there was a large reduction in the body integral dose and consistently better OAR sparing. There was also potential for improved plan robustness by mitigating range uncertainty and interplay effects.”

PAT could also reduce treatment delivery time, Toussaint explained, citing a publication showing that PAT was 50% faster than VMAT for a spine SBRT case. She noted that the field is moving quickly, and has already seen a paper published on spot-scanning hadron arc therapy using carbon and helium ions (Advances Radiat. Oncol. 6 100661).

“Are the demonstrated gains in physical dose and the estimated clinical benefits enough to motivate the technological and engineering efforts needed for implementation of proton arc in clinical practice?” Toussaint asked. “This is up for debate.”

Creating a PAT prototype

Xuanfeng Ding from William Beaumont Hospital in Michigan certainly thinks that PAT offers plenty of clinical potential, and is part of a team working on an ongoing project to create a clinical PAT system.

Currently, there are three main challenges when it comes to proton therapy: range and set-up uncertainties; large spot size; and prolonged treatment times. These can be addressed using approaches such as robust plan optimization, dynamic beam collimation and increased dose delivery efficiency, respectively. “But PAT can address these three together,” said Ding.

PAT delivery is complex, however, with treatment plans comprising numerous energy layers (as the energy of a proton beam defines the precise depth at which it deposits a dose within the target). Ten years ago, the energy-layer switching time of a first-generation pencil-beam scanning systems was up to 6 s, making PAT delivery significantly lengthier than IMPT and limiting its clinical application.

PAT schematic

“But things change. The majority of new-generation cyclotron-based scanning systems can reach an energy-layer switching time of 0.5 s, which makes proton arc comparable or even faster than two-field IMPT,” said Ding. “This is good timing to look seriously at how to implement this in a clinical setting.”

Ding described the Beaumont project, which is developing spot-scanning PAT, also known as SPArc. It involves moving the patient couch and gantry, while continuously delivering the proton beam; as well as switching energy layers and scanning the beam during the gantry rotation.

“The goal is to make proton therapy more efficient, more robust and also with better dose conformality. That’s the final clinical endpoints that we’re looking at that will benefit society,” said Ding. “The challenge is it’s a huge gantry, so how can we rotate this gantry precisely while delivering the proton beam with millimetre accuracy?”

The team, now working with Belgian medical technology company IBA, has already demonstrated SPArc delivery on a clinical ProteusOne system (see photo at top of article). The researchers confirmed that during gantry rotation, the central spot maintains a constant size and 1 mm position accuracy, at energies from 70 to 227.7 MeV (Radiother. Oncol. 137 130). The collaboration has also developed an iterative treatment planning optimization algorithm, which incorporates machine-specific dose delivery sequences and gantry motion. Indeed, their work established the first dynamic proton arc model that combines mechanical and irradiation sequence models.

Ding also demonstrated that SPArc can be more efficient than IMPT. He compared a SPArc plan with a three-field IMPT plan for a brain tumour case. Both exhibited the same plan quality, but the SPArc treatment took just 4 minutes, while IMPT required 11 minutes. The gain is disease-site dependent, he noted, and treatment delivery time is just one part of the process. But Ding estimates that, when patient set-up time is included, SPArc could provide an average 20% increase in patient throughput for a single-room proton system, and potentially more for a multi-room proton-therapy centre.

Looking ahead, Ding’s group is collaborating with Université catholique de Louvain to develop a linear energy transfer (LET)-optimized SPArc algorithm. The team also plans to investigate hypofractionated treatments, in which the prescribed dose is delivered in one or just a few fractions, as well as investigate ways to mitigate interplay effects between the motion of a tumour and motion of the proton beam.

“I think we’re on the right track to proton arc therapy, eventually clinical users will adopt this technology for a variety of disease sites,” Ding concluded. “More and more vendors and clinical users will play a key effect because different people will invent different innovations for this technology, moving together to implement proton arc into the clinical setting to benefit our patients.”

Biological optimization

In the final talk of the session, Alejandro Carabe from Hampton University Proton Therapy Institute, Virginia, took a look at the third piece of the puzzle. With PAT seemingly technically feasible and dosimetrically advantageous, could it also allow us to control the biological effectiveness of a proton beam and increase the therapeutic index?

The goal of any radiation treatment is to maximize the probability of destroying tumour cells (tumour control probability, TCP), while minimizing the risk of damage to non-cancerous tissue (normal tissue complication probability, NTCP). The therapeutic index represents the ratio between TCP and NTCP, and the higher its value, the more beneficial a radiation treatment.

Looking first at a single photon beam, the high entrance dose will inevitably damage normal tissue more. To reduce NTCP, and thus increase the therapeutic index, rotational therapy can be used to deliver many fields and conform the dose to the target volume. With a single proton beam, however, the entrance dose is already inherently low, which should result in a decreased NTCP with respect to TCP, and a higher therapeutic index than that of the photon beam.

Carabe described a case study comparing VMAT with two- and three-field IMPT plans for oesophageal cancer treatments. While the mean dose to the target was not significantly different for the three approaches, the IMPT plans clearly delivered less mean dose to the OARs.

The study demonstrated that – in addition to a single proton field offering a higher therapeutic index than a single photon field – a low number of proton fields can outperform VMAT, the most conformal photon therapy. With this in mind, is there even any need to incorporate rotation into proton therapy to increase the therapeutic index?

A review of published studies comparing physical dose distributions between PAT and IMPT for lung, brain, head-and-neck and prostate cancer treatments showed no evidence that PAT improved conformity or uniformity compared with IMPT – although it did reduce normal tissue integral dose. “Therefore, we cannot categorically say that PAT significantly improves therapeutic index over IMPT, at least when just looking at dose distribution,” said Carabe.

But, he explained, there’s more to consider than purely dosimetric parameters. Even when a uniform dose is delivered to the target, the linear energy transfer increases towards the distal edge of beam, causing that dose to be more harmful. If proton plans are only optimized for physical dose, not LET, they often place high-LET dose in the normal tissue next to the target. PAT could provide the control to deposit high-LET dose in the target volume.

Carabe and colleagues performed an in vitro experiment to examine whether PAT could be used to control the LET distribution in cultured cells (Phys. Med. Biol. 65 165002). They found that PAT plans generated almost double the LET than three-field IMPT, and killed more cells than IMPT with the same dose. They also analysed a prostate cancer case and found that using PAT to concentrate high LET in the target volume enabled reduction of delivered dose, without reducing the clinical effect.

figure 1

In another piece of research, Carabe compared IMPT with monoenergetic PAT (figure 1) to treat a clinical brain tumour case (Phys. Med. Biol. 65 165006). “With monoenergetic PAT, we were able to avoid critical areas such as the brain stem and produce very conformal plans compared to IMPT plans,” he said, adding that the technique “could concentrate the LET in the area that we wanted and take it away from the areas where it could be very damaging”.

“PAT allows LET painting, which can be utilized to either decrease the risk of radiation-induced toxicity, or increase the biological effectiveness of the treatment, or even reduce the prescribed dose or number of fractions,” said Carabe. “All of this will allow us to have much better control of the effects of a treatment and increase the therapeutic index.”

He emphasized that introduction of this new modality should not just rely on a pure physics argument, but on its biological impact. “The most important thing to remember is that delivery of PAT should not be justified based on increased conformity,” Carabe concluded. “It should be justified based on enhanced biological impact.”

Magnetoelastic material sustainably powers health monitors using body movement

The future of bioelectronics – including wearables, implantable devices and smart technologies – hinges on the ability to sustainably power devices. A number of approaches for converting biomechanical energy into electricity have been introduced, including piezoelectrics and triboelectrics, which function by deriving charge from compressing or contacting materials. Unfortunately, these techniques’ suboptimal electronic properties and vulnerability to ambient humidity limit their effectiveness.

The answer could lie in magnetoelasticity, in which a material’s magnetic properties change under mechanical stress. This effect is usually observed in rigid metal alloys, which have mechanical moduli significantly higher than that of human tissue (they are very stiff). As a result, such materials are unsuitable for biomechanical energy generation. Researchers led by Jun Chen at the University of California, Los Angeles’ Samueli School of Engineering have overcome this difficulty by formulating a new soft magnetoelastic polymer blend. They share their results in Nature Materials.

Micromagnets dispersed in a polymer

The team created the porous, soft magnetoelastic material by dispersing micromagnets in a silicone polymer matrix. These micromagnets induce the inherent magnetic behaviour of the system so that when the elastomer is compressed, the magnetic field changes. The authors hypothesize that this change in magnetic flux density is caused by a realignment of “chains” of micromagnets under mechanical deformation. They were also able to model this effect theoretically.

In order to harness the energy induced by the magnetoelastic effect, the researchers positioned a magnetic inductor on top of the silicone layer. This inductor converts changes in magnetic fields into an electrical current. Therefore, magnetic deformations can be turned into electrical energy to be used according to the desired application. The authors call this electrically responsive system a “soft magnetoelastic generator”.

Monitoring human health

Many applications for wearable power generation could exploit the electrical coupling provided by a magnetic inductor. For example, with gentle hand tapping, the authors were able to charge capacitors of varying size within seconds. Another magnetoelastic generator was able to drive a commercial wearable thermometer to monitor body temperature.

In addition to powering wearables, the generator could also be used to sustainably power implanted bioelectronic devices, which remains difficult with current technology. Acoustic waves and ultrasound can be utilized to transfer energy to medical implants through tissue. To demonstrate this effect, the researchers implanted a magnetoelastic generator in porcine tissue and excited the tissue using ultrasound. The generator was able to output a power of over 30 µW, a value comparable to that used by bioelectronics such as pacemakers and neurostimulators.

Magnetoelastic generator applications

Lastly, the researchers tested the device’s capacity to act as a cardiovascular monitor. They found that a generator worn on the wrist could detect a human pulse, even when wet. In this scenario, the natural arterial pulse deformed the magnetoelastic generator, inducing a current in the inductor. This sensor was even able to function underwater and through sweat, as the material is inherently waterproof.

Moving forward, Chen’s team aims to further augment the electrical output of the generators by optimizing the device’s design. This work opens a new avenue for practical human-body-centric energy, sensing and therapeutic applications.

NASA hit by resignation over its handling of investigation into telescope renaming

A member of NASA’s Astrophysics Advisory Committee has resigned over the agency’s handling of an investigation into whether the James Webb Space Telescope (JWST) should be renamed. The probe was instigated in the wake of concerns that Webb – a former NASA administrator – had been involved in mistreating gay and lesbian people in the 1950s and 1960s. NASA announced in September, however, that it would not be changing the name of the JWST, revealing the news via a single-sentence statement that was released only to certain media outlets.

That decision angered some astronomers particularly because the agency had said it would be fully transparent in releasing the results of the investigation. In response, Lucianne Walkowicz from the Adler Planetarium in Chicago, who is also a co-founder of the JustSpace Alliance, has now announced they are resigning with immediate effect from the committee over its handling of the affair. The JWST is due to be launched in December.

Regarded as a successor to NASA’s Hubble telescope, the JWST was originally known as the Next Generation Space Telescope. In 2002 the then NASA boss Sean O’Keefe renamed the telescope in honour of Webb, who had served as NASA administrator during the Apollo era. A bureaucrat rather than a scientist, Webb had served in various US government roles since the 1940s.

Earlier this year, however, more than 1200 people signed an open letter calling on NASA to rename the JWST, claiming that Webb was involved in anti-LGBT+ activities before taking up the role at NASA. The letter was initiated by Walkowicz as well as Chanda Prescod-Weinstein from the University of New Hampshire, Brian Nord from Fermilab and the University of Chicago as well as Sarah Tuttle from the University of Washington.

“Webb served as the undersecretary of state during the purge of queer people from government service known as the ‘Lavender Scare’,” the authors stated. They added that archival evidence “clearly indicated that Webb was in high-level conversations regarding the creation of this policy and resulting actions”.

The authors also noted that Webb was in charge of NASA when Clifford Norton – a budget analyst at the agency – was sacked in 1963 on suspicion of homosexuality. “We, the future users of NASA’s next-generation space telescope and those who will inherit its legacy, demand that this telescope be given a name worthy of its remarkable discoveries, a name that stands for a future in which we are all free,” the authors wrote.

Lack of transparency

In June, NASA said it would begin an internal investigation, which would examine historical documents and interview historians who had studied Webb. While officials at NASA said the agency would be “transparent” with the decision, on 27 September NASA administrator Bill Nelson issued a single-sentence statement to selected journalists stating: “We have found no evidence at this time that warrants changing the name of the James Webb Space Telescope.”

The news angered astronomers. “What I hear as a queer scientist and a member of multiple NASA collaborations is, ‘The homophobic terror that Clifford L Norton was subjected to doesn’t matter’,” noted Prescod-Weinstein on Twitter. “I find NASA’s single sentence statement about the evidence to be gaslighting, constituted by the sin of omission, and most troublingly, unsupported and thus unscientific. They do not make the case for their claim in light of the publicly available evidence.”

The news also surprised many who sit on NASA advisory committees, who said they only learned of the news from press reports. In an open letter announcing their resignation from the 12-strong committee, Walkowicz criticized NASA’s lack of transparency and called NASA’s response “flippant” and “pathetic”.

“After the past year and a half we’ve had with not only the pandemic, but also national grappling with issues of racism and human rights, it boggles the mind that NASA has so little insight into its own participation in systematic oppression,” Walkowicz writes. “I’m not the first and won’t be the last driven out of a NASA space, where evidently straight people’s opinions are valued and taken more seriously than queer people’s experiences.”

Prescod-Weinstein adds that the logic behind naming the telescope after Webb is that he is responsible for NASA’s successes during the Apollo era. “At the same time, NASA says he is not responsible for the homophobia that occurred at NASA,” says Prescod-Weinstein. “How is he responsible for all of NASA’s successes during his time as administrator but none of its failures? Real people were harmed by those failures. That matters.”

Cold atmospheric plasma eradicates residual cancer cells

Richard Wirz

Chemotherapy and radiotherapy are standard treatments used after cancer surgery to destroy any residual tumour cells within the surgical cavity or circulating in the body. Such therapies, however, can be associated with adverse effects. Cold atmospheric plasma could provide an alternative anti-cancer tool and is under investigation as a potential postsurgical treatment.

A team at the University of California, Los Angeles (UCLA), working with researchers from China and Canada, has developed a portable, air-fed cold atmospheric plasma (aCAP) device for such applications. In a proof-of-concept study, described in Science Advances, the aCAP device inhibited tumour growth and improved survival in mice following cancer surgery.

One major advantage of the team’s aCAP device is that it uses ambient air as the source gas to generate the cold plasma discharge, in contrast to conventional CAP systems that require bulky pressurized gas supplies. It can also be powered by batteries. This greatly reduces cost and complexity, and increases the feasibility of use in the operating suite as well as remote locations throughout the world.

Low-energy lightning

Plasma, an ionized gas, is the primary state of matter in stars, and comprises over 99% of the visible universe. Plasma made from air consists of many reactive species, radicals, electrons and photons. Lightning, a visible form of plasma, is a naturally occurring electrostatic discharge between two electrically charged regions that produce a giant arc of electricity with gigajoule energy in the ambient air.

Cold plasma

Led by co-principal investigators Richard Wirz, director of the UCLA Plasma and Space Propulsion Laboratory, and Zhen Gu of the Zhejiang Laboratory of Systems and Precision Medicine at Zhejiang University Medical Center, the researchers designed the portable aCAP device around the concept of reducing the energy regime of lightning from gigajoules to joules. They achieved this by adjusting the voltages and distances between the device’s electrodes. The small arcs between these electrodes ionize ambient air that is fed through the device, resulting in a near-room-temperature jet of cold atmospheric plasma.

The team hypothesized that the local application of aCAP on residual tumour cells in a surgical cavity would induce a high level of reactive oxygen species (ROS) and reactive nitrogen species (RNS) in the tumour microenvironment. ROS and RNS are known to induce cancer immunogenic cell death and release tumour-associated antigens in situ, evoking effective anti-tumour immunity.

Cancer cell kill

The researchers first tested their prototype aCAP device in vitro with breast cancer and melanoma cells. They detected increased concentrations of ROS and RNS cells in the cells and the culture media, which caused potent tumour-killing effects.

Next, they used the device to treat the surgical cavity following resection of 400 mm3 breast tumours in mice. After surgery and application of aCAP (for 1, 2, 3 or 4 min) to residual tumour cells, they detected increased levels of calreticulin, an indicator of immunogenic cell death. The researchers used a thermal camera to monitor temperature in the aCAP-treated areas, reporting that no significant temperature changes occurred in and adjacent to the treated tissue.

To mimic residual microtumours, the researchers deliberately left 2–5% residual tumour after surgery, and monitored tumour regrowth. The mice that had surgery plus aCAP treatment showed significantly improved control of tumour regrowth compared with those that only underwent surgical excision.

Longer aCAP treatment correlated with better outcomes: over 40% of the mice survived for at least 60 days when treated with 4 min of aCAP. Co-lead authors Guojun Chen and Zhitong Chen suggest that extending the duration of aCAP treatment or performing repeated treatments could further enhance its therapeutic efficacy.

“Our portable aCAP device for postsurgical cancer treatment simplifies CAP equipment configurations and more broadly facilitates its applications in medicine. We anticipate that this treatment approach is also applicable to other types of solid cancer,” note the authors. They add that aCAP treatment could possibly be combined with cancer immunotherapies, such as immune checkpoint blockade, to further improve therapeutic outcomes.

“We are planning to improve the form, function and ease-of-use of the device, while continuing with further studies to determine modes of operation that are most effective with mouse models and then larger animals,” says Wirz. “This will be a joint collaboration by our team at UCLA, Zhejiang University, McGill University, and the National Innovation Center for Advanced Medical Devices in Shenzhen. If successful, we will want to advance to human studies.”

Laser beams become visible in vacuum

Laser beams are normally invisible when they pass through a vacuum, but physicists at the University of Bonn, Germany, have found a way to make them reveal themselves. This feat, which they accomplished using a technique called Ramsey imaging, should make it easier to align lasers with the precision needed to trap and manipulate individual atoms – a crucial step for atom-based quantum computing and other quantum technologies.

Optical traps use highly focused (often criss-crossed) laser beams to generate one or more dips, or “pockets”, in potential energy where individual particles can be held in place. Experimenters can move these pockets back and forth at will, thereby transporting the particles to specific locations in space.

As the number of particles in the same location increases, they start to interact with each other. “To control this process, all the pockets must have the same shape and depth,” explains Gautam Ramola, a PhD student at Bonn and the lead author of a study on the new technique. For that to happen, he adds, the trapping laser beams must overlap with micrometre precision.

Highly homogenous optical traps are especially important for atom-based technologies such as optical lattice clocks, trapped atom interferometers, quantum computing and quantum simulators. However, because these technologies operate under vacuum to preserve the atoms’ delicate quantum states, few other particles are present to scatter or reflect the laser light and thus reveal information about the beams’ intensity profile.

Ramsey phase tracking

Ramola, team leader Andrea Alberti and colleagues overcame this problem by using the atoms themselves to detect how the beams propagated. This technique, dubbed Ramsey phase tracking, works by probing the atoms’ hyperfine splitting – that is, the shift in an atom’s energy levels that occurs due to interactions between the magnetic moment of its nucleus and the orbital motion of its electrons. The Ramsey signal measures how this hyperfine splitting changes in the presence of elliptically polarized laser beams.

“Each atom effectively acts as a small sensor that records the intensity of the beam,” Alberti explains. “By examining thousands of atoms at different locations, we can determine the location of the beam to within a thousandth of a millimetre.”

The technique, which the researchers describe in Physical Review Applied, allowed the team to adjust four laser beams so that they interacted at exactly the position required. “Such a manoeuvre would normally take several weeks using conventional techniques, with no real guarantee of the final result,” Alberti says. “We only needed about one day to achieve this.”

In their work, the team made measurements on optical traps for caesium atoms, but the technique will also work with other alkali atoms, as well as atoms from certain other groups in the periodic table, such as the magnetic lanthanides. It could also be applied to a range of optical trap geometries, including “flat” and “hollow” traps.

Mussels mix proteins and metals to create sticky threads

Microscope image of mussel

Mussels are famous for their ability to stick to a multitude of surfaces and now researchers in Canada and Germany have identified the molecular mechanisms used by mussels to produce robust adhesive threads. Using a range of imaging and spectroscopy techniques, a team led by Tobias Priemel at McGill University in Montreal found that the molluscs release fluid proteins into a network of microchannels in their feet, in coordination with separately stored metal ions.

To anchor themselves to their seashore habitats, mussels produce adhesive threads, called byssus. Once cured, the protein fibres of these structures become integrated with metal ions via mechanically stable cross-links – which are coordinated by an amino acid called DOPA.

By creating similar artificial structures, researchers aim to develop a new generation of bio-inspired polymers: with applications including self-healing materials; advanced coatings; and underwater adhesives. Currently, however the mechanisms employed by mussels to construct load-bearing cross-links within their byssus threads are largely unknown.

Microscopy and spectroscopy

To learn more, Priemel’s team first investigated the byssus formation process using a combination of optical, electron, and X-ray fluorescence microscopy; then examined the chemical composition of the fibres using Raman spectroscopy.

They discovered that instead of drawing in metal ions from surrounding seawater, as had been previously thought, mussels concentrate and store ions of iron and vanadium within specialized storage particles. These particles are themselves contained within cells and held together by an as-yet unknown biomolecule. Within a separate stockpile of vesicles, mussels also carry concentrated, DOPA-rich proteins in a fluid form.

To produce their byssus threads, the team showed that mussels secrete proteins from these vesicles, into a complex network of tiny, interconnected microchannels their feet. In coordination with this process, metal storage particles are also delivered into the microchannel network, where they release their metal ions – possibly through a pH-driven process.

As they diffuse and spread through the dense protein fluid, the ions are then coordinated by DOPA molecules to form cross-links. In the process, the mixture coalesces to form mechanically stable threads, featuring strong protein-metal bonds.

For the first time, this mechanism allowed Priemel’s team to explain how mussels can strongly adhere to almost any solid surface, even in seawater conditions. With a deeper knowledge of this process, researchers could soon find it far easier for researchers to replicate byssus filaments that are just as strong as those found in nature. Through future research, the researchers also hope to shed new light on why mussels use vanadium ions in particular – which are exceptionally rare in nature.

The study is described in Science.

Why nuclear energy must be part of ‘net zero’ climate targets

illustration of Earth run by nuclear power

According to a poll carried out in 2020 by the Institution of Mechanical Engineers (IMechE), only a quarter of people aged between 18 and 24 in the UK are aware that nuclear is a low-carbon source of energy. Three-quarters of young people, in contrast, believe that wind and solar are low carbon, with only 61% of the eldest-age category polled – 65–74 year olds – knowing that nuclear falls into the low-carbon category too. Those findings might surprise physicists, who will be aware that the energy density of nuclear fission is so high that just a fingertip of uranium has an energy equivalent of 5000 barrels of oil.

Despite these benefits, nuclear power tends to suffer from relatively poor public perception and not knowing it is low carbon could be due to a lack of education. Indeed, it is understandable why people might fear radiation given that you can’t see it – yet the same can be said of the air that we breathe. And if we assess the whole impact of energy sources per kWh, then the energy “deathprint” – the number of people killed per kWh produced – of nuclear is significantly lower than that of most other energy sources.

If you care about the environment and giving more land and resources back to nature – as many young people do – then nuclear is an important option in achieving “net zero”. This refers to the balance between the amount of greenhouse gases produced and the amount removed from the atmosphere – with net zero meaning that there are no net greenhouse emissions from the entire energy system. The UK government has stated that it wants to be net zero by 2050 and says it will support secure, reliable, low-carbon nuclear energy as a commercially deployable technology that can enable rapid decarbonization of heat, power and transport.

Nuclear needs an image makeover so that it is viewed on a par with other energy sources that are widely considered clean and sustainable

The problem for nuclear – as highlighted by the IMechE poll – is that it must reinvent itself for the modern world and re-evaluate its position within a sustainable-energy mix. Nuclear needs an image makeover so that it is viewed on a par with other energy sources that are widely considered clean and sustainable. That will require the nuclear industry to engage openly with society to build better awareness and understanding of the advantages and disadvantages of nuclear energy compared with other energy sources. Only then will it get the support to plug the gap in the decarbonization of our energy consumption. As the International Nuclear Agency points out, that goal cannot be reached using renewables alone, which are intermittent and cannot be easily stored.

Young support

As physicists we are well equipped to drive the conversation on climate change from the problem to solutions – one that nuclear needs to be a part of. We want to ensure a clean, sustainable and abundant future for us and the next generation. The COP26 climate talks that will be held on 1–12 November in Glasgow offer a perfect opportunity to do so. The UK Nuclear Institute will have an unparalleled reach at COP26 through collaboration with the European Nuclear Society – a non-governmental organization – and aims to get people excited about nuclear and ensure it is at the table and considered alongside renewables.

As part of that initiative, in February 2021 the Nuclear Institute’s Young Generation Network (YGN) launched the #NetZeroNeedsNuclear campaign to promote, support and raise awareness of nuclear as a low-carbon energy source. The initiative also sought to influence policy makers involved in COP26, take a scientific approach to energy policy and financing, and foster a sustainable collaboration between nuclear and renewables.

The YGN’s “position paper” called Nuclear for Climate, which summarizes the importance of nuclear, was supported by more than 100 nuclear associations worldwide and has so far been translated into 17 languages. In it, we emphasize that nuclear is not only a low-carbon source of energy, but is also widely available, scalable and deployable. We therefore need to build new nuclear plants – alongside increased renewable-energy capacity – if we are to deliver efficient and affordable clean-energy systems and achieve our net-zero targets.

Nuclear is also capable of supporting the decarbonization of other sectors, such as heating and transport. Indeed, nuclear allows the opportunity to decarbonize all energy, not just electricity. This can be done directly with the output of future “generation IV” reactors or through the steady production of hydrogen as a clean fuel for transport. Nuclear also supports global development by promoting global socioeconomic benefits and is strongly aligned to the UN sustainable development goals.

The collaboration between different parts of the energy and the wider sector has been a key part of #NetZeroNeedsNuclear and we must work together to save our planet. We hope that the move to nuclear will be the wildcard of COP26 and, as early-career physicists, we urge our fellow physicists to stand with us and support nuclear – alongside other clean energy sources – as a key part of our journey towards a new sustainable future.

‘Superbubble’ region of star formation was created by supernovae, study suggests

The highest-resolution 3D map of nearby molecular gas clouds in Milky Way has revealed a structure that is creating new regions of star formation.

Called the Perseus-Taurus Shell, or Per-Tau Shell for short, the region is a “superbubble” in the interstellar medium, blown by the blast waves of multiple supernovae dating back 22 million years. The blast waves have ploughed into interstellar gas, piling it up at the edge of the superbubble where it has formed the well-known Perseus and Taurus molecular clouds, which today are active star-forming regions.

The discovery has been made by a team led by Catherine Zucker and Shmuel Bialy of the Harvard Smithsonian Center for Astrophysics. They used a 3D map of interstellar dust with unprecedented resolution down to 1 parsec (3.26 light-years). This was produced by team member Reimar Leike of the Max Planck Institute for Astrophysics using data from the European Space Agency’s Gaia astrometric mission. The map charts molecular clouds out to a distance of 400 parsecs (1300 light-years) from the Sun.

Their new 3D map shows the general structure of giant molecular clouds, with lower-density outer envelopes and higher-density inner layers. The team hypothesizes that the boundary between these two regions within a cloud represents the transition between atomic neutral gas in the outer envelope, and the cold molecular gas required to form stars in the inner zone.

The Radcliffe Wave

While the map charts a dozen molecular clouds in the Sun’s neighbourhood, it is the Per-Tau Shell that has proved the most intriguing. It has an almost spherical structure that is 508 light-years in diameter. The Taurus Molecular Cloud located on the side of the shell that is nearest to the Sun (400 light-years away) and the Perseus Molecular Cloud is on the far side of the shell. The Per-Tau Shell is part of an even bigger structure, discovered in 2020 by a team led by Zucker and João Alves of the University of Vienna and the Radcliffe Institute for Advanced Study at Harvard University. This large structure is called the “Radcliffe Wave”.

Spanning 8800 light-years, from the Taurus Molecular Cloud all the way to the Cygnus X star-forming region located 5000 light-years from us, the Radcliffe Wave contains about three million solar masses worth of gas and dust. It is so large that it is undulating in time with the sinusoidal perturbations of the Milky Way’s spiral disc, an effect that could be caused by interactions with dwarf galaxy satellites or large clumps of dark matter.

The Radcliffe Wave “is a dense feature of the Local Arm of the Milky Way,” says Zucker, and its discovery has overturned the prevailing theory that the high density of star-forming nebulae that we see in the night sky were part of a ring-like structure called the Gould Belt.

Low-resolution illusion

The new results “lend further credence to the idea that the Gould Belt is an illusion caused by previously low-resolution data,” Zucker tells Physics World.

Instead, astronomers can now see that many of the local molecular clouds are part of the Radcliffe Wave, and the supernovae that gave birth to the Per-Tau Shell would originally have formed in an older star-forming nebula within the Radcliffe Wave.

“We think it’s no coincidence that the Per-Tau Shell formed inside the Radcliffe Wave,” says Zucker. “It speaks to the idea that star formation is mediated by physical processes that occur on vastly different scales.”

Supernovae triggers

The idea that supernovae can trigger the formation of molecular clouds had been hypothesized previously, but this is the first time that the process has been seen occurring in 3D. As such, this 3D map of the local molecular gas clouds will allow scientists to compare clouds generated in computer simulations to the real thing. Such comparisons will tell us how molecular clouds form and also provide insights into how stars themselves form, and why some molecular clouds are more adept at forming more massive stars than others.

“Understanding the density structure of molecular clouds in 3D will help us to place constraints on the large-scale dynamical processes of the gas within clouds that will form the seeds of star formation,” says Harvard’s Michael Foley, who is on Zucker and Bialy’s team. Making comparisons between the results from this 3D map and simulations or theoretical predictions could allow astronomers to see how certain structures within star-forming nebulae, such as clumps and filaments, form, and how these feed into the process of star formation.

The findings are published in The Astrophysical Journal and The Astrophysical Journal Letters.

Heterogeneous anthropomorphic phantoms: reimagining SBRT QA for small lung tumours

Medical physicists at the Dutch radiation oncology clinic Maastro are on a mission to fast-track continuous improvement and best practice in the planning, management and delivery of stereotactic body radiotherapy (SBRT) for the treatment of very small (less than 1 cm diameter) early-stage lung tumours. Working with industry partner CIRS, a US manufacturer of tissue-equivalent phantoms and simulators for medical imaging, radiation therapy and procedural training, Michel Öllers, Ans Swinnen and colleagues set out last year to demonstrate the inadequacy of existing dose verification methods for a specific category of lung SBRT treatments (Med. Phys. 47 5829). Their focus: how the latest so-called “type c” dose calculation algorithms like Acuros, an integral part of Varian’s Eclipse treatment planning system (TPS), deviate beyond a relative uncertainty of 5% for small lung tumours.

“Modern dose calculation algorithms perform much better in heterogeneous tissue like the lung than they did, say, 15 years ago,” explains Frank Verhaegen, head of physics research at Maastro and professor of medical physics at Maastricht University. “Nevertheless,” he adds, “accurate dose verification and QA for treatment plans of small lung tumours – and therefore small treatment fields – remains a complex proposition for the medical physics team.”

Phantom physics

Part of the problem is that while water-equivalent phantoms are well suited to the QA of radiotherapy treatments for abdominal or brain tumours (where most of the tissue is near-water-equivalent), the use of those same water-equivalent phantoms for heterogeneous tissue (such as the microenvironment of the lung) will provide incorrect dosimetric results. “Our aim was to find a phantom suited for accurate dose verification of the very small lung tumours (below 1 cm3) that are increasingly treated nowadays using SBRT,” notes Öllers. “Those cases, in turn, represent a non-trivial physics challenge for the developers of any dose calculation algorithm.”

With this in mind, the Maastro researchers teamed up with product engineers at CIRS to co-develop a series of proprietary tumour-equivalent inserts that mimic lung-tissue lesions as small as 5 mm diameter when integrated within CIRS’s existing Dynamic Thorax Phantom (Model 008A). Their new heterogeneous set-up was subsequently put through its paces at Maastro to test the premise, as Verhaegen puts it, that “accurate plan verification of small lung tumours can only be performed in an anthropomorphic phantom that mimics the clinical situation as closely as possible”.

Put another way: although the traditional SBRT QA consensus states that type c algorithms are able to reproduce the actual physical dose distribution in heterogeneous media in a small-field setting, Maastro’s working hypothesis said otherwise. In fact, the starting point for Verhaegen and colleagues was just the opposite – an assertion that Acuros falls outside a clinically acceptable accuracy of 5% below a certain threshold in tumour diameter (approximately 0.75 cm).

Quantifying the problem

To test their hypothesis, the Maastro physicists carried out a series of comparative dose measurements using the homogeneous PTW Octavius 4D phantom (including the Octavius 1000 SRS detector) and the heterogeneous Dynamic Thorax Phantom from CIRS. The latter contained different lung-equivalent, film-holding cylindrical phantom inserts with water-equivalent spherical targets (diameters of 0.5, 0.75, 1, 2 and 3 cm).

Using Acuros (version 15.5.11), the team calculated plans for 6 and 10 MV for each spherical target in the CIRS phantom – 14 treatment plans in all – with those plans subsequently delivered to both the PTW and CIRS phantoms to compare measured dose. In addition, the treatment plans of seven clinical lung cancer patients, all of them with tumours less than approximately 1 cm3 by volume, were irradiated in the heterogeneous CIRS phantom. The actual tumour size within the clinical treatment plans determined the choice of the spherical target size, thereby ensuring that both measurement geometry and clinical target volumes match as closely as possible.

The results are unequivocal: while measurement discrepancies in the homogeneous Octavius 4D phantom for the 14 calculated treatment plans were within 1.5%, dose discrepancies between measurement and TPS for the heterogeneous CIRS phantom increased for both 6 and 10 MV plans with decreasing target diameters – to 23.7±1.0% for 6 MV and 8.8±1.1% for 10 MV using the smallest target of 0.5 cm diameter (with a 2 mm margin for planning target volume versus clinical target volume). For the seven clinical plans, meanwhile, this trend of increasing dose difference with decreasing tumour size is less pronounced, although the smallest tumours exhibit the largest differences between measurement and TPS (up to 16.6±0.9%).

Disseminate and educate

So what’s changed 12 months on from the publication of Maastro’s findings? Verhaegen, for his part, is hopeful that the wider radiotherapy community is now at least more aware of the limitations of even the most advanced dose calculation algorithms for challenging SBRT indications like small lung tumours, also of the need for dedicated heterogeneous phantoms that can mimic these clinical situations to provide state-of-the-art dose verification. “Equally important,” he explains, “those heterogeneous phantoms are essential for the testing, calibration and validation of next-generation dose calculation models for stereotactic lung treatments.”

Meanwhile, the Maastro medical physics team has actioned the results with the clinic’s radiation oncologists, who are now aware of the limitations and inaccuracies of SBRT treatment plans to address lung tumour volumes of 1 cm3 or smaller. “Depending on the clinical status of the patient, we might accept dose inaccuracies up to 10% or make a decision not to treat,” says Öllers. “It’s worth noting, though, that our decision to treat or not to treat at Maastro is weighted against several factors. We consider not only the challenges of small lung tumours for the dose calculation algorithm, but also tumour movement and the visibility of the tumour targets on the cone-beam CT image.”

As for the bigger picture, Verhaegen reckons Varian’s close interest in the results is good news for the longer-term development and enhancement of lung SBRT QA protocols. He concludes: “We have been in contact with Varian since the publication of the study and aim to work with the vendor’s Eclipse development team to improve the Acuros algorithm for small lung lesions. That sort of continuous improvement will yield transferable upsides for the radiotherapy community as a whole.”

Phantom dynamics

The CIRS Dynamic Thorax Phantom (Model 008A) is designed for comprehensive analysis and QA of image acquisition, treatment planning and dose delivery in image-guided radiation therapy of lung lesions.

CIRS Dynamic Thorax Phantom

The phantom body represents an average human thorax in shape, proportion and composition. A lung-equivalent rod (containing a spherical target and/or various detectors) is inserted into the lung-equivalent lobe of the phantom.

The body is connected to a motion actuator box that induces 3D target motion (sub-mm accuracy and reproducibility) through linear translation and rotation of the lung-equivalent rod. The motion of the rod itself is radiographically invisible (owing to its matching density with the surrounding material), while the motion of the target can be resolved thanks to its density difference.

CIRS Motion Control Software provides independent control of target and surrogate, replicating the complex 3D tumour motion within the lung. The phantom is tissue-equivalent from 50 keV to 125 MeV, while the surrogate breathing platform accommodates numerous gating devices.

Green jobs for physics graduates: finance and economics

Rustam Majainah, senior pricing analyst, OVO

Green energy might appear to be all about feats of engineering, but integrating those breakthroughs into society involves many other challenges too, not least from a financial and economic point of view. This is another area where physicists can play a key role. Just ask Rustam Majainah, a physics graduate who now works as a pricing analyst at OVO, the UK’s largest independent energy supplier.

Rustam Majainah

After studying physics at Royal Holloway, Majainah did a Master’s in renewable energy and sustainability at the University of Reading, UK. In the summer between his BSc and Master’s programme, he did a placement at the Chippenham-based renewable-energy company Good Energy, which he found through the South East Physics Network (SEPnet).

In his job at OVO, Majainah uses numerical models to determine the cost of energy. This involves considering many factors, including the cost of generation, the use of cables that bring the energy to people’s homes, and social levies such as the warm home discount, which supports vulnerable customers. “I think energy supply is often a forgotten part of the green transition,” says Majainah. “You’ve got the energy generators and networks on one side, and everyday people on the other, and energy suppliers sit in the middle and try to match them up.”

Majainah points out that it’s a time of change in the industry. “With the innovation of smart meters, we’re moving from a system where you give your supplier one reading per quarter to one where we can get that data at a half-hourly level,” he says. In moving to more granular charges, OVO can use that data to pass on savings to their customers if, for example, the wind is blowing and turbines are generating energy, or if electricity is cheap at certain times. More granular charges can also be used to “flatten” energy usage peaks by charging customers less for energy at quieter times, and part of Majainah’s role is looking at the wider policy around that.

“It’s a prickly point,” he explains, “because if energy is cheaper at some times and more expensive at others, how do you encourage customers to change their consumption patterns without unfairly impacting people who don’t have the flexibility to do that?” OVO also looks at kitting people’s homes out with electric vehicle chargers, using vehicle-to-grid technology that allows cars to export energy back into the grid when local demand increases.

All of these challenges require people with numerical and analytical skills, which a physics degree gives you a strong grounding in. Additionally, Majainah says that skills in data-analysis programming languages such as Python, which many physics degrees teach, are highly sought-after in the energy sector. “We’re going through big systems transitions,” he says, “so there are plenty of opportunities at OVO and in the industry in general.”

Flora Biggins, PhD student, University of Sheffield, UK

Since wind and solar energy depend on the weather, and are not necessarily being generated most at the times when consumers are using the most electricity, energy storage is a key component of embedding them in our networks. But developing this capacity requires financial investment.

Flora Biggins

Flora Biggins, a PhD student at the University of Sheffield, is working on incentivizing companies to make these investments. After graduating with a physics degree from Imperial College London, she decided she wanted to do research relating to sustainability. “I wanted to use my problem-solving skills to work on solutions to climate change, which is the biggest challenge we face,” she says, “and energy storage is really important for decarbonizing electricity.”

Biggins’ research involves creating computational models that use machine learning to predict how prices of energy-storage technologies such as batteries and green hydrogen will evolve over time. She can then use these predictions to advise companies on how to invest in order to maximize their profits, for example by buying the right kind of batteries, or by using batteries to store energy and then sell it on when prices are higher.

I wanted to use my problem-solving skills to work on solutions to climate change, which is the biggest challenge we face

Flora Biggins

In addition to advising companies, Biggins’ work also informs policy. “If I find that energy storage is not very profitable, then it’s important for government organizations to know that,” she explains. “They might respond by introducing subsidies to encourage investment until prices drop to an affordable level.” Predicting future prices is very difficult, as there are numerous factors to consider that are constantly fluctuating. Future prices of green hydrogen are especially tricky to forecast, as it is relatively new, so doesn’t have much historical data to use as a starting point.

To tackle these challenges, the mathematical and computational skills Biggins developed during her physics degree are essential. Besides these technical skills, she says resilience is also necessary to keep going when things don’t go as planned, and she finds that having a positive solutions-focused project helps to motivate her. “It feels good to be working on something that is going to benefit society.”

Lewis Ashworth, programme manager, Institutional Investors Group on Climate Change

Many physics graduates go into careers in finance, which are another way of influencing how money is invested, and there are green options within this sector too. Lewis Ashworth, for example, is a programme manager at the Institutional Investors Group on Climate Change (IIGCC) – a membership body that supports shareholders to drive forward sustainability in the companies they invest in.

As part of his physics degree at the University of Sheffield, Ashworth did a year abroad at Monash University in Melbourne, Australia, during which he took courses in climate dynamics of the atmosphere and oceans alongside pure physics. “That opened my eyes to climate change,” he says, “so when I graduated I decided to do a sustainability-focused Master’s degree.”

Lewis Ashworth

Ashworth did an MSc course on environmental technology and energy policy at Imperial College London before starting his role at IIGCC. He now works on several projects, including an initiative called Climate Action 100+, which seeks to ensure that the 167 largest greenhouse-gas-emitting companies in the world reduce their emissions to be in line with the goals of the Paris Agreement.

Other programmes that Ashworth works on include educating shareholders on how they can influence the companies they invest in, for example by filing shareholder resolutions or voting against directors. He is also helping to develop a benchmarking process to assess companies’ progress towards the goals of the Paris Agreement. This uses various indicators, such as whether the companies have set net-zero targets.

Ashworth regularly does presentations to colleagues and investors as part of his job, so communication skills are essential, alongside an understanding of the statistics and data that he is presenting. He finds his physics background gives him confidence in understanding the various topics he speaks about, from electric vehicles to using hydrogen to decarbonize the steel industry. “As a physicist, these are not alien concepts,” he says, “so it’s nice to feel confident in your ability to decipher what’s going on.”

One common challenge facing people like Ashworth who work in sustainability is that they have high aspirations for making change, but often face barriers and find progress to be slower than they would like. “But when something big happens,” he says, “like a company announcing that they are going to commit to a target that you have been pushing for, and you know you were part of it, that’s when you know you are really making a difference.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors