Skip to main content

Brightness boost for Berkeley Lab Laser Accelerator upgrade

A second beamline area at the Berkeley Lab Laser Accelerator (BELLA) Center will soon open to users, allowing researchers to study extremely hot plasmas, investigate cancer therapies and discover new materials for quantum science. Dubbed Interaction Point 2 (iP2), the new beamline will be able to produce a laser beam about a thousand times brighter than is currently possible at the lab.

Installation of iP2 began in 2020 at BELLA, which is operated by the Lawrence Berkeley National Laboratory (LBNL). The facility was completed on schedule despite a shutdown in March 2020 caused by the COVID-19 pandemic. The first commissioning runs at iP2 began in September 2022, with engineers successfully delivering petawatt, picosecond laser pulses.

Those initial experiments applied about half the maximum pulse energy to low-density foam targets as well as thin metal or plastic ones. The foam targets allow the laser to penetrate further than the other materials, which creates a very strong magnetic field, like a vortex in the foams. That ability to boost particles to higher energies in shorter distances promises new, and possibly cheaper, means of exploring fundamental physics.

We’re ushering in a new era of high-intensity laser experiments

Cameron Geddes

“By compressing the laser energy into this short pulse and tiny focus, we can produce these very exotic small focal spots in the targets,” says LBNL physicist Lieselotte Obst-Huebl, who led the iP2’s installation.

High intensity 

The facility’s initial experiments will include studies of the FLASH effect, where radiation delivered by protons in short, intense bursts is used to kill cancer cells but not the healthy tissue nearby. BELLA researchers want to extend previous studies of the effect to thicker skin and tumour tissue.

In one early experiment, Obst-Huebl told Physics World, a collaboration with Lawrence Berkeley biologists will irradiate anaesthetized live mice to study whether the higher power produces “the beneficial effects we’ve seen in cell samples” in iP1 investigations. 

Other planned studies for iP2 include methods of improving qubits and high-temperature superconductors. “We’re ushering in a new era of high-intensity laser experiments,” adds Cameron Geddes, director of the LBNL’s accelerator technology and applied physics division. 

Quasars, exoplanets and the atmospheres of distant worlds: more on the first results from the JWST

It was an active final day at the First Science Results from JWST conference at the Space Telescope Science Institute in Baltimore, US, where discussion turned to some incredible observations of quasars above redshift 6, showing them as they existed more than 12.7 billion years ago.

As the compact cores of galaxies with extremely active supermassive black holes, we know that quasars can shine many times brighter than their host galaxy. In his presentation, John Silverman of the University of Tokyo described how data from the JWST’s CEERS (Cosmic Evolution Early Release Science) survey is following up on a dozen high-redshift quasars originally identified by the Subaru Telescope on Mauna Kea.

Throughout the conference, astronomers have joked that high redshift no longer means what it used to mean. Before the JWST came along, high redshift for the Hubble Space Telescope meant resolving the host galaxies of quasars out to about redshift 2, or roughly 10 billion years in the past. Now, the JWST is resolving the structures of host galaxies around quasars at redshift 6 (almost 12.7 billion years ago).

Much happened in the universe between redshifts 2 and 6, and astronomers are keen to see if the ratio of the mass of a supermassive black hole at the centre of a galaxy relative to the mass of its host galaxy (or more specifically the stellar mass of the galaxy’s bulge) still holds at the highest redshifts. The answer will tell us about the conditions under which supermassive black holes and galaxies formed, and how they affected each other’s growth.

The mass ratio between a supermassive black hole and the bulge of a galaxy around it is 1:200, with this value believed to be connected to feedback from the black hole in the form of outflows of radiation spewing out as it accretes matter. The relationship was first quantified by observations with the Hubble Space Telescope in the 1990s, with Silverman calling it “fundamental.”

It turns out that high redshift galaxies do indeed also stick to this relation. Silverman said that astronomers have targeted redshift 6 because it’s at this redshift that simulations of galaxies tend to differ the most. What astronomers really need is some hard and fast data to input into the simulations, and the JWST has been happy to oblige.

The typical galaxy hosting a quasar at this redshift is just 8% as luminous as the quasar. However, it’s actually possible to take the glare of a quasar out of the image – since the quasar itself appears point-like, it manifests as diffraction spikes that can be removed by a point spread function.

The JWST finds the galaxies to be fairly compact and disc-shaped, with surprisingly well-defined spiral arms and central bars just a billion years after the Big Bang. In her talk, Madeline Marshall, of NRC Herzberg in Victoria, Canada, discussed the first high-redshift quasar results from the JWST’s Near-Infrared Spectrometer (NIRSpec), finding these black holes to weigh billions of solar masses, and the mass of their host galaxies to be in the region of hundreds of billions, therefore seeming to maintain the mass ratio observed at lower redshift.

How exactly black holes grew to be so massive so early in the universe is still under debate, but hopefully the JWST will start to provide some answers. Just to give an indication of the telescope’s power, the JWST’s resolution is so fine that some of the quasar images show companion galaxies merging or interacting with the main galaxy, sporting tidal tails and bursts of star-formation at a rate of 30–50 solar masses per year.

Exoplanets and protoplanetary discs

Earlier in the day, exoplanets and protoplanetary discs came under the spotlight. Olivier Berné of the Institut de Recherche en Astrophysique et Planétologie in Toulouse revealed a solution to how planets can form in the ultraviolet-radiation-rich environments of large star clusters.

These star clusters produce their fair share of hot, young, massive stars that emit lots of ultraviolet radiation that ought, in principle, to erode protoplanetary discs around neighbouring lower mass stars. Berné reported how JWST astronomers, working with colleagues from the Atacama Large Millimeter/submillimeter Array, have observed the chemistry of these vulnerable discs and discovered a warm envelope of molecular gas surrounding them.

The envelopes are rich in polycyclic aromatic hydrocarbons, which have a strong infrared spectral signature that stands out to the JWST. They also have a high ultraviolet opacity, so they are able to block a lot of the harmful ultraviolet from outside a disc, protecting the early stages of planet formation.

Inside a planet-forming disc

One protoplanetary disc where planet-formation has proceeded quite far is PDS 70. It hit the news in 2018 and 2021 when astronomers using ALMA were able to image rings in PDS 70’s disc that appear to have been carved out by two young planets.

Giulia Perotti of the Max Planck Institute for Astronomy in Heidelberg revealed how the JWST can now measure chemistry within the inner region of PDS 70’s protoplanetary disc. It appears to be enriched with small dust grains that have been thermally processed, possibly by outbursts from the young star. The inner disc, meanwhile, is warped, possibly from the influence of another, unseen planet. Chemically, water and oxygen have also been detected in the disc. PDS 70 continues to be our best-studied example of planets forming within a disc of gas and dust.

WASP atmospheres

The transmission spectrum of an exoplanet taken by JWST.

Meanwhile, Kevin Stevenson of Johns Hopkins Applied Physics Laboratory updated delegates on the JWST’s observations of the atmospheres of older exoplanets. First, he recounted the space telescope’s observations of WASP-39b – a “hot Jupiter” 700 light-years away.

These observations were made as WASP-39b was transiting its star, with some of the star’s light being absorbed by atoms and molecules in the planet’s atmosphere as it passed through. Using this “transmission spectroscopy”, the JWST detected carbon monoxide, potassium, sodium and water in the atmosphere of WASP-39b, as well as sulphur dioxide, which is a product of photochemistry.

It’s the first time photochemical processes, in which radiation from the star alters molecules, have been detected on any exoplanet. The absence of a strong methane line at 3.3 microns is also evidence that photochemistry is transforming methane into other molecular species.

Stevenson then went on to preview results from another hot Jupiter – the planet WASP-43b, which lies 284 light-years away. When the JWST’s predecessor, the Spitzer Space Telescope, observed WASP-43b it could not detect any thermal emission from the planet’s night-side, which means it must be cold, beyond the limits of Spitzer to detect.

Stevenson revealed that the JWST had now detected this faint thermal emission, and – although he couldn’t give details – he described how making this measurement and measuring the temperature of the night-side would allow scientists to better constrain the properties of the tidally locked planet’s atmosphere.

Tantalizing TRAPPIST-1

We also heard new findings from the TRAPPIST-1 planetary system, which consists of seven planets in orbit around a red dwarf star 40 light-years away. Björn Benneke of the University of Montreal revealed that the JWST had performed reconnaissance of the atmospheres of some of the worlds of TRAPPIST-1.

While he wasn’t able to say anything yet about what the JWST had positively detected in their atmospheres, he did reveal that the seventh planet, TRAPPIST-1g, probably does not have a thick atmosphere rich in hydrogen. This would seemingly rule out it being a so-called ‘Hycean’ world, consisting of an ocean kept warm by a thick swathe of hydrogen. Since planet ‘g’ is at the very outer edge of TRAPPIST-1’s habitable zone, it might mean that without a thick insulating atmosphere, TRAPPIST-1g could be too cold to be habitable to life as we know it.

The three-day conference was an exciting preview of how the JWST is beginning to transform astronomical research and allow us to detect things that were completely beyond astronomers until now. Sometimes the conference presentations were frustratingly light on details – many said they’d have more to say next year, particularly at the 241st meeting of the American Astronomical Society (AAS) on 8–12 January in Seattle.

We have to remember, though, that the JWST has only been collecting data for barely six months. Given the complexity of both the telescope and the information it is collecting, astronomers are making sure to take care with their findings. If the preliminary results from this first JWST science conference are any indication, then the next few years could be some of the most exciting times ever for astrophysicists, cosmologists and planetary scientists.

Construction begins on the €1.3bn Square Kilometre Array

On-site construction for the €1.3bn Square Kilometre Array (SKA) began in Australia and South Africa on 5 December for what will be the world’s largest radio-astronomy infrastructure when complete in 2028. Work began 18 months after the Square Kilometre Array Observatory (SKAO) Council gave the green light for the facility. 

First conceptualized 30 years ago, the SKA project underwent several years of design and engineering work. The SKAO – an intergovernmental organization with 16 partner countries including eight members – will manage the construction and operation of the telescope from its headquarters at Jodrell Bank in the UK.

South Africa will have 133 SKA dishes during this phase, which will be added to the existing 64 that belong to the SKA-precursor telescope – MeerKAT – to form a mid-frequency instrument. Australia will host a low-frequency array of 131,072 antennas, enlarging the area covered by radio frequencies from the two telescopes.

The first two antenna stations are due to be complete by May 2023, while the first dish is set to be installed in April 2024, followed by three to four dishes each month.

On 5 December an event was held at the site of the SKA-Low telescope in Western Australia, attended by Philip Diamond, SKAO director-general. Meanwhile, SKAO council chair Catherine Cesarsky appeared at a ceremony in South Africa’s Northern Cape province where the SKA-Mid telescope will be built.

So far about €500m has been allocated towards construction, with more than 40 contracts worth more than €150m having been delivered over the past 18 months.

During the ceremony to mark the start of construction, Australia’s minister of science and industry, Ed Husic, along with South Africa’s minister of science and innovation Blade Nzimande jointly announced more than €200m for Australian and South African companies to deliver some of the extensive infrastructure required for the telescopes.  

Metallic snowflakes and a new spin on the curveball

The spin imparted on a baseball by a pitcher plays a crucial role in the ball’s trajectory – and how easy it is for the batter to hit the ball. If the ball has lots of spin, the Magnus effect will cause it to take a curved path towards the batter – with the strength of the curve depending on the spin of the ball. So, if a pitcher can vary the spin between pitches, they can confuse the batter and make them strike out.

The opposite strategy is the knuckleball, whereby little or no spin is imparted to the ball. This results in the ball following an erratic trajectory – making it difficult to hit. However, this can be a very risky strategy because the pitcher has little control over the trajectory, and the ball could end up outside of the strike zone.

Some professional pitchers are better than others when it comes to controlling the spin of a baseball. As a result, some players may be tempted to put a foreign sticky substance on their hand to get a better grip on the ball – something that is banned in Major League Baseball, with the exception of rosin.

Level the playing field

Now researchers at Tohoku University in Japan have looked at how sticky substances such as rosin affect friction between fingers and baseball leather. Not surprisingly, Takeshi Yamaguchi, Daiki Nasu and Kei Masani found that the substances increased friction. However, they also discovered that rosin – which can be used by pitchers – increases the friction in a consistent way when different people are tested. As a result, its use tends to level the playing field.

The trio also found that baseballs used in Japan imparted more friction than those used in the US. By increasing the friction of US balls, they suggest, American pitchers may not be tempted to cheat by using sticky substances.

The research is described in Communications Materials.

This is the last Red Folder of the year before we break for the festive season. So I am going to end with news that researchers in Australia and New Zealand have created tiny metallic snowflakes (see figure). The Aussie part of the collaboration grew the crystals by dissolving a number of different metals in gallium – which is a metal that is liquid at just above room temperature.  Then, their Kiwi partners did computer simulations to investigate why different metals formed differently-shaped snowflakes.

They report their findings in Science.

From the secrets of supernovae to the oldest planets in the universe: the first results from the JWST

The life and death of stars; setting sights on minor asteroidal and cometary bodies in our solar system; the oldest planets in the universe; and the secrets of supernovae – a rollercoaster ride of new discoveries was revealed on day two of the “First Science Results from JWST” conference, held at the Space Telescope Science Institute in Baltimore, US.

Stellar nurseries on Cosmic Cliffs

One of the first images to be released from the JWST in July this year was that of the “Cosmic Cliffs” – a section of the Carina star-forming nebula located about 8500 light-years away. Previously imaged by the Hubble Space Telescope, the so-called cliffs are made of molecular gas encircling a giant bubble blown in the nebula by the stellar winds and ultraviolet light of five luminous O-type stars, which are the hottest and most massive type of stars to exist. Embedded within the wall of gas are nascent stars. As astronomer Megan Reiter of Rice University explained, the JWST has now identified 24 new outflows from young stars that are still growing by accreting matter from their surroundings. By tracing these outflows, Reiter and her colleagues are able to locate the sites of star formation within the nebula.

The aim of their observations is to better understand what is triggering star formation in the nebula, said Reiter. Do the winds of the five O-type stars compress the gas in the Cosmic Cliffs on the edge of the bubble and trigger star formation that way, or are the knobbly “cliff tops”, where some young stars are forming, simply made from denser regions of gas that have survived the erosional onslaught of the ultraviolet radiation? As of now, more observations are needed, but the JWST is more equipped than Hubble was to answer these question in the future.

Icy building blocks

Young stars are surrounded by proto-planetary discs, where all kinds of complex carbon chemistry can occur, depending on the distance from the star (and hence the temperature), as well as the density of the gas. Yao-Lun Yang of RIKEN in Japan and the University of Virginia, US, showed how the JWST observations of a young protostar catalogued as IRAS 15398-3359 revealed the spectral signature of ices containing complex organic molecules in the disc around the star.

The JWST’s mid-infrared spectrum of the star contains unprecedented detail, showing molecules such as ethanol, methanol, methane and dimethyl embedded within ices in the disc. These molecules are potentially the building blocks for all kinds of carbon chemistry, including life. “The crazy thing to us is that there is so much detail [in the spectrum],” said Yang.

Delayed detonations

Things got explosive with Chris Ashall’s presentation on a type Ia supernova, SN 2021aefx, which exploded in 2021 in the galaxy NGC 1566, located about 33 million light-years away. Type Ia supernovae mainly involve the death of a white dwarf star, but there are several permutations that could give rise to such a supernova – from the destruction of a single white dwarf to a binary merger. There are also questions about whether the explosion detonates in the core of the white dwarf, or whether it is sparked in its outer shell before travelling into the core.

Ashall, of Virginia Tech, talked about how the key to figuring out the dynamics of the SN 2021aefx explosion was in the JWST identifying the location of certain critical ionized elements in the supernova remnant. The JWST was able to detect emissions from doubly ionized cobalt that had decayed from quantities of nickel-56 that had formed in the violence of the supernova explosion. The location of the cobalt, and hence the nickel, was not at the centre of the explosion, but offset from the centre in the outer layers, whereas the JWST did see an abundance of argon at the centre.

There were also spectral lines of nickel-58, which forms in high density regions, suggesting that the white dwarf that exploded had a mass of at least 1.2 times the mass of the Sun – this is nearly as massive as a white dwarf can get (the Chandrasekhar limit of 1.44 solar masses). Comparing the location of the cobalt, argon and nickel lines, Ashall and his colleagues saw that it matched simulations of delayed detonation supernova, wherein a single white dwarf accreting matter from a companion star experiences a wave of intense internal heating that reverberates around the star before it explodes.

Water on the main

Minor bodies in our own solar system were also under the spotlight, particularly the weird objects that astronomers call “main belt comets”. These are objects that behave like icy comets, with tails and comae – but which leisurely orbit around the Sun, in the main asteroid belt between Mars and Jupiter, along with myriad inert rocky asteroids. Their origins and the mechanisms that drive their pseudo-cometary behaviour are still a mystery, one that the JWST has now shed a little light on.

Michael Kelley of the University of Maryland reported on the JWST’s observations of the main belt comet 238P/Read, detecting the spectral signature of outflows of water vapour, but no carbon dioxide, the absence of which is considered unusual. The lack of carbon dioxide is a big clue as to the origin of this particular main belt comet. Either the carbon dioxide was baked out of the comet after it arrived in the asteroid belt, or it never had carbon dioxide in the first place.

“We think it’s more likely that carbon dioxide was never accreted,” said Kelley. This suggests that 238P/Read represents a new class of cometary body that has become trapped in the asteroid belt.

The oldest ones

The solar system formed about 4.5 billion years ago, but the universe is 13.8 billion years old. How soon after the Big Bang could planets form around stars? One place to try and find the answer is globular clusters, which are ancient, having formed 12–13 billion years ago. Matteo Correnti of STScI, spoke about sifting through the stars of one globular cluster known as 47 Tucanae, in search of white dwarfs with anomalous infrared excesses. A white dwarf is the remains of a Sun-like star that has expired, puffing off its outer layers into deep space to leave an inert, hot core. The fluctuating gravitational tides that occur during this slow process of star death can disrupt orbiting planetary systems, smashing them apart and resulting in planetary debris falling onto the surface of the white dwarf. This has actually been observed on younger white dwarfs in the Milky Way, and the planetary debris has a signature in the infrared wavelengths that the JWST observes at.

The stars of 47 Tucanae are calculated to have formed 13.06 billion years ago, so any planets that once existed around one of these stars that became a white dwarf will also be 13.06 billion years old. Correnti revealed that so far they’ve discovered one possible candidate white dwarf with signs of planetary debris. If confirmed, the discovery would be far-reaching, proof that rocky planets could form in the universe less than a billion years after the Big Bang. If there were planets, is it therefore plausible that there could have also been life at this early time?

  • Keith Cooper’s next blog post will report on the third day of the conference, covering the JWST’s observations of quasars – some of the brightest objects in the universe – as well as numerous exoplanetary updates.

Compact radiation detector could expedite the use of dynamic PET

Dynamic positron emission tomography (PET) is a medical imaging technique that tracks both the spatial and temporal pattern of radiotracer uptake. The approach provides information that can’t be gained by conventional static PET, such as perfusion and diffusion information and pharmacokinetic parameters that help scientists understand how long a tracer stays somewhere in the body, where the tracer came from, and where it goes afterward.

Dynamic PET has been used in cardiology, treatment response assessments, theranostics applications, drug-discovery research, diagnosis and research on neurodegenerative diseases, and more. It can generate multiple types of PET images with complementary information in a single imaging session, such as influx rate, distribution volume and conventional standardized uptake value (SUV)-equivalent images.

Yet, dynamic PET to date is largely confined to academic research centres.

That’s in part because to create a dynamic PET image series, scientists need to obtain an image-derived arterial input function – the radiotracer concentration in the patient’s arterial blood plasma as a function of time. The current gold standard for arterial blood sampling is an invasive and expensive procedure that requires many clinical resources, including anaesthesiologists and surgical space.

Several research groups, including Shirin Abbasinejad Enger’s team at the Jewish General Hospital, McGill University, are working to change that.

“We’re trying to simplify this clinical workflow by removing the invasive arterial blood sampling step,” says Liam Carroll, a PhD student in Enger’s group. “We can measure this radiation by placing a radiation detector on a patient’s wrist, which gives you the same information you would normally collect with arterial blood sampling.”

Carroll and Enger have developed a cost-effective radiation detector that acquires the arterial input function non-invasively. Patients who have been injected with a radiotracer place their wrist on the detector, which measures the radiation escaping from the patient’s wrist. Then, an algorithm is used to derive the arterial input function from the detector data.

The researchers introduced a detector prototype in 2020. Since then, they’ve been working to optimize detector size and geometry and develop a clinical data analysis chain to accurately separate arterial and venous components of arterial input functions. A recent study reported in Medical Physics presents their latest results.

“We would like to provide any centre that already has a PET scanner with an affordable detector that they can use for dynamic PET scanning, if needed,” says Enger. “So one of the first decisions we made was that this detector needs to be affordable such that clinics with less resources can also acquire dynamic PET images, because any PET scanner can be used for dynamic imaging.”

Detectors designed by other research groups, Enger says, are larger and less cost-effective. Several approaches use block crystal scintillators, while others arrange detector elements in rings, similar to a miniature PET detector. While these detectors may have higher detection efficiencies, they also may cause artefacts when placed in the field-of-view of the PET scanner, she explains. Placing detectors elsewhere on the body, such as the leg, still may not be suitable for whole-body dynamic PET scanning.

Carroll and Enger’s detector couples silicon photomultipliers to low-density scintillating fibres surrounded by a 3D-printed plastic shell. The detector has two layers of narrow scintillating fibres, which maximizes the total and intrinsic efficiency of the detector while keeping manufacturing costs and electronic complexity relatively low. Detector efficiency in the first row of scintillators ranges from approximately 53% to 84% for positrons and 51% to 52% for annihilation photons (total efficiency ranges from approximately 62% to 74%) across four radioisotopes used in a variety of dynamic PET applications (fluorine-18, carbon-11, oxygen-15 and gallium-68).

The clinical data-analysis pipeline uses an expectation-maximization maximum-likelihood algorithm. It includes counts from 511 keV annihilation photons, Carroll says, because positrons emitted from low-energy positron emitters do not reach the detector. The research team is planning to incorporate radial artery and wrist diameters obtained during ultrasound imaging to improve their data-analysis pipeline.

Carroll and Enger have already built and tested a prototype detector and have founded a start-up, BetaScint Imaging, to commercialize it. They are currently preparing to validate the detector’s performance in vivo in patient clinical trials, which will begin in early next year.

“Dynamic PET studies right now are benefiting a very small percentage of patients who need it,” Enger says. “Our hope is that more patients will benefit from this imaging technique in the future.”

‘World’s smallest photon’ confined in dielectric nanocavity

Researchers have confined light to dimensions smaller than the diffraction limit in a nanosized dielectric cavity for the first time. The work, which confirms a theoretical prediction made in 2006, could promote the development of new optical chip architectures that consume less energy than their electrical counterparts.

Classical optics theory states that light cannot be focused into a volume smaller than a cube with a side-length of half its wavelength. This is the diffraction limit, and it restricts the resolution of optical microscopes. In recent years, however, researchers used metal nanoparticles to compress rather than focus light. This compressed light is more intense and interacts more strongly with matter.

The problem with metal nanoparticles, however, is that they absorb light as well as compressing it, leading to optical losses. Particles made from dielectric materials ought to be better, since they do not absorb light as strongly, and in 2006 a team led by Michal Lipson at Columbia University in the US showed that substituting them should, in theory, work.

Topology optimization

In the latest study, researchers in the NanoPhoton Center at the Technical University of Denmark (DTU) fabricated their nanoscale optical cavity from silicon, the dielectric workhorse material of modern information technology. Like other such cavities, the new nanostructure is designed to retain light by reflecting it back and forth — as if between two mirrors — so that it does not propagate as it usually would.

To design the cavity, the researchers used a technique called topology optimization that was pioneered by team member Ole Sigmund, who initially used it to design bridges and aircraft wings. “Rather than starting with a design concept and then perhaps adding some elements of numerical optimization around this starting point, we let a computer find the optimum design – that is, the one that compresses light most intensely in the optical cavity,” explains team leader Søren Stobbe.

The resulting computer-generated cavity design features a bowtie-like structure at its centre that spatially confines the light. A ring-like structure surrounding the bowtie helps boost the cavity’s quality factor – an intrinsic property of resonators that relates to the strength of loss mechanisms.

Nanofabrication challenges

Fabricating this design was difficult, Stobbe says. To construct it, they had to build an 8 nm silicon bridge in the centre of the bowtie structure, which in turn had to be fully etched into the 220 nm silicon device layer with near-vertical sidewalls. This would be a demanding nanofabrication task on its own, but the researchers also had to address an even more important challenge: contrary to conventional nanocavities based on, for example, photonic crystals or micropillars, the critical dimension plays a key role for bowtie cavities.

“Indeed, the mode volume of the cavity depends on how small the features a given fabrication process enables,” Stobbe tells Physics World, “but changing the process also changes the optimum design. We solved this by measuring the fabrication constraints and including these in the topology optimization. This approach, which is a first in any field of research or engineering, ensures that we get the smallest possible mode volume given by our fabrication process.”

The work could make it possible to develop energy-saving optical chip architectures for components in data centres, computers and telephones, the researchers say. They are now exploring several new directions, including implanting light emitters inside the silicon. “This would allow [us] to directly measure the enhancement of light–matter interactions over the large bandwidths enabled by our cavities,” Stobbe explains.

Another aspect under investigation will be to push the critical dimension of the cavities, which are already close to the size limit possible. This will require entirely new methods for silicon nanofabrication using self-assembly, which appear to be extremely promising, Stobbe reveals.

The present work is detailed in Nature Communications.

How to deflect an asteroid: DART’s Andrew Cheng on the Physics World Breakthrough of the Year

This episode of the Physics World Weekly podcast features an interview with Andrew Cheng, who is a lead scientist on the Double Asteroid Redirection Test (DART) space mission. In September 2022 the DART spacecraft smashed into an asteroid and was successful in changing the orbit of that near-Earth object.

DART was conceived and executed by NASA and an international team led by the Johns Hopkins Applied Physics Laboratory – and they are the winners of the Physics World 2022 Breakthrough of the Year Award.

Cheng is based at Johns Hopkins and he recalls the final moments in mission control before the impact, which he describes as “one of the greatest moments of my life”. He explains how the DART mission came together and talks about how we could defend Earth from asteroid impacts in the future.

This is the final Weekly podcast of 2022. Thanks for listening and we will be back on 5 January with the first episode of 2023.

Satellites observe highest volcano plume ever

The eruption of the Hunga Tonga-Hunga Ha’apai volcano in 2022 was the tallest ever recorded, with a volcanic plume that reached almost 58 km in height. According to physicists at Oxford University and RAL Space in the UK, who measured the plume using data from a trio of geostationary satellites, it was also the first to pass through the Earth’s stratosphere and enter the lower mesosphere. The measurements, which are described in Science, shed new light on how volcanic eruptions affect the climate and reveal fresh information about a layer of Earth’s atmosphere that remains poorly understood.

Hunga Tonga-Hunga Ha’apai is situated in the Tongan archipelago in the southern Pacific Ocean. On 15 January 2022, it violently erupted, ejecting an extremely tall column of ash, gases and water into the atmosphere. Until now, however, no one knew exactly how tall it was.

The parallax effect

To measure the plume’s height, physicists led by Simon Proud used parallax, which is the same mechanism our two eyes use to perceive depth. The three weather satellites observed the volcano from different positions, imaging it from multiple angles. These images, which the satellites recorded every 10 minutes during the eruption, enabled the physicists to build a three-dimensional picture of the top of the plume.

“Usually, plume heights are calculated by comparing the plume temperature (measured by a satellite) to the atmospheric temperature (from weather models),” Proud explains. “This eruption went so high that the temperature technique is too inaccurate. Since the parallax method needs nothing more than simple geometry, it is well suited to measuring these types of extreme eruption where other approaches fail.”

Proud adds that parallax-based estimations are only possible with good satellite coverage. They therefore wouldn’t have been possible a decade ago.

An altitude of nearly 58 kilometres

The team’s measurements showed that the plume reached an altitude of nearly 58 kilometres at its highest extent, breaking records set by the 1991 eruption of Mount Pinatubo in the Philippines (40 km) and the 1982 El Chichon eruption in Mexico (31 km). It is also the first plume ever recorded as reaching the mesosphere, which starts at roughly 50 km above the Earth’s surface.

“The mesosphere aspect is important since this part of the atmosphere is usually very dry,” Proud says. “The Hunga-Tonga eruption put water, ash and sulphur dioxide there.”

The amount of water ejected into the mesosphere could have implications for the climate as it might lead to surface warming, he adds. “Because we don’t have much knowledge of the upper atmosphere, the eruption also serves a bit like a natural laboratory, to help us better understand the processes up there.”

Proud and colleagues say they would now like to use their parallax method to study other eruptions. Their goal is to develop a dataset of plume heights that volcanologists and atmospheric scientists can use to model how volcanic ash disperses in the atmosphere. “The technique could be used to measure eruptions (and even severe thunderstorms) as they happen,” Proud tells Physics World.

Photons, protons or electrons: which will bring FLASH radiotherapy to the clinic?

With the first clinical trial of FLASH radiotherapy reported earlier this year, using protons to deliver high-dose rate radiation to bone metastases, does proton-based FLASH offer the greatest potential for clinical translation? Or will we see the electron-based FLASH – as used for most preclinical studies to date – and the emergence of very high-energy electrons (VHEEs) lead the way? Or perhaps the development of advanced linear accelerator technologies could bring photon-based FLASH into the clinic?

In a balloon debate at the recent FLASH Radiotherapy and Particle Therapy Conference (FRPT 2022), experts in each field stated their cases for each option. And the audience got to decide “What has the best long-term potential for FLASH for clinical application?”

Pushing photons

The first speaker was Billy Loo from Stanford University, who argued the case for photon-based FLASH. Loo is part of a multidisciplinary research group that’s developing a range of novel linear accelerator technologies including very high-current photon linear accelerators, as well as compact very high-energy electron linacs and proton linacs.

“We are working on all of these technologies, yet which one are we pushing forward into the clinic as our main emphasis area? It’s really the photon technology,” he said.

So why does Loo believe that photons offer the most promising path forward? He explained that in last few decades, radiotherapy has made huge advances in dose conformity, using intensity-modulated radiotherapy and stereotactic techniques. “Dose conformity is key,” he stated. “We don’t want to give up decades of huge gains in therapeutic index and conformity to adopt FLASH; we really need both.”

One way to achieve highly conformal FLASH therapy could lie in a power-efficient linear accelerator technology that Loo and collaborators are currently developing, which can produce much higher beam current (30 times that of current clinical linacs) to enable high-dose rate X-ray therapy. To deliver beams rapidly in multiple directions and achieve the desired conformity, the prototype PHASER system incorporates 16 of these linacs, resulting in around 500 times the beam current of existing systems.

Loo cited various requirements for performing conformal FLASH, many of which are met by all three particle types. But critically, he added, you need multiple beam angles. “I’ve shown the strategy for that with photons; whether that can be done practically for electrons and protons is much less clear.”

Finally, Loo pointed out that the clinical impact of a treatment technology also relies on its accessibility, size and cost, where photon-based systems could have a clear advantage. He noted that while proton therapy is a mature technology, looking at the number of patients treated per year, this is 3–4 million for photons compared with about 22, 000 for protons, and for VHEE, none.

Promoting protons

Next up, John Perentesis from Cincinnati Children’s Hospital described why protons have the edge when it comes to delivering FLASH.

“In terms of the physics, and how that eventually impacts the biology, protons really do have favourable spatial characteristics,” he explained. “They have deeper tissue penetration and much tighter penumbra than electrons and less propensity for hotspots. And with the advent of Bragg peak FLASH, there’s the opportunity for no exit dose, in contrast to X-rays, and a lower integral dose while maintaining conformality.”

Another key argument is that proton FLASH is already here. Perentesis cited the FAST-01 and FAST-02 trials of proton-based treatment of bone metastases using the FLASH-enabled ProBeam system. FAST-01 has completed and demonstrated the successful and reliable use of the pencil-beam scanning gantry in transmission. “We can treat deep-seated tumours with FLASH today with currently available proton accelerator technology,” he said, noting that Bragg peak FLASH should be achieved in the near future – it’s just an engineering challenge to implement.

Perentesis finished his presentation with the argument of practicality. Conferring with the audience revealed that most believe there will be some sort of FLASH treatment in the next five years, but nobody thinks FLASH will completely replace traditional radiation oncology.

“Protons are the Swiss army knife of radiation oncology,” he said, quoting physicist Anthony Mascia. “They can switch back and forth between ultrahigh dose rate and standard dose rate, can do deep-seated and superficial tumours, they are highly conformal, and at the end of the day they are here and ready.”

Endorsing electrons

Finally, Jean Bourhis from Lausanne University Hospital (CHUV) proposed that VHEE and electrons are the way forward for the long term. He described two straightforward indications for electron-based FLASH: cutaneous tumours and intra-operative radiotherapy (IORT), noting that for both of these, the technology is ready, there is strong biological evidence and clinical trials are ongoing.

Current electron FLASH trials include a dose escalation study, IMPulse, treating resistant melanoma, and a randomized study comparing FLASH and conventional dose rates in skin cancer. Bourhis pointed out that both of these use a curative radiation dose. In addition, IORT trials and two further randomized trials for skin cancers are planned. “No doubt, if we are successful here, electron FLASH will be part of the long-term standard treatment,” he said.

Another scenario – and one with the greatest unmet clinical need – is to use electron FLASH to treat large and deep-seated tumours. To achieve this requires a treatment field of at least 15×15 cm at 15 cm depth, while maintaining FLASH parameters of 10–20 Gy delivered in less than 100 ms.

This is where VHEE comes in. Even without FLASH, VHEE offers potential advantages including: reduced sensitivity to heterogeneities; ease of beam acceleration and scanning; and potentially superior dosimetric properties to X-ray VMAT. Adding FLASH could make VHEE a very interesting tool, said Bourhis. “With VHEE, we can already implement parameters which are known to produce a FLASH effect.”

Bourhis also highlighted the rapid advances in accelerator technology that could enable more compact and lower-cost devices, citing the VHEE-based FLASH system being developed by CERN, CHUV and THERYQ, which will offer treatment at 100–140 MeV from several directions in less than 100 ms.

“We already have possible long-term adoption of electron FLASH, depending on the outcome of the ongoing clinical trials,” he concluded. “And for VHEE FLASH, we have the potential now to have high-performance, compact, low-cost, competitive technology. It’s likely the greatest potential for the long term.”

Let the people decide

Following the presentations, the audience were tasked with selecting their favoured particle, with the top two continuing into the second part of the debate. Protons went into an early lead, with photons and electrons continually switching between second and third place. When the voting closed, however, electrons had edged ahead and the final results were 37% for protons, 33% for electrons, and 30% for photons. As such, the photon balloon was popped, and Perentesis and Bourhis embarked upon their final arguments.

“The fact that proton technology is here, it’s tractable and it’s generating similar biological results to electrons – those are strong points,” said Perentesis, noting that there’s also an opportunity for parallel companion trials, for example with protons and electrons, to learn from the similarities and differences between the two.

“We know the advantage of protons are clear, but we know that the uptake of protons after 40 years is only 1–2% of the patients in our country that we treat with radiotherapy. It’s very unlikely protons will change the game for FLASH, not for the long term,” countered Bourhis. “We have this VHEE opportunity that we shouldn’t miss. “We’ve not studied this before in the clinic, but now is the right time to do this because of all the advantages that already exist with VHEE compared with protons.”

Bourhis apparently made a highly persuasive argument, with the final vote swaying the audience towards electrons, which came out as the winner with 53% of the vote. The proton balloon was popped, leaving the electron balloon intact. As for whether the balloons foretell the future, for now it’s likely worth keeping a close eye on the development and implementation of all three FLASH technologies.

Copyright © 2025 by IOP Publishing Ltd and individual contributors