Skip to main content

Deep learning enables safer heart scans with lower radiotracer dose

De-noising PET scans

Positron emission tomography (PET) with the radiotracer 18F-FDG provides an important tool for assessing the health of the heart muscle in patients with ischemic heart disease, in which narrowed coronary arteries reduce the heart’s blood supply. Such PET scans help identify the level of damage to the heart muscle and play an important role in clinical decision making.

Current guidelines recommend injecting a 200–350 MBq dose of 18F-FDG. But lowering this tracer dose will decrease the patient’s radiation exposure – an essential goal of any diagnostic procedure – as well as reducing imaging costs and potentially opening up new applications. The downside, however, is that a lower tracer dose may lead to poorer quality images, thereby reducing diagnostic accuracy.

One approach proposed to address this problem is to employ artificial intelligence algorithms to restore image quality. Researchers from Rigshospitalet in Denmark have now investigated the use of deep learning to reduce noise in low-dose PET images. They validated the diagnostic accuracy of this approach using 18F-FDG images of patients with ischemic heart disease, detailing their findings in Physics in Medicine & Biology.

First author Claes Nøhr Ladefoged and colleagues retrospectively examined 168 patients referred for cardiac viability testing using 18F-FDG-PET/CT. Patients received approximately 300 MBq of 18F-FDG and one hour later underwent a low-dose CT scan followed by a thoracic PET scan.

The researchers reconstructed both static and ECG-gated (with eight gates) PET images. They also simulated dose-reduced images with 1% and 10% of the total counts, corresponding to tracer doses of 3 and 30 MBq, respectively. They then trained U-Net, a 3D convolutional neural network developed for biomedical image segmentation, to de-noise the four sets of dose-reduced PET images (static and gated data with the two dose reduction thresholds).

Clinical metrics

Diagnosis of patients with ischemic heart disease is based on several factors, including estimates of end-diastolic volume (EDV), end-systolic volume (ESV), left ventricular ejection fraction (LVEF) and FDG defect extent (deviation from inter-subject normal perfusion). Patients with normal myocardial perfusion usually have low extent scores and high LVEF, although there’s no specific threshold.

The researchers compared full-dose, dose-reduced and de-noised dose-reduced PET images from 105 patients. Using Corridor4DM software, which automatically segments the left ventricle, they extracted values of EDV, ESV and LVEF from the gated images, and FDG defect extent from the static images.

For EDV and ESV measurements, the full-dose and 1% dose-reduced PET images matched well, with a correlation coefficient of above 0.93, which increased to above 0.98 with de-noising. Significantly, for LVEF, de-noising increased this correlation from 0.73 to 0.89. In the 10% dose-reduced images, the team saw excellent correlation across all metrics with only minor improvements after de-noising. They note that none of the de-noised images were significantly different from the full-dose images.

The accuracy of diagnosis, based on European Society of Cardiology guidelines that define normal LVEF as 50% or above, improved after de-noising the dose-reduced images. When using the 1% dose-reduced images, 13 patients had a different diagnosis to that suggested by the full-dose measurement. De-noising improved this to just two patients. For the 10% dose-reduced images, five patients had discordant diagnosis before de-noising and all diagnoses agreed after de-noising.

The researchers note that the FDG defect extent score was, on average, only moderately affected by the dose reduction, with even the 1% dose-reduced images providing similar scores to the full-dose images. This is likely because this metric is measured from static PET images, in which all true coincidence events are used. In contrast, ESV and EDV measurements are taken from gated PET images, which only include one eighth of the counts in each gate, resulting in greater noise.

The reduced-dose images also exhibited a marked improvement in image quality after de-noising. Comparing standardized uptake value (SUV) measurements for the 1% dose-reduced and the full-dose static images showed considerable bias in the dose-reduced images. After de-noising, however, they exhibited near-identical SUV. SUV in the 10% dose-reduced images largely resembled those of the full-dose images, but were further improved using the de-noising model.

The researchers conclude that their deep-learning noise-reduction model enables significant 18F-FDG dose reduction in cardiac PET imaging without losing diagnostic accuracy. “A reduction to one hundredth of the dose is possible with quantitative clinical metrics comparable to that obtained with a full dose,” they write. “This dose reduction is important for patients, staff, general radiation protection and healthcare economy.”

Plasmonic metasurface gives high-speed optical WiFi a boost

Physicist and engineers at Duke University in the US have developed a new metamaterial that could substantially increase the speed of wireless optical communications. The material, which consists of an array of “nanoantennas” made from cubes of silver just 60 nm wide, can capture light within a 120-degree field of view and relay it into a narrow angle with a record-high efficiency of around 30%.

Although light in the visible and infrared part of the electromagnetic spectrum carries more information per unit time than the radio waves used in wireless technologies such as Bluetooth and WiFi, data transmission at visible and infrared wavelengths is currently restricted to fibre-optic cables. One reason for this is that wireless receivers must be able to capture light from different directions simultaneously. The simplest way to do this is to make the receivers physically bigger, but that reduces the speed at which they can transmit information onwards, thus lessening any advantage.

In 2016, researchers at the Connectivity Lab (a subsidiary of Facebook) developed a new type of receiver that could, in principle, be used for wireless communications at optical frequencies. Their device consisted of a spherical bundle of fluorescent fibres that captured blue light and re-emitted green light that could then be funnelled into a small receiver. However, it could only transmit two gigabits (Gb) of information per second, which compares poorly with standard fibre-optic providers (which typically offer around 10 Gbs) as well as high-end systems offering 1000s of Gbs.

Speeding up

A team led by Maiken Mikkelsen has now used the physics of surface plasmons to speed up the Connectivity Lab’s design. Plasmons are quasiparticles that arise when light interacts strongly with electrons in a nanostructured metal, causing the electrons to oscillate collectively. By adjusting the shape, size and arrangement of the nanoscale structures, the metallic material can be tailored to capture light at specific frequencies, increasing the device’s light-absorbing speed and light-emitting efficiency by a factor of more than 1000.

Mikkelsen and her colleagues made their plasmonic metasurface by depositing an array of silver nanocubes, spaced 200 nm apart, atop a thin (75 nm) silver substrate coated with a polymer containing four layers of fluorescent dye. The researchers report that the interaction of the nanocubes with the electrons in the substrate 7 nm below enhances the overall fluorescence of the dye by 910 times and its light emission rate by 133 times. Such values were previously only possible for isolated, highly optimized single nanostructures, not for whole arrays. “While we haven’t yet integrated a fast photodetector like the Connectivity Lab did in their original work, we have solved the major bottleneck in the design,” Mikkelsen says.

Centimetre-sized sample

The researchers also observe that the metasurface can collect fast-modulated light with a 3 dB bandwidth exceeding 14 GHz from a 120-degree field of view and relay it into a narrow angle with an overall efficiency of around 30%. This value, they say, is a record high “to the best of our knowledge”.

The researchers, who describe their experiments in Optica, say they can fabricate their metasurface over areas as large as centimetres using a simple technique known as liquid deposition without any loss in efficiency. They now plan to assemble several plasmonic devices together to cover a 360° field of view.

Light-induced lattice vibrations could speed up data recording

magnet's crystal lattice

Intense laser pulses can turn an antiferromagnetic material into a ferromagnetic one within just a few picoseconds (10-12 s) – a time scale that matches the fundamental limit for magnetization switching and vastly exceeds the recording speeds of today’s computer hard drives. The technique, which works by optically “shaking” the crystal lattice of dysprosium orthoferrite (DyFeO3), could form the basis of a fast and energy-efficient new way of processing data.

Modern hard disk drives encode data by using magnetic field pulses to flip the spins of electrons (representing binary zeros and ones) in ferromagnetic materials within the disk. Because these magnetic pulses require a substantial electrical current, the data-writing process dissipates significant amounts of energy. It is also relatively slow, with a complete spin flip taking tens of nanoseconds (1 ns = 10-9 s).

Antiferromagnets like DyFeO3 are considered promising candidates for future high-density memory applications because their spins flip much faster, with characteristic frequencies in the terahertz range. These rapid spin flips are possible because the electron spins in DyFeO3 are aligned antiparallel to each other – meaning that the material (unlike ferromagnets, which have parallel electron spins) lacks a net magnetization. The spins in antiferromagnets are also robust to external magnetic perturbations, making them a stable platform for data storage.

Controlling the exchange interaction

Researchers led by Andrea Caviglia of the Delft University of Technology in the Netherlands have now put these properties to work by showing that intense (> 10 MV cm–1) mid-infrared laser pulses just 250 femtoseconds (1 fs =10-15 s) long can switch the spins in DyFeO3 in less than 5 picoseconds. The mechanism for this switch lies in the interaction between an electron’s spin (roughly, its rotation on its own axis) and its orbital momentum, which stems from the electron’s movement around the atomic nucleus and is related to the shape of the material’s electronic orbital.

In DyFeO3, the spin of the transition-metal (Fe) ion and the orbital momentum of the rare-earth (Dy) ion are strongly coupled via a mechanism known as an exchange interaction. This quantum interaction occurs between pairs of identical fermions (such as electrons), and it tends to prevent the spin magnetic moments of neighbouring fermions from pointing in the same direction.

Caviglia and colleagues, however, found that the intense laser pulses essentially “shook up” the lattice of DyFeO3, producing ultrafast and long-lasting changes in the exchange interaction. These changes made it possible for the material to undergo a phase transition, switching from an antiferromagnet to a ferromagnet.

Ultrafast lattice control

The researchers, who report their work in Nature Materials, say that it was previously thought that phonons (that is, vibrations) could only change a material’s magnetism on a timescale of nanoseconds. “We have reduced the magnetic switching time by a 1000, which is a major milestone in itself,” says team member Rostislav Mikhaylovskiy of Lancaster University in the UK.

The researchers hope that their findings will encourage further research into the exact mechanisms governing ultrafast lattice control of magnetic states. They now plan to optically stimulate other phonon modes in DyFeO3. “These modes often feature a symmetry that is different to the one we have already addressed and thus might have a fundamentally distinctive impact on the magnetic state of the antiferromagnet,” study lead author Dmytro Afanasiev tells Physics World. “Who knows what kind of novel scenarios for light-driven magnetic recording they may provide.”

Dress inspired by Perseverance Rover’s parachute, augmented reality Sun brightens your living room

Dare mighty things” was the message surreptitiously encoded into the parachute of NASA’s Perseverance Rover, which landed on Mars last week. Now, designers at Svaha Apparel have created a dress based on the pattern of red, white and black segments that made up the parachute. The company is currently running a pre-order campaign and says that it will produce the dress is if gets at least £10,000 worth of orders. According to the Svaha website, the goal has already been met – so the dress may soon be available for $69.99. The company is running a similar campaign for a parachute-inspired t-shirt, which retails for $24.99.

In 2003, the Danish-Icelandic artist Olafur Eliasson lit up the Turbine Hall of London’s Tate Modern with an artificial Sun. Now you can brighten up a room in your house on a dull winter’s day with a virtual version of that artwork – at least when you look at the room through the camera of your smartphone.

Eliasson has teamed up with Daniel Birnbaum, director of the augmented reality platform Acute Art to create a Pokemon Go-like version of his Sun that appears on your screen. And if that gets too bright for you, a rain cloud is available – perhaps followed by a virtual rainbow.

Shapley, Curtis and the ‘island universes’ controversy

The Shapley–Curtis debate makes interesting reading, even today. It is important not only as a historical document but also as a glimpse into the reasoning processes of eminent scientists engaged in a great controversy for which the evidence on both sides is fragmentary and partly faulty. This debate illustrates forcefully how tricky it is to pick one’s way through the treacherous ground that characterizes research at the frontiers of science.” – Frank Shu

In 1919 George Hale, head of Mount Wilson Observatory, called for the US National Academy of Sciences to host a debate about either Einstein’s theory of relativity or “island universes” – galaxies outside of our own. The home secretary of the academy, C G Abbot, was sanguine that either would be worth pursuing on a public stage. He wrote to Hale saying: “You mentioned the possibility of a sort of debate. From the way the English are rushing relativity in Nature and elsewhere, it looks as if the subject would be done to death long before the meeting of the academy, and perhaps your first proposal to discuss the island universe would be more interesting. I have a sort of fear, however, that people care so little about island universes that unless the speakers took pains to make the subject very engaging, the thing would fall flat.”

And so it followed that at a meeting of the academy on 26 April 1920, scientists Harlow Shapley and Heber Curtis presented contrasting arguments that collectively came to be known as “the Great Debate”, though at the time it was officially titled “The Scale of the Universe”.

Shapley argued that there was nothing more to the universe than our Milky Way galaxy, and there were no other “island universes”. He could not swallow the notion that there was more to reality than our own galaxy, as that would imply that Andromeda (pictured above) was about 108 light-years away from us. Furthermore, he claimed these observed “spiral nebulae” were merely nearby gas clouds within the Milky Way, rather than distinct galaxies in their own right. He also argued that our Sun was far from the centre of the Milky Way, yet another point of disagreement between Curtis and him.

The Great Debate was not only about the nature of the universe. The young, hungry Shapley had an agenda – he had hoped that by defeating the older Curtis, he would earn the directorship of Harvard College Observatory. It would be a tall order – Curtis had been known to be a skilled and precise orator. And his confidence was on full display even before the debate. In a letter to Shapley he wrote that “A good friendly scrap is an excellent thing, once in a while…sort of clears up the atmosphere.”

A good friendly scrap is an excellent thing, once in a while…sort of clears up the atmosphere

It is likely that Shapley knew he would be swimming upstream. He opened his argument by saying that “To Ptolemy and his school, the universe was geocentric; but since the time of Copernicus the Sun, as the dominating body of the solar system, has been considered to be at or near the centre of the stellar realm. With the origin of each of these successive conceptions, the system of stars has ever appeared larger than was thought before. Thus the significance of man and Earth in the sidereal scheme has dwindled with advancing knowledge of the physical world, and our conception of the dimensions of the discernible stellar universe has progressively changed. Is not further evolution of our ideas probable?”

Contrary to Shapley, Curtis argued that Andromeda and other spiral nebulae could, in fact, be other galaxies. In support of his hypothesis he appealed to the fact that Andromeda seemed to possess more novae than the Milky Way. Why would this be if Andromeda was merely a part of the Milky Way? A better explanation was that Andromeda was a distinct galaxy that simply possessed a different rate of nova occurrences than the Milky Way.

Shapley and Curtis were both given 40 minutes to make their case to an audience of academics, possibly including Albert Einstein. Shapley presented ideas he had written in one of his papers, emphasizing the scale of the Milky Way. Curtis, meanwhile, offered a slideshow to express his explanation that spiral nebulae were actually “island universes” – better known to us today as galaxies. The speakers were not really addressing each other’s core arguments, with Shapley focusing on the Milky Way’s size and Curtis on the possibility of island universes. Curtis’s eloquence and stage presence dwarfed Shapley’s, which may ultimately have contributed to Curtis’s victory in the eyes of the audience.

Fortunately for Shapley, he still earned the directorship of the Harvard College Observatory, while Curtis went on to run the Allegheny Observatory. In 1923 astronomer Edwin Hubble measured the changing brightness of what are called Cepheid variable stars. He demonstrated that they were so distant from us as to be outside of the Milky Way. With that, the Great Debate was settled, and Curtis’s apparent victory upgraded to a definitive one.

Shapley’s position that the Milky Way is the entirety of our universe might seem laughable to our contemporary minds. No modern scientist would admit to ever rejecting a good explanation of observations merely because it violates our intuitions about what reality should be like. But we should hesitate before judging Shapley. Today, there are debates about the existence of the multiverse, and future scientists may one day laugh at our unwillingness to accept that, just as we are tempted to judge Shapley’s resistance to accept the true size of the universe.

  • This article was published in Lateral Thoughts, Physics World’s regular column of humorous and offbeat essays, puzzles, crosswords, quizzes and comics, which appears on the back page of the print edition. You can submit your own Lateral Thoughts. Articles should be 900–950 words, and can be e-mailed to pwld@ioppublishing.org

COVID-19 leads to major overhaul for radiotherapy

© AuntMinnieEurope.com

Greater use of hypofractionated dosing regimens has helped radiation oncology sites deliver radiation therapy to cancer patients in England during the COVID-19 pandemic, offsetting a massive decline in treatment sessions overall, according to a study published in Lancet Oncology.

The study from the University of Leeds with Public Health England and the Royal College of Radiologists evaluated how radiation therapy practice changed on a number of fronts during the first wave of the outbreak.

Compared with the same periods in 2019, in 2020 the number of radiotherapy courses dropped by 19.9% in April, 6.2% in May and 11.6% in June, reported Katie Spencer, a fellow in clinical oncology at Leeds Teaching Hospitals National Health Service (NHS) Trust, and colleagues. The data reflect guidelines during the pandemic and more appropriate use of radiation therapy in some key areas, such as short courses with high doses (hypofractionated) where appropriate, delays for nonurgent care, and increased use of radiation therapy as an alternative to surgery.

“Although radiotherapy activity decreased during the first wave of the pandemic, our data suggest that the overall impact of this decline is likely to be modest,” said the group. “In addition, radiotherapy appears to have mitigated against some of the indirect harms of the pandemic by maintaining curative treatment options despite the challenges facing surgical services.”

Search for value in cancer care

Spencer is a specialist on health economics and value in cancer care, particularly the use and cost-effectiveness of radiotherapy, and an experienced user of routine NHS healthcare data to understand variation and improve value and outcomes. Her team’s study was designed to evaluate the impact of the COVID-19 pandemic on radiotherapy services across the NHS in England and guidelines issued in response to the outbreak. About one-third of cancer patients typically undergo radiation therapy.

Katie Spencer

“Alongside surgery and systemic anticancer therapy, radiotherapy plays a major part both as a curative treatment and in the palliation of [localized] symptoms from advanced disease,” the authors wrote. “At the outset of the pandemic, all three treatment modalities were affected by constraints on COVID-19 testing and staff shortages.”

Researchers evaluated the use of radiation therapy delivered by all 52 NHS radiation therapy providers in England pre-pandemic and through the end of June 2020. During a lockdown period – from 23 March to 28 June – the total number of radiation treatment courses dropped by 3263 across the NHS in England, while the number of treatment appointments was down by 119,050, the authors reported.

The declines were in line with national and international guidelines for prioritizing/triaging patients and managing care during the pandemic.

“The disproportionately greater fall in treatment attendances largely reflects a rapid increase in the use of ultra-hypofractionated treatment regimens across several [tumour] sites,” the authors noted.

For example, in April 2020, researchers reported dramatically more use of a treatment regimen that specified 26 Gy in five fractions for the neoadjuvant treatment of breast cancer, as opposed to the 40 Gy in 15 fractions that was more common pre-pandemic.

“A marked increase in the use of ultra-hypofractionation in the neoadjuvant treatment of rectal cancer was also observed, with a reduction in the use of less than 2 Gy per fraction regimens,” they said.

Other key findings

The researchers also reported that compared with other age groups, they found more of a decline in treatment for people over the age of 70, which could reflect higher risk of the patients due to age and comorbidities, as well as the ability to defer care for certain conditions such as prostate cancer and nonmelanoma skin cancer.

In tumour types that typically require more immediate treatment, such as cancers of the rectum, bladder and oesophagus, the researchers observed an increase in the number of treatment courses. Spencer and colleagues suggested that this could signal the use of radiation therapy as an alternative to surgery. For example, the number of curative courses for bladder cancer rose by 143.3% in May 2020 relative to May 2019.

Areas of concern, however, include a decline in the number of palliative treatment courses and a persistent decline in radiation therapy overall in June 2020 relative to June 2019. NHS data show that the number of referrals for possible symptomatic cancer was 21% lower in June 2020 compared with the same period in 2019, the authors noted.

“New diagnoses were suppressed by 26%, which is probably a key contributor to the ongoing suppression in radiotherapy activity up to June 2020,” Spencer and colleagues wrote.

It’s possible that this trend will have an effect on outcomes as the data are followed over time.

“As COVID-19 cases again rise, these data are crucial for modelling indirect harms of the pandemic and establish a new baseline for radiotherapy treatments from which to plan for the ongoing delivery of care throughout subsequent pandemic waves and into the recovery beyond,” the authors wrote. “They also reinforce the need to address any persisting delays in cancer diagnostic pathways.”

  • This article was originally published on AuntMinnieEurope.com ©2021 by AuntMinnieEurope.com. Any copying, republication or redistribution of AuntMinnieEurope.com content is expressly prohibited without the prior written consent of AuntMinnieEurope.com.

How gravitational waves could reveal flaws in a black-hole theorem, satellites lead an agricultural revolution

The famous no-hair theorem says that black holes can only be defined in terms of three properties: mass, charge and spin. It has held up pretty well for about 50 years, but now some physicists are hoping to find evidence of violations of the theorem in gravitational waves from merging black holes. In this episode of the Physics World Weekly podcast, Jamie Bamber and Katy Clough of the UK’s University of Oxford discuss the possibility of finding deviations from the theorem, and what they could tell us about physics beyond the Standard Model.

Earth imaging satellites provide a wide range of information about agriculture that is used by everyone from farmers to those who set food policies. In this episode Catherine Nakalembe of the University of Maryland in the US explains how satellite monitoring can help mitigate the effects of drought and other climate and land use variations.

Multiplying light signals could give optical computers a boost

Researchers in Russia and the UK have proposed a new and simple way to produce binary output signals in the logic gates of optical computers. Developed by Nikita Stroev at the Skolkovo Institute of Science and Technology in Russia and Natalia Berloff at the UK’s University of Cambridge the technique involves multiplying the input signals to the gates, instead of adding them linearly. With further improvements, their approach could drastically reduce the number of light signals required for optical computers to operate, improving their potential for complex problem solving.

Optical computers are emerging solution to the limitations of conventional electronic devices. Not only could they enable information to travel much faster through their component circuits; they should also have a far lower energy consumption and allow information to be processed in new ways that would be much more efficient at solving certain problems.

Instead of electrical signals, optical computers use the continuous phase of photons to encode and distribute binary information. A big challenge in building optical computers is that photons normally do not interact with each other, making it difficult to create logic gates. One way of making photons interact is to use materials with nonlinear refractive indices. When two or more input optical signals are combined in such materials, they interact with each other via the material’s electrons. By carefully engineering these interactions, researchers could build devices in which the two signals add together to deliver the desired output — and least in principle

Unpredictable phase

In practice, however, nonlinear materials can have unpredictable effects on the phases of output signals. To deal with this problem an external resonant excitation is introduced to set the phases of output photons to clear binary states, which is not an ideal solution.

Stroev and Berloff explored a more robust approach in their study. Instead of adding the input signals linearly, they combined them by multiplying their wave functions. Under appropriate conditions, the researchers calculate that the phases of each coupled signal changed to reach a minimum energy configuration. This produced an output signal with a phase clearly associated with either a 0 or a 1, without any need for additional signals.

The duo’s model system used polaritons: quasiparticles that form through a strong coupling between light and matter, giving them hybrid physical properties. In their design, they multiplied the wavefunctions of coherent, superfast polariton pulses, guiding them towards the correct output phase by temporarily altering their coupling strengths.

The inherent noise in the signals means that further improvements will be needed before the system can be integrated into the large-scale production of optical computers. However, the early success of Stroev and Berloff’s approach reveals a promising new route towards superfast, real-world problem solving, in areas too complex for conventional computers to handle.

The research is described in Physical Review Letters.

Iridium in undersea crater confirms asteroid wiped out the dinosaurs

Strong evidence that the dinosaurs were killed-off 66 million years ago by an asteroid hitting Earth has been found in Chicxulub crater under the Gulf of Mexico. An international team has measured an abundance of the rare element iridium in the crater and similarly high concentrations of the element are known to occur in sediments laid down at the time of the Cretaceous–Paleogene boundary (K–Pg) extinction event, which saw many species on Earth vanish.

Measuring 200 km across, the Chicxulub crater is believed to have been created by an 11 km-wide asteroid crashing into Earth. The impact would have sent vast amounts of vaporized rock into the atmosphere, blocking out the Sun and creating a winter that could have lasted several decades. The result, scientists believe, was the mass extinction of 75% of species on Earth including the non-flying dinosaurs.

The crater was discovered in the 1990s, but the idea that the K–Pg extinction was caused by an asteroid impact was proposed a decade earlier by a team that included the physics Nobel laureate Luis Alvarez. They found an unusually high amount of iridium in sedimentary rocks laid down at the K–Pg boundary. Iridium is rare in the Earth’s crust because it is a siderophile, which means that it dissolves in iron and therefore tends to sink into the Earth’s core. Iridium is much more abundant in asteroids, leading Alvarez and colleagues to conclude that the vaporization of an asteroid released large amounts of iridium into the atmosphere, which then fell to the ground as dust as the dinosaurs disappeared.

Huge tsunamis

As well as the subsequent discovery of the Chicxulub crater, the impact extinction theory is backed up by evidence that huge tsunamis occurred in the Gulf of Mexico and Caribbean regions at the time. However, the evidence linking the Chicxulub impact to the K–Pg extinction is not conclusive. The iridium could have been put into the atmosphere by another asteroid impact or impacts; and some scientists have suggested that increased volcanic activity, rather than an asteroid, could have caused the extinction.

In 2016, Sean Gulick at the University of Texas at Austin and Joanna Morgan of Imperial College London led an international team of scientists on the International Ocean Discovery Program on an expedition to the Chicxulub crater. They took about 900 m of rock core samples and found a similar spike in iridium content in sediment laid down just after the crater was formed. Indeed, the sedimentary rock containing iridium is so thick that they were able to date the dust to about two decades after the impact.

Similar abundances

Studies of the cores also revealed high levels of several other elements associated with asteroids and have been found in similar abundances in K–Pg sediments at 52 sites around the world.

“We are now at the level of coincidence that geologically doesn’t happen without causation,” says Gulick. “It puts to bed any doubts that the iridium anomaly [in the geologic layer] is not related to the Chicxulub crater.”

A separate study of the cores done in 2019 reveals that the crater rock has been depleted of sulphur when compared to surrounding limestone. This suggests that the impact blew large amounts of sulphur into the air, where it would have contributed to the cooling and then fallen as acid rain – making the situation on the ground even worse.

The research is described in Science Advances.

Quantum computer captures physics of high-energy particles

Quantum theory is often portrayed as a disruptive force, complicating everything that classical physics seemed to have figured out. Now, however, physicists at Lawrence Berkeley National Laboratory (LBNL) in the US have demonstrated that the two can work side-by-side, in a proof-of-principle study that shows how a quantum computer can complement a classical method of modelling high-energy particle collisions.

Machines such as CERN’s Large Hadron Collider (LHC) smash protons together at energies of more than 1 TeV, producing showers of thousands of particles. Physicists use computer models to predict what happens to those particles by the time they reach a detector. In one such modelling technique, known as a parton shower, the assumption is that the particles that make it to the detector are the last step in a long cascade of particles and radiation that converted into one another after the initial collision.

For this parton shower to include quantum features of particle interactions, however, the model needs to simultaneously consider all possible intermediate particles that could form between the initial and final particles – something that cannot be done by a classical computer algorithm, says Christian Bauer, a theoretical physicist at LBNL and co-author of the paper. “What a classical [parton] shower does is that it sort of goes through and produces a particular event, one at a time, with a particular intermediate particle,” Bauer explains. “The quantum version of the shower sort of does all possibilities in one shot.”

In their study, which appears in Physical Review Letters, Bauer and colleagues created a quantum algorithm for the parton shower. To do this, they developed a simplified version of the Standard Model of particle physics that shares some of the full model’s features but is simple enough for present-day quantum computers to execute. They then used the IBM Q Johannesburg chip to calculate details of particle processes that can occur within this simplified model. This IBM chip has 20 superconducting quantum bits (qubits), and the LBNL scientists used cloud access to program it to run their quantum parton shower algorithm on 5 qubits, using 48 quantum gate operations. When they compared the real chip’s output to a prediction made by IBM’s quantum computer simulator, they found excellent agreement – indicating that the computer fully captured quantum effects in their particle model.

A quantum problem for a quantum machine

The idea that quantum effects are hard or impossible to model on non-quantum devices is an old one, dating back to lectures given by the physicist Richard Feynman in the early 1980s. Features of parton showers that are formulated in the language of quantum mechanics from the get-go certainly fall into this category, says Jesse Thaler, a physicist at the Massachusetts Institute of Technology who was not involved in the study. “While some aspects of particle scattering can be described in a classical language, nature is fundamentally quantum mechanical,” Thaler says. The current study, he suggests, could be a stepping-stone towards a future in which theorists use the outputs of both classical and quantum computers to piece together more complex models of what happens inside particle colliders.

The speed at which that future arrives, however, will depend on overcoming certain hardware challenges within quantum computing. “While I would be surprised if these challenges could be overcome before the end of the LHC era, it is plausible that these kinds of hybrid classical and quantum algorithms could be useful for interpreting data from a future collider,” Thaler predicts.

Two-way collaboration

Even though quantum computers are not yet advanced enough to outperform classical machines completely, Bauer thinks there are benefits in making them work together. While classical computational techniques produce excellent results in some areas of particle physics, he and his colleagues aim to concentrate on inherently quantum effects that classical machines could never properly handle. “We should only ask quantum computers to do the things that are hard to do on classical computers,” he says.

Benjamin Nachman, a physicist at LBNL and lead author of the study, adds that collaboration between high-energy physicists and quantum information scientists is a two-way street. “There are techniques for how to do [high energy] physics that we could apply to improve error mitigation on quantum computers,” he says. He and his collaborators have begun exploring some of these techniques, aided by a US Department of Energy programme that provides funding for this type of interdisciplinary partnership.

In the meantime, the LBNL team is focusing on making their current “toy” version of Standard Model physics, and the accompanying quantum algorithm, more sophisticated. “If a better [quantum] computer comes tomorrow, we can run the model that we developed here with more precision,” Bauer says. “But in order to really go to the Standard Model, there’s more theoretical work needed as well – which is not unsurmountable at all, it just needs to be done.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors