Skip to main content

Single-cell nanobiopsy explores how brain cancer cells adapt to resist treatment

Infographic of a double-barrel nanopipette

Glioblastoma (GBM) is the deadliest and most aggressive form of brain cancer. Almost all tumours recur after treatment, as surviving cells transform into more resilient forms over time to resist further therapies. To address this challenge, scientists at the University of Leeds have designed a novel double-barrel nanopipette and used it to investigate the trajectories of individual living GBM cells as they change in response to treatment.

The nanopipette consists of two nanoscopic needles that can simultaneously inject exogenous molecules into and extract cytoplasm samples from a cell. The nanopipette is integrated into a scanning ion conductance microscope (SICM) to perform nanobiopsies of living cells in culture. Unlike existing techniques for studying single cells, which usually destroy the cell, the nanopipette can take repeated biopsies of a living cell without killing it, enabling longitudinal studies of an individual cell’s behaviour over time.

Writing in Science Advances, the researchers explain that SICM works by measuring the ion current between an electrode inserted in a glass nanopipette and a reference electrode immersed in an electrolytic solution containing the cells. Nanobiopsy is performed when an ion current flows through the nanopore at the tip of the nanopipette after applying a voltage between the two electrodes. In their double-barrel nanopipette, one barrel acts as an electrochemical syringe to perform cytoplasmic extractions; the second contains an aqueous electrolyte solution that provides a stable ion current for precise positioning and nanoinjection prior to nanobiopsy.

The semi-automated platform enables extraction of femtolitre volumes of cytoplasm and simultaneous injection into individual cells. The platform provides automated positioning of the nanopipette using feedback control (the ion current drops when the nanopipette approaches the sample), while detection of particular current signatures indicates successful membrane penetration of a single cell.

Longitudinal studies

As a proof-of-concept of the platform’s ability, the researchers conducted longitudinal nanobiopsy of a GBM cell (and its progeny), profiling gene expression changes over 72 h. They performed nanobiopsy prior to therapy, during treatment with radiotherapy and chemotherapy, and post treatment.

“Our method is robust and reproducible, allowing membrane penetration and nanoinjection across different cell types with distinct mechanical properties,” write co-principal investigators Lucy Stead and Paolo Actis. “The average success rate of nanoinjection is 0.89 ± 0.07. Intracellular mRNA is then extracted.”

The researchers investigated the response of GBM cells to the standard treatment of 2 Gy of radiation and 30 µM of temozolomide. They visually tracked individual cells and their progeny over 72 h, with 98% remaining in the microscope’s field-of-view during this time frame – an important factor when aiming to perform longitudinal analysis.

Fluorescence images of brain cancer cells

On day 1, the researchers biopsied, injected with a fluorescent dye and imaged each cell. On day 2, half of the cells received irradiation and chemotherapy, while the others served as controls. All cells were imaged on day 2 and 3, and biopsied and injected again on day 4.

In cells that underwent day-1 nanobiopsies, survival was similar between treated and untreated cells, and cell division rates were comparable in the two groups. After 72 h, 63% of untreated control (not biopsied) cells survived, compared with 25% of the treated, biopsied cells. There was no difference in the subsequent death rates of cell subtypes at day 1, irrespective of treatment. However, a much larger proportion of untreated cells switched subtype over time, or produced progeny with a different subtype, than the treated cells.

“This suggests that untreated cells are significantly more plastic over the three-day time course than treated cells,” the researchers write. “The cell phenotype scores of paired day 1 and longitudinal samples revealed that treated cells tend to maintain the same phenotype during therapy, while untreated cells are more likely to switch transcriptional state over 72 h, suggesting that treatment either induces or selects for high transcriptional stability in this established GBM cell line.”

“This is a significant breakthrough,” says Stead. “It is the first time that we have a technology where we can actually monitor the changes taking place after treatment, rather than just assume them. This type of technology is going to provide a layer of understanding that we simply never had before. And that new understanding and insight will lead to new weapons in our armoury against all types of cancer.”

The team is convinced that the ability of these versatile nanoprobes to access the intracellular environment with minimal disruption holds potential to “revolutionize molecular diagnostics, gene and cell therapies”.

“Our future work will focus on increasing the throughput of the technology so that more cells can be analysed,” Actis tells Physics World. “We are working to improve the protocols for analysing the RNA extracted from cells so that more biological information can be gathered. We are also very keen to study more advanced biological models of brain cancer based on patient-derived cells and organoids.”

Keith Burnett: ‘I have this absolute commitment that the broader we are, the more powerful physics will be’

Founded in 1920, the Institute of Physics has had some high-flying presidents over the years. Early luminaries included Ernest Rutherford, J J Thomson and Lawrence Bragg, while more recently the presidency has been held by the likes of Jocelyn Bell Burnell, Julia Higgins and Sheila Rowan. The current incumbent is Keith Burnett, an atomic physicist who spent more than a decade as vice-chancellor of the University of Sheffield in the UK.

He studied at the University of Oxford and worked at the University of Colorado at Boulder and Imperial College London, before returning to Oxford, where he was head of physics in the mid-2000s. But despite a career spent almost entirely at top universities, Burnett is not a distant, elite figure. He grew up in the valleys of South Wales and revels in the fact that his cousin Richie Burnett was World Darts Champion in 1995.

Physics World caught up with Burnett to find out more about his career and vision for physics.

What originally sparked your life-long interest in physics?

I grew up in a mining valley in South Wales, which was a wonderful place with a really cohesive community. It was at the time of the Apollo space programme – oh my god, the excitement. You could see the possibilities and I was fascinated by the idea of space. But one thing I did have was a wonderful teacher in school – Mr Cook. Also, my father worked for a small engineering company that made ceramics. So I just loved the idea of science from the very beginning.

You went on to study at Oxford, where you did a PhD in atomic physics. What attracted you to that field?

I had absolutely wonderful undergraduate lecturers and teachers – one being another Welshman, Claude Hurst. There was also Colin Webb, who later started Oxford Lasers. He was an amazing undergraduate teacher at Jesus College and he really inspired me. In fact, he then passed me on to one of his buddies, Derek Stacey. The group had been founded by Heini [Heinrich] Kuhn, who was an emigré scholar from Germany, and had a wonderful tradition in precision atomic physics.

Did the commercial side of physics ever appeal in terms of your own career?

Not so much, but I did really admire what Colin was doing because he was very early in terms of commercialization. People wanted the type of excimer lasers he was making in the lab. In fact I just got an e-mail from him. He’s retired but very pleased that Oxford Lasers has won a good contract for doing semiconductor work. So I very much admire the applications of lasers and optics.

You were around in the 1990s at the time Bose–Einstein condensation was first observed in the lab. It was a peak period for atomic physics wasn’t it?

I was actually on the search committee that hired Carl Wieman to [the University of Colorado at] Boulder, where I was an assistant professor at the time. Carl joined the faculty and worked with Eric Cornell to make a Bose–Einstein condensate. I was tracking that very closely. It was an absolutely wonderful time because it went from “No-one thinks you can make it” to “Maybe they’ve made it” and then “Wow, it’s really big and juicy and we can do great stuff with it.”

Would you say Eric Cornell and Carl Wieman were worthy winners of the Nobel Prize for Physics in 2001?

Yes. They won it with Wolfgang Ketterle. It was a remarkable story with twists and turns because the person who developed the ideas behind [laser] cooling was Dan Kleppner at MIT. He was the first to develop hydrogen cooling with Tom Greytak. But what is really important is that the people at MIT taught other people elsewhere how to do it. Because of that, they progressed much faster and were able to learn from one another. It shows that if you don’t have trust and the ability to exchange ideas, everything slows down.

My cousin Richie was World Darts Champion in 1995. He’s the really well-known Burnett in in the valley. Not me!

Keith Burnett

After spells at Imperial College and then back at Oxford, you became vice-chancellor at the University of Sheffield. How did that come about?

I was about 49 when they said “Will you be head of physics at Oxford?” And I thought “Yeah, that’ll be amazing!” So I did that and it was very perplexing but wonderful – an amazing department. I did that for a year. But the person who inspired me [to move to Sheffield] was actually an ex-president of the IOP – and the previous vice-chancellor of Sheffield – Gareth Roberts [who died in 2007]. He’s another Welshman, though from north Wales, which is very different from south Wales – they play soccer, not rugby – but still Welsh. I was very poor at rugby. But my cousin Richie was World Darts Champion in 1995. He’s the really well-known Burnett in in the valley. Not me!

So what did Gareth Roberts say to you?

Well, I’d worked with Gareth at Oxford and he said “You should really think about it.” Sheffield is a city bathed in the traditions of making steel and metallurgy. So I thought I would love being part of the civic life of the city. I also felt this was a university that does wonderful things for its citizens and students. The other thing is that my daughter had gone to Sheffield before me – she’s an architect there so I always say I follow in my daughter’s footsteps.

As vice-chancellor at Sheffield, you were firmly opposed to the principle of student tuition fees. Why was that?

Higher education is not just for the individual. It has consequences for society and for business too. If you say “No, it’s just an individual choice whether someone goes to university and pays a fee”, well that can work to a certain extent. But you cannot then be sure you’ll have enough scientists to work in, say, industry or defence. As a country, we used to roughly balance the system in terms of where people went. But now it’s a free-for-all in terms of choice, which is bad if we need more people in science and engineering. Tuition fees also fundamentally change the relationship with students. I disagreed with fees when they came in and I still disagree with them now.

The UK university sector has expanded hugely over the last 20 years thanks to a huge rise in student numbers and the trebling of tuition fees in 2012. Has that been good or bad?

The big thing that happened during my time at Sheffield was the increase in student tuition fees [to £9000]. I was very much against the increase, which wasn’t a popular [position to hold] among many of my vice-chancellor colleagues. In fact, I remember being pressured by Number 10 to sign a letter with other Russell Group universities to support the rise. I knew it was going to be a major burden on households and we’re now in a situation where the UK has to write off £12bn [from students who never earn enough to pay their loans back]. We’ve got a very bad investment portfolio and the students have got debt. It’s been a disaster.

Large rectangular building with a glass facade divided into different-sized diamond shapes

Tuition fees haven’t risen for more than a decade now and many universities have come to rely on the much higher fees paid by international students. How has the growth in foreign students affected the higher-education sector?

International student fees used to be a top-up. When I was at Sheffield, we used them to build a new engineering teaching lab, known as the Diamond. But nowadays the income from international students is pretty much built into the fabric – in other words, without their fees you can’t run a university. We have some amazing physics departments in this country, but the tap that feeds them is actually undergraduate physicists, cross-subsidized by international students, especially from business schools, international relations and engineering. As a country, we need physics properly funded and less reliant on foreign students.

If you look at a place like Sheffield, students bring enormous benefits – vitality, money, inward investment

Keith Burnett

The rise in international students has also played a role in increasing immigration to the UK. Where do you stand in that debate?

If you look at a place like Sheffield, students bring enormous benefits – vitality, money, inward investment. Others may say “No, we don’t like students taking accommodation” and things of that sort. If you talk to experts in immigration, it’s far more neutral than people think. But the whole topic is inflammatory and it’s difficult to get a balanced discussion of the advantages and disadvantages. There are, though, some incredible physics departments in the UK – look at the number of companies working with the University of Bristol in its quantum tech. This is a big potential business long term.

After Sheffield, you became involved in the Schmidt Science Fellows scheme – what’s that all about?

It was an idea of [the US computer scientist] Stu Feldman, a long-term confidant of Eric and Wendy Schmidt – Eric being a former chief executive and chair of Google. Stu said “There ought to be a way in which people, once they’ve done their PhDs, can think more broadly rather than just carrying on in a particular thing.” How, in other words, can we identify people across the world who’ve got fantastic ideas and then give them some freedom to move? So we – our team at Rhodes House in Oxford – select people with exciting ideas and help them choose where they might go in the world.

What’s your role in the scheme?

My job is to mentor researchers in making this transition. Initially, I did all of the mentoring but now I have some colleagues. It can be all the way from handling financial issues to dealing with principal investigators to writing faculty applications. Over the last six years we’ve helped about 120 people across the world in different institutions. Some are now in national labs, while others have set up their own businesses. For me, it’s the most wonderful job because I get to hear the issues that early-career scientists have, such as using machine learning in all sorts of things – imaging biomolecules, precision drugs, everything.

What are the main challenges facing early-career researchers?

First and foremost, salaries. I think we’re in grave danger of underpaying our early-career scientists. We also need to do more to help people with their work–life balance. The Schmidt programme does have generous parental leave. There’s also the question of supporting and promoting people who work in interdisciplinary areas.

Three photos: a teacher and pupils with a robot; quantum computing abstract; pedestrians walking towards the UK Houses of Parliament

In October 2023 you started your two-year stint as IOP president. What are your priorities during your term in office?

The IOP has just launched its new five-year strategy and the big focus is the skills base of teachers and researchers. First, are we helping teachers enough – the people who help people get into physics? We need a strong pipeline of talent because physicists don’t just stay in academia, they move into finance, industry, policy.

Second, we are very interested in influencing science – especially the green economy. We have to explain that it’s physicists – working with engineers and chemists – who are at the core of efforts to tackle climate change.

We’re also thinking more about how to make membership of the IOP more useful and accessible. It’s not arrogance to think that someone with an awareness of physics is just that much better prepared for lots of things going on in the modern world.

How can members of the IOP get involved with helping put that strategy into practice?

Start by looking at the strategy, if you haven’t already. If you’re a member of a particular group or branch, then feed your ideas back to your representatives. Our influence as an institute is much more powerful if we’re the convenors and co-ordinators of a more general effort. We can’t do all the things, but our membership is big and strong. If you can’t find somebody, contact me.

You feel strongly about the need for the physics community to be more diverse. How do you see physics evolving and over the next few decades?

There’s a wonderful book, After Nativism, that just came out by Ash Amin, who’s a trustee of the Nuffield Foundation, which I chair. He argues that many of the things needed to make a just, equitable and diverse society are not being advocated, with many parts of society backing away from these issues. But the younger generation is utterly committed to a future that’s more just, equitable and diverse. They’ve grown up freer of prejudice but also used to discussing these things more openly. They’re not interested in many of the divisions that people would see in terms of labels of any sort. Any labelling of people due to race, ethnicity, sexual proclivity – anything at all – is an anathema and I personally find that inspirational. I really find that inspirational.

As a profession, we are a long way off equity and have great deficits in terms of inclusion

Keith Burnett

How can the IOP help with such issues?

One of the things that the IOP can do is say “Well, what are the advantages of a society of that sort?” Some people may accuse us of being a bunch of “woke liberals”. We’re not. We’re just people who believe in justice and equity in society. But we’re going have to work for it because, as a profession, we are a long way off equity and have great deficits in terms of inclusion. Going forward, we will have a younger generation who will care much less about these issues because they won’t see them. In fact, they’ll find it very strange that there was a time when the IOP didn’t represent society as a whole.

What are the benefits of a more equitable and inclusive physics community?

The advantages are huge. You know, if you exclude groups of people because of the labels you attribute to them, you’re “deleting” people who could be powerful, influential and helpful for physics. You’re just wasting people. I have this absolute commitment that the broader we are in terms of our people, the better, the more just and the more powerful we will be. I think our community wants that. Some won’t; some people might have a more traditional view of what society is. But it’s our duty and our incentive to say why we want a more just society – after all, it’s smarter, more powerful, more fun.

New photovoltaic 2D material breaks quantum efficiency record

Convention solar cells have a maximum external quantum efficiency (EQE) of 100%: for every photon incident on the cell, they generate one photoexcited electron. In recent years, scientists have sought to improve on this by developing materials that “free up” more than one electron for every photon they absorb. A team led by physicist Chinedu Ekuma of Lehigh University in the US has now achieved this goal, producing a material with an EQE of up to 190% – nearly double that of silicon solar cells.

The team made the new compound by inserting copper atoms between atomically thin layers of germanium selenide (GeSe) and tin sulfide (SnS). The resulting material has the chemical formula CuxGeSe/SnS, and the researchers developed it by taking advantage of so-called van der Waals gaps. These atomically small gaps exist between layers of two-dimensional materials, and they form “pockets” into which other elements can be inserted (or “intercalated”) to tune the material’s properties.

Intermediate bandgap states

The Lehigh researchers attribute the material’s increased EQE to the presence of intermediate bandgap states. These distinct electronic energy levels arise within the material’s electronic structure in a way that enables them to absorb light very efficiently over a broad spectrum of solar radiation wavelengths. In the new material, these energy levels exist at around 0.78 and 1.26 electron volts (eV), which lie within the range over which the material can efficiently absorb sunlight.

The material works particularly well in the infrared and visible regions of the electromagnetic spectrum, producing, on average, nearly two photoexcited charge carriers (electrons and holes bound in quasiparticles known as excitons) for every incident photon. According to Ekuma, such “multiple exciton generation” materials can serve as the active layer within solar cell devices, where their performance is fundamentally governed by exciton physics. “This active layer is crucial for enhancing the solar cell’s efficiency by facilitating the generation and transport of excitons in the material,” Ekuma explains.

Further research needed for practical devices

The researchers used advanced computational models to optimize the thickness of the photoactive layer in the material. They calculated that its EQE can be enhanced by making sure that it remains thin (in the so-called quasi-2D limit) to prevent quantum confinement losses. This is a key factor that affects efficient exciton generation and transport through a process known as nonradiative recombination, in which electrons and holes have time to recombine instead of being whisked apart to produce useful current, Ekuma explains. “By maintaining quantum confinement, we preserve the material’s ability to effectively convert absorbed sunlight into electrical energy and operate at peak efficiency,” he says.

While the new material is a promising candidate for the development of next-generation, high-efficient solar cells, the researchers acknowledge that further research will be needed before it can be integrated into existing solar energy systems. “We are now further exploring this family of intercalated materials and optimizing their efficiency via various materials engineering processes to this end,” Ekuma tells Physics World.

The study is detailed in Science Advances.

Nanofluidic memristors compute in brain-inspired logic circuits

A memristor that uses changes in ion concentrations and mechanical deformations to store information has been developed by researchers at EPFL in Lausanne, Switzerland. By connecting two of these devices, the researchers created the first logic circuit based on nanofluidic components. The new memristor could prove useful for neuromorphic computing, which tries to mimic the brain using electronic components.

In living organisms, neural architectures rely on flows of ions passing through tiny channels to regulate the transmission of information across the synapses that connect one neuron to another. This ionic approach is unlike the best artificial neural systems, which use electron currents to mimic these synapses. Building artificial nanofluidic neural networks could provide a closer analogy to real neural systems, and could also be more energy-efficient.

A memristor is a circuit element with a resistance (and conductance) that depends on the current that has previously passed through it – meaning that the device can store information. The memristor was first proposed in 1971, and since then researchers have had limited success in creating practical devices. Memristors are of great importance neuromorphic computing, because they can mimic the ability of biological synapses to store information.

In this latest research, EPFL’s Théo Emmerich, Aleksandra Radenovic and their colleagues made their nanofluidic memristors using a liquid blister that expands or contracts when currents of solvated ions flowed into or out of it, changing its conductance.

Iconic and ionic

In 2023, researchers took a significant step toward ion-based neuromorphic computing when they discovered memory effects in two nanofluidic devices that regulated ion transport across nanoscale channels. When subjected to a time-varying voltage, these devices displayed a lagging change in current and conductance. This is a memristor’s characteristic “pinched” hysteresis loop. However, the systems had weak memory performance, and were delicate to fabricate. Furthermore, the mechanism responsible for the memory effect was unclear.

But this has not deterred the EPFL team, as Emmerich explains: “We wanted to show how this nascent field could be complementary to nanoelectronics and could lead to real-world computing applications in the future”.

To create their device, the EPFL researchers fabricated a 20 micron-by-20 micron silicon nitride membrane atop a silicon chip, with a 100 nm-diameter pore at its centre. On this chip, they deposited 10-nm-diameter palladium islands around which fluid could flow, by using evaporative deposition techniques. Finally, they added a 50–150 nm thick graphite layer, to create channels that led to the pore.

Tiny blister

Upon dipping the device into an electrolyte solution and applying a positive voltage (0.4–1.0 V), the researchers observed the formation of a micron-scale blister between the silicon nitride and the graphite above the central pore. They concluded that ions travelled through channels and converged at the centre, increasing pressure there and leading to blister formation. This blister acted as a resistive “short circuit” that increased the device’s conductance, placing it in the “on” state. Upon applying a negative voltage of the same magnitude, the blister deflated and the conductance decreased, placing the device in the “off” state.

Because the blister took time to deflate following the voltage shut-off, the device remembered its previous state. “Our optical observation showed the mechano-ionic origin of the memory,” says EPFL’s Nathan Ronceray.

Measurements of the current flowing through the device before and after the voltage reset showed that the device operated with a conductance ratio up to 60 on a timescale of 1–2 s, indicating a memory effect two orders of magnitude greater than previous designs. Emmerich adds, “This is the first time that we observe such a strong memristive behaviour in a nanofluidic device, which also has a scalable fabrication process”.

To create a logic circuit, the team connected two of their devices in parallel to a variable electronic resistor. Both devices thus communicated together through this resistor to achieve a logic operation. In particular, the switching of one device was driven by the conductance state of the other.

Logical communication

Until now, Emmerich says, nanofluidic devices have been operated and measured independently from each other. He adds that the new devices “can now communicate to realize logic computations.”

Iris Agresti, who is developing quantum memristors at the University of Vienna, says that while this is not to the first implementation of a nanofluidic memristor, the novelty is showing how multiple devices can be connected to perform controlled operations. “This implies that the behaviour of one of the devices depends on the other,” she says.

The next step, the EPFL researchers say, is to build nanofluidic neural networks where memristive units are wired together with water channels. The goal being to create circuits that can perform simple computing tasks such as pattern recognition or matrix multiplication. “We dream of building electrolytic computers able to compute with their electronic counterparts,” says Radenovic.

That’s a long-term and ambitious goal. But such an approach presents two key advantages over electronics. First, the systems would avoid the overheating typically associated with electrical wires, because they would use water as both the wires and the coolant. Second, they could benefit from using different ions to execute complete tasks on par with living organisms. Moreover, Agresti says, artificial neural networks with nanofluidic components promise lower energy consumption.

Yanbo Xie, a nanofluidics expert at Northwestern Polytechnical University in China, points out that the memristor is a critical component for a neuromorphic computer chip and plays a similar role to a transistor in a CPU. The EPFL logic circuit could be “a fundamental building block for future aqueous computing machines,” he says. Juan Bisquert an applied physicist at the University of James I in Castello, Spain, agrees. The devices “show a robust response,” he says, and combining them to implement a Boolean logic operation “paves the way for neuromorphic systems based on fully liquid circuits.”

The work is described in Nature Electronics.

Why we still need a CERN for climate change

It was a scorcher last year. Land and sea temperatures were up to 0.2 °C higher every single month in the second half of 2023, with these warm anomalies continuing into 2024. We know the world is warming, but the sudden heat spike had not been predicted. As NASA climate scientist Gavin Schmidt wrote in Nature recently: “It’s humbling and a bit worrying to admit that no year has confounded climate scientists’ predictive capabilities more than 2023 has.”

As Schmidt went on to explain, a spell of record-breaking warmth had been deemed “unlikely” despite 2023 being an El Niño year, where the relatively cool waters in the central and eastern equatorial Pacific Ocean are replaced with warmer waters. Trouble is, the complex interactions between atmospheric deep convection and equatorial modes of ocean variability, which lie behind El Niño, are poorly resolved in conventional climate models.

Our inability to simulate El Niño properly with current climate models (J. Climate 10.1175/JCLI-D-21-0648.1) is symptomatic of a much bigger problem. In 2011 I argued that contemporary climate models were not good enough to simulate the changing nature of weather extremes such as droughts, heat waves and floods (see “A CERN for climate change” March 2011 p13). With grid-point spacings typically around 100 km, these models provide a blurred, distorted vision of the future climate. For variables like rainfall, the systematic errors associated with such low spatial resolution are larger than the climate-change signals that the models attempt to predict.

Reliable climate models are vitally required so that societies can adapt to climate change, assess the urgency of reaching net-zero or implement geoengineering solutions if things get really bad. Yet how is it possible to adapt if we don’t know whether droughts, heat waves, storms or floods cause the greater threat? How do we assess the urgency of net-zero if models cannot simulate “tipping” points? How is it possible to agree on potential geoengineering solutions if it is not possible to reliably assess whether spraying aerosols in the stratosphere will weaken the monsoons or reduce the moisture supply to the tropical rainforests? Climate modellers have to take the issue of model inadequacy much more seriously if they wish to provide society with reliable actionable information about climate change.

I concluded in 2011 that we needed to develop global climate models with spatial resolution of around 1 km (with compatible temporal resolution) and the only way to achieve this is to pool human and computer resources to create one or more internationally federated institutes. In other words, we need a “CERN for climate change” – an effort inspired by the particle-physics facility near Geneva, which has become an emblem for international collaboration and progress.

That was 13 years ago and since then nature has spoken with a vengeance. We have seen unprecedented heat waves, storms and floods, so much so that the World Economic Forum rated “extreme weather” as the most likely global event to trigger an economic crisis in the coming years. As prominent climate scientist Michael Mann noted in 2021 following a devastating flood in Northern Europe: “The climate-change signal is emerging from the noise faster than the models predicted.” That view was backed by a briefing note from the Royal Society for the COP26 climate-change meeting held in Glasgow in 2021, which stated that the inability to simulate physical processes in fine detail accounts for “the most significant uncertainties in future climate, especially at the regional and local levels”.

Yet modelling improvements have not kept pace with the changing nature of these real-world extremes. While many national climate modelling centres have finally started work on high-resolution models, on current trends it will take until the second half of the century to reach kilometre-scale resolution. This will be too late for it be useful to tackle climate change (see figure below) and urgency is needed more than ever.

A climate EVE

Pooling human and computing resources internationally is a solution that seems obvious. In a review of UK science in 2023, the Nobel-prize winner Paul Nurse commented that “there are research areas of global strategic importance where new multi-nationally funded institutes or international research infrastructures could be contemplated, an obvious example being an institute of climate change built on the EMBL [European Molecular Biology Laboratory] model”. He added that “such institutes are powerful tools for multinational collaboration and bring great benefit not only internationally but also for the host nation”.

So, why hasn’t it happened? Some say that we don’t need more science and instead must spend money helping those that are already suffering from climate change. That is true, but computer models have helped vulnerable societies massively over the years. Before the 1980s, poorly forecast tropical cyclones could kill hundreds of thousands of people in vulnerable societies. Now, with improved model resolution, excellent week-ahead predictions (and the ability to communicate the forecasts) can be made and it is rare for more than a few tens of people to be killed by extreme weather.

Graph of the spatial resolution climate models decreasing over time

High-resolution climate models will help target billions of investment dollars to allow vulnerable societies to become resilient to regionally specific types of future extreme weather. Without this information, governments could squander vast amounts of money on maladaptation. Indeed, scientists from the global south already complain that they don’t have actionable information from contemporary models to make informed decisions.

Others say that different models are necessary so that when they all agree, we can be confident in their predictions. However, the current generation of climate models is not diverse at all. They all assume that critically important sub-grid climatic processes like deep convection, flow over orography and ocean mixing by mesoscale eddies can be parametrized by simple formulae. This assumption is false and is the origin of common systematic errors in contemporary models. It is better to represent model uncertainty with more scientifically sound methodologies.

A shift, however, could be on the horizon. Last year a climate-modelling summit was held in Berlin to kick-start the international project Earth Visualisation Engines (EVE). It aims to not only create high-resolution models but also foster collaboration between scientists from the global north and south to work together to obtain accurate, reliable and actionable climate information.

Like the EMBL, it is planned that EVE will comprise a series of highly interconnected nodes, each with dedicated exascale computing capability, serving all of global society. The funding for each node – about $300m per year – is small compared to the trillions of dollars of loss and damage that climate change will cause.

Hopefully, in another 13 years’ time EVE or something similar will be producing the reliable climate predictions that societies around the globe now desperately need. If not, then I fear it will be too late.

Looking for dark matter differently

Dark matter makes up about 85 percent of the universe’s total matter, and cosmologists believe it played a major role in the formation of galaxies. We know the location of this so-called galactic dark matter thanks to astronomical surveys that map how light from distant galaxies bends as it travels towards us. But so far, efforts to detect dark matter trapped within the Earth’s gravitational field have come up empty-handed, even though this type of dark matter – known as thermalized dark matter – should be present in greater quantities.

The problem is that thermalized dark matter travels much more slowly than galactic dark matter, meaning its energy may be too low for conventional instruments to detect. Physicists at the SLAC National Laboratory in the US have now proposed an alternative that involves searching for thermalized dark matter in an entirely new way, using quantum sensors made from superconducting quantum bits (qubits).

An entirely new approach

The idea for the new method came from SLAC’s Noah Kurinsky, who was working on re-designing transmon qubits as active sensors for photons and phonons. Transmon qubits needs to be cooled to temperatures near absolute zero (- 273 °C) before they become stable enough to store information, but even at these extremely low temperatures, energy often re-enters the system and disrupts the qubits’ quantum states. The unwanted energy is typically blamed on imperfect cooling apparatus or some source of heat in the environment, but it occurred to Kurinsky that it could have a much more interesting origin: “What if we actually have a perfectly cold system, and the reason we can’t cool it down effectively is because it’s constantly being bombarded by dark matter?”

While Kurinsky was pondering this novel possibility, his SLAC colleague Rebecca Leane was developing a new framework for calculating the expected density of dark matter inside Earth. According to these new calculations, which Leane performed with Anirban Das (now a postdoctoral researcher at Seoul National University, Korea), this local dark-matter density could be extremely high at the Earth’s surface – much higher than previously thought.

“Das and I had been discussing what possible low threshold devices could probe this high predicted dark matter density, but with little previous experience in this area, we turned to Kurinsky for vital input,” Leane explains. “Das then performed scattering calculations using new tools that allow the dark matter scattering rate to be calculated using the phonon (lattice vibration) structure of a given material.”

Low energy threshold

The researchers calculated that a quantum dark-matter sensor would activate at extremely low energies of just one thousandth of an electronvolt (1 meV). This threshold is much lower than that of any comparable dark matter detector, and it implies that a quantum dark-matter sensor could detect low-energy galactic dark matter as well as thermalized dark matter particles trapped around the Earth.

The researchers acknowledge that much work remains before such a detector ever sees the light of day. For one, they will have to identify the best material for making it. “We were looking at aluminium to start with, and that’s just because that’s probably the best characterized material that’s been used for detectors so far,” Leane says. “But it could turn out that for the sort of mass range we’re looking at, and the sort of detector we want to use, maybe there’s a better material.”

The researchers now aim to extend their results to a broader class of dark matter models. “On the experimental side, Kurinsky’s lab is testing the first round of purpose-built sensors that aim to build better models of quasiparticle generation, recombination and detection and study the thermalization dynamics of quasiparticles in qubits, something that is little understood,” Leane tells Physics World. “Quasiparticles in a superconductor seem to cool much less efficiently than previously thought, but as these dynamics are calibrated and modelled better, the results will become less uncertain and we may understand how to make more sensitive devices.”

The study is detailed in Physical Review Letters.

How the global gaming community is helping to solve biomedical challenges

Complex scientific problems require large-scale resources to solve. So why not look for help from the billions of gamers around the world who spend so much time on their computers?

That’s the idea behind a new kind of citizen science, in which members of the public contribute to research projects by playing video games designed to perform specific scientific tasks.

A research team headed up at McGill University in Canada is now using this approach to understand more about the human microbiome – the tens of trillions of microbes that colonize our bodies, some of which play a vital role in our health.

Unlike most previous citizen science video games – designed for users with a specific interest in science, likely limiting their accessibility to the wider gaming community – this latest citizen science activity is integrated into a commercial video game that’s played by tens of millions of gamers.

The story began on 7 April 2020, when the team released a tile-matching mini game called Borderlands Science as a free download for the role-playing shooter–looter game Borderlands 3. Since then, as the researchers report in Nature Biotechnology, over four million players have solved more than 135 million science puzzles.

“We didn’t know whether the players of a popular game like Borderlands 3 would be interested or whether the results would be good enough to improve on what was already known about microbial evolution. But we’ve been amazed by the results,” says senior author Jérôme Waldispühl in a press statement. “In half a day, the Borderlands Science players collected five times more data about microbial DNA sequences than our earlier game, Phylo, had collected over a 10-year period.”

The Borderlands Science gameplay

The mini game requires players to align rows of tiles representing the genetic building blocks of different microbes. The gamers’ efforts have helped trace the evolutionary relationships of over a million different types of bacteria in the human gut, improving upon results produced by existing computer programs. The researchers hope to use this information to understand how microbial communities are affected by diet and medications, and to relate specific types of microbes to diseases such as inflammatory bowel disease and Alzheimer’s.

“Because evolution is a great guide to function, having a better tree relating our microbes to one another gives us a more precise view of what they are doing within and around us,” explains Rob Knight from UC San Diego.

McGill’s Attila Szantner, who co-founded the Swiss IT company Massively Multiplayer Online Science (MMOS) and came up with the idea of integrating DNA analysis into a commercial video game, points out that the Borderlands Science project demonstrates the vast potential of teaming up with the gaming industry and its communities to tackle major scientific challenges.

“As almost half of the world population is playing with video games, it is of utmost importance that we find new creative ways to extract value from all this time and brainpower that we spend gaming,” Szantner says.

NASA demands new designs for cost-hit Mars Sample Return mission

NASA is seeking alternative designs for its Mars Sample Return (MSR) mission, which is meant to bring back soil and rocks gathered by the agency’s Perseverance rover. But with the MSR beset by cost hikes and delays, NASA concedes that the current design is “too expensive” and that its aim of returning material by 2040 is “unacceptably too long”.

A partnership between NASA and the European Space Agency (ESA), the MSR is designed to return samples collected by Perseverance since 2021 at the Jezera crater on Mars. The material, once back on Earth, will boost our understanding of the red planet’s geological history and the evolution of its climate. It could also help with plans for future human explorers on Mars.

The MSR was given the highest scientific priority by the National Academies of Sciences Decadal Survey of Planetary Science in 2022. It consists of three parts: a Sample Retrieval Lander that will pick up samples deposited by Perseverance and put them in a container; a Mars Ascent Vehicle that will launch the container into Martian orbit; and ESA’s Earth Return Orbiter to ferry the samples to Earth.

Despite its importance, the MSR has fallen significantly behind schedule and gone vastly over budget. Originally planned to cost $4bn, that figure had ballooned to $5.3bn by 2022. A scathing report into the mission by NASA’s independent review board in September 2023 noted that NASA had “unrealistic” ideas about the MSR’s cost and schedule.

“There is no credible, congruent technical, nor properly margined schedule, cost and technical baseline that can be accomplished with the likely available funding,” the report concluded. It said there was a “near zero probability” that ESA and NASA could launch the mission by 2030 and warned that the MSR would cost $6–11bn – roughly the same as the James Webb Space Telescope. Samples would not reach Earth before 2040.

Lowering risk

After the report was released, NASA promised to set up a review panel to respond to its conclusions and “assess alternative architectures” for the mission. In a document released on 15 April, the four-person panel – led by deputy administrator for science Sandra Connelly – concludes that NASA needs to improve the accountability, authority, communication and co-ordination of the mission.

The panel calls on NASA to solicit ideas from industry and NASA institutions, recommending a detailed process to explore “out-of-the-box architecture and mission element options by releasing a competitive industry study solicitation as soon as possible”. Such options, the panel asserts, could make the mission cheaper, less complex and less risky, while returning the samples faster. In particular, the studies should include alternative designs for the Mars ascent vehicle that will lift the samples off the red planet’s surface.

NASA administrator Bill Nelson admits that the 2040 date “is too far” away and hopes that the new plan should speed the mission up and make it cheaper. Nicola Fox – associate administrator of NASA’s Science Mission Directorate – adds that “it is imperative to return these valuable samples to Earth to be studied in state-of-the-art laboratories”. A successful sample-return effort will, she says, enable scientists to address key questions about Mars and “inspire future generations to pursue further investigations into questions not yet known”.

  • Meanwhile, NASA has announced that the Dragonfly rotorcraft mission to Titan, a moon of Saturn that is rich in organic materials, will move towards its final design. The mission has a launch date of July 2028 with NASA expecting the craft to fly on Titan in 2034.

Local twist angles in graphene come into view

Stacking layers of two-dimensional materials on top of each other and varying the twist angle between them massively alters their electronic properties. The trick is to get the twist angle just right, and to know when you’ve done so. Researchers in China have now developed a technique that helps with the second part of this challenge. By allowing scientists to directly visualize the variations in local twist angles, the new technique shed light on the electronic structure of twisted materials and accelerate the development of devices that exploit their properties.

Graphene (a 2D form of carbon just one atom thick) does not have an electronic band gap. Neither does a pair of graphene layers stacked on top of each other. However, if you add another 2D material called hexagonal boron nitride (hBN) to the stack, a band gap emerges. This is because the lattice constant of hBN – a measure of how its atoms are arranged – is nearly the same as that of graphene, but not exactly. The slightly mismatched layers of graphene and hBN form a larger structure known as a moiré superlattice, and the interactions between nearby atoms in this superlattice allow a gap to form. If the layers are then twisted so that they are further misaligned, the lattice interactions weaken, and the band gap disappears.

Achieving such changes in conventional materials usually requires scientists to alter the materials’ chemical composition. Varying the twist angle between layers of a 2D material is an entirely different approach, and the associated possibilities kickstarted a new field of device engineering known as twistronics. The problem is that twist angles are hard to control, and if different areas of a sample contain unevenly distributed twisted angles, the sample’s electronic properties will vary from location to location. This is far from ideal for high-performance devices, so researchers have been exploring ways to visualize such inhomogeneities more precisely.

A new method based on sMIM

In the new work, a team led by Hong-Jun Gao and Shiyu Zhu of the Institute of Physics, Chinese Academy of Sciences, adapted a method called scanning microwave impedance microscopy (sMIM) that was recently developed by Zhixun Shen and colleagues at Stanford University in the US. The adapted method involves applying a range of gate voltages to the sample and analysing conductivity fluctuations in the sMIM data at different positions in it. “This process provides the gate voltages corresponding to moiré band gaps, which are indicative of fully filled electronic bands, directly unveiling details about the moiré superlattice and local twist angles,” Zhu explains.

When the researchers tested this method on high-quality samples of twisted bilayer graphene fabricated by their colleagues Qianying Hu, Yang Xu and Jiawei Hu, they were able to detect variations of twist angles directly. They also gleaned information on the conductivity of localized areas, and they characterized other electronic states such as quantum Hall states and Chern insulators by applying out-of-plane magnetic fields. “We made these measurements concurrently,” Zhu notes. “This allowed us to directly obtain quantum state information under different local twist angle conditions.”

The new technique revealed pronounced variations in the local twist angles of around 0.3° over distances of several microns, he adds. It also enabled the team to measure local conductivity, which is not possible with alternative methods that use single-electron transistors to measure compressibility or nanoSQUIDs to measure magnetic fields. What is more, for samples of twisted bilayer graphene covered by an insulating BN layer, the new method has a significant advantage over conventional scanning tunnelling microscopy, as it can penetrate the insulating layer.

Exploring novel quantum states

“Our work has revealed the local twist angle variation within and between domains of a twisted two-dimensional material,” Zhu tells Physics World. “This has deepened our understanding of the microscopic state of the sample, allowing us to explain many experimental phenomena previously observed in ‘bulk-averaging’ measurements. It also provides a way to explore novel quantum states that are difficult to observe macroscopically, offering insights from a microscopic perspective.”

Thanks to these measurements, the unevenness of local twist angles in twisted two-dimensional materials should no longer be a hindrance to the study of novel quantum states, he adds. “Instead, thanks to the rich distribution of local twist angles we have observed, it should now be possible to simultaneously compare various quantum states under multiple local twist angle conditions and band structure conditions in a single sample.”

The researchers now aim to extend their technique to a wider range of twisted systems and heterostructure moiré systems – for example, in materials like twisted bilayer MoTe2 and WSe2/WS2. They would also like to conduct bulk-averaging measurements and compare these results with local measurements using their new method.

Quantum Barkhausen noise detected for the first time

Researchers in the US and Canada have detected an effect known as quantum Barkhausen noise for the first time. The effect, which comes about thanks to the cooperative quantum tunnelling of a huge number of magnetic spins, may be the largest macroscopic quantum phenomena yet observed in the laboratory.

In the presence of a magnetic field, electron spins (or magnetic moments) in a ferromagnetic material all line up in the same direction – but not all at once. Instead, alignment occurs piecemeal, with different regions, or domains, falling into line at different times. These domains influence each other in a way that can be likened to an avalanche. Just as one clump of snow pushes on neighbouring clumps until the entire mass comes tumbling down, so does alignment spread through the domains until all spins point in the same direction.

One way of detecting this alignment process is to listen to it. In 1919, the physicist Heinrich Barkhausen did just that. By wrapping a coil around a magnetic material and attaching a loudspeaker to it, Barkhausen transformed changes in the magnetism of the domains into an audible crackling. Known today as Barkhausen noise, this crackling can be understood in purely classical terms as being caused by the thermal motion of the domain walls. Analogous noise phenomena and dynamics also exist in other systems, including earthquakes and photomultiplier tubes as well as avalanches.

Quantum Barkhausen noise

In principle, quantum mechanical effects can also produce Barkhausen noise. In this quantum version of Barkhausen noise, the spin flips occur as the particles tunnel through an energy barrier – a process known as quantum tunnelling – rather than by gaining enough energy to jump over it.

In the new work, which is detailed in PNAS, researchers led by Thomas Rosenbaum of the California Institute of Technology (Caltech) and Philip Stamp at the University of British Columbia (UBC) observed quantum Barkhausen noise in a crystalline quantum magnet cooled to temperatures near absolute zero (- 273 °C). Like Barkhausen in 1919, their detection relied on wrapping a coil around their sample. But instead of hooking the coil up to a loudspeaker, they measured jumps in its voltage as the electron spins flipped orientations. When groups of spins in different domains flipped, Barkhausen noise appeared as a series of voltage spikes.

The Caltech/UBC researchers attribute these spikes to quantum effects because they are not affected by a 600% increase in temperature. “If they were, then we would be in the classical, thermally activated regime,” Stamp says.

Rosenbaum adds that applying a magnetic field transverse to the axis of the spins has “profound effects” on the response, with the field acting like a quantum “knob” for the material. This, he says, is further evidence for the novel quantum nature of the Barkhausen noise. “Classical Barkhausen noise in magnetic systems has been known for over 100 years, but quantum Barkhausen noise, where domain walls tunnel through barriers rather than being thermally activated over them, has not, to the best of our knowledge, been seen before,” he says.

Co-tunnelling effects

Intriguingly, the researchers observed spin flips being driven by groups of tunnelling electrons interacting with each other. The mechanism for this “fascinating” co-tunnelling, they say, involves sections of domain walls known as plaquettes interacting with each other through long-range dipolar forces. These interactions produce correlations between different segments of the same wall, and they also nucleate avalanches on different domain walls simultaneously. The result is a mass cooperative tunnelling event that Stamp and Rosenbaum liken to a crowd of people behaving as a single unit.

“While dipolar forces have been observed to affect the dynamics of the motion of a single wall and drive self-organized criticality, in LiHoxY1-xF4, long-range interactions cause correlations not just between different segments of the same wall, but actually nucleate avalanches on different domain walls simultaneously,” Rosenbaum says.

The result can only be explained as a cooperative macroscopic quantum (tunnelling phenomenon, Stamp says. “This is the first example ever seen in nature of a very large-scale cooperative quantum phenomenon, on the scale of 1015 spins (that is, a thousand billion billion),” he tells Physics World. “This is huge and is by far the largest macroscopic quantum phenomenon ever seen in the lab.”

Advanced detection skills

Even with billions of spins cascading at once, the researchers say the voltage signals they observed are very small. Indeed, it took them some time to develop the detection ability necessary to accumulate statistically significant data. On the theory side, they had to develop a new approach to investigate magnetic avalanches that had not been formulated previously.

They now hope to apply their technique to systems other than magnetic materials to find out whether such cooperative macroscopic quantum phenomena exist elsewhere.

Copyright © 2024 by IOP Publishing Ltd and individual contributors