Skip to main content

EU must double its science budget to remain competitive, warns report

The European Union should more than double its budget for research and innovation in its next spending round, dubbed Framework Programme 10 (FP10). That’s the view of a report by an expert group, which says a dramatic increase to €220bn is needed for European science to be globally competitive once again. Its recommendations are expected to have a big influence over the European Commission’s proposals for FP10, due in mid-2025.

The EU’s current Horizon Europe programme, which runs from 2021 to 2027, has a budget of €95.5bn. In December 2023, the Commission picked 15 experts from research and industry – led by former Portuguese science minister Manuel Heitor – to advise on FP10, which is set to run from 2028 to 2034. According to their report, Europe is lagging behind in investment and impact in science, technology and innovation.

It says Europe’s share of global scientific publications, most-cited publications and patent applications have dropped over the last 20 years. Europe’s technology base, it claims, is more diverse than other major economies, but also more focused on less complex technologies. China and the US, in contrast, lead in areas expected to drive future growth, such as semiconductors, optics, digital communications and audio-visual technologies.

The experts also say the “disruptive, paradigm shifting research and innovation” that Europe needs to boast it economies is “unlikely to be fostered by conventional procedures and programmes in the EU today”. They want the EU to set up an experimental unit to test and launch disruptive innovation programmes with “fast funding” options. It should develop programmes like those of the US advanced research projects agencies and explore how generative AI could be used in science.

Based on analysis of previous unfunded proposals, the report claims that FP10’s budget should be doubled to €220bn to “guarantee funding of all high-quality proposals”. It also says that funding applications need to be simplified and streamlined, with funding handed out more quickly. It also calls for better international collaborations, including with China, and disruptive innovation programmes, such as on military-civilian “dual-use” innovation.

Launching the report, Heitor said there was a need “to put research technology and innovation in the centre of European economies”, adding that the expert group was calling for “radical simplification and innovation” for the next programme. Europe needs to pursue a “transformative agenda” in FP10 around four interlinked areas: competitive excellence in science and innovation; industrial competitiveness; societal challenges; and a strong European research and innovation ecosystem.

Space travel: the health effects of space radiation and building a lunar GPS

We are entering a second golden age of space travel – with human missions to the Moon and Mars planned for the near future. In this episode of the Physics World Weekly podcast we explore two very different challenges facing the next generation of cosmic explorers.

First up, the radiation oncologist James Welsh chats with Physics World’s Tami Freeman about his new ebook about the biological effects of space radiation on astronauts. They talk about the types and origins of space radiation and how they impact human health. Despite the real dangers, Welsh explains that the human body appears to be more resilient to radiation than are the microelectronics used on spacecraft. Based at Loyola Medicine in the US, Welsh explains why damage to computers, rather than the health of astronauts, could be the limiting factor for space exploration.

Later in the episode I am in conversation with two physicists who have written a paper about how we could implement a universal time standard for the Moon. Based at the US’s National Institute of Standards and Technology (NIST), Biju Patla and Neil Ashby, explain how atomic clocks could be used to create a time system that would making coordinating lunar activities easier – and could operate as a GPS-like system to facilitate navigation. They also say that such a lunar system could be a prototype for a more ambitious system on Mars.

Hybrid irradiation could facilitate clinical translation of FLASH radiotherapy

Dosimetric comparisons of prostate cancer treatment plans

FLASH radiotherapy is an emerging cancer treatment that delivers radiation at extremely high dose rates within a fraction of a second. This innovative radiation delivery technique, dramatically faster than conventional radiotherapy, reduces radiation injury to surrounding healthy tissues while effectively targeting malignant tumour cells.

Preclinical studies of laboratory animals have demonstrated that FLASH radiotherapy is at least equivalent to conventional radiotherapy, and may produce better anti-tumour effects in some types of cancer. The biological “FLASH effect”, which is observed for ultrahigh-dose rate (UHDR) irradiations, spares normal tissue compared with conventional dose rate (CDR) irradiations, while retaining the tumour toxicity.

With FLASH radiotherapy opening up the therapeutic window, it has potential to benefit patients requiring radiotherapy. As such, efforts are underway worldwide to overcome the clinical challenges for safe adoption of FLASH into clinical practice. As the FLASH effect has been mostly investigated using broad UHDR electron beams, which have limited range and are best suited for treating superficial lesions, one important challenge is to find a way to effectively treat deep-seated tumours.

In a proof-of-concept treatment planning study, researchers in Switzerland demonstrated that a hybrid approach combining UHDR electron and CDR photon radiotherapy may achieve equivalent dosimetric effectiveness and quality to conventional radiotherapy, for the treatment of glioblastoma, pancreatic cancer and localized prostate cancer. The team, at Lausanne University Hospital and the University of Lausanne, report the findings in Radiotherapy and Oncology.

Combined device

This hybrid treatment could be facilitated using a linear accelerator (linac) with the capability to generate both UHDR electron beams and CDR photon beams. Such a radiotherapy device could eliminate concerns relating to the purchase, operational and maintenance costs of other proposed FLASH treatment devices. It would also overcome the logistical hurdles of needing to move patients between two separate radiotherapy treatment rooms and immobilize them identically twice.

For their study, the Lausanne team presumed that such a dual-use clinically approved linac exists. This linac would deliver a bulk radiation dose by a UHDR electron beam in a less conformal manner to achieve the FLASH effect, and then deliver conventional intensity-modulated radiation therapy (IMRT) or volumetric-modulated arc therapy (VMAT) to enhance dosimetric target coverage and conformity.

Principal investigator Till Böhlen and colleagues created a machine model that simulates 3D-conformal broad electron beams with a homogeneous parallel fluence. They developed treatments that deliver a single broad UHDR electron beam with case-dependent energy of between 20 and 250 MeV for every treatment fraction, together with a CDR VMAT to produce a conformal dose delivery to the planning target volume (PTV).

The tumours for each of the three cancer cases required simple, mostly round PTVs that could be covered by a single electron beam. Each plan’s goal was to deliver the majority of the dose per treatment with the UHDR electron beam, while achieving acceptable PTV coverage, homogeneity and sparing of critical organs-at-risk.

Plan comparisons

The researchers assessed the plan quality based on absorbed dose distribution, dose–volume histograms and dose metric comparisons with the CDR reference plans used for clinical treatments. In all cases, the hybrid plans exhibited comparable dosimetric quality to the clinical plans. They also evaluated dose metrics for the parts of the doses delivered by the UHDR electron beam and by the CDR VMAT, observing that the hybrid plans delivered the majority of the PTV dose, and large parts of doses to surrounding tissues, at UHDR.

“This study demonstrates that hybrid treatments combining an UHDR electron field with a CDR VMAT may provide dosimetrically conformal treatments for tumours with simple target shapes in various body sites and depths in the patient, while delivering the majority of the prescribed dose per fraction at UHDR without delivery pauses,” the researchers write.

In another part of the study, the researchers estimated the potential FLASH sparing effect achievable with their hybrid technique, using the glioblastoma case as an example. They assumed a FLASH normal tissue sparing scenario with an onset of FLASH sparing at a threshold dose of 11 Gy/fraction, and a more favourable scenario with sparing onset at 3 Gy/fraction. The treatment comprised a single-fraction 15 Gy UHDR electron boost, supplemented with 26 fractions of CDR VMAT. The two tested scenarios showed a FLASH sparing magnitude of 10% for the first scenario and more substantial 32% sparing of brain tissues of for the second.

“Following up on this pilot study focusing on feasibility, the team is currently working on improving the joint optimization of the UHDR and CDR dose components to further enhance plan quality, flexibility and UHDR proportion of the delivered dose using the [hybrid] treatment approach,” Böhlen tells Physics World. “Additional work focuses on quantifying its biological benefits and advancing its technical realization.”

Trailblazer: astronaut Eileen Collins reflects on space, adventure, and the power of lifelong learning

In this episode of Physics World Stories, astronaut Eileen Collins shares her extraordinary journey as the first woman to pilot and command a spacecraft. Collins broke barriers in space exploration, inspiring generations with her courage and commitment to discovery. Reflecting on her career, she discusses not only her time in space but also her lifelong sense of adventure and her recent passion for reading history books. Today, Collins frequently shares her experiences with audiences around the world, encouraging curiosity and inspiring others to pursue their dreams.

Joining the conversation is Hannah Berryman, director of the new documentary SPACEWOMAN, which is based on Collins’ memoir Through the Glass Ceiling to the Stars, co-written with Jonathan H Ward. The British filmmaker describes what attracted her to Collins’ story and the universal messages it reveals. Hosted by science communicator Andrew Glester, this episode offers a glimpse into the life of a true explorer – one whose spirit of adventure knows no bounds.

SPACEWOMAN has its world premiere on 16 November 2024 at DOC NYC. Keep an eye on the documentary’s website for details of how you can watch the film wherever you are.

Venkat Srinivasan: ‘Batteries are largely bipartisan’

Which battery technologies are you focusing on at Argonne?

We work on everything. We work on lead-acid batteries, a technology that’s 100 years old, because the research community is saying, “If only we could solve this problem with cycle life in lead-acid batteries, we could use them for energy storage to add resilience to the electrical grid.” That’s an attractive prospect because lead-acid batteries are extremely cheap, and you can recycle them easily.

We work a lot on lithium-ion batteries, which is what you find in your electric car and your cell phone. The big challenge there is that lithium-ion batteries use nickel and cobalt, and while you can get nickel from a few places, most of the cobalt comes from the Democratic Republic of Congo, where there are safety and environmental concerns about exactly how that cobalt is being mined, and who is doing the mining. Then there’s lithium itself. The supply chain for lithium is concentrated in China, and we saw during COVID the problems that can cause. You have one disruption somewhere and the whole supply chain collapses.

We’re also looking at technologies beyond lithium-ion batteries. If you want to start using batteries for aviation, you need batteries with a long range, and for that you have to increase energy density. So we work on things like solid-state batteries.

Finally, we are working on what I would consider really “out there” technologies, where it might be 20 years before we see them used. Examples might be lithium-oxygen or lithium-sulphur batteries, but there’s also a move to go beyond lithium because of the supply chain issues I mentioned. One alternative might be to switch to sodium-based batteries. There’s a big supply of soda ash in the US, which is the raw material for sodium, and sodium batteries would allow us to eliminate cobalt while using very little nickel. If we can do that, the US can be completely reliant on its own domestic minerals and materials for batteries.

What are the challenges associated with these different technologies?

Frankly, every chemistry has its challenges, but I can give you an example.

If you look at the periodic table, the most electronegative element is lithium, while the most electropositive is fluorine. So you might think the ultimate battery would be lithium-fluorine. But in practice, nobody should be using fluorine – it’s super dangerous. The next best option is lithium-oxygen, which is nice because you can get oxygen from the air, although you have to purify it first. The energy density of a lithium-oxygen battery is comparable to that of gasoline, and that is why people have been trying to make solid-state lithium-metal batteries since before I was born.

Photo of Arturo Gutierrez and Venkat Srinivasan. Gutierrez is wearing safety glasses and a white lab coat and has his arms inside a glovebox while Srinivasan looks on

The problem is that when you charge a battery with a lithium metal anode, the electrolyte deposits on the lithium metal, and unfortunately it doesn’t create a thin, planar layer. Instead, it forms these needle-like structures called dendrites that short to the battery’s separator. Battery shorting is never a good thing.

Now, if you put a mechanically hard material next to the lithium metal, you can stop the dendrites from growing through. It’s like putting in a concrete wall next to the roots of a tree to stop the roots growing into the other side. But if you have a crack in your concrete wall, the roots will find a way – they will actually crack the concrete – and exactly the same thing happens with dendrites.

So the question becomes, “Can we make a defect-free electrolyte that will stop the dendrites?” Companies have taken a shot at this, and on the small scale, things look great: if you’re making one or two devices, you can have incredible control. But in a large-format manufacturing setup where you’re trying to make hundreds of devices per second, even a single defect can come back to bite you. Going from the lab scale to the manufacturing scale is such a challenge.

What are the major goals in battery research right now?

It depends on the application. For electric cars, we still have to get the cost down, and my sense is that we’ll ultimately need batteries that charge in five minutes because that’s how long it takes to refuel a gasoline-powered car. I worry about safety, too, and of course there’s the supply-chain issue I mentioned.

But if you forget about supply chains for a second, I think if we can get fast charging with incredibly safe batteries while reducing the cost by a factor of two, we are golden. We’ll be able to do all sorts of things.

A researcher holding a plug kneels next to an electric car. The car has a sign on the front door that reads "Argonne research vehicle"

For aviation, it’s a different story. We think the targets are anywhere from increasing energy density by a factor of two for the air taxi market, all the way to a factor of six if you want an electric 737 that can fly from Chicago to Washington, DC with 75 passengers. That’s kind of hard. It may be impossible. You can go for a hybrid design, in which case you will not need as much energy density, but you need a lot of power density because even when you’re landing, you still have to defy gravity. That means you need power even when the vehicle is in its lowest state of charge.

The political landscape in the US is shifting as the Biden administration, which has been very focused on clean energy, makes way for a second presidential term for Donald Trump, who is not interested in reducing carbon emissions. How do you see that impacting battery research?

If you look at this question historically, ReCell, which is Argonne’s R&D centre for battery recycling, got established during the first Trump administration. Around the same time, we got the Federal Consortium for Advanced Batteries, which brought together the Department of Energy, the Department of Defense, the intelligence community, the State Department and the Department of Commerce. The reason all those groups were interested in batteries is that there’s a growing feeling that we need to have energy independence in the US when it comes to supply chains for batteries. It’s an important technology, there’s lots of innovations, and we need to find a way to move them to market.

So that came about during the Trump administration, and then the Biden administration doubled down on it. What that tells me is that batteries are largely bipartisan, and I think that’s at least partly because you can have different motivations for buying them. Many of my neighbours aren’t particularly thinking about carbon emissions when they buy an electric vehicle (EV). They just want to go from zero to 60 in three seconds. They love the experience. Similarly, people love to be off-grid, because they feel like they’re controlling their own stuff. I suspect that because of this, there will continue to be largely bipartisan support for EVs. I remain hopeful that that’s what will happen.

  • Venkat Srinivasan will appear alongside William Mustain and Martin Freer at a Physics World Live panel discussion on battery technologies on 21 November 2024. Sign up here.

UK plans £22bn splurge on carbon capture and storage

Further details have emerged over the UK government’s pledge to spend almost £22bn on carbon capture and storage (CCS) in the next 25 years. While some climate scientists feel the money is vital to decarbonise heavy industry, others have raised concerns about the technology itself, including its feasibility at scale and potential to extend fossil fuel use rather than expanding renewable energy and other low-carbon technologies.

In 2023 the UK emitted about 380 million tonnes of carbon dioxide equivalent and the government claims that CCS could remove more than 8.5 million tonnes each year as part of its effort to be net-zero by 2050. Although there are currently no commercial CCS facilities in the UK, last year the previous Conservative government announced funding for two industrial clusters: HyNet in Merseyside and the East Coast Cluster in Teesside.

Projects at both clusters will capture carbon dioxide from various industrial sites, including hydrogen plants, a waste incinerator, a gas-fired power station and a cement works. The gas will then be transported down pipes to offshore storage sites, such as depleted oil and gas fields. According to the new Labour government, the plans will create 4000 jobs, with the wider CCS industry potentially supporting 50,000 roles.

Government ministers claim the strategy will make the UK a global leader in CCS and hydrogen production and is expected to attract £8bn in private investment. Rachel Reeves, the chancellor, said in September that CCS is a “game-changing technology” that will “ignite growth”. The Conservative’s strategy also included plans to set up two other clusters but no progress has been made on these yet.

The new investment in CCS comes after advice from the independent Climate Change Committee, which said it is necessary for decarbonising the UK’s heavy industry and for the UK to reach its net-zero target. The International Energy Agency (IEA) and the Intergovernmental Panel on Climate Change have also endorsed CCS as critical for decarbonisation, particularly in heavy industry.

“The world is going to generate more carbon dioxide from burning fossil fuels than we can afford to dump into the atmosphere,” says Myles Allen, a climatologist at the University of Oxford. “It is utterly unrealistic to pretend otherwise. So, we need to scale up a massive global carbon dioxide disposal industry.” Allen adds, however, that discussions are needed about how CCS is funded. “It doesn’t make sense for private companies to make massive profits selling fossil fuels while taxpayers pay to clean up the mess.”

Out of options

Globally there are around 45 commercial facilities that capture about 50 million tonnes of carbon annually, roughly 0.14% of global emissions. According to the IEA, up to 435 million tonnes of carbon could be captured every year by 2030, depending on the progress of more than 700 announced CCS projects.

One key part of the UK government’s plans is to use CCS to produce so-called “blue” hydrogen. Most hydrogen is currently made by heating methane from natural gas with a catalyst, producing carbon monoxide and carbon dioxide as by-products. Blue hydrogen involves capturing and storing those by-products, thereby cutting carbon emissions.

But critics warn that blue hydrogen continues our reliance on fossil fuels and risks leaks along the natural gas supply chain. There are also concerns about its commercial feasibility. The Norwegian energy firm Equinor, which is set to build several UK-based hydrogen plants, has recently abandoned plans to pipe blue hydrogen to Germany, citing cost and lack of demand.

“The hydrogen pipeline hasn’t proved to be viable,” Equinor spokesperson Magnus Frantzen Eidsvold told Reuters, adding that its plans to produce hydrogen had been “put aside”. Shell has also scrapped plans for a blue hydrogen plant in Norway, saying that the market for the fuel had failed to materialise.

To meet our climate targets, we do face difficult choices. There is no easy way to get there

Jessica Jewell

According to the Institute for Energy Economics and Financial Analysis (IEEFA), CCS “is costly, complex and risky with a history of underperformance and delays”. It believes that money earmarked for CCS would be better spent on proven decarbonisation technologies such as buildings insulation, renewable power, heat pumps and electric vehicles. It says the UK’s plans will make it “more reliant on fossil gas imports” and send “the wrong signal internationally about the need to stop expanding fossil fuel infrastructure”.

After delays to several CCS projects in the EU, there are also questions around progress on its target to store 50 million tonnes of carbon by 2030. Press reports, have recently revealed, for example, that a pipeline connecting Germany’s Rhine-Ruhr industrial heartland to a Dutch undersea carbon storage project will not come online until at least 2032.

Jessica Jewell, an energy expert at Chalmers University in Sweden, and colleagues have also found that CCS plants have a failure rate of about 90% largely because of poor investment prospects (Nature Climate Change 14 1047). “If we want CCS to expand and be taken more seriously, we have to make projects more profitable and make the financial picture work for investors,” Jewell told Physics World.

Subsidies like the UK plan could do so, she says, pointing out that wind power, for example, initially benefited from government support to bring costs down. Jewell’s research suggests that by cutting failure rates and enabling CCS to grow at the pace wind power did in the 2000s, it could capture a “not insignificant” 600 gigatonnes of carbon dioxide by 2100, which could help decarbonise heavy industry.

That view is echoed by Marcelle McManus, director of the Centre for Sustainable Energy Systems at the University of Bath, who says that decarbonising major industries such as cement, steel and chemicals is challenging and will benefit from CCS. “We are in a crisis and need all of the options available,” she says. “We don’t currently have enough renewable electricity to meet our needs, and some industrial processes are very hard to electrify.”

Although McManus admits we need “some storage of carbon”, she says it is vital to “create the pathways and technologies for a defossilised future”. CCS alone is not the answer and that, says Jewell, means rapidly expanding low carbon technologies like wind, solar and electric vehicles. “To meet our climate targets, we do face difficult choices. There is no easy way to get there.”

From melanoma to malaria: photoacoustic device detects disease without taking a single drop of blood

Malaria remains a serious health concern, with annual deaths increasing yearly since 2019 and almost half of the world’s population at risk of infection. Existing diagnostic tests are less than optimal and all rely on obtaining an invasive blood sample. Now, a research collaboration from USA and Cameroon has demonstrated a device that can non-invasively detect this potentially deadly infection without requiring a single drop of blood.

Currently, malaria is diagnosed using optical microscopy or antigen-based rapid diagnostic tests, but both methods have low sensitivity. Polymerase chain reaction (PCR) tests are more sensitive, but still require blood sampling. The new platform – Cytophone – uses photoacoustic flow cytometry (PAFC) to rapidly identify malaria-infected red blood cells via a small probe placed on the back of the hand.

PAFC works by delivering low-energy laser pulses through the skin into a blood vessel and recording the thermoacoustic signals generated by absorbers in circulating blood. Cytophone, invented by Vladimir Zharov from the University of Arkansas for Medical Science, was originally developed as a universal diagnostic platform and first tested clinically for detection of cancerous melanoma cells.

“We selected melanoma because of the possibility of performing label-free detection of circulating cells using melanin as an endogenous biomarker,” explains Zharov. “This avoids the need for in vivo labelling by injecting contrast agents into blood.” For malaria diagnosis, Cytophone detects haemozoin, an iron crystal that accumulates in red blood cells infected with malaria parasites. These haemozoin biocrystals have unique magnetic and optical properties, making them a potential diagnostic target.

Photoacoustic detection

“The similarity between melanin and haemozoin biomarkers, especially the high photoacoustic contrast above the blood background, motivated us to bring a label-free malaria test with no blood drawing to malaria-endemic areas,” Zharov tells Physics World. “To build a clinical prototype for the Cameroon study we used a similar platform and just selected a smaller laser to make the device more portable.”

The Cytophone prototype uses a 1064 nm laser with a linear beam shape and a high pulse rate to interrogate fast moving blood cells within blood vessels. Haemozoin nanocrystals in infected red blood cells absorb this light (more strongly than haemoglobin in normal red blood cells), heat up and expand, generating acoustic waves. These signals are detected by an array of 16 tiny ultrasound transducers in acoustic contact with the skin. The transducers have focal volumes oriented in a line across the vessel, which increases sensitivity and resolution, and simplifies probe navigation.

In vivo testing

Zharov and collaborators – also from Yale School of Public Health and the University of Yaoundé I – tested the Cytophone in 30 Cameroonian adults diagnosed with uncomplicated malaria. They used data from 10 patients to optimize device performance and assess safety. They then performed a longitudinal study in the other 20 patients, who attended four or five times at up to 37 days following antimalarial therapy, contributing 94 visits in total.

Photoacoustic waveforms and traces from infected blood cells have a particular shape and duration, and a different time delay to that of background skin signals. The team used these features to optimize signal processing algorithms with appropriate averaging, filtration and gating to identify true signals arising from infected red blood cells. As the study subjects all had dark skin with high melanin content, this time-resolved detection also helped to avoid interference from skin melanin.

On visit 1 (the day of diagnosis), 19/20 patients had detectable photoacoustic signals. Following treatment, these signals consistently decreased with each visit. Cytophone-positive samples exhibited median photoacoustic peak rates of 1.73, 1.63, 1.18 and 0.74 peaks/min on visits 1–4, respectively. One participant had a positive signal on visit 5 (day 30). The results confirm that Cytophone is sensitive enough to detect low levels of parasites in infected blood.

The researchers note that Cytophone detected the most common and deadliest species of malaria parasite, as well as one infection by a less common species and two mixed infections. “That was a really exciting proof-of-concept with the first generation of this platform,” says co-lead author Sunil Parikh in a press statement. “I think one key part of the next phase is going to involve demonstrating whether or not the device can detect and distinguish between species.”

The research team

Performance comparison

Compared with invasive microscopy-based detection, Cytophone demonstrated 95% sensitivity at the first visit and 90% sensitivity during the follow-up period, with 69% specificity and an area under the ROC curve of 0.84, suggesting excellent diagnostic performance. Cytophone also approached the diagnostic performance of standard PCR tests, with scope for further improvement.

Staff required just 4–6 h of training to operate Cytophone, plus a few days experience to achieve optimal probe placement. And with minimal consumables required and the increasing affordability of lasers, the researchers estimate that the cost per malaria diagnosis will be low. The study also confirmed that the safety of the Cytophone device. “Cytophone has the potential to be a breakthrough device allowing for non-invasive, rapid, label-free and safe in vivo diagnosis of malaria,” they conclude.

The researchers are now performing further malaria-related clinical studies focusing on asymptomatic individuals and children (for whom the needle-free aspect is particularly important). Simultaneously, they are continuing melanoma trials to detect early-stage disease and investigating the use of Cytophone to detect circulating blood clots in stroke patients.

“We are integrating multiple innovations to further enhance Cytophone’s sensitivity and specificity,” says Zharov. “We are also developing a cost-effective wearable Cytophone for continuous monitoring of disease progression and early warning of the risk of deadly disease.”

The study is described in Nature Communications.

Quantized vortices seen in a supersolid for the first time

Quantized vortices – one of the defining features of superfluidity – have been seen in a supersolid for the first time. Observed by researchers in Austria, these vortices provide further confirmation that supersolids can be modelled as superfluids with a crystalline structure. This model could have variety of other applications in quantum many body physics and Austrian team now using it to study pulsars, which are rotating and magnetized neutron stars.

A superfluid is a curious state of matter that can flow without any friction. Superfluid systems that have been studied in the lab include helium-4; type-II superconductors; and Bose–Einstein condensates (BECs) – all of which exist at very low temperatures.

More than five decades ago, physicists suggested that some systems could exhibit crystalline order and superfluidity simultaneously in a unique state of matter called a supersolid. In such a state, the atoms would be described by the same wavefunction and are therefore delocalized across the entire crystal lattice. The order of the supersolid would therefore be defined by the nodes and antinodes of this wavefunction.

In 2004, Moses Chan of the Pennsylvania State University in the US and his PhD student Eun-Seong Kim reported observing a supersolid phase in superfluid helium-4. However, Chan and others have not been able to reproduce this result. Subsequently, researchers including Giovanni Modugno at Italy’s University of Pisa and Francesca Ferlaino at the University of Innsbruck in Austria have demonstrated evidence of supersolidity in BECs of magnetic atoms.

Irrotational behaviour

But until now, no-one had observed an important aspect of superfluidity in a supersolid: that a superfluid never carries bulk angular momentum. If a superfluid is placed in a container and the container is rotated at moderate angular velocity, it simply flows freely against the edges. As the angular momentum of the container increases, however, it becomes energetically costly to maintain the decoupling between the container and the superfluid. “Still, globally, the system is irrotational,” says Ferlaino; “So there’s really a necessity for the superfluid to heal itself from rotation.”

In a normal superfluid, this “healing” occurs by the formation of small, quantized vortices that dissipate the angular momentum, allowing the system to remain globally irrotational. “In an ordinary superfluid that’s not modulated in space [the vortices] form a kind of triangular structure called an Abrikosov lattice, because that’s the structure that minimizes their energy,” explains Ferlaino. It was unclear how the vortices might sit inside a supersolid lattice.

In the new work, Ferlaino and colleagues at the University of Innsbruck utilized a technique called magnetostirring to rotate a BEC of magnetic dysprosium-164 atoms. They caused the atoms to rotate simply by rotating the magnetic field. “That’s the beauty: it’s so simple but nobody had thought about this before,” says Ferlaino.

As the group increased the field’s rotation rate, they observed vortices forming in the condensate and migrating to the density minima. “Vortices are zeroes of density, so there it costs less energy to drill a hole than in a density peak,” says Ferlaino; “The order that the vortices assume is largely imparted by the crystalline structure – although their distance is dependent on the repulsion between vortices.”

Unexpected applications

The researchers believe the findings could be applicable in some unexpected areas of physics. Ferlaino tells of hearing a talk about the interior composition of neutron stars by the theoretical astrophysicist Massimo Mannarelli of Gran Sasso Laboratory in Italy. “During the coffee break I went to speak to him and we’ve started to work together.”

“A large part of the astrophysical community is convinced that the core of a neutron star is a superfluid,” Ferlaino says; “The crust is a solid, the core is a superfluid, and a layer called the inner crust has both properties together.” Pulsars are neutron stars that emit radiation in a narrow beam, giving them a well-defined pulse rate that depends on their rotation. As they lose energy through radiation emission, they gradually slow down.

Occasionally, however, their rotation rates suddenly speed up again in events called glitches. The researchers’ theoretical models suggest that the glitches could be caused by vortices unpinning from the supersolid and crashing into the solid exterior, imparting extra angular momentum. “When we impose a rotation on our supersolid that slows down, then at some point the vortices unpin and we see the glitches in the rotational frequency,” Ferlaino says. “This is a new direction – I don’t know where it will bring us, but for sure experimentally observing vortices was the first step.”

Theorist Blair Blakie of the University of Otago in New Zealand is excited by the research. “Vortices in supersolids were a bit of a curiosity in early theories, and sometimes you’re not sure whether theorists are just being a bit crazy considering things, but now they’re here,” he says. “It opens this new landscape for studying things from non-equilibrium dynamics to turbulence – all sorts of things where you’ve got this exotic material with topological defects in it. It’s very hard to predict what the killer application will be, but in these fields people love new systems with new properties.”

The research is described in Nature.

Sceptical space settlers, Einstein in England, trials of the JWST, tackling quantum fundamentals: micro reviews of the best recent books

A City on Mars: Can We Settle Space, Should We Settle Space, and Have We Really Thought This Through?
By Kelly and Zach Weinersmith

Husband-and-wife writing team Kelly and Zach Weinersmith were excited about human settlements in space when they started research for their new book A City on Mars. But the more they learned, the more sceptical they became. From technology, practicalities and ethics, to politics and the legal framework, they uncovered profound problems at every step. With humorous panache and plenty of small cartoons by Zach, who also does the webcomic Saturday Morning Breakfast Cereal, the book is a highly entertaining guide that will dent the enthusiasm of most proponents of settling space. Kate Gardner

  • 2024 Particular Books

Einstein in Oxford
By Andrew Robinson

“England has always produced the best physicists,” Albert Einstein once said in Berlin in 1925. His high regard for British physics led him to pay three visits to the University of Oxford in the early 1930s, which are described by Andrew Robinson in his charming short book Einstein in Oxford. Sadly, the visits were not hugely productive for Einstein, who disliked the formality of Oxford life. His time there is best remembered for the famous blackboard – saved for posterity – on which he’d written while giving a public lecture. Matin Durrani

  • 2024 Bodleian Library Publishing

Pillars of Creation: How the James Webb Telescope Unlocked the Secrets of the Cosmos
By Richard Panek

The history of science is “a combination of two tales” says Richard Panek in his new book charting the story of the James Webb Space Telescope (JWST). “One is a tale of curiosity. The other is a tale of tools.” He has chosen an excellent case study for this statement. Pillars of Creation combines the story of the technological and political hurdles that nearly sank the JWST before it launched with a detailed account of its key scientific contributions. Panek’s style is also multi-faceted, mixing technical explanations with the personal stories of scientists fighting to push the frontiers of astronomy.  Katherine Skipper

  • 2024 Little, Brown

Quanta and Fields: the Biggest Ideas in the Universe
By Sean Carroll

With 2025 being the International Year of Quantum Science and Technology, the second book in prolific science writer Sean Carroll’s “Biggest Ideas” trilogyQuanta and Fields – might make for a prudent read. Following the first volume on “space, time and motion”, it tackles the key scientific principles that govern quantum mechanics, from wave functions to effective wave theory. But beware: this book is packed with equations, formulae and technical concepts. It’s essentially a popular-science textbook, in which Carroll does things like examine each term in the Schrödinger equation and delve into the framework for group theory. Great for physicists but not, perhaps, for the more casual reader. Tushna Commissariat

  • 2024 Penguin Random House

Four-wave mixing could boost optical communications in space

A new and practical approach to the low-noise amplification of weakened optical signals has been unveiled by researchers in Sweden. Drawing from the principles of four-wave mixing, Rasmus Larsson and colleagues at Chalmers University of Technology believe their approach could have promising implications for laser-based communication systems in space.

Until recently, space-based communication systems have largely relied on radio waves to transmit signals. Increasingly, however, these systems are being replaced with optical laser beams. The shorter wavelengths of these signals offer numerous advantages over radio waves. These include higher data transmission rates; lower power requirements; and lower risks of interception.

However, when transmitted across the vast distances of space, even a tightly focused laser beam will spread out significantly by the time its light reaches its destination. This will weaken severely the signal’s strength.

To deal with this loss, receivers must be extremely sensitive to incoming signals. This involves the preamplification of the signal above the level of electronic noise in the receiver. But conventional optical amplifiers are far too noisy to achieve practical space-based communications.

Phase-sensitive amplification

In a 2021 study, Larsson’s team showed how these weak signals can, in theory, be amplified with zero noise using a phase-sensitive optical parametric amplifier (PSA). However, this approach did not solve the problem entirely.

“The PSA should be the ideal preamplifier for optical receivers,” Larsson explains. “However, we don’t see them in practice due to their complex implementation requirements, where several synchronized optical waves of different frequencies are needed to facilitate the amplification.” These cumbersome requirements place significant demands on both transmitter and receiver, which limits their use in space-based communications.

To simplify preamplification, Larsson’s team used four-wave mixing. Here, the interaction between light at three different wavelengths within a nonlinear medium produces light at a fourth wavelength.

In this case, a weakened transmitted signal is mixed with two strong “pump” waves that are generated within the receiver. When the phases of the signal and pump are synchronized inside a doped optical fibre, light at the fourth wavelength interferes constructively with the signal. This boosts the amplitude of the signal without sacrificing low-noise performance.

Auxiliary waves

“This allows us to generate all required auxiliary waves in the receiver, with the transmitter only having to generate the signal wave,” Larsson describes. “This is contrary to the case before where most, if not all waves were generated in the transmitter. The synchronization of the waves further uses the same specific lossless approach we demonstrated in 2021.”

The team says that this new approach offers a practical route to noiseless amplification within an optical receiver. “After optimizing the system, we were able to demonstrate the low-noise performance and a receiver sensitivity of 0.9 photons per bit,” Larsson explains. This amount of light is the minimum needed to reliably decode each bit of data and Larsson adds, “This is the lowest sensitivity achieved to date for any coherent modulation format.”

This unprecedented sensitivity enabled the team to establish optical communication links between a PSA-amplified receiver and a conventional, single-wave transmitter. With a clear route to noiseless preamplification through some further improvements, the researchers are now hopeful that their approach could open up new possibilities across a wide array of applications – especially for laser-based communications in space.

“In this rapidly emerging topic, the PSA we have demonstrated can facilitate much higher data rates than the bandwidth-limited single photon detection technology currently considered.”

This ability would make the team’s PSA ideally suited for communication links between space-based transmitters and ground-based receivers. In turn, astronomers could finally break the notorious “science return bottleneck”. This would remove many current restrictions on the speed and quantity of data that can be transmitted by satellites, probes, and telescopes scattered across the solar system.

The research is described in Optica.

Copyright © 2025 by IOP Publishing Ltd and individual contributors