Skip to main content

‘Proxima’ review: bonds beyond the bounds of Earth

The bond between parent and child is undoubtedly a strong one, no matter the complexities involved. Proxima, the latest film by French screenwriter and director Alice Winocour, follows the final weeks on Earth for Sarah (Eva Green), an astronaut preparing for a 12-month stay on the International Space Station. Stunningly shot in part on location in Star City – the cosmonaut training centre in Russia – and in collaboration with the European Space Agency (ESA), the story transpires to be far more firmly rooted on Earth than it is about space.

Proxima explores the various strains, stresses and successes of the relationship between Sarah and her daughter Stella, as she prepares to leave for her long and dangerous journey. This is a film about a mother who is an astronaut, and an astronaut who is a mother. Green’s performance is exceptional, as her character treads the difficult line so many people must between being a good parent, while also excelling in their career.

It would be only fair to compare and contrast Proxima with two recent blockbuster sci-fi films that explore complicated parent-child relationships – Interstellar (2014) and Ad Astra (2019), which have a father–daughter and father–son bond respectively, as a key plot point. But both those films’ stories are juxtaposed against much larger, complex plots that philosophize about other “big questions” in the universe.

Proxima, on the other hand, is a fictional story set in a scientific field – and so in a sense could be thought of as science fiction – but focuses mainly on Sarah’s struggle to get through the gruelling astronaut training and be the best astronaut she can, while also trying to support her daughter and maintain their very close relationship. There are no warp drives or time travel to worry about, but for a space nerd like me, there is a real joy to be had from watching the scenes shot on location in iconic and historic locations such as the Baikonur Cosmodrome. The direction and cinematography in Proxima provide a beautiful aesthetic that is worth catching on the big screen, if you can find one showing it and feel comfortable with the COVID-19 protocols in place.

Another key theme in the film is that of separation. Sarah is separated from Thomas, the father of her child, who is himself an ESA astrophysicist, played impeccably by Lars Eidinger. As Sarah leaves for her training, Stella goes to live with her father, moving countries, joining a new school and most of all, missing her mother. It is clear from early in the film that Thomas is an inattentive father, and Sarah has misgivings about leaving Stella with him. Her struggle with this separation, and her attempts to keep her strong connection with her daughter as she prepares to leave the planet, are at the heart of the film.

Alice Winocour

Proxima is indisputably a feminist film. We are left with no doubt that life as an astronaut is made harder for a woman than it is for a man, especially if the woman is a parent. Sarah must make decisions about whether to continue with her menstruation in space – the scene was reminiscent of the real-world story of NASA asking astronaut Sally Ride, the first US woman in space, whether 100 tampons would suffice for a week in space – and often has to grapple with equipment and space suits that are primarily designed for men. If that were not bad enough, she is faced with boorish misogyny all too often, not least at the hands of mission commander, Mike Shannon, perfectly portrayed by Matt Dillon.

On first viewing, I was left with some misgivings about what exactly the film was trying to say, especially the finale. It’s difficult to discuss it without straying far into spoiler territory (if that does not worry you, see the final question in my conversation with Winocour below) but suffice to say that I watched Proxima with two other feminists, both of whom were frustrated by the difficult choices Sarah was forced to make throughout the film, and the lack of support she receives. This film does not champion the ability to be a mother and an astronaut with ease. Instead it realistically depicts the many hurdles along the way. Indeed, Proxima portrays the struggles of a highly skilled woman, trying to excel, while battling with the pressures of a broken family and the rigours of preparing to go into space.

At the time of filming, the realities of a quarantine (which all astronauts must go into before a mission, to avoid taking germs to the ISS) were alien to most. Since then, much of the global population has experienced some sort of lockdown. Knowing what we now do, I wonder how many of us would be happy to quit the planet and spend a year with just a handful of others in space. Leaving our family and loved ones behind is surely one of the most difficult things to imagine. Proxima puts that idea centre stage and the result is a fascinating film that left me pondering our lives on Earth, more than our place in the universe.

It also left me with many questions, some of which I put to director and co-writer Winocour. Be warned: there are spoilers in my final question.

What is the film about for you?

I don’t think all films need to have a message – but for me Proxima is about the relationship between a mother and daughter, and the idea of separation. Because this mother is also an astronaut, she has to separate from her daughter and she also has to separate from the Earth. To me there was a poetic parallel between those two things.

When you were initially writing the script, did you come at it more from the astronaut angle, or from the point of view of a mother?

You never really know in the process of writing, but for me, the more I have to talk about something intimate, the more it has to be in another world. In film, particularly in French cinema, people tend to tell autobiographical stories but I find I have to project myself into a distant, unknown world that I want to discover. My first film was set in a psychiatric hospital in the 19th century, and my second was about soldiers returning from Afghanistan – so it can be very different types of things. I am drawn to that. I don’t know why.

When it comes to Proxima, I’ve always had a poetic fascination with space, but I didn’t know anything about that world. So I decided to go to ESA and meet some people. I told them we could do a film about a female astronaut, which was the first core idea to emerge. Then I realized that the thing I wanted to talk about was this complex relationship between a mother and a daughter. You never see superheroines in films with children, but in real life, all the astronauts I’ve met do have children. I wanted to change the perspective and to tell the story from this point of view.

There’s a realism to the film, as much of it was filmed on location in Star City and Baikonur. Was that something of a coup?

I really wanted to have an emotional story and a cinematic story. I was not obsessed with the realistic look of things, but I wanted to film in those places, which are kind of unknown. No popular films had been shot there, yet when I went to Kazakhstan it was the place from which all passenger rockets leave the Earth. Now there is SpaceX flying from the US, but at that time it was only Baikonur. That made it a very emotional place and to shoot there was inspiring for the actors too.

There was something exhilarating about the whole adventure of making the movie. We were like the astronauts, working with people from different countries. We had a German crew, a Russian crew, a French crew and the Kazak crew. I was really thankful to have ESA support. I told them “There are so many movies that show the work of NASA. We need one to show your work, and that of the Russians. When people think about space, thanks to movies, they only think of America. In reality, there are all these other space agencies.”

Who did you work with at ESA?

Everyone from the director of communications, to astronauts Samantha Cristoforetti and Thomas Pesquet – who trained Eva Green. I met Tim Peake, who was training in Poland, several times. I spent two years travelling between Star City, Cologne and Kazakhstan when I was writing the script.

Green and I met a lot of astronauts together, including Claudie Haigneré who was a kind of godmother for the film, as well as Canadian astronauts like Julie Payette – all those pictures of women you see at the end of the film. Green was trained by the Russian trainers and they were very hard on her.

Something that surprised me was the misogynistic way in which the mission commander Mike treats Sarah. I’d like to think that that sort of behaviour would be beneath an astronaut in this day and age?

I’m sorry to say that it is worse than you see in the film. As women we are used to that kind of macho way of behaving. Female astronauts experience the same, and there’s a kind of self-censorship that these women have to perform, in order to succeed in their careers. Only 10% of astronauts, in the world, are women. Most women don’t dare to dream that they could do that job or they think that, if they want children, they have to choose between the two.

Proxima shows how hard it is for women. You have to do the same tasks as men, and then more beyond, to prove that you are capable and that you are credible. But these women show they can do it. Ultimately, for me, it is a story of liberation.

This is a major spoiler, but when Sarah breaks out of the quarantine facility the night before they are meant to take off, to keep a promise to Stella, it is really shocking. What was your thinking behind writing that ending?

Quarantine for astronauts is not like a prison. It’s more like something to keep journalists away. To escape is…well, it’s very hard to not want to see your family. While the scenes in the film were something I dreamt up, one of the astronauts I met, Analie Fischer, the first mother in space, told me that she had escaped quarantine from Houston to go trick or treating at Halloween with her daughter. She wanted to spend more time with her. Another astronaut who left quarantine was Jean-François Clervoy, who went to see his son who was very ill at that time with cancer – Clervoy didn’t know if he would see him when he came back. There is a risk of not coming back from a mission. The day of launch is really scary, even for experienced astronauts.

  • 2020 Dharamsala/ Darius Films 107 min

The power of authority: why we need to rely on experts

Eons ago, we are told, the Flood nearly wiped out the human species, sparing only the tiny handful of people who had prepared themselves. To my mind, COVID-19 is the 21st-century equivalent of the Flood. It’s a global disaster that has killed hundreds of thousands of people, yet also a lesson about the need to prepare for future threats.

These days we face a wider variety of existential threats than in biblical times, including air and water pollution, climate change and rising seas. We have more technologically advanced means to cope with them, such as medicines, vaccines and new energy sources. We also generally assume the responsibility, for both practical and moral reasons, to employ these means to protect not just our immediate families but the entire human species as well.

Still, 21st-century humans face new and staggering challenges when it comes to using those means. Noah had a direct communication channel with God and an unquestioned patriarchal authority, which let him convey the impending danger to his family without disbelief. That divine link also let him marshall and apply the resources needed to build the boat that allowed him and his companions, literally, to weather the storm.

To cope with existential threats, we have to rely on the authority of people with special training – “experts”, we call them

The 21st century has no Noah, and no God warning us of looming disasters. To cope with existential threats, we have to rely on the authority of people with special training to identify, develop and apply the right tools – “experts”, we call them. And when we talk about the “authority of science”, we mean the will to defer to those experts about technical matters that we don’t understand in situations where we are vulnerable.

Flood control

As I argued in my recent book The Workshop and the World, the authority of science is not something that comes naturally to humans, but has to be generated and maintained. Such authority is fragile, and only happens in an atmosphere that has been carefully nurtured by three main things: political leadership, institutional consistency and communal trust. The silver lining of COVID-19 is that the reaction to it illustrates by negative example why these three factors are important.

Let’s start with political leadership. Fostering the authority of science requires leaders who demonstrate their commitment to defer to experts. Donald Trump is a prime negative example. The 45th US president has called the COVID-19 pandemic a hoax. He’s said it will soon go away. He’s looked for scapegoats, left counter-measures to others, and dismantled preparations that his predecessors had put in place. He’s even advised using unsupported remedies or taking unproven ones – actions that not only may harm his health but also undermine the authority of expert advice that others need to stay healthy themselves.

The same is true of leaders who flout their own lockdown rules or don’t wear facemasks when required or advised. Trump once retweeted a comment that wearing masks is a “symbolic” act, which dissolves what might be viewed as an act of commitment into mere theatrics. He is still more proactive in sabotaging scientific authority by insulting and even firing scientists, and appointing those without scientific credentials to oversee scientific activity.

Another basic element for maintaining scientific authority is to have reliable, consistent and transparent scientific institutions. The credibility of scientific literature was hurt by retractions in The Lancet and the New England Journal of Medicine of papers reporting the results of coronavirus-related research. The World Health Organization had to retract a claim about the incidence of asymptotic transmission of coronavirus. The US Centers for Disease Control and Prevention botched coronavirus test kits, flip-flopped its evaluation of mask-wearing, and mixed up the results of certain tests. In the UK, the government’s Scientific Advisory Group on Emergencies (SAGE) was criticized for lacking transparency.

True, the ability to change one’s conclusions based on new evidence or reanalysis is critical to the strength and reliability of science. Yet doing so because of mistaken procedures or without transparency encourages suspicions that statements from scientific institutions are motivated by politics or incompetence. Poor communication makes institutions appear opaque and their operations mysterious, making it easier for politicians to dismiss their advice either by saying “I’m not a scientist” or by blithely claiming “I’m following ‘the science’” while doing nothing of the sort. While the latter is somewhat more commendable, it also undermines scientific authority.

Finally, scientific authority can only flourish in communities that value health and welfare rather than, say, glory, wealth and self-advancement. Such communities must also be willing to make rather than ignore decisions about how to justly allocate limited resources.

It is tempting to blame the lack of scientific authority on shameless politicians, bad institutions or selfish people

It is tempting to blame the lack of scientific authority on shameless politicians, bad institutions or selfish people – or to think that such authority can be restored by fixing any one of these. None of these three pillars by itself will generate scientific authority; it is magical thinking to hope that, say, the next election will save us. Each pillar affects the others. Politicians who insult institutions or scientists hurt the authority of these and encourage community scepticism. Institutions whose advice is inconsistent encourage individuals and politicians to discount their advice. Desire for short-term gain rather than long-term security provides the incentive for politicians and institutions to do the same.

The critical point

I know this sounds religious, but think metaphorically of the pandemic as God’s lesson: “Let me teach 21st-century humans by throwing a pandemic onto Earth and installing the people least able to handle it as our leaders. Will humans get it?” The moral is that the inhabitants of a globalized and scientifically and technologically dependent world must cultivate an atmosphere in which scientific authority can exist or they will perish. No more social distancing, extinction next time.

Water polo’s eggbeater kick defies biomechanics, supercomputer says disposable masks are best

Water polo is a gruelling sport and even staying in one place requires the continuous effort of treading water. To extend their reach for the ball while stationary, players use a kick called the “eggbeater” to raise their upper bodies above the surface of the water. The kick is so called because the legs make large circles in the water.

Now Hideki Takagi of the University of Tsukuba’s swimming laboratory and colleagues have discovered that the efficacy of the eggbeater exceeds the predictions of conventional biomechanical theories based on Newton’s laws and hydrodynamics.

“Our study hints that water polo players are actually taking advantage of complex physics, including unstable vortices, to achieve this increased efficiency,” explains Takagi. The Japan-based team describes their surprising finding in Sports Biomechanics.

Mask calculations

What sort of mask is best for curbing the spread of Covid-10? Researchers in Japan have used the Fugaku supercomputer to compare the efficacy of disposable medical style masks with reusable masks made from woven fabrics – specifically cotton and polyester.

The calculations suggest that the disposable masks, which are made from a non-woven polypropylene fabric, stop more cough droplets that woven masks. The disposable masks are particularly good at stopping smaller droplets, while the woven masks are not – suggest the calculations.

Fugaku has recently earned the title of the world’s most powerful supercomputer and was also used to model the flow of droplets in an office and on a railway carriage. You can read more about the study in The Guardian.

Diamond defects reveal viscous currents in graphene

A team led by researchers from Harvard University and the University of Maryland in the US has used defects in diamond to map the magnetic field generated by electrical currents in graphene. Their experiments reveal that currents in this atomically thin form of carbon flow like a viscous fluid – a result that could provide fresh insights into the collective behaviour of electrons in strongly interacting quantum systems.

Graphene has many exceptional electrical properties. Among them is the fact that, at the point where its conduction and valence bands just touch each other (the Dirac point), it can support currents composed of electrons and an equal number of positively charged holes, rather than electrons alone. In the present work, Ronald Walsworth, Amir Yacoby and colleagues set out to establish whether these electron–hole plasmas (or Dirac fluids, as they are also known) flow smoothly, like electrons travelling through a metallic wire, or unevenly like water running through a pipe.

Mapping the local magnetic field

Since two-dimensional systems like graphene exhibit a one-to-one relationship between current density and magnetic field, Walsworth, Yacoby and colleagues knew they could extract the current flow pattern by mapping the magnetic field near their graphene sample. To do this, they turned to nitrogen-vacancy (NV) centres in diamond.

NV centres are defects that arise when two neighbouring carbon atoms in diamond’s crystalline lattice are replaced by a nitrogen atom and an empty lattice site. They possess numerous quantum properties, including spin and the ability to emit single photons. Their energy levels are also highly sensitive to magnetic fields – meaning that when an NV centre is excited with a laser pulse, the intensity of the fluorescence it emits will change in response to fluctuations in the local magnetic field.

The team exploited these properties of NV centres in two ways. First, they ran a nanostructured diamond tip containing a single NV centre across a narrow (1–1.5 µm wide) conducting channel in graphene. This enabled them to characterize the magnetic field along a narrow line across the graphene device, revealing changes in the pattern of current flow with a spatial resolution of around 50 nm.

While this approach gave them a detailed view of current flow, it was also time-consuming, and therefore impractical for use in mapping the flow across the experiment’s entire 2D field of view. To get a wider perspective, the team placed a graphene sample on a diamond sheet that contained many NV centres near its surface. They then used a camera to image the magnetic field distribution in the sample, as recorded by the fluorescence of the NV centres.

This technique produced a complete, 2D snapshot of the pattern of current flowing through the graphene at a single moment in time. While the snapshot has a lower spatial resolution (around 400 nm) than the image obtained from a single NV centre, it gives an overview of how the current is flowing in the entire sample, rather than in a single narrow channel, at a given moment.

Dirac fluid current density develops parabolic profile

“Our two methods have now produced images revealing that the Dirac fluid current density develops a parabolic profile in which flow is fastest at the centre of the graphene sample and decreases at the edges – just as it is for water flowing in a pipe,” say study lead authors Mark Ku and Tony Zhou. While previous studies had shown that electrons can flow like water in graphene, they add, these other studies failed to confirm the presence of viscous flow in currents composed of Dirac fluids, which are characterized by strong interactions between the component particles. The researchers also note that they observed the viscous behaviour at room temperature, while previous experiments were restricted to colder temperatures.

“The study of viscous hydrodynamic flow in highly pure electronic systems (like graphene) is important in condensed-matter physics because such behaviour is thought to play a role in strongly interacting quantum matter in general,” Ku tells Physics World. “Our work opens up the exciting possibility of exploring phenomena like (hitherto unobserved) electronic turbulence, which is thought to play a role in the unusual electronic properties of high-temperature superconductors, for example.”

The study, which is detailed in Nature, also shows that magnetic field imaging with NV centres can reveal novel transport phenomena. This means that NV defects could be used to study other high-quality materials that may display electron hydrodynamics, as well as other types of exotic electronic transport behaviour (such as topological superconductivity). Understanding viscous electronic flow may even allow researchers to exploit this behaviour in new microelectronics devices made from graphene and related materials.

Standing on the outside looking in: X-rays through glass

© AuntMinnieEurope.com

As a medical physicist at a large public trauma hospital in Melbourne, Australia, I’m not a frontline worker, but many of my colleagues in radiology are face-to-face with COVID-19 positive or suspected patients. As staff get more fatigued and confront the many challenges of the pandemic, I wanted to support them by making their work safer and easier.

We heard about a clever technique for taking chest X-rays through glass. In theory, this allows the X-ray unit to remain outside an isolated patient’s room – conserving personal protective equipment (PPE), reducing radiographer risk and speeding up the process.

X-ray through glass
X-ray through glass

We first saw details about the technique on Twitter. Also, authors of a 2016 article have explained about imaging and the infection-control aspects for the Ebola virus (Spectrum 2016 23 18), which seemed equally applicable to COVID-19.

The first step was to convince our staff this was a safe and reasonable way to take chest X-rays, but it was not easy to find publications that explained the radiation-safety or image-quality aspects of the technique. This motivated us to answer questions about the magnitude of X-ray transmission through glass, safety aspects caused by scattered radiation and the technique parameters that would be most useful.

X-ray through glass

We’ve published our findings recently in Physical and Engineering Sciences in Medicine. We found that 110 kV and 5.5 mAs were the most commonly used factors to account for the glass and additional distance to the patient (this could be up to 3 m between the X-ray tube and detector). The glass was typically equivalent to about one half-value-layer (i.e. it reduced the intensity of the X-ray beam by about 50% and also increased the beam quality or average energy of the X-ray beam).

Our radiologists were satisfied with the image quality, with 90% of images taken through glass considered diagnostic. With our parameters, the typical radiation dose to patients is the same whether the X-ray is taken through glass or not.

The technique has been established for use in the current COVID-19 pandemic, and we assessed the radiation safety accordingly. The radiographers and/or nurses do not need to wear a lead apron. This was important because of the amount of PPE they were already wearing and the infection-control risks associated with sharing lead aprons.

The radiation dose to staff standing 1 m away while a chest X-ray is taken through glass is equivalent to approximately three hours of natural background radiation in Australia (less than 0.5 µSv) and is even lower further away. We encourage staff to maximize their distance from the patient being imaged and also from the backscatter off the glass outside the room.

We have now performed several hundred chest X-rays using this technique. In intensive care unit (ICU) rounds, one radiographer and a nurse tend to stay in the room, while the second radiographer operates the X-ray unit from outside the room. In the emergency department, where the rooms are not as large, staff leave the room while the X-ray is performed. This technique should never be used through lead glass.

Staff need to consider whether the technique is appropriate for each individual patient and also the clinical question. For example, it is not going to be useful where a large patient is semi-erect to confirm the wide-bore nasogastric tube position. In these cases, a supine in-room technique will answer the clinical question more quickly and with less chance of needing to repeat the X-ray.

Zoe Brady

One of the highlights of implementing this technique was the teamwork and collaborative spirit displayed. Radiologists, radiographers, physicists, infection-prevention nurses, ICU nurses and researchers all worked together. Adaptability is key in our response to the pandemic.

This technique strengthens our defence against COVID‑19. All of us have stretched resources coping with the pandemic and we hope our “how to” guide provides enough information for others to put the “chest X-ray through glass” technique into practice.

By the way, in case you are curious about the origin of the first part of the headline for this article, it comes from Cold Chisel’s song “Standing on the outside”, which appears in the Australian band’s 1980 album called East.

  • This article was originally published on AuntMinnieEurope.com © 2020 by AuntMinnieEurope.com. Any copying, republication or redistribution of AuntMinnieEurope.com content is expressly prohibited without the prior written consent of AuntMinnieEurope.com.

Our perilous planet

volcano eruption

“Comfort’s in Heaven, and we are on the Earth,” says the Duke of York in William Shakespeare’s Richard II, a dire bit of philosophy that many writers have touched upon. Ellen Prager, marine scientist, writer and science adviser to Celebrity Cruises in the Galapagos, gives her own take on the subject in her latest book Dangerous Earth: What We Wish We Knew about Volcanoes, Hurricanes, Climate Change, Earthquakes, and More. It’s a dangerous world out there, she writes, and there is still much about our planet that we do not understand.

Most of us are going to die due to old age, or common causes such as heart disease, stroke and dementia. Every so often, though, it is the Earth itself that poses a significant danger to life and limb. Natural disasters – including earthquakes, tsunamis and hurricanes – garner the world’s attention and can even mark the passage of history: the volcanic eruption of Mount Vesuvius; the 1906 San Francisco earthquake; the Boxing Day tsunami in 2004 that killed almost a quarter of a million people.

Despite their enormous destructive power, these natural phenomena are of significant interest to scientists. They appear possibly predictable but are mostly not; they’re massive in scale, dwarfing the human realm. A mature hurricane can release about 200 times the total electrical generating capacity of civilization, according to NASA. In the open ocean a tsunami wave can travel at more than 800 km/h. Anthropogenic warming is heating the ocean by the energy equivalent of over five Hiroshima bombs a second.

Scientific, policy and rescue communities all over the world learn from each such event, as Prager explains. Yet there is much we do not know. She describes what is known about these various cataclysms, usually with specific examples from recent decades, then covers the “wish-we-knews” – what scientists and others would like to know. Indeed, science still can’t predict when a volcano will erupt, the exact path a hurricane will take, or the value of climate sensitivity (the average amount of surface warming the Earth will experience if carbon-dioxide levels in the atmosphere double). The latter may be one of the most important numbers in human history, and requires much better understanding of, for example, the dynamics of the 15 melting planetary ice sheets, the rate at which permafrost will melt; and how the resulting climate change will affect extreme weather, extreme rainfall, corals and marine life, algae blooms and ocean dead zones.

Prager is a strong writer with an excellent narrative sense. As an Oregonian I know a lot about the 1980 eruption of the Mount St Helens volcano in the Pacific Northwest of the US, and I was captivated by her telling of the story of the 1991 eruption of Mount Pinatubo in the Philippines. Although more than 840 people died from the 15 June eruption in the Philippines’ Luzon Volcanic Arc, the largest in those months and the second-largest of the 20th century, many more were saved by the work of geologists who were closely monitoring the mountain for seismic activity and magma movement. Some of the best work was done educating the local population about past massive eruptions and pushing local officials to make thorough evacuation plans.

Then, Prager writes, pulling the reader steadily along, “a defining moment occurs”. A geologist named Rick Hoblitt accompanies an air force general on an aerial survey of the mountain. Hoblitt points out the current hazards and notes how the imminent pyroclastic flow will lead right into the nearby Clark Air Base. “The general turns to his colonel and orders an evacuation,” and the next day 25,000 people leave; soon a total of 200,000 are evacuated. Within days the main eruption bursts forth and the base is obliterated, but many lives have been saved.

Climate change appropriately features heavily in Prager’s book, being the mother of all disasters now and for the next few centuries. I found the chapter on volcanoes the best in Dangerous Earth, for both science and narration. The last chapter – on rogue waves, landslides, rip currents and sinkholes – is refreshingly novel, but the final subsection on sharks seems shoehorned in, perhaps at the suggestion of the publicity department. The midsection of the book contains an attractive collection of colour photographs and figures. Prager’s paragraph on coral bleaching – about which, for some reason, I’ve always had somewhat of a mental block – is the most succinct, useful explanation I’ve ever come across.

Despite its many positives, the book does have issues. Tornadoes only receive two pages, and then only in the context of hurricanes. In fact, in the US more people are killed annually by tornadoes, on average, than hurricanes. In several places in the book, a presumably asymmetric 3D object is specified by only two numbers.

A more serious problem, in my view, is the lack of an index or even a useful notes section. Each chapter has its own bibliography, but almost no effort was made to link claims, quotations and topics in that chapter with specific sources in the chapter’s bibliography. The reader is left to guess which reference might apply based solely on the paper’s, site’s or book’s title. It makes the book less useful and turns it into something to be read instead of studied. This book deserves more.

  • 2020 University of Chicago Press 272pp £20hb

A grasshopper’s quantum leap, mentoring medical physicists, extremely brilliant synchrotron

A grasshopper lands at a random location on a patch of lawn and then jumps a fixed distance in a random direction. What shape of lawn has the highest probability of retaining the grasshopper after its jump?

In this episode of the Physics World Weekly podcast, we talk to a quartet of physicists and mathematicians that have made a remarkable connection between this “grasshopper problem” and Bell’s theorem of quantum physics.

We also chat about an international programme for mentoring medical physicists and the recent €150 million upgrade to the European Synchrotron Radiation Facility.

This week’s guests are quantum grasshopper experts Olga GoulkoAdrian KentDmitry Chistikov and Mike Paterson.

Quantum error correction achieved using oscillator grid states

A practical implementation of a quantum error-correction protocol first proposed back in 2001 has been achieved by physicists in the US and France. The protocol increases the coherence time of quantum memory and although the work is still preliminary, it could potentially allow for the much more economical use of quantum bits (qubits) in quantum computers.

Noise is inevitable in any system, and conventional computers use error correction to stop noise from corrupting calculations. In quantum computers, noise-related errors are a much more significant problem because of the delicate nature of the quantum states used to create qubits.

Errors in a quantum computer come from fundamentally classical sources, explains Michel Devoret of Yale University in Connecticut: “The computation is derailed by parasitic noise or thermal noise.” However, fixing errors is not a simple matter. “When you are repairing errors, you may introduce new errors because measurement is invasive in quantum mechanics,” says Devoret.

Non-local solution

The key to overcoming this problem lies in the fact that noise is local – it does not affect different parts of a system in a non-random way. If quantum information is stored non-locally, therefore, it should be possible to recover it. Most quantum error-correction algorithms do this by using multiple qubits to encode the same information. This is not a panacea, however, because increasing the number of qubits in a quantum computer is not easy. For example, Google’s Sycamore machine, with which it claimed quantum advantage in 2019, contained just 54 qubits.

Back in 2001, theoretical physicists Daniel Gottesman, Alexei Kitaev and John Preskill proposed storing information non-locally in exotic quantum states of an oscillator that became known as GKP states. “Everybody knows that the uncertainty principle says you can’t measure with arbitrary precision both the position and the momentum of a harmonic oscillator,” explains Preskill, who is at Caltech. “But it turns out that you can prepare a state of a particle or an oscillator, then someone – while your back is turned – can come along and shift it a little in position and momentum, and you can measure both shifts to arbitrary accuracy if you promise that they are small. If you look at one of these states in position space it looks like a grid, and if you look at it in momentum space it looks the same way; so if you shift the comb by less than half the distance between the teeth, you can measure the distance by which it’s been shifted. So you can encode information in these states and the noise in the lab can be measured and corrected.”

“Visionary theoretical discovery”

When the trio first published their results, Devoret says the paper received only limited attention. “I find this paper to be a wonderful example of a visionary theoretical discovery happening a little too early,” he says. “I remember reading this paper and thinking ‘Well this looks very clever but not very practical’.”

Since 2001, however, researchers have become more adept at creating exotic quantum states in various platforms. Moreover, they have realized that, as well as correcting for small displacements in position and momentum, GKP states can also compensate more effectively than any other states for the much more significant noise arising from photon loss from qubit states.

Last year, researchers at ETH Zurich in Switzerland created GKP states in qubits made of trapped ions, which are a leading contender for practical quantum computers. Now, Devoret and colleagues at Yale and the Inria Paris Research Centre have demonstrated quantum error correction of an encoded qubit in the other leading quantum-computing technology – qubits made from superconducting circuits.

Double the lifetime

The researchers initialized qubits in specific states and measured their lifetimes with and without applying the quantum error-correction algorithm. They found that the error-correction algorithm doubled the lifetime of all the quantum states to around 250 μs. “In our experiment we’re just protecting memory,” says Devoret, “The next step is to show fault-tolerant computation: you want to protect an operation and error correct while you are computing.”

Preskill is impressed. “Until now they’ve only demonstrated a fairly modest improvement over what they can do without using these special error-resistant states,” he says, “but it’s an important step.”

As John Martinis of University of California, Santa Barbara adds, “I think it’s a really interesting [achievement] and a step forward for the field because they made this error-protected logical qubit using ideas that have been around for a long time. They showed it worked.”

Martinis, who until recently worked at Google’s AI lab developing quantum computers, cautions however, that radical improvements may be necessary to compete with the brute force approach. “If you look at the technology of computers, we know that, over time, people learn how to make lots of bits – so we can make lots of qubits. The advantage here is that you do this error correction internally using one qubit with a complicated state, so you don’t necessarily have to scale up, but unless you extend the lifetime thousands or millions of times, how useful is it going to be?” he says.

The research is described in Nature.  

Molecular Trojan Horse breaches blood–brain barrier

Nanoparticles doped with molecules derived from a neurotransmitter can smuggle chemical cargoes across the blood–brain barrier (BBB). A team at Tufts University in the US created such nanoparticles and used them to deliver a diverse range of therapeutic substances into the brains of mice. The technique could one day be used to treat neurological conditions such as infections and neurodegenerative disorders, while avoiding the side effects that accompany other methods for penetrating the BBB.

The BBB comprises a layer of endothelial cells that line the brain’s blood vessels. This layer of cells is selectively permeable and protects the brain from toxins and pathogens that might be circulating in the blood. Unfortunately, it also keeps out most drugs and other therapeutics, making certain brain diseases difficult to treat by conventional methods of drug delivery.

Finding ways to breach or circumvent the BBB is the subject of active research. Although some techniques have been shown to deliver drugs with a degree of success, methods that disrupt the barrier by chemical or physical means can simultaneously let other – potentially harmful – substances sneak across alongside the intended molecules. Writing in Science Advances, Feihe Ma, Liu Yang and colleagues report an approach that is more selective.

Study authors

To develop their technique, the team experimented with three chemicals that are naturally present in the brain: tryptamine, phenethylamine and phenylethanolamine. These neurotransmitters are members of the special class of molecules that can cross the BBB in a process that is thought to occur via active transport (rather than passive diffusion) across the endothelial cells’ membranes.

Ma, Yang and colleagues coupled the functional amines in the neurotransmitters to lipid chains. This created composite molecules (NT–lipidoids) that are hydrophobic at one end and hydrophilic at the other. In an aqueous solution, such amphiphilic molecules bunch together to form spherical particles called micelles that can encapsulate other chemical species within their cores. By using NT–lipidoids in this process, the researchers created nanoparticles with surfaces studded with the neurotransmitter-derived amines.

When they injected NT–lipidoid nanoparticles carrying fluorescent dye into live mice, the researchers found that the tryptamine-derived formulation (NT1) successfully breached the BBB and accumulated in the animals’ brains. The presence of tryptamine on the surfaces of the nanoparticles apparently triggered the transport process that lets the neurotransmitter pass into and through endothelial cells. A mystery that the researchers are currently investigating is why the phenethylamine- and phenylethanolamine-derived formulations failed to achieve the same result.

To test the technique with a more therapeutically relevant cargo, next the team used NT1 nanoparticles to encapsulate amphotericin B, an antifungal drug to which the BBB is usually impermeable. Like the fluorescent dye, this substance was also carried across the BBB successfully. However, the researchers found that the concentration delivered to the animals’ brain tissue was even higher when they added lipidoids that included phenylboronic acid to the mixture. This component makes the lipidoid more soluble in water, thus decreasing the size of the nanoparticles from 800 nm to 100 nm as a result.

For some treatments, merely getting the drug across the BBB is not enough, however. Antisense therapy, for example, could be used to limit production of the tau-protein tangles associated with Alzheimer’s disease, but the antisense oligonucleotides (ASOs) that produce the beneficial effect must penetrate into the neurons themselves. To achieve this, the researchers incorporated into the nanoparticles yet another lipidoid, 306-O12B-3, which they had previously shown could deliver ASOs into cells in the liver. They found that nanoparticles that combined NT1 and 306-O12B-3 lipidoids transported the tau ASOs across both the BBB and the membranes of the neurons, yielding a reduction in tau mRNA.

Although the results are promising, the technique is still some way from the clinic. “More studies and clinical trials, including pharmacokinetics, pharmacodynamics, toxicity etc, will be needed to determine the utility and safety of the technology in humans,” explains corresponding author Qiaobing Xu.

There are also still fundamental questions about the phenomenon at the heart of the work. The ineffectiveness of lipidoids derived from phenethylamine and phenylethanolamine is unexplained, but even the way in which the tryptamine-doped nanoparticles work is unclear. “We think they cross the BBB by way of a transporter-mediated process,” says Xu, “but we don’t know the mechanism exactly.”

Particle and atomic physics more likely to win Nobels

Research into particle and atomic physics is more likely to be awarded a Nobel prize than work in other fields. That is according to new analysis from scientists at Stanford Univer­sity, who find, however, that research that does achieve such recognition may actually have far less impact than other discoveries made at the same time.

John Ioannidis and colleagues from Stanford mapped the key Nobel-prize-related publication of each laureate who was awarded a prize in either physics, chemistry or physiology and medicine between 1995 and 2017. They also analysed 63 million other articles from the Scopus database that were published in the same timeframe. The work revealed that, out of 114 scientific fields, parti­cle and atomic physics alone account for a quarter of Nobel prizes awarded during the period covered by the study. When including cell biology, neuroscience and molecular chem­istry, these five fields accounted for more than half of the awards. Further­more, almost all Nobel-prize-related papers have been cited less exten­sively than top papers published around the same time.

Many candi­dates deserve a prize, but only a few receive it

Lutz Bornmann

While the Stanford group was unable to determine whether this clustering is driven by the nomination of candi­dates or by the selection of the awar­dees, or both, they are concerned that it might create a culture where some scientists are considered less worthy simply because of the field in which they work. It might even influence the fields in which researchers choose to undertake research. “Scientists do care about rewards and incentives. Therefore, they may be attracted to work in fields or subdisciplines that are attracting more recognition, funding and have better odds of being seen as impor­tant science,” Ioannidis told Physics World. “Diversity and richness of dis­ciplines and approaches is a strength of science, and too much clustering of recognition may be detrimental in this regard.”

Ground-breaking research

Lutz Bornmann, a science soci­ologist at the headquarters of the Max Planck Society in Munich, Germany, is “surprised” by the team’s findings. “I had expected that the Nobel prizes would be evenly dis­tributed across fields,” he says. He wonders whether, in addition to the potential for bias in the nomination and selection processes, the clus­tering might indicate that certain fields of science are more amenable to the type of ground-breaking research likely to achieve Nobel recognition.

Bornmann is keen, however, for scientists to continue their research efforts in whichever field captures their interest. “Although the study revealed these clustering processes, researchers should not adjust their careers to the results of the study,” he says. After all, he explains, even researchers active in fields seem­ingly favoured by the prize stand only a very small chance of being awarded a Nobel, however worthy their achievements. “Many candi­dates deserve a prize,” he says, “but only a few receive it.”

The research is published in PLOS One.

Copyright © 2025 by IOP Publishing Ltd and individual contributors