Skip to main content

Subradiance stores light in dense atomic clouds

Subradiance, whereby excited atoms decay more slowly than usual, has been spotted in a dense atomic cloud for the first time. Giovanni Ferioli and colleagues at the University of Paris-Saclay prepared longer-lived “subradiant” atomic states using optical tweezers and laser pulses. With further improvements their technique could have applications in optics and quantum computing.

An atom in an excited state will decay back into its ground state by spontaneously emitting a photon. In 1954, the American physicist Robert Dicke showed that this decay process can be enhanced within dense ensembles of atoms where the average atomic separation is smaller than the wavelength of the emitted photon. Through this effect of superradiance, photons emitted by the atoms constructively interfere with each other resulting in a burst of light that is shorter and more intense than would occur if the decays occurred independently,

More recently, physicists have discovered that the opposite effect called subradiance can also occur.  Here, emitted photons destructively interfere with each other, inhibiting the decay of excited states. While subradiance has been seen in dilute atomic ensembles and ordered atomic arrays, it had not been observed before in dense clouds of atoms.

In a pinch

Now, the Paris team has seen subradiance within dense, disordered clouds of cold rubidium atoms, which were confined within an optical-tweezer trap. After being excited by a laser pulse, the rubidium atoms decay rapidly at first because the laser mostly couples to superradiant states. Over time, however, subradiance emerges through a combination of two mechanisms. First, the weakly-excited subradiant states of the atoms far outlive their superradiant states, and eventually came to dominate the cloud’s overall emission. Second, a certain fraction of the superradiant states will leak into subradiant states.

Together, these factors lead to a long tail of emission that persists after the excitation has occurred. The team also observed that the lifetime of the subradiance increases as more atoms are added to the cloud.

By increasing the intensity of their laser pulse, Ferioli’s team could put the atoms into a quantum superposition of excited states. While initially decaying via superradiance, these multiple states can eventually become trapped in a “dark” state; with destructive interference preventing further decays. The researchers showed that firing a subsequent laser pulse at the atoms can cause the cloud to emerge from its dark state and release a sudden burst of light.

In the future, Ferioli and colleagues hope to gain more control over the lifetimes of their excited atoms. If achieved, this could enable researchers to prepare long-lived and highly reliable networks of entangled atoms. This could lead to applications in areas including optics, metrology and quantum computing.

The research is described in Physical Review X.

MRI reveals deterioration of brain’s reward circuitry in younger-onset dementia

MR brain images

Frontotemporal dementia (FTD) is a brain disorder that most commonly affects those under the age of 60. Due to an overlap of clinical symptoms (for example, loss of enthusiasm, empathy or motivation) with other neurological disorders, as well as the gradual nature of its onset, it is not uncommon for patients with FTD to be misdiagnosed with late-life depression. Therefore, it is critical to find a distinct symptom that can be used to diagnose FTD.

Muireann Irish and her team at The University of Sydney are the first to differentiate a distinct symptom in patients with FTD: anhedonia, the inability to enjoy pleasant experiences. Their research, published in Brain, also gives insight into the neurobiological workings of anhedonia, distinguishing it from other neuropsychiatric conditions such as apathy and depression.

“We are very excited by these findings as they reveal a symptom that has not previously been documented in frontotemporal dementia, as well as the neural bases of this symptom, opening the door to potential treatments,” explains Irish.

A unique symptom of frontotemporal dementia

In a study of FTD patients, patients with Alzheimer’s disease and healthy controls, the researchers strove to characterize this multi-faceted condition through two diagnostic approaches: cognitive and neuroimaging assessments. Behavioural tests revealed a high prevalence of anhedonia in FTD syndromes as opposed to other forms of dementia. Specifically, FTD patients with anhedonia would experience a lack of interest in rewarding experiences and enjoyable hobbies, such as eating a favourite meal or spending time with friends.

Grey matter degeneration

The researchers then performed voxel-based morphometry analysis of participants’ whole-brain MR images to examine voxel-by-voxel changes in grey matter signal intensities. They discovered that that the neural circuitry of anhedonia differs from that of apathy and depression. Specifically, FTD patients diagnosed with anhedonia show deterioration mostly within a frontostriatal grey matter network responsible for experiencing pleasure.

The researchers note that participants with Alzheimer’s disease, who did not show clinically significant anhedonia, exhibited different patterns of grey matter atrophy to the FTD patients.

Shedding light on anhedonia

While this work provides insight on potential treatment areas that could improve the quality of life for these patients, more research is needed to gain a comprehensive understanding of anhedonia in FTD. Future investigations would focus on the relationship between the deterioration of the brain’s reward circuit and the manifestation of anhedonia in the patient’s everyday life.

“Our findings are also important for understanding the subjective experience of the person living with dementia, for the delivery of personalized care, as well as revealing broader insights into a fundamental aspect of the human condition,” says Irish. “If we pause to consider what it might be like to lose our capacity to experience pleasure, we can appreciate the immense need for future work in this field to restore some of the simple pleasures in life to those affected by these cruel disorders.”

Superfluidity seen in a 2D Fermi gas

Physicists in Germany say they have found definitive evidence for the existence of superfluidity in an extremely cold 2D gas of fermions. Their experiment involved confining a few thousand lithium atoms inside a specially designed trap, and they say that the finding could help shed light on the role of reduced dimensionality in high-temperature superconductors.

Understanding the mechanisms that allow electrical current to flow without resistance inside cuprate materials at ambient pressure and at temperatures of up to 133 K is one of the biggest outstanding challenges in condensed-matter physics. Although scientists can explain the process behind more conventional, lower-temperature superconductivity, they are still trying to work out how the phenomenon can take place at high temperatures in what are essentially 2D materials (cuprates being made up of layers of copper oxide). Such low-dimensional materials are prone to fluctuations that prevent the long-range coherence thought to be essential for superconductivity.

2D Fermi gases can serve as model systems to try and help clear up this mystery, having strong and tuneable correlations between their constituent fermions that can mimic interactions in superconductors.  Macroscopic quantum phenomena such as Bose–Einstein condensation involve large numbers of bosons – particles with integer spin – co-existing in a single quantum state. Fermions, in contrast, have half-integer spin and are subject to the Pauli exclusion principle – which precludes multiple particles sharing quantum states. But fermions can get around this restriction by pairing up and combining their spins.

Superfluidity too relies on a macroscopic quantum state of bosons, occurring at very low temperatures and causing the fluid in question to flow without viscosity. While 3D Fermi gases have previously been shown to exhibit superfluidity, only indirect evidence had been gleaned for the phenomenon in such gases restricted to 2D. But Lennart Sobirey and colleagues at the University of Hamburg have now observed superfluidity in a 2D Fermi gas thanks in part to a special kind of trap they initially demonstrated in 2017.

Box potential

The researchers carried out their experiment using a Fermi gas of about 6000 lithium-6 atoms. As is the case with 3D demonstrations, they first used a series of optical and magnetic techniques to cool the atoms down to a fraction of a degree above absolute zero and hold them in place. The difference this time was that they created what is known as a box potential, by loading the gas into an optical lattice created by two blue lasers. This tightly confined the gas to a very thin layer, with the energy needed to move atoms in the vertical direction exceeding the thermal energy and chemical potential of the gas.

To establish whether their gas was a superfluid, the team turned to the Landau criterion. This stipulates that excitations can only occur within a superfluid when there is movement above a certain minimum velocity. In other words, there will be no friction between the superfluid and any impurity that moves through it more slowly than this.

The impurity created by the researchers came in the form of a moving lattice of light. They directed two red laser beams at the centre of their trap, so creating an interference pattern with a sinusoidal potential. By offsetting the frequency of the two beams very slightly, they were able to move the lattice through the gas with a certain speed.

Temperature changes

They did this for a range of different velocities and in each case measured the system’s change in temperature – a measure of how many excitations were created. They found that as they increased the velocity the movement remained frictionless, until they reached a certain critical value. At this point the movement started to generate heat.

The researchers carried out this procedure both when the fermions interacted strongly to create a Bose–Einstein condensate and when they formed more weakly bound Cooper pairs. Although the precise response in the two states was different, in both cases they observed a critical velocity below which movement was frictionless. This, they say, “constitutes conclusive evidence of superfluidity”.

What’s more, the researchers found that this superfluidity was temperature dependent. By preparing the gas at different temperatures and in each case moving the lattice through the gas at a variety of speeds, they observed the critical velocity came into play at low temperatures but not at higher ones. In other words, the gas went through a phase transition from a superfluid to a normal fluid at a certain critical temperature. They determined that temperature to be 35 nK, which, they say, agrees very well with theoretical predictions.

John Thomas of North Carolina State University in the US, who was not involved with the research, points out that the experiments show a clear peak in the critical speed at the crossover from a Bose–Einstein condensate to a Fermi gas – where the gas is most strongly correlated. This, he says, “already provides new insights into the nature of superfluidity in 2D systems”. He adds that the work paves the way for studying the effects of dimensionality, which, he points out, can be smoothly tuned from 2D to 3D by reducing the strength of the confining potential.

The research is described in Science.

The story of science fiction and society

Rachel Carson’s Silent Spring – the famous 1962 book on the devastating impact of pesticides like DDT – is credited with revitalizing the environmental movement. It also helped to accelerate the transition of science journalism from its former stance as largely uncritical cheerleader to something more like a “watchdog” of the fourth estate, as well as paving the way for the founding of the US Environmental Protection Agency. Interestingly, Carson’s book did not begin with history or scientific fact, but instead with a form of science fiction, in which ecosystem disruption caused by unregulated pesticide use has killed off all life in Anytown, USA.

With its use of fiction next to fact and its demonstrable impact on society, Silent Spring was an excellent case study and just one of the examples analysed in Science Fiction, an elegantly written new book by media and cultural studies researcher Sherryl Vint. At the heart of this captivating book is the long and complicated relationship between science and sci-fi, in which fiction serves to reflect and reflect upon science, the production of which has in turn been influenced by science fiction. The work is the latest in the MIT Press “Essential Knowledge” series, which, as the publisher puts it, offers “accessible, concise, beautifully produced pocket-size books” delivering “expert overviews” on topics ranging from neuroplasticity and quantum entanglement to fMRI and nihilism.

Largely eschewing a potentially lengthy discourse on both the origins and history of science fiction, as well as the fool’s game of trying to define its boundaries, Vint instead concentrates on the core of the so-called genre: “a vision of the world made otherwise and what possibilities might flow from such otherness”. Science Fiction begins by briefly exploring sci-fi’s early grounding in utopian writings – and its more recent shift towards the dystopian tradition. The first half of the book also covers sci-fi’s popular associations with futurology and speculative design, as well as a relationship to colonialism that has long been baked into the texture of the genre.

The second half of the work, meanwhile, focuses on science fiction as a medium through which the reader can contemplate our relationship with both existing and potential scientific and technological advances in robotics and genomics, as well as with forces that are shaping our society, like climate change and the increasing dominance of finance in the economy.

Throughout, Vint is engaging and critical, and demonstrates a formidable command of the science fiction canon, from which she draws examples to show the breadth of topics onto which the genre can provide a lens. My primary regret, however, is that the book is not longer – and does not embrace wider themes. At the outset (and even on the back cover blurb), Vint makes it quite clear that her goal is to explore the engagement between science fiction and current research in science and technology, a place where visions of future technological changes might be imagined and interrogated. Such a frame likely aligns well with the interests of the Physics World readership.

Nevertheless, as Vint does note, “social as well as technological change is at stake in SF”. As a reader who has some background in the liberal arts and also in science, I feel that Science Fiction lacks important chapters for what is supposed to be an introductory text. I was anticipating, for example, some discussion of science fiction’s intersection with gender, race, sexuality and class. This is not to say that these topics are not touched upon – Afrofuturism, for example, gets a few mentions, with reference to the iconic works of individuals like Octavia Butler, Sun Ra and Janelle Monáe. However, these examinations are fleeting for a cultural-studies text that acknowledges that, as well as exploring plausible scientific extrapolation, science fiction equally serves “as a literature of social change, often using futuristic technologies to establish that its stories take place in different worlds, but remaining more interested in social than scientific change”.

I’m not against pull quotes in principle, but the value they add should exceed their capacity to disrupt the flow of one’s reading

My other criticism concerns a peculiar design choice that I suspect might be endemic to the “Essential Knowledge” series: the off-putting decision to punctuate every 10-or-so pages with full-page, inverted-colour pull quotes. I’m not against pull quotes in principle, but the value they add should exceed their capacity to disrupt the flow of one’s reading (which their scale does here in a way that, say, a picture would not have). I can’t help but feel their utility lies in attracting the interest of prospective buyers leafing through the work in a bookstore – but to the detriment of the ultimate readership.

These issues aside, Science Fiction nevertheless left me very open to more – both of Vint’s insights into science-fiction studies and also to some of the other, less familiar, offerings in the “Essential Knowledge” series. Overall, this is well worth a read.

  • 2021 MIT Press $15.95pb 224pp

3D printing technique keeps brittle tungsten crack-free

Tungsten has many excellent properties. It resists corrosion, and its melting point of 3422 °C is the highest of all metals, making it an ideal material for components that operate at extreme temperatures. There is a problem, though: it is highly brittle at room temperature, which means it is hard to process using conventional techniques.

Researchers at the Karlsruhe Institute of Technology (KIT) in Germany have now addressed this problem by adapting an additive manufacturing technique called electron beam melting (EBM) for use in tungsten processing. The resulting crack-free metal could be used in high-temperature components such as rocket nozzles, heating elements for furnaces, or parts for fusion reactors and medical imaging systems.

Additive manufacture

The KIT researchers, led by Steffen Antusch from the Institute for Applied Materials – Materials Science and Engineering (IAM-WK), have studied several ways of using additive manufacturing (also known as 3D printing) to make tungsten components that require little to no post-production processing. In their latest work, they used EBM to reduce strain in tungsten during processing and thus produce a soft material with no cracks that is easier to handle.

The EBM technique uses electrons accelerated in a vacuum to melt metal powder. By moving the electron beam, it is possible to produce a 3D component from the metal in an additive way – that is, layer-by-layer. The technique was originally developed for titanium alloys and materials requiring high processing temperatures.

Pre-heating reduces deformation and inherent stress

To create 3D-printed parts from tungsten instead, Antusch and colleagues used the electron beam in the EBM machine to pre-heat the tungsten metal powder before melting it. The researchers explain that this pre-heating procedure reduces deformation and inherent stress in the metal, making it possible to process materials that break easily at room temperature but can be deformed when hot.

Compared to other techniques, such as laser printing, the new approach is much better at producing crack-free tungsten, Antusch tells Physics World. And unlike powder injection moulding — another widely-employed advanced manufacturing technology for fabricating complex, high volume net-shape components – Antusch notes that with the new method “you don’t need expensive tools and are free to design the printed parts”.

The IAM-WK researchers are involved in work for the Helmholtz Association and the European Fusion Programme (EUROfusion), with the long-term goal of developing materials and processes for high-temperature applications in fusion energy and medical engineering (such as making parts for CT scanners). They now plan to characterize and test the mechanical properties of their printed tungsten materials for use in such applications.

Meet NIST’s pandemic poet and other characters from 1918, quasicrystal was created by nuclear bomb

The US National Institute of Standards and Technology has been around for over a century and in 1918 the metrology institute first started to photograph its employees for security concerns related to the First World War.

NIST librarian and archivist Keith Martin has been looking into the lives of some of the people photographed and describes some of the most interesting characters in this lovely article.

One of the photos is of a young Frank Davenport Calmore, who was a clerk in the NIST instrument shops while attending the Howard University School of Law – a historically Black university in Washington DC. Despite earning his law degree, he spent decades working as a train steward in Texas. Although denied a legal career, he used his legal training on several occasions to advocate for fair employment for Black Americans in the railroad industry.

Elizabeth Yung-Kwai is one of the earliest known Asian-American employees at NIST, which was founded as the National Bureau of Standards in 1901. A student at Wellesley College, she researched self-luminous materials and co-authored the paper “Studies of radium luminous materials,” which was presented at the April 1919 meeting of the American Physical Society. After raising a family she served her country again in the Second World War as a member of the Women’s Army Corps.

Cleon Throckmorton was a lab assistant at NIST while studying at George Washington University. A year after his photo was taken, he opened the Krazy Kat Klub in Washington DC, which catered to a gay clientele. He later moved north to New York City where he became a successful set designer for theatre and television.

But perhaps the most interesting find by Martin is a poem written by the chemist Campbell Waters about how to avoid contracting influenza. It was recorded for posterity by office clerk Lois Crump, who send it to her husband overseas. It begins “Oh, shun the common drinking cup, void the kiss and hug…”. Words that resonate today.

Quasicrystal quest

For more than a decade Luca Bindi and Paul Steinhardt have been on a quest to discover quasicrystals. The duo went on an expedition to the wilds of the Kamchatka Peninsula in the far east of Russia to successfully show that quasicrystals can occur in nature – they had previously only been synthesized in the lab.

Now, the duo and colleagues have found a previously unknown icosahedral quasicrystal in a sample of red trinitite. This material was formed during the Trinity test – the first detonation of a nuclear bomb, which occurred in New Mexico in July 1945.

Trinitite was created when the intense heat of the bomb fused sand with copper wiring from recording equipment. The team used several techniques to study the structure and composition of metallic blobs within a trinitite sample and found a structure with fivefold, threefold, and twofold rotational symmetries. This pattern violates the symmetry rules of periodic crystals and is therefore a quasicrystal.

The quasicrystal that Bindi and Steinhardt found in Kamchatka probably fell to Earth in a meteorite. This latest work suggests that quasicrystals could be forming in violent events here on Earth, including lightning strikes, meteor impacts, or other nuclear detonations.

  • Paul Steinhardt has written a highly entertaining book about his quest for quasicrystals. You can read my review here.

CMOS controller for quantum computer operates at 3 K

An electronic device that operates at cryogenic temperatures while controlling spin quantum bits (qubits) has been unveiled by researchers in the Netherlands. The controller could help alleviate the “wiring bottleneck” that threatens the development of quantum computers that integrate large numbers of qubits.

Researchers are developing quantum computers using several different technologies and it is not yet clear which current technology – if any – will lead to the creation of low-cost, scalable devices. Most of the competing designs pose significant challenges related to device temperature. That is because quantum computing devices typically operate at cryogenic temperatures, making them much colder than the wires and other conventional electronics used to connect the quantum devices to the outside world. This radical temperature mismatch can thwart the design and operation of quantum devices.

Now, scientists at the Delft University of Technology (TU Delft) have, in collaboration with Intel and the Netherlands Organization for Applied Scientific Research (TNO) , circumvented this issue by showing that a quantum chip can be controlled by an electronic device held at cryogenic temperatures. Writing in Nature, they report that their cryo-controller, called Horse Ridge, directed operations on a silicon-based quantum chip as successfully as more standard, off-the-shelf room temperature electronics.

“The way we build quantum chips now involves a single wire going from every qubit [quantum bit] to instrumentation at room temperature,” says Lieven Vandersypen, a physicist at TU Delft who was involved with the study. This approach, he explains, will not be practical in the future as physicists try to connect thousands or millions of qubits at a time in to perform complex computations.  “At that point, the solution of having a single wire going to every qubit is no longer feasible,” he says. His team has been working on bringing electronics closer to the qubits both in terms of location and temperature in order to avoid this problem.

Connecting millions of qubits

“The problem with integrating qubits and electronics is a thermal problem, a cryostat problem really,” agrees Sorin Voinigescu, an engineer at the University of Toronto who was not involved with the study. Microwave controllers such as Horse Ridge work like a mobile phone transceiver, he notes, with the added difficulty of having to operate at low temperatures. In the future, Voinigescu says, to control millions of qubits scientists will have to pack tens of thousands of such transceivers in a tiny package within a cryostat right alongside them. The new study could be a step towards meeting this challenge.

In their experiment, the team used Horse Ridge to control a pair of spin qubits confined with a nanoscale silicon system known as a double quantum dot. The cryo-controller applied pre-programmed short bursts of microwave radiation in order to manipulate the spins or, equivalently, perform calculations with the qubits that they make up. Conventionally such microwave bursts come from room temperature electronics connected to the qubit chip through a coaxial transmission line. Horse Ridge, on the other hand, delivered them while operating at a cool 3 K temperature without sacrificing any quality.

Vandersypen, notes that in their setup, researchers could switch between directing the qubits with room temperature electronics, or with the cryogenic controller. They found essentially the same control fidelity, a promising 99.7%, in both cases. In other words, Horse Ridge’s control of the qubits was nearly 100% reliable.

Long-established technology

The cryo-controller is based on complementary metal-oxide-semiconductor (CMOS) technology, which is a long-established technology, notes TU Delft team member Edoardo Charbon. ”Putting together 60 years of experience of CMOS with qubits was a winning team,” he highlights.

In addition to smooth operation at temperatures of a few Kelvin, CMOS components can be relatively easily miniaturized and integrated with quantum chips, according to Charbon. Further, Horse Ridge can be programmed to execute different sets of manipulations of the qubits. “Versatility usually comes with a price in complexity and size or bulkiness of the device.” Charbon says “In our case it’s all integrated, it’s already there, all you have to do is program it differently.” Voinigescu also points out the complexity of Horse Ridge as well as its capacity to simply run more algorithms than cryo-controllers tested in past experiments.

Vandersypen hopes that the combination of versatility, high reliability and relatively straight-forward integration of CMOS technologies such as Horse Ridge can help silicon-based qubits like those in his team’s experiment move up in the quantum computer design race. At the same time, though Horse Ridge is much colder than any room temperature wire, it is still not as cold as the qubits that it controls — which are held at 20 mK. Charbon says that its temperature can be lowered a bit further, but to fully master putting the controller and the ever-growing number of qubits within the same device his collaborators will also have to engineer warmer quantum chips. “The two temperatures, one day, will be maybe the same,” he sets an ambitious goal for their future work.

The research is described in Nature.

 

Entangling measurement promises more efficient quantum networks

Conceptual illustration containing a multimeter-like device with a glowing screen and a pointer that indicates which state the atomic qubits are in

A key resource in future quantum communication networks is entanglement: a quantum correlation that can be developed between, for example, distant nodes of the network. Special methods of measuring the nodes’ state can create the entanglement or protect an already existing entanglement against destructive environmental effects. Scientists led by Gerhard Rempe at the Max Planck Institute for Quantum Optics (MPQ) in Germany have implemented such a measurement on two distant atomic qubits.  The final state of the qubits has 67 percent “fidelity” to an ideally entangled state.

Two entangled qubits in a Bell state

Two classical bits can exist in one of four states: 00, 01, 10 or 11. Two quantum bits, or qubits, can be prepared in one of these states too, but their wave-like character means they can also adopt a phase factor and then be combined. Four such combinations are known to be especially interesting: in-phase and out-of-phase combinations of 00 and 11 (labelled φ+ and φ) and in-phase and out-of-phase combinations of 01 and 10 (labelled ψ+ and ψ). If the two qubits are prepared in one of these four states, which are known as Bell states, they have maximum entanglement. This means, for example, if one experimenter, Alice, measures the state of the first qubit to see if it shows 0 or 1, and her colleague Bob measures the second qubit, Bob always knows the result of Alice’s measurement (based on the random outcome of his own measurement) without asking her.

Alternatively, instead of measuring the qubits separately, they can be measured together. For example, Alice might randomly choose one of the Bell states, prepare the two qubits in this chosen state, and then ask Bob to discover her choice just by measuring the qubits. If Bob measures them separately, he will not be able to say anything about Alice’s choice. Instead, he must measure them together, ideally with a Bell-state measurement, before he can answer Alice’s question.

A proper Bell-state measurement can distinguish between all four states and is termed “complete”. If the measurement not only determines Alice’s choice, but also preserves the entanglement between the two qubits, then it is termed “nondestructive”. Although Bell-state measurement is critical to many quantum processing tasks, it does not always need to be nondestructive (or even complete). Nonetheless, a measurement that has these characteristics will be a very useful tool for generating entanglement. Such measurements can also be used to save an already existing entanglement from environmental noises if implemented very efficiently and repeated quickly enough – a phenomenon known as the quantum Zeno effect.

A proper Bell-state measurement on distant atomic qubits

In their new study, which is published in Nature Photonics, Stephan Welte and his MPQ colleagues describe how they implement a complete non-destructive Bell-state measurement on two distant qubits. They encode each qubit in a rubidium atom trapped in a high-quality optical resonator. Then they prepare two very weak optical pulses with a specific polarization. Each pulse interacts with both qubits successively and after the interactions, its polarization is detected. The first pulse determines if the result of the Bell-state measurement on the two qubits is a φ-type Bell state or a ψ-type one and the second pulse determines if it is plus-type or minus-type.

Staring from the first qubit in this setup, each optical pulse travels through a 60-m optical fibre to reach the second qubit. While a similar measurement protocol had previously been implemented on qubits located very close to each other within IBM’s five-qubit chip, implementation on distant qubits is important for quantum networks where qubit memories sit on distant nodes of the network. “Conceptually, the work done on the IBM chip is similar to ours,” Emanuele Distante, a co-author of the MPQ study, explains. “The main difference is that we measure distant qubits connected via fibre in a network.”

Outlook for improvements and developments

In this work, the state of the two qubits after the measurement is not exactly the Bell state that it should be. Instead, it has an average “fidelity” of about 67 percent – fidelity being a standard measure of similarity between quantum states. This value is not ideal. Nevertheless, it is above a threshold guaranteeing that, although the two qubits do not have maximum entanglement, they are still entangled. Furthermore, simulation of the system’s imperfections suggests the fidelity could be boosted to about 90 percent.

Another practical issue is that the optical pulses are sometimes not detected at the end of the experiment – an event that heralds the total failure of the system to accomplish the task. This makes the measurement system inefficient, but the researchers say that the low efficiency could be improved by reducing the optical losses and other technical advancements.

Alongside these improvements, the group say that measuring and entangling three or more qubits would be “a fascinating avenue” for future research. “Our measurement could be scaled up to a larger number of qubits,” Distante says. “Many nodes can in principle be connected and photons can fly by into optical fibre interacting with the atoms at the nodes. In this way, it could be possible in future to measure the entangled state of more connected qubits.”

Three steps to safer stereotactic radiotherapy

Want to learn more on this subject?

In recent years, stereotactic radiotherapy (SRT) has evolved into standard practice in radiation oncology. SRT of small targets using high dose per fraction with steep dose fall-off requires a comprehensive quality assurance programme to ensure that the prescribed dose is accurately delivered. However, dosimetry is still one of the major challenges faced by many clinical physicists when embarking on SRT.

In this educational webinar, Hui Khee Looe will touch on three important aspects of SRT:

  • Beam commissioning
  • Patient-specific plan QA
  • End-to-end tests

He will address the most frequently asked questions on each of these aspects and provide practical tips and step-by-step guides. Special focus will be placed on common pitfalls and how to avoid them. These will be illustrated by using real clinical cases to help understanding.

Now is the time to understand more, so that we may fear less.” (Marie Curie)

Want to learn more on this subject?

Hui Khee Looe received his Master’s degree and PhD in physics from the University of Oldenburg. He is currently deputy head of the Department of Medical Physics at Pius Hospital, where he is responsible for clinical duties in the Clinics for Radiotherapy, Nuclear Medicine and Radiology. In addition, he is a university lecturer and leader of the research group Computational Methods in Modern Dosimetry at Carl von Ossietzky University. Hui Khee has published more than 40 peer-reviewed papers. His research activities focus on dosimetry under non-equilibrium conditions, the development of mathematical models for modern dosimetry, dosimetry in magnetic fields, and multi-dimensional dose measurements.

Icequakes and rogue waves: geoscientists and musicians interpret the sounds of the sea

This episode of the Physics World Weekly podcast looks at how geoscientists and musicians interpret the soundscapes of the oceans in terms of both science and art.

Our first guest is geophysicist Rob Abbott of Sandia National Laboratory in the US. Earlier this year, he led an expedition to the arctic coast of Alaska’s North Slope where they used an undersea optical-fibre cable to listen to rumblings under the sea ice. He talks about detecting icequakes and possibly the icebreaking activities of a whale, as well the challenges of working in temperatures below -40 °C.

Next up is the geoscientist Rónadh Cox and the percussionist Cormac Byrne who share a love of Ireland’s rugged west coast and the bodhrán – a handheld drum associated with Irish folk music. Cox, who is at Williams College in Massachusetts, describes how huge Atlantic waves shape the Irish coastline – often shifting giant boulders. Byrne explains how he teamed up with Cox and musician  Rónán Ó Snodaigh to create music inspired by ocean waves – which he performs for us on the bodhrán.

Copyright © 2026 by IOP Publishing Ltd and individual contributors