Skip to main content

Liquid metal antenna matches extreme curvature and deformation of moving organs

Biomedical sensors that actively monitor physiological signals from moving organs undergo large and cyclic deformations. To mechanically conform to the curvature of the organs and match their motion, such sensors typically feature thin designs to ease flexibility, and elastomer encapsulations to enhance stretchability.

To boost the mechanical performance of tissue-interfaced wireless biomedical devices, a research team at Singapore University of Technology and Design (SUTD) replaced the conventional metallic conductors in the antenna with Galinstan — a low toxicity gallium alloy that is liquid at room temperature.

The 3D-printed microfluidic antenna, which the researchers describe in Advanced Materials, retains a high wireless power efficiency under extreme deformations and conformally adheres to fragile dynamic tissues, providing a thin, wireless sensing platform for health monitoring applications. “This technology will open up new opportunities for biological sensing, communication and therapeutics,” explains senior author Michinao Hashimoto.

Additively manufactured antennas

Challenging the existing liquid metal antennas, which have large, solid monolithic structures, Hashimoto’s team envisioned a soft, non-monolithic structure. To achieve this, the researchers employed a pneumatic silicone sealant dispenser to print the outline of a microchannel on a 7 µm-thick Ecoflex substrate, referred to as Ecoflex microsheet, which has similar elastic properties to biological tissues. After embedding a set of light-emitting diodes (LEDs) and a jumper wire into the device, they added a free-standing Ecoflex microsheet to seal the outline, resulting in an empty microfluidic channel.

To aid in the Galinstan injection process, the team used a rigid sacrificial layer of poly(vinylalcohol) — a water-soluble polymer — as a removable mechanical support to enable smooth liquid metal flow in the microchannels, thereby closing the electrical circuit and powering the LEDs. The resulting fluidic antenna operates close to the standard near-field communication frequency (13.56 MHz) with low energy loss.

Resilient electromagnetics

For stretchable electronics that include solid antennas, wavy and serpentine conductor patterns help to increase the overall stretchability. However, the range is still limited by the yield strain of the metallic conductor. Liquid metal, however, offers unlimited strain, making it an enticing choice for sensors that will experience large deformations.

The researchers report that the Galinstan antenna can experience up to 200% tensile strain, match a 3 mm radius of curvature and withstand a 180o twisting angle while maintaining a low energy loss. Repetitive tensile strain tests revealed no degradation in the antenna’s energy efficiency or meaningful shift in its operating frequency, highlighting the stability of the design. They note that pressing the antenna, however, could cause irreversible microchannel collapse if the pressure exceeds 90 kPa.

The researchers

To secure the Galinstan antenna onto soft tissues, the team coated the tissue—antenna interface with a 650 nm-thick layer of the bio-adhesive polydopamine. This increased the adhesion energy, removing the need for sutures that could injure the tissue. Experiments on a porcine small intestine and heart, and a chicken leg revealed that the surrounding environment detuned the antenna and lowered its efficiency. However, it maintained sufficient performance to power the device’s LEDs.

“We believe this work covers the need for minimally invasive implantation of biodevices on fragile tissues such as the brain, liver and kidney,” says first author Kento Yamagishi, who plans to use this technology for future in vivo implantable studies in small animals.

Smartphones could create distributed space weather observatory

The magnetometers found in some smartphones could allow devices to be used to create a distributed space weather observatory. That is the conclusion of NASA’s Sten Odenwald, who has studied the ability of four popular models of smartphone to detect small disturbances to Earth’s magnetic field.

Following the advent of the iPhone model 3GS in 2009, which contains a 3-axis Hall-effect magnetometer chip, such sensors have become commonplace in smartphone and some other consumer devices. They are used by a range of smartphone apps from compasses to ferrous metal detectors and there is a growing interest in using magnetometers for navigation when GPS signals are not available – such as when the user is a shopping mall.

While smartphone magnetometers are in widespread use, it had not been clear how such apps would be affected by disruptions to Earth’s magnetic field caused by solar storms – which are related to violent processes in the Sun such as coronal mass ejections.

Significant errors

Now, Sten Odenwald of the NASA Space Science Education Consortium has shown that significant errors in smartphone magnetometer readings can be caused by solar-storm level disruptions. While these errors have a negative effect on compass and positioning applications, Odenwald believes that this sensitivity could be harnessed to create a citizen science space-weather observatory that covers large swathes of the Earth.

Odenwald tested four popular smartphone models: the iPhone 6s, and Samsung’s Galaxy S9, Galaxy S8 and Galaxy Note 5. Each phone’s magnetometer was assessed in two ways. First, Odenwald created synthetic magnetometer data for two historical geomagnetic storm events by combining actual magnetic observatory data with each sensor’s typical noise profile. Secondly, each smartphone model was subjected to real magnetic fields generated by a Helmholtz coil to determine its absolute responsiveness to disturbances of about 1 µT. This is a percentage or two of Earth’s magnetic field strength and on par both with the changes caused by solar storms and the sensitivity of magnetometer apps.

Of the four smartphones tested, the most sensitive proved to be the iPhone 6S and the Galaxy Note 5 which, which detected the stronger storms at a signal-to-noise ratio as high as 8.

Auroral zone

Odenwald points out that the magnetometer chips used in each of the tested devices are of 2012–13 vintage, meaning that newer phone models may potentially perform better. The simulated storm data found that smartphone magnetometers proved more susceptible at mid-to-high latitudes. This corresponds to the so-called auroral zone where storms produce the largest geomagnetic changes. At latitudes lower than 38° where geomagnetic changes are not as large, the effect would not be as pronounced.

“Under certain circumstances, geomagnetic storms could be a significant source of error in compass and other positioning applications,” Odenwald reports. “The changes can amount to several degrees of deviation in magnetic compass applications, and this is problematic for orienteering applications in which a stable field over many hours is needed for precise guiding.”

Large storms are relatively rare, however, especially now that we are in a quiet period in the 11-year solar cycle. Even during a solar maxima, large storms only strike once every few months. Indeed, the current lull meant that Odenwald had to simulate historical storms, rather than simply measure live events.

“Unique opportunity”

Beyond the disruption to smartphone apps, Odenwald says that the magnetometers’ sensitivity “presents a unique opportunity for scientific studies of global changes in the geomagnetic field during significant solar storm events”.

Geophysicist Ciarán Beggan of the British Geological Survey agrees but points out that ambient noise is a significant barrier. “It is potentially possible to use mobile phones to detect a geomagnetic storm – and there are projects like this already using phone magnetometers (like CrowdMag)”. He adds, “but the real issue is that it is not easy to distinguish natural signals from manmade noise or interference in particular, and that the phone magnetometers are really noisy in general, as shown [in Odenwald’s work]”.

“Magnetic cleanliness”

Beggan says, “For geomagnetism it is far better to have a few really good magnetometers with excellent signal to noise and good magnetic cleanliness that can detect subtle storms, than millions of poorer records, which are very noisy and prone to unknown interference. At the moment, I think phone magnetometers are just not sensitive enough for large storms.”

Responding to this, Odenwald told Physics World that he envisages a smartphone array being used for “educational purposes rather than scientific research”, allowing high school physics teachers to discuss space weather when they are teaching magnetism.

To this end, Odenwald is now looking to develop software that will combine hundreds of smartphone measurements taken across the US by physics teachers in 2023 to create a movie of the progress of a series of targeted geomagnetic storms across the country.

The study is described in Space Weather.

Was biggest black-hole merger more lopsided than previously thought?

A compelling alternative explanation for what astrophysicists believe is the largest black hole merger measured to date has been put forth by two astronomers in Germany. Alexander Nitz and Collin Capano at the Max Planck Institute for Gravitational Physics argue that the gravitational wave GW190521 created by the merger could have been triggered by a stellar-mass black hole spiralling into a far larger body. If correct, their results could resolve a mystery surrounding the masses of both bodies in the merger and may lead to a better understanding of intermediate-mass black holes.

In May 2019, the LIGO and Virgo observatories detected gravitational waves originating from a black hole merger that created a new body weighing in at about 140 solar masses – making this the largest black hole merger seen so far. Subsequent calculations showed that the two black holes responsible for the GW190521 signal measured roughly 85 and 66 solar masses. Yet these values seem to be at odds with our current understanding of stellar evolution.

When giant stars of about 65–120 solar masses explode as supernovae, current theories predict that the explosions completely overwhelm the gravity holding the star together. As a result, no material is left behind after the explosion – not even a black hole. This should create a gap in the mass range of black holes formed from exploding stars, however both objects involved in the GW190521 merger appear to fall inside this gap.

Fine tuning

To extract information about black hole masses from gravitational waves, astronomers must compare their observations to theoretical models, which incorporate prior assumptions about the physics. By selecting different priors and constraints, researchers can fine-tune their models to better fit their observations. In their study, Nitz and Capano used a different set of priors to assess the GW190521 signal than those originally used by LIGO–Virgo scientists.

The duo’s work yielded two possible solutions. In the first, both masses were like those originally calculated by LIGO–Virgo. In the second, however, they fell well beyond the forbidden mass gap to either side: measuring roughly 166 and 16 solar masses. The heavier object would be an intermediate-mass black hole, which are thought to form not in supernovae, but by black holes devouring matter or even other black holes. These currently-elusive bodies lie between those formed by collapsed stars, and the supermassive black holes at the centres of galaxies – which can be millions or billions of times more masive than the Sun.

If Nitz and Capano’s interpretation is correct, it could not only resolve the mass gap mystery; it would also make the GW190521 merger the first detected example of a stellar-mass body spiralling into an intermediate-mass black hole.  Therefore, the duo’s results may offer compelling new clues about their properties of intermediate-mass objects and could enable astronomers to describe a greater variety of black hole mergers from future detections of gravitational waves.

The research is described in The Astrophysical Journal Letters.

Efficient optical rectenna could generate power from waste heat

Devices known as optical rectennas show considerable promise for renewable energy because they can harvest energy from heat and convert it into electricity. Their chief drawback is their low efficiency, which makes them impractical for large-scale use. Researchers at the University of Colorado, Boulder, US have now found a way to boost this efficiency, paving the way for optical rectennas that can generate useful amounts of electrical power from waste heat.

Rectennas (short for rectifying antennas) consist of two parts: an antenna, which absorbs electromagnetic radiation, and a diode, which converts the absorbed energy into direct current. Optical rectennas, which convert electromagnetic fields at optical frequencies into electrical current, work most efficiently when made on the micron or sub-micron scales, which is much smaller than devices that convert longer-wavelength radiation such as radio waves. The problem is that as these devices shrink, their electrical resistance grows, thus drastically limiting their power output.

Sidestepping the obstacle

Faced with this no-win situation, a Boulder team led by Amina Belkadi, Garret Moddel and Ayendra Weerakkody decided to explore whether they could sidestep the obstacle entirely. In a conventional rectenna, electrons must pass through an insulator, which adds resistance to the device, reducing the amount of electricity it can produce. In the new work, Belkadi and her colleagues decided, somewhat counterintuitively, to build a rectenna that uses two insulators instead. They did this because having a space between the insulators makes it possible for a structure known as a quantum well to form.

This quantum well contains discrete quasi-bound electronic states, and electrons that hit it with energies matching the states’ energy levels can then propagate through the structure without much resistance. This propagation process is termed resonant tunnelling, and it enables tunnelling electrons to produce a higher current compared to non-resonant electrons.

Low resistance and high responsivity

While such behaviour had been predicted in theory, it had not previously been observed experimentally. To produce it, the researchers had to choose the materials for their rectenna carefully and then fabricate them with the correct thickness.

In their work, which they detail in Nature Communications, Belkadi, Moddel and Weerakkody studied rectennas made from Ni/NiO/Al2O3/Cr/Au metal-double-insulator-metal (MI2M) wafers. The quantum wells formed between the two oxides NiO and Al2O3. The researchers began by dividing each wafer into four Ni/NiO/Al2O3/Cr/Au diode batches. They then varied the thicknesses of the NiO layers (from 3 to 6 nm) to modify the depth and width of the MI2M quantum wells, while keeping the thickness of the Al2O3 at 1.3 nm.

They found that the devices responded better to light as the thickness of the NiO layer increased from 3-5 nm, producing currents of 0.43 A/W at 3 nm, 0.52 A/W at 4 nm, and 0.59 A/W at 5 nm thicknesses. The device’s resistance, meanwhile, dropped from 10 kΩ for the 3 nm structure to a minimum of 4 kΩ for the 4 nm structure before increasing to 50 kΩ at 5 nm. After balancing these various trade-offs, and thanks to the electron resonant tunnelling effect, the researchers report that they were able to make a 0.035 μm2-sized device with a resistance of just 13 kΩ and a responsivity of 0.5 A/W.

Deeper quantum wells

To test their rectennas, the researchers placed a network of about 250 000 of them on a hotplate. They then increased the hotplate’s temperature and measured how much heat the devices were able to capture. 

While the researchers found that devices in this configuration captured less than 1% of the hotplate’s radiation, they believe they can increase this figure by making the quantum wells in their devices deeper, which would allow more electrons to pass through. They would achieve this by using different metals and insulators. An optimized design could, they say, be used to harvest waste heat from sources ranging from power plants to industrial ovens.

More ambitiously, the researchers say that it might, in principle, be possible to capture the energy radiating from Earth itself using rectennas mounted on airships. Moddel, in particular, looks forward to the day when rectennas sit on top of everything, from solar panels on the ground to lighter-than-air vehicles in the air. “If you can capture heat radiating into deep space, then you can get power anytime, anywhere,” he says.

Members of the team now plan to find out whether the effect they observed can be replicated in other materials, as well as in Ni/NiO/Al2O3/Cr/Au wafers with different thicknesses. “The goal is to find materials with a quasi-bound state near to the metal Fermi level,” Belkadi explains. “The closer the better since this will allow for lower resistance and higher diode efficiency.” Finding multiply-bound states would also be desirable, she tells Physics World, as it might push the technology forward enough to make it commercially viable.

Solar mission propels tip/tilt systems into commercial applications

Today’s space probes represent a modern miracle of precision engineering. Take ESA’s Solar Orbiter as an example, which in 2020 embarked on a 10-year mission to study the dynamics of the Sun by flying closer to the star than any previous spacecraft. Onboard the 1.7 tonne satellite are 10 separate instruments that have been carefully optimized to capture the most accurate scientific data while also surviving the extreme temperatures the satellite will be exposed to during its journey around the Sun.

One of those instruments, the Polarimetric and Helioseismic Imager (PHI), will measure the Sun’s magnetic field to investigate important dynamical features of both the solar surface and its interior, such as how energy is transported between different layers. “The PHI instrument measures the polarization of the incoming light,” explains Reiner Volkmer of the Leibniz Institute for Solar Physics, a member of the project team. “Two images are taken in succession in different polarization states, and one is subtracted from the other. This is done for different polarization directions to calculate the magnetic field strength and direction.”

For accurate results, each of the images must come from exactly the same position on the solar surface, which means that the mirror of the PHI’s high-resolution telescope must always be pointing in the same direction – despite the constant movement and vibrations of the spacecraft. “A movable optical element is needed in the light path of the telescope to correct for the jitter,” comments Volkmer, whose team was responsible for designing an image stabilization system for the PHI. “By correlating two sequential images we can calculate the deviation in position from one to the other, and the mirror is then moved to compensate for the deviation.”

Piezoelectric control

The mirror must be moved quickly and precisely enough to correct for small vibrations in real time, and the solution chosen by the project team was a tip/tilt system that exploits piezoelectric actuators to dynamically adjust the angular position of the mirror in two orthogonal directions. Designed and developed by Physik Instrumente (PI), a German company that specializes in precision positioning technologies, the piezoelectric tip/tilt unit is light and compact enough to be integrated into the PHI instrument, and robust enough to endure both extreme temperatures and the mechanical shocks that occur during launch.

Inside the tip/tilt system are four piezo actuators arranged in pairs. All of the actuators control the movable platform at the same time, which ensures a common pivot point and delivers the same precision and dynamic properties in all directions of motion. This design scheme, known as parallel kinematics, is particularly important for achieving faster response times and for moving heavier loads – such as the relatively large mirror needed for the PHI’s telescope. “With a stacked solution, a serial kinematic, we would have different dynamic properties and angular errors in the x– and y-axes,” explains Lukas Rau, PI’s product marketing manager for piezo systems. “A piezo tip/tilt unit is also more compact because only one device is needed to control angular motion in the two axes.” 

One of the big advantages of using piezoelectric actuators for space applications is that there is no relative movement of mechanical parts inside the system, which means there is no friction, no wear, and no need for lubrication or maintenance. The piezoelectric materials used for the actuators expand and contract in response to an applied voltage, while flexible joints provide a friction-free mechanism to guide the movement of the platform. A stictionless preload design integrated into the tip/tilt unit adds stiffness that protects the actuators against tensile forces when they move at high dynamic frequencies or through large angular deflections.

Under pressure

The stiffness of the piezo-based design also imbues the system with a high resonant frequency, which enables more dynamic movements and more stable control. “There is always a trade-off between operating frequency and the amount of angular travel,” explains Rau. “Devices with a larger angular motion range deliver typically lower dynamic frequencies, while faster systems need a stiffer design that allows only a smaller movement.”

PI now offers a wide range of commercial off-the-shelf mirror mounts, thanks to five decades of experience in supporting beam-steering and stabilization applications in aerospace, astronomy and space exploration. These include piezo-driven systems that can be optimized to deliver the dynamic properties and angular motion needed for a specific application, as well as magnetic voice-coil tip/tilt units that typically enable larger movements at slightly slower speeds.

For the PHI instrument, meanwhile, the piezo tip/tilt unit had to be specially designed to meet the exacting demands of the image stabilization system. “We searched the market for suitable devices, and selected one that was close to our requirements,” says Volkmer. “We then contacted PI to see if they could modify the component to meet our specifications.”

Custom design

The starting point was a commercial device, the S-340, but significant changes were needed to enable the tip/tilt system to move the PHI’s mirror at high enough frequencies to support image acquisition rates of up to 1200 frames per second. While the standard component offers an angular displacement of 2 mrad and a resonant frequency of 1 kHz for a load of 63 g, the device designed for the PHI had to move a mirror weighing 71 g through a displacement of ±295 µrad with a resonant frequency of 1.3 kHz . “Increasing the required resonant frequency even by just a small amount needs a different design,” comments Rau.

Further modifications were needed to ensure that the system could operate effectively in space conditions, particularly the extreme temperatures the Solar Orbiter will face as it approaches the Sun. “The cabling had to be adapted to cope with a wide temperature range, and we had to introduce new soldering techniques to meet the required NASA/ESA standards,” says Rau. During the eight years it took to design and build the image stabilization system, PI engineers worked closely with the Leibniz team to adapt the component design, ensure easy integration with the PHI’s optical system, and conduct the extensive testing needed to qualify the tip/tilt unit for use in space flight and launch. “It was a really good co-operation with PI,” comments Volkmer. “Development was needed on both sides, and they put in the the effort to make the necessary changes.”

We are experts in motion and positioning, and we partner with our customers to find the best solution for their application.

Lukas Rau, PI’s product marketing manager for piezo systems

PI has now developed several bespoke tip/tilt systems for such one-off space projects, while Rau believes that the same technologies could soon find mass-market appeal for a new generation of space-based optical communications systems that are being planned to provide fast Internet connections anywhere in the world.  “The Solar Orbiter is an extreme project that allows us to demonstrate our capabilities, but the business case for other space applications is becoming more attractive,” he says. 

Rau points out that companies such as Amazon and SpaceX have started to deploy thousands of satellites into low-Earth orbit, creating constellations of spacecraft that will exploit laser beams to communicate with each other – and in some cases, the ground. One of the big challenges for these free-space optical (FSO) systems is sustaining the laser links between adjacent satellites as the constellation moves through space and the relative positions of the satellites keep changing. 

“The whole system is moving, there is vibration in each satellite, and there are also aberrations that arise from changes in humidity, temperature and turbulence in the atmosphere,” says Rau. “It’s a very dynamic system, and the position of the mirror needs to be corrected quickly enough to maintain the optical connections.” 

PI’s tip/tilt systems can achieve the fast dynamics needed to stabilize these optical data links, and are also small, light and cost-effective enough to be integrated into the much smaller satellites that will be deployed in these low-Earth orbit constellations. To improve precision, integrated sensors measure the motion of the platform to calibrate the device and eliminate the effects of hysteresis.

Industrial options

Other commercial applications exploit the highly dynamic angular motion that can be achieved with a tip/tilt unit to steer a laser beam across a surface. This scanning capability is widely used, for example, to inspect semiconductor wafers for defects or contaminants, or to move a light beam across a sample in an optical microscope. Another emerging application lies in laser materials processing, where a tip/tilt system can be used to oscillate a high-power laser beam on the workpiece, allowing faster cutting speeds while avoiding the build-up of heat that can sacrifice quality.

“The basic requirements for these different applications, whether in space or in industry, are often the same,” comments Rau. “They only allow a small space for integration, they require precise and highly dynamic beam steering in two axes, and they need friction-free operation for extra long uptime without the need for maintenance. There are not so many other solutions that you can use.”

Rau explains that PI works closely with its clients to design and supply tip/tilt systems that are optimized for their needs, whether for a bespoke project like the Solar Orbiter or for commercial applications that require volume production of large numbers of devices. “Many aspects need to be thought about to find the right product,” says Rau. “We are experts in motion and positioning, and we partner with our customers to find the best solution for their application.”

It’s topology, naturally

Ask a solid-state physicist to name the biggest discovery in the field over the past 50 years, and chances are the answer – if not high-temperature superconductivity – will be topological materials. These are materials in which bulk properties determine special behaviour along the surface or an edge, and they have profoundly changed the study of electrons in solids – as recognized by three Nobel prizes. Their earliest incarnations in physics have created a new standard for electrical resistance, and provided an independent means to determine the fine-structure constant, α, in quantum electrodynamics. These materials are also expected to bring great advances to information processing, resulting from topological electron behaviour in graphene, or the harnessing of topological qubits in quantum computing. Even exotic entities such as magnetic monopoles and Majorana particles, once a province of particle physics, have become the subject of solid-state topological study.

But now topology is going beyond the solid state. In the past few years, physicists have begun to realize that it could have a major role in fluid dynamics – the Earth’s oceans, for example, or plasmas, or the biological cells in fluid-like “active” matter. The research promises to provide a clear window through which to study what can otherwise be very murky areas of science. More importantly, it brings an entirely new area of mathematics to our understanding of the natural world; as well as to complex artificial systems of huge practical importance, namely fusion reactors. “Topology gives you a quick, direct way to see whether waves [in certain systems] exist,” says theorist Brad Marston of Brown University in Rhode Island. “Once you work through the logic, answers can come very simply.”

Topological research promises to provide a clear window through which to study what can otherwise be very murky areas of science

At the edge

Topology as a mathematical concept goes back a long time, at least to the 19th century. It describes properties of objects that do not change during continuous deformations. For example, a Möbius strip will always have one continuous edge and one continuous surface, no matter how much it is stretched or twisted. Similarly, a mug and a doughnut are topologically identical, if defined by the existence of a single hole. Topology is also what properly differentiates a line from a surface and, more broadly, what constitutes a dimension.

Physics, however, has not always been quick to see where topology matters. Indeed, the first topological property in the solid state was initially not even recognized as such. This is the quantum Hall effect, which concerns electron behaviour in a very thin (and therefore essentially 2D) semiconductor that is sandwiched between two other semiconductors. When a magnetic field is applied, the electrons in the 2D layer experience a perpendicular Lorentz force and begin to travel in tiny circles, similar to the orbits of bound electrons around atoms.

This is possible for all the electrons apart from those near the semiconductor’s edges, where there is not enough space for complete circles. Here, the electron trajectories are chopped off into semicircles – whenever each electron gets to the end of its semicircle, it zips straight along the edge itself. Thanks to the particular behaviour of these edge electrons, the edge itself becomes electrically conducting – unlike the interior of the semiconductor, which remains non-conductive.

The importance of the quantum Hall effect was immediately obvious after its discovery by the German physicist Klaus von Klitzing in 1980. As implied by the name, it stems from the conductance being perfectly quantized – a fact that has been exploited for the new international standard for electrical resistance (the ohm), and for precise determinations of α. Two years after von Klitzing’s discovery, the topological nature of the effect began to become apparent, thanks to work by the British theorist David Thouless. Both von Klitzing and Thouless won the Nobel Prize for Physics for their contributions, in 1985 and 2016, respectively.

Topological properties often involve behaviour in a certain dimension manifesting as something else – something much less trivial – in a lower dimension. As such, there is an intuitive sense in the quantum Hall effect being topological in origin: the straight, 1D trajectories the electrons take at the edges of the semiconductor can be viewed as projections of the 2D circles they take within. Topological properties sometimes also involve the breaking of one or more fundamental symmetries. In the case of the quantum Hall effect, the external magnetic field breaks time-reversal symmetry, such that solutions to the Schrödinger equation describing electron behaviour are no longer identical when time, t, is replaced by –t.

Robust effects

But it is not only the conductivity, and the quantization of that conductivity, that makes the quantum Hall effect special. Another topological hallmark is the robustness of the conductivity in the face of defects, temperature changes and other forms of disruption: the phenomenon is “topologically protected”. It is for this reason that the quantum Hall effect is earmarked for applications in computing, including quantum computing using qubits, which are notoriously hard to sustain against environmental decoherence.

Despite Thouless’s insight, it was not until the 2000s that interest in topology in the solid state really took off. This was after the prediction, and subsequent discovery, of edge conductance in thin insulators in the absence of an external magnetic field. Electrical insulators in the bulk, topological insulators have surface states that conduct electrons extremely well. In 2D topological insulators, an effective magnetic field is generated by every electron as its spin interacts with its orbital motion. The result is a quantum spin Hall effect, wherein spin-up and spin-down electrons travel in opposite directions along an edge. Shortly after the discovery of 2D topological insulators, physicists found examples of topological insulators operating in 3D: for these, it is electron behaviour in a 3D volume that manifests as spin conductance over a 2D surface.

Beyond the solid state

Today, topology is a huge topic in physics. But all this topological research has had one feature in common. It is physics of the solid state, or at least physics involving a discrete, periodic lattice (if topological photonics, another emerging topic, is to be included). That all changed in 2017 when Brown University’s Marston, together with Pierre Delplace and Antoine Venaille at the University of Lyon in France, studied the fluids that make up the Earth’s atmosphere and oceans from a topological perspective.

They were particularly interested in certain, unusual waves in the atmosphere and oceans at the equator, known as Kelvin and Yanai modes. Geophysicists had first identified these waves in the 1960s, describing them as classical solutions to the wave equation, but there were still numerous unknowns about them. They are confined to the Earth’s equator; they are persistent; and they only travel eastwards, never westwards. A condensed-matter physicist by training, Marston had already spent a decade statistically simulating fluid aspects of the climate system at mid latitudes before he came across these strange equatorial waves. “I could have made this connection 10 years earlier if I had been paying closer attention,” he says now. “This is what happens when scientific fields fragment. Geophysicists don’t go to condensed-matter seminars, and vice-versa. They aren’t exposed to topological ideas.”

figure 1

Helped by Marston’s background, the three researchers immediately noticed that these equatorial waves had topological characteristics. First of all, the slim layers of the Earth’s atmosphere and oceans can be approximated as 2D. Second, the constituent air and water molecules are, like electrons in the quantum Hall effect, encouraged into circular motion by a force acting perpendicularly to that motion – in this case the Coriolis force (which results from the Earth’s rotation). Like the magnetic field in the solid-state phenomenon, the Coriolis force breaks time-reversal symmetry. It also forms a natural boundary, or edge, at the equator, deflecting air and water currents clockwise in the southern hemisphere, and counter-clockwise in the northern hemisphere. With such ingredients, says Marston, it is possible to ascertain fairly quickly via topology that persistent waves should exist at the equator, regardless of the finer details of the Earth’s climate and ocean systems (figure 1).

But how many waves? All topological properties have a characteristic integer value, the first Chern number (named after the Sino-American mathematician who identified it in the 1940s), which is one way to find how many unique states the property should give rise to. In their study, Marston and his colleagues calculated a first Chern number equal to ±2, corresponding to two edge modes. Using topology, and without having to solve any classical wave equations with ungainly polynomials, the researchers had proved that the equator should harbour two unidirectional equatorial waves – the Kelvin and Yanai modes (Science 358 1075). Both of these waves are known to influence the El Niño Southern Oscillation in the Earth’s ocean–atmosphere system, which in turn has a significant climate impact, causing incidents of drought or flooding in equatorial countries. According to Marston, topological insights could help researchers model El Niño and other oscillations, which have hitherto been sources of high uncertainty in climate predictions.

Active flow

Inspired by a talk Marston gave on the results, a group led by Cristina Marchetti at Syracuse University, New York, swiftly explored topology in a very different type of fluid system: active matter. Such matter consists of any collection of units that are able to propel themselves by some means. There are numerous examples in the natural world, from flocks of birds to bacterial suspensions and migrating cells. Active matter often flows in thin layers, for instance through skin or the lining of the gut. And, like the Earth’s atmosphere, these layers often have a curvature – in the case of the eye, even an Earth-like spherical geometry. The difference with the Earth’s atmosphere is that biological active matter is not subject in any significant way to a perpendicular external force such as the Coriolis force. But the fact that the matter is active and propels itself breaks time-reversal symmetry.

In their analytical computation, Marchetti and colleagues found that the combination of active flow and curved geometries supports topological waves around the equator of a sphere, or the neck of an hourglass-shaped object known as a catenoid (figure 2). In a very general sense, the researchers say, these topological waves provide highways for information transfer within a flock, which are robust against disorder and backscattering (Phys. Rev. X 7 031039).

figure 2

Marchetti’s group was not considering any particular type of active matter, and Suraj Shankar, a former member who is now based at Harvard University, US, is at pains to say that biology is too complex to speculate about the importance of the effect for particular systems. However, he does point out that some types of active matter are already known to harbour waves. For instance, the cells involved in wound healing transmit stress waves, which scientists believe may be involved in guiding the tissue’s growth and maintaining its integrity until the wound is fully healed. Given that wounds often occur on curved parts of the body, says Shankar, “it is not inconceivable that you end up with situations where these waves have a topological character”. Although experiments with active matter are difficult, and those on curved surfaces doubly so, he adds that other groups are already trying to observe the tell-tale signs of topology in active-matter systems.

Fluid topology

More and more research is turning to topology in fluids. A group led by Anton Souslov at the University of Bath in the UK, for example, has shown that in an active-matter system comprising chiral units, it is possible to design an interface that topological waves can flow into but not out of – a phenomenon that could be exploited to build perfect absorbers for soundproofing (arXiv:2010.07342). Meanwhile, Delplace, Venaille and Manolis Perrot at the University of Lyon have used topology to predict new types of trapped acoustic-gravity waves, known as Lamb waves. Observations of such Lamb waves could give us insight into the otherwise-invisible details of the stratification of distant planetary atmospheres, they say (Nature Phys. 15 781).

But perhaps the most intriguing area of this new fluid topology, from an application point of view, is plasmas. For these insights, we can thank a chance meeting of Marston with an old friend, Jeff Parker, a theoretical plasma physicist at Lawrence Livermore National Laboratory in California. The duo crossed paths at a fluid-dynamics workshop at the Aspen Center for Physics, Colorado, in 2017. By his own admission, Parker at the time knew very little about topology and the quantum Hall effect, but on hearing about Marston’s early work on geophysical waves he immediately saw similarities to models used in plasma physics. “I thought if it applies to fluids, it applies to plasmas,” he says. “It was intriguing to me – I was excited to be one of the first to look into it in plasmas.”

I thought if it applies to fluids, it applies to plasmas. I was excited to be one of the first to look into it in plasmas

Jeff Parker

Plasma waves

Marston and Parker began to work together, along with Ziyan Zhu at Harvard University and Steven Tobias at the University of Leeds in the UK. In a first foray into the topology of plasma physics, the researchers considered a system in which a magnetic field confines a plasma to a long cylindrical geometry. They found that, according to the topology of the system, a surface wave of electrons – what they term a “gaseous plasmon polariton” – should manifest along the outer boundary, where the plasma abruptly peters out into a vacuum (Phys. Rev. Lett. 124 195001). They have arranged to test their prediction at the Basic Plasma Science Facility at the University of California Los Angeles once COVID-19 restrictions have eased.

Having tested the waters of plasma topology, the same researchers, without Zhu but including Joshua Burby at Los Alamos National Laboratory in New Mexico, considered a more exciting system: the tokamak. In any radial cross-section of this doughnut-shaped geometry, contour lines can be drawn to represent the magnetic shear (where the magnetic field changes direction) in the ions making up the plasma. Parker, Marston and colleagues found that a magnetic shear varying from negative to positive from one slice to the next indicates the presence of a boundary with a Chern number of +1 on one side and –1 on the other. The ions at this boundary will therefore harbour a topological wave (Phys. Rev. Research 2 033425) (figure 3).

figure 3

In fact, experimental plasma physicists are already aware of such waves. They are known as a reversed-shear Alfvén eigenmodes (RSAEs), and are something of a double-edged sword. On one hand, they can resonate with the products of a fusion reaction – alpha particles – causing them and their associated energy to be ejected from the confinement before they can be harnessed for fusion power. On the other hand, their characteristics can betray details about the magnetic-field structure of the plasma in the middle of a tokamak, where extreme conditions make it difficult to perform conventional diagnostics. Retrospectively predicting the existence of RSAEs, as Parker, Marston and colleagues have done, does not alter any of this physics. As in the case of equatorial waves in the Earth’s atmosphere and ocean systems, however, the result provides the tantalizing prospect of more topological insights into tokamaks, which can currently only confine plasmas from minutes to hours before they fizzle out.

Might it be possible that, with more study, physicists could find the tokamak conditions that deliver a topologically protected state of fusion – in other words, one that can be sustained indefinitely? “I think it’s a little way off, but that’s the dream,” says Parker. He believes the goal is analogous to exploiting the quantum Hall effect for stable qubits in quantum computing. That is, a goal that is speculative, but reasonable. Certainly, the physics of topological fluids is only just getting started. With applications already ranging from the Earth’s climate to active matter to plasmas, who knows what it will illuminate next?

Researchers analyse carbon footprint of planned neutrino experiment

The annual greenhouse-gas emissions of a planned neutrino experiment could be on a par with the production of 1000 cars. That is according to a new analysis that finds the main source of emissions of the Giant Array for Neutrino Detection (GRAND) project is different during the construction and operation of the project. The authors say their work is the first published assessment of greenhouse-gas emissions of a large-scale physics experiment and provides a methodology that could be used for future facilities (Astroparticle Physics 131 102587).

First mooted in 2015, the GRAND project aims to detect ultrahigh-energy neutrinos originating from deep space using 200,000 antennas spread across mountainous regions around the world. A small-scale prototype of the project started last year but its operation was severely impacted by the coronavirus pandemic. A mid-scale stage of the experiment is planned for 2025 before the full-scale experiment comes online in the 2030s.

Large amount of data can actually lead to a huge carbon footprint

Kumiko Kotera

The researchers examined the global greenhouse-gas emissions across the three stages of the experiment, focusing on three sources: travel, digital technologies – such as computers, numerical simulations and data storage – as well as hardware equipment such as manufacturing and transporting the radio antennas.

They find that the prototype stage is expected to produce 482 tonnes of CO2 equivalent (CO2e) annually, with emissions during the mid-scale stage more than doubling to 1061 tonnes of CO2e per year. The full-scale experiment is estimated to have annual emissions of 13,385 CO2e – a 12-fold jump. The researchers estimate that such an output is like the emissions from almost 8000 return flights from France to Western China or the manufacturing of 1000 cars.

The study estimates that in the prototype phase, digital technologies and travel will dominate emissions, accounting for 69% and 27%, respectively. Once the main project starts, however, emissions from travel will drop to just 7%. The bulk of emissions will then be shared between hardware equipment (48%) and digital technologies (45%). In the mid-scale phase, the three sources are expected to contribute almost equally. The collaboration says that it will now be producing a green policy, which members will be encouraged to follow.

Environmental impact

Physicist Kumiko Kotera from Sorbonne University in Paris, who co-founded the GRAND project and co-authored the report, told Physics World that they were surprised by the impact of digital technologies. The report even found that sending hard drives by plane could produce lower emissions than online data transfer. “We believe that people in general are more aware of the emissions due to travel and hardware equipment production but tend to forget that large amounts of data can actually lead to a huge carbon footprint,” she explains. Kotera adds that these emissions could also be hard to tackle, but technological advances and changes in electricity sources are making progress in the right direction.

Kotera expects similar studies to become commonplace in the future, adding that they have been approached by other scientists who are planning to assess their own experiments. “Large-scale physics and astrophysics experiments gather a large fraction of the scientific staff and absorb a significant volume of the science budget,” says Kotera. “As such it seems essential to assess their environmental impact.”

Crystalline supermirrors cut optical losses

Low-loss, highly-reflective “supermirrors” – that is, those that scatter and absorb very few photons – are a key technology for many research fields and are used to make optical resonators for a variety of optics and photonics applications. Until now, however, it has been challenging to make such mirrors for the technologically important mid-infrared (mid-IR) range of light wavelengths between 3‒8 microns.

Researchers at the University of Vienna, Austria; its spin-off Thorlabs Crystalline Solutions; and the US National Institute for Standards and Technology (NIST) have used a new optical coating technology to construct such a mirror from single-crystal gallium arsenide/aluminium gallium arsenide. The mirror has a record-low excess optical loss below 10 parts per million and might find use in applications such as cancer detection, environmental sensors and high-resolution spectroscopy.

Less than 10 in a million photons

The supermirror made by Georg Winkler, Lukas Perner and colleagues is extremely smooth and contains low levels of contaminants. It absorbs and scatters less than 10 in a million photons from its surface. In comparison, the researchers note that an ordinary bathroom mirror loses around 104 times more photons, while even high-quality mirrors used in top-class research lose 10 to 100 times more.

Rather than directly depositing the mirror components onto an optical substrate, as is usually the case when fabricating such structures, the researchers made crystalline coatings using a new epitaxial layer transfer process. In their experiments, they deposited alternating multilayers of high refractive index gallium arsenide (GaAs) and low refractive index ternary aluminium gallium arsenide (Al𝑥Ga1-xAs) alloys. They then transferred these monocrystalline multilayers onto curved silicon optical substrates to make the mirrors. This crystalline coating was developed by Thorlabs Crystalline Solutions.

A new loss mechanism

To determine the optical losses of their mirrors, Winkler, Perner and colleagues directed a beam of weak laser light onto them and measured how much light they absorbed. At low levels of light absorption, they made use of a battery of optical measurements, including cavity ring-down and transmittance spectroscopy, as well as direct absorption tests.

During these measurements, they also observed a hitherto unobserved loss mechanism: a very distinct and unexpected change in absorption loss that depends on the polarization of the incident laser light. According to Hartwin Peelaers, a condensed-matter physicist at the University of Kansas, US who also worked on the mirror, this effect could come from the anisotropic elasticity of the crystals when they are strained.

The researchers, who report their work in Optica, say they are now developing ways to expand the optical bandwidth of their mirrors and improve their reflectivity even further so that they can be used in different applications. One promising area is high-resolution molecular spectroscopy, which is used to detect minute amounts of substances in gas mixtures – including extremely small concentrations of biomarker molecules in the exhaled breath of patients with early-stage cancer. Other possibilities include precisely detecting leaks of greenhouse gases such as methane from large-scale natural gas production plants and measuring light-matter interactions in fundamental physics research.

Why peregrine falcons wear eyeliner, golden eagles could accelerate using turbulence

This edition of the Red Folder has gone to the birds – or more precisely, to the raptors.

First up is the news that peregrine falcons have evolved the natural equivalent of eyeliner to help them hunt.

Fans of American football know that players will smear dark makeup below their eyes to reduce glare when they are trying to catch fast-moving balls. Peregrine falcons have similar patterns of dark feathers below their eyes — called malar stripes – and it had been long thought that they perform a similar function. However, definitive proof had been lacking.

Now, Michelle Vrettos, Chevonne Reynolds and Arjun Amar of the University of Cape Town have found good evidence that these stripes did indeed evolve to reduce glare. The trio studied over 2000 photographs of peregrines taken in 94 locations around the world by citizen scientists. They found a strong correlation between the strength of sunlight in a location and the size and darkness of malar stripes of birds at that location.

“The solar glare hypothesis has become ingrained in popular literature, but has never been tested empirically before,” says Vrettos. Amar points out that the falcons were ideal subjects for such a study because they are widespread throughout the world and therefore have evolved in a wide range of sunlight conditions.

They describe their research in Biology Letters.

Fly like an eagle

The combination of lightweight electronics like GPS and ubiquitous mobile phone networks have been a boon to researchers studying the behaviour of some animals in the wild. In a recent study, scientists in the US used such a tagging technology to closely monitor the flight of a golden eagle as it travelled along the Appalachian Mountains from Alabama to New York. As well as logging the bird’s location, the instruments measured the bird’s altitude, groundspeed and triaxial acceleration.

After taking local wind conditions into consideration, Kasey Laurent and Gregory Bewley at Cornell University and colleagues found that the eagle’s inflight accelerations have a highly irregular, fluctuating pattern that resembles particle trajectories in turbulent flows. The team then used a simple linear model explain the relationship between the eagle’s acceleration and the intensity of the turbulence it experienced.

Instead of being a hindrance to efficient flight, the team believes that eagles could use turbulence as a source of energy while flying. They also say that the performance of some aircraft in turbulence could be improved by studying the flight of birds such as the golden eagle.

The study is described in PNAS.

Reinforcing the case for independent QA in the radiation oncology clinic

In radiation therapy, patient safety starts and ends with independent QA, a comprehensive programme of machine, patient-specific and workflow checks for identifying – and mitigating – human and system errors in an expanding universe of complex treatment variables. By putting a forensic focus on the quality and accuracy of treatment planning, delivery and management, independent QA gives the radiation oncology team confidence that treatment is being delivered to the tumour site as intended while minimizing collateral damage to healthy tissues and organs at risk (OARs).

Here, Physics World talks to Jeff Kapatoes, senior director for regulatory and research at Sun Nuclear Corporation, a US-based manufacturer of QA solutions for radiotherapy and diagnostic imaging providers, about the enduring – indeed growing – importance of independent QA for patient safety, continuous improvement and clinical innovation in the radiation oncology workflow.

Why is independent QA fundamental to the successful delivery of radiotherapy treatments?

Independent QA provides an essential audit of the evolving radiotherapy delivery system, complementing the integrated “self-checks” on the treatment machine. There will always be residual risk from unforeseen failure modes in today’s complex treatment systems – a risk that is best addressed through independent QA to avoid any conflict of interest. Put simply: QA that is not independent is a self-check, and self-checking is inherently biased and driven by familiarity contamination or “group think” – both in design and risk assessment.

What’s more, as radiotherapy systems become more interoperable – knitting together hardware and software subsystems from multiple vendors – the likelihood of testing and verifying every configuration as part of a comprehensive self-check becomes ever-more remote. The bottom line: independent QA not only maintains the desired quality of treatment delivery, it drives continuous improvement in patient safety by rooting out systematic errors and simultaneously highlighting opportunities for machine/workflow optimization.

So independent QA and continuous improvement go hand-in-hand?

Correct – though that improvement proceeds along a couple of distinct coordinates. Upstream patient safety, for example, is shaped by the quality of the core radiotherapy technology.

Jeff Kapatoes

Progress has been rapid in this regard over the past decade, with wide-scale deployment of advanced modalities – including MR-guided radiotherapy (MR/RT), stereotactic radiosurgery/stereotactic body radiotherapy (SRS/SBRT) and volumetric modulated-arc therapy (VMAT) – all of which enhance treatment delivery and dose distribution accuracy when implemented correctly.

Meanwhile, independent QA (or downstream patient safety) is shaped by our ability to monitor and improve the quality of radiotherapy treatments. In other words: to ensure that the treatment system is delivering radiation to the patient as intended. Here too, capabilities are progressing dramatically, with once abstract concepts such as anatomical dose verification and in vivo monitoring becoming routinely available, improving our ability to detect myriad intra- and interfraction patient and machine errors.

In what ways do independent QA vendors like Sun Nuclear and its peers benefit the medical physics team?

Sun Nuclear’s QA solutions are used in many ways to deliver enhanced treatment outcomes and improved operational efficiencies in the radiation oncology clinic. That could mean commissioning a new treatment machine or clinical workflow; daily, weekly or monthly machine QA checks; as well as all aspects of patient-specific QA. In short, we give medical physicists the QA tools they need to do their job better as the independent auditors of radiation treatment and patient safety.

As such, Sun Nuclear is part of the collective QA conversation with the medical physics community, whether that’s requirements-gathering at scale for our product development roadmap or collaborating with clinical scientists directly on advanced QA technologies. It’s also worth noting how a dynamic and innovative QA ecosystem supports the commercial objectives of the radiotherapy equipment manufacturers, providing them with independent insurance and security regarding the safety of their treatment systems in the clinic.

What role does independent QA play in early-stage R&D and clinical translation of next-generation radiotherapy technologies?

QA tools from Sun Nuclear and other product vendors are fundamental for successful technology innovation and clinical translation of advanced radiotherapy modalities. The commercial introduction of the Elekta Unity MR-linac is a case in point. Back in 2012, Sun Nuclear supported the early-stage clinical evaluation of this pioneering MR/RT breakthrough, providing an MR-compatible ArcCHECK (a 4D diode array for patient QA) to Elekta’s university research partner UMC Utrecht in the Netherlands.

Easing adoption

Right now, we have around a dozen academic partners and university research hospitals using our products for preclinical research on novel treatment delivery systems. At the same time, established radiotherapy system manufacturers, as well as new-entrant technology start-ups, commonly deploy a suite of our QA solutions across the organization to support product development, manufacturing and their installation and service teams.

Do you see any threats to the culture of best practice and continuous improvement regarding patient safety in radiotherapy?

As one of several companies focused exclusively on independent QA, we see various structural “issues arising” that give cause for concern – and pause for thought. For starters, there’s a concerted move by radiotherapy equipment makers to integrate more and more downstream patient safety checks into their delivery systems. While there are operational benefits for sure, self-checking on its own is not enough and should not be a replacement for a rigorous, independent QA programme. Today’s radiotherapy systems are so complex that it is simply not possible to mitigate every potential risk internally – hence the need for independent evaluation and verification as a default setting.

There are also concerns around data – especially open access to the data generated by the treatment planning system, the linac (machine log files) and electronic portal imaging device (EPID) during treatment. While there have already been attempts to restrict access and monetize this data, it’s worth restating that the data is “owned” by the clinic and must remain freely accessible to enable independent analysis of radiotherapy fulfilment. A research study presented at the ESTRO 2020 Meeting by Iridium Kankernetwerk, Belgium, is instructive in this regard, with independent downstream QA – predicated on full data access at a single clinic over the course of two years – identifying 4000 actionable errors and opportunities for improvement within 56,000 delivered fractions. Just one example among many of the role that independent QA plays in the detection of random and systemic errors, the prevention of future errors, and the identification of opportunities for improved treatment delivery.

Where next for independent QA?

The independent QA community is in robust health. In fact, two of our more recent innovations – the SunCHECK platform (for integrated machine and patient QA) and SRS MapCHECK (a high-density diode array for SRS patient QA) – have seen tremendous clinical uptake because of the efficiency gains they provide when embedded in the treatment workflow. Proof-positive of the enduring demand for independent QA solutions – even with the blurring of boundaries that follows from more integrated self-check features being rolled out by the radiotherapy equipment vendors.

The competition is welcome in any case and will only accelerate product innovation and the introduction of enhanced QA solutions for medical physicists and the cross-disciplinary radiation oncology team. Right now, for example, there are exciting opportunities taking shape around automation, integration and machine learning within current treatment modalities. Longer term, innovative QA products and services will also be needed to support the clinical translation of next-generation treatment technologies such as FLASH radiotherapy and biological IGRT.

 

Copyright © 2026 by IOP Publishing Ltd and individual contributors