Skip to main content

Global audience tunes into online APS March Meeting

What a difference a year makes. In 2020 the APS March Meeting was one of the first casualties of the worsening COVID-19 pandemic, with the event cancelled just hours before it was due to get under way. This year, like all other major scientific conferences, the March Meeting will be convened online – enabling physicists in all parts of the world to explore the latest research breakthroughs and technical innovations.

The APS is expecting more than 11,000 scientists and students to log on for the online event. The main draw will be the scientific programme, with parallel sessions running from Monday 15 March to Friday 19 March, which will be complemented by pre-meeting tutorials and short courses, a series of Industry Days, and events designed specifically for undergraduate students to present their research, learn about career options, and connect with the scientific community.

A virtual exhibit will run from Monday through to Thursday, with company representatives available to discuss their products from 12.00 p.m. to 3.00 p.m. (Central Time). A few of the new product innovations that will be presented at the exhibit are highlighted below.

Superconducting magnet system offers more efficient cooling

The DryMag superconducting magnet system from Lake Shore Cryotonics

The Janis DryMag 1.5 K superconducting magnet system from Lake Shore Cryotronics offers a more cost-effective solution for low-temperature material research. Providing a continuous temperature range of 1.5 K to 420 K – even with the magnet operating at full field – the cryogen-free system now enables an easy shift between in-plane and out-of-plane measurements by providing both a 2D vector field magnet configuration as well as 0–90° precision sample rotation.

With an initial cooldown time of less than 24 hours (with the 9 T magnet), the system offers accurate and simultaneous temperature control of the sample mount and the surrounding helium exchange gas. The sample is located in static helium exchange gas for efficient cooling, regardless of sample material or shape.

In addition, the system is available in optical and non-optical geometries, and with a sample in vacuum configuration. An optional electrical transport measurement package is available, which includes Lake Shore’s M91 FastHall measurement controller for much faster, more precise Hall measurements. This controller offers measurement times up to 100 times faster than typical Hall systems, particularly when measuring low-mobility materials.

For more information about the DryMag system, visit lakeshore.com/DryMag.

Integrated unit simplifies the generation of complex waveforms

Tabor P90612B

The new Proteus RF arbitrary waveform generators/transceivers from Tabor Electronics allow complex pulse shapes and phases to be easily created up to 8 GHz, eliminating the need for cumbersome IQ modulator/oscillator set-ups. With an optional high-speed digitizer, the Proteus can be converted into an arbitrary waveform transceiver to provide closed-loop measurement capability within a single unit and programming environment.

A variety of interpolators, IQ modulators and numerically controlled oscillators are integrated into each channel, allowing complex RF signals to be generated directly from the Proteus instrument. This integrated approach eliminates the limitations of external IQ modulators and mixers, such as IQ mismatch and in-band carrier feed-through. Each channel is coherent and fully independent, making it possible to produce multichannel time/phase aligned signals, or signals of different frequencies and characteristics, from a single unit.

The Proteus RF AWG exploits a high-speed PCI express interface that provides up to 64 Gb/s of data transfer speed. It offers 16GS of memory to allow even long and complex waveform sequences to be downloaded quickly, while reducing the waveform size through interpolation up to a factor of x8 makes it possible to save even more experimental set-up time. For applications requiring more memory or real-time changes to the waveform, the Proteus arbitrary waveform transceiver allows waveforms to be streamed from disk to instrument at speeds of up to 6GS/s.

Three form-factor versions are available to suit the needs of the experiment: the modular PXIe format offers unlimited time-aligned channels with the fastest data transfer rates, while the desk and bench versions offer up to 12 fully independent channels.

Integrated platform delivers quantum control

Quantum Machines hardware

New from Quantum Machines (QM) is the Quantum Orchestration Platform (QOP), a complete hardware and software solution that allows even the most complex quantum algorithms to be run on any quantum processor. Unlike classical computation, where the computer logic is embedded within the processor itself, quantum processors have no built-in logic. Logic operations are instead performed by sending high-frequency pulses to the quantum processor from tailor-made classical hardware.

To address this challenge, QM has developed an integrated solution for controlling quantum systems. At the core of the solution is the OPX, the hardware element of the QOP, which incorporates multiple waveform generators, digitizers and processing units that are all integrated on a single FPGA with a unique and scalable design. Inside the OPX is a dedicated pulse processor that allows for advanced multi-qubit manipulation, quantum error correction, and full system scaling.

The OPX is designed to be easily programmed using QUA, a powerful yet intuitive programming language, which enables the most complex experiments and algorithms to be run quickly and easily. With seamless compatibility and powerful capabilities such as ultra-low feedback latency and general control flow, the platform delivers real-time processing to speed up experiments by as much as an order of magnitude.

Founded by leading quantum researchers, Quantum Machines partners with development teams at the forefront of quantum computing, including multinational corporations, start-ups, government laboratories, and academic institutions, to help advance the future of quantum computing.

Novel AFM mode enables electrochemical mapping for battery research

Park Systems

The NX range of atomic force microscopes (AFMs) from Park Systems now supports scanning electrochemical cell microscopy (SECCM), a new pipette-based nanoelectrochemical scanning probe technique for investigating the local electrochemical properties of electrode surfaces. For the first time this technique is allowing scientists studying electrocatalysis and energy storage to correlate electrochemical activity with the nanostructure of electrochemical interfaces.

SECCM works by inserting a quasi-reference counter electrode (QRCE) into a nanopipette filled with an electroactive species. The Z-scanner of the AFM is used to lower the nanopipette onto the contact surface, creating a meniscus that allows a tiny droplet, or nanoelectrochemical cell, to form. The electroactive species in the droplet reacts when a bias is applied between the QRCE and an electrode placed on the XY scanner, making it possible to create a electrochemical current mapping at various positions across the sample surface.

SECCM allows researchers to perform thousands of confined nanoelectrochemical measurements on a single surface, with droplet sizes ranging from a few hundred nanometers to a few microns. Researchers can easily alter the chemical systems by swapping a new pipette with another electroactive species, and there is little need for special preparation of samples – making the technique both simple and cost-effective.

In one recent study, the SECCM mode of the Park NX12 system was used to study an electrochemically reversible redox process at a highly ordered surface of pyrolytic graphite (HOPG). Localized nanoscopic cyclic voltammetry measurements were taken each time the meniscus touched the surface, providing a spatially-resolved surface electroactivity mapping of HOPG at the micro- and nanoscale.

The results indicate that the results are robust and reproducible, with a current limit as low as a few picoAmpere. According to application scientists at Park Systems, this capability could also facilitate the rational design of functional electromaterials for use in energy storage studies and for corrosion research.

Cryostat supports quantum computing scale-up

The ProteoxLX from Oxford Instruments

Oxford Instruments NanoScience will be showcasing its latest innovation in cryogen-free dilution refrigerator technology for quantum computing scale-up, the ProteoxLX, at the APS March Meeting. The ProteoxLX is part of Oxford Instruments’ family of next-generation dilution refrigerators, which all share the same modular layout to provide cross-compatibility and added flexibility for cryogenic installations.

Optimized for scaling up quantum computing systems, the LX system supports maximum qubit counts, with a large sample space and ample coaxial wiring capacity. Low vibration features reduce noise and support long qubit coherence times, while the system provides full integration of signal conditioning components.

The LX also offers two fully customizable secondary inserts for an optimized layout of cold electronics, as well high-capacity input and output lines that are fully compatible and interchangeable across the Proteox family.

Photonic crystal lasers deliver optimal performance for lidar sensing and laser processing

PCSEL

High-performance lasers are central to both lidar sensing and laser processing – technologies that enable a range of smart mobility and smart manufacturing applications. Lidar, for example, plays an essential role within autonomous vehicles, construction machines and automated factory robots, while laser processing is used for fabrication of electronics, automobiles and solar cells.

Currently, however, the lasers used for these applications – semiconductor lasers, CO2 lasers and fibre lasers – all come with limitations. Speaking at the recent Photonics West LASE conference, Susumu Noda from Kyoto University described the problem.

To generate high-power from a conventional semiconductor laser, such as a broad-area Fabry–Pérot laser, the cavity width must be increased, which results in a low-quality, divergent output beam. “For lidar sensing, a complicated lens systems and fine adjustments are required to reform the beam shape,” Noda explained. “And for laser processing, the output beam cannot be focused sufficiently and thus cannot be used directly for manufacturing.”

Laser processing is also performed using CO2 and fibre lasers. But CO2 lasers are extremely large and have low efficiency. Fibre lasers, meanwhile, contain hundreds of laser diodes that are combined into an amplification fibre. As such, fibre lasers suffer from a complex configuration and a substantial size, as well as limited efficiency.

What’s needed is a totally different laser technology that addresses these issues. And according to Noda, this is the photonic crystal surface-emitting laser, or PCSEL.

PCSELs are new type of semiconductor laser that contain a photonic crystal integrated on top of the active layer. Photonic crystals are nanostructured materials in which a periodic variation of the dielectric constant (formed, for example, by a lattice of holes) creates a photonic band-gap. The resulting PCSELs emit a high-quality, symmetric beam with narrow divergence.

Susumu Noda

“For lidar applications, lens-free, adjustment-free operation is possible. And for laser processing, due to their high brightness, we can use PCSELS directly for materials processing,” said Noda. “Therefore, photonic crystal lasers are expected to contribute to the development of smart mobility and smart manufacturing.”

Noda and his research team have been working on PCSELs since 1999, when they first established 2D surface-emitting coherent oscillation. They went on to demonstrate control of polarization and beam shape by tailoring the phonic crystal structure, expansion into blue–violet wavelengths and 1D beam steering. In 2013, 0.2 W PCSELs became commercially available, and the team has since demonstrated watt-class operation and above.

Lidar applications

To generate the high-brightness operation required for lidar applications, Noda’s team created a double-lattice PCSEL, containing two sets of square lattice structures with different air hole depths and sizes. The 500 µm-diameter device exhibited a slope efficiency of 0.8 W/A and extremely narrow beam divergence of 0.1°.

The PCSEL demonstrated superior temperature performance to conventional semiconductor lasers, operating between -40 and 100 °C, with temperature dependences of the output power and lasing wavelength of -0.36%/K and 0.08 nm/K. Examining lens-free light propagation from these PCSELs revealed that they could create small beam spots (5 cm) at distances of up to 30 m. Conventional Fabry–Pérot lasers could not even deliver a beam at 10 m or further.

PCSEL-based lidar

The team used this double-lattice PCSEL to construct a lidar system. Noda pointed out that while conventional lasers used in lidar require complicated lens systems and mechanical mirrors to steer the beam, PCSELs offer lens-free, adjustment-free beam control.

Noda shared a video showing the ability of PCSEL-based lidar to measure the distance of objects – in this case, two researchers. As they walked towards the lidar system, it accurately detected their movement in real time. It could also track smaller changes in distance, such as swinging of arms or hand movements, demonstrating successful high-resolution lidar operation.

The team has also recently demonstrated 2D electronic beam scanning, based on a chip design with dually modulated PCSELs integrated in a 10 × 10 array.

Laser processing

For laser processing applications, the PCSELs need to offer miniaturization and high efficiency, to address the limitations of CO2 and fibre lasers. Using ultracompact PCSELs, “the realization of handheld laser processing systems can be possible,” said Noda.

Three steps are required to achieve this, he explained: creating 1 mm, 10 W PCSELs with high beam quality; enlarging the devices to 3 mm to reach 100 W output power; and developing associated packaging and cooling technologies.

To create 10 W PCSELs, the team optimized the double-lattice structures by adjusting the size and shape of the air holes. The fabricated device successfully achieved 10 W continuous-wave operation, with very narrow beam divergence. Irradiating a metal surface with the focused PCSEL beam formed extremely fine holes. “We believe this is the first successful processing of a metal surface by a single-chip semiconductor laser,” said Noda.

Next, the team fabricated and packaged a 3 mm-diameter PCSEL. Initial tests on this device under pulsed conditions showed that 150 W output was obtained using only six times the threshold current. Noda predicts that by further increasing the device size to 1 cm, kilowatt-class operation could be achieved.

“For the future, key devices for lidar sensing and laser processing can be made using PCSELs,” Noda concluded.

Quantum gravity could soon be tested using ultracold atoms

Quantum gravity might soon be tested in the lab, thanks to a new analysis from physicists in the UK, France and Hong Kong. Drawing on advances in quantum information science, the researchers have found that if gravity is fundamentally quantum rather than classical it must generate a signature known as non-Gaussianity. To look for that signature, they propose probing an ultra-cold gas of several billion caesium atoms existing in a state known as a Bose-Einstein condensate (BEC).

Attempts to unify general relativity and quantum mechanics usually involve quantizing gravity to create a theory of quantum gravity such as string theory or loop quantum gravity. However, with little or no empirical data to support such theories, some physicists have developed alternative unifying theories in which matter is quantized but gravity itself remains a fundamentally classical variable.

Previously, most scientists thought that distinguishing between these two types of theory in the laboratory would be impossible given the scale at which space-time should become quantized. That “Planck length” – a mere 1.6×1035 m – could only be probed directly by colliding particles using an accelerator about the size of the Milky Way.

Manageable Planck mass

However, insights from quantum information science suggest that tests of quantum gravity could be done at the far more manageable scale of the “Planck mass” – which is about 22 µg. The challenge is creating a real system that can remain in a coherent quantum state on this macroscopic scale and the solution could rely on techniques developed for the construction of quantum computers and other quantum technologies.

One such proposal, put forward by Vlatko Vedral at the University of Oxford and other researchers in 2017, involves the observation of quantum entanglement between two microspheres, each of which is placed into a superposition of two spatial locations. By blocking all other possible interactions between the spheres, any entanglement must occur via a gravitational interaction. But as Richard Howl, until recently at the University of Nottingham in the UK, and Vedral point out in their new research, classical non-local effects could conceivably lead to such entanglement.

To carry out a more unambiguous test of quantum gravity, the new work instead relies on non-Gaussianity. Needed to carry out universal quantum computation, this is a property of a quantum system whose time-evolution operator is not just a linear or quadratic function of quantum variables. A matter particle emitting a graviton, for example, would be non-Gaussian because the Feynman diagram representing the interaction would involve three quantum operators (as opposed to two in the classical case).

Billions of atoms

Howl and colleagues have shown theoretically that if a system displays non-Gaussianity then its gravitational interaction must be quantum mechanical. What is more, they have identified a quantum system that could be scrutinized for this characteristic, and which could be set up using existing technology. That system is a BEC, a state of matter in which all atoms are cooled to such a low temperature that they end up sharing the same quantum state. More specifically, the researchers suggest a 0.2 mm-diameter condensate of 4 billion atoms of caesium-133 held in a spherical optical trap for around 2 s.

There are several ways that this system could be scrutinized. One option involves releasing the condensate from its trap and then sending it through a matter-wave beam-splitter. By measuring the number of atoms in the two outgoing beams and repeating the process many times over, the difference between those numbers should follow a non-Gaussian distribution if gravity is indeed quantum mechanical.

The physicists maintain that this set-up has several advantages over microspheres. It involves just a single quantum system in a single location, for example, and the team argues that a BEC lends itself naturally to eliminating the electromagnetic interactions that would also display non-Gaussianity and therefore potentially generate a false positive signal.

Feshbach resonances

As they point out, the microsphere experiment would minimize electromagnetic forces simply by placing the spheres far enough part – at a distance where the objects’ mutual gravity is stronger than the van der Waals force. Doing so, however, also reduces the gravitational interaction. With the condensate however, Feshbach resonances allow the overall strength of electromagnetic interactions between the constituent atoms to go to zero when the system is exposed to a suitable magnetic field or laser beam.

Carrying out the experiment in the lab will involve overcoming a number of technical hurdles, including how to put the atoms in their initial quantum state. BECs have previously been placed into massive non-classical states but not into the kind of state needed here. That is a macroscopic squeezed state, which exploits Heisenberg’s uncertainty principle to reduce noise in the measured variable – either the BEC’s position or momentum. This comes at the expense of increasing noise in the other variable, which does not affect the measurement.

Generating this state will be tricky since it will require transforming squeezed states of spin and atom number – which has never been done for such a large BEC. Alternatively, says Howl, it might be possible to use a much smaller squeezed state. But that in turn would mean increasing both the number of atoms in the condensate and the number of experimental trials by a couple of orders of magnitude.

Gerard Milburn at the University of Queensland in Australia, is enthusiastic about the principle behind the new work – describing the switch to non-Gaussian signatures as “a very good idea”. But he cautions that putting that idea into practice will not be easy, given quantum noise arising from non-Gaussian dynamics in the condensate itself. “My guess is that these effects will be at least as large as the non-Gaussian effects coming from quantum gravity,” he says.

The research is reported in PRX Quantum.

DNA’s remarkable physical properties

DNA is not just genetic material. It is also an advanced polymer that is inspiring a new field of research that treats DNA as a soft material. As well as developing our fundamental understanding of life processes, this research could also lead to applications such as smart drug carriers or new methods for regenerating tissues.

Find out more in the article ‘Make or break: building soft materials with DNA’, by physicist Davide Michieletto.

Accuracy, feasibility and reliability of linac-based VMAT technique for total body irradiation (TBI)

Want to learn more on this subject?

Radiation therapy in the form of total body irradiation (TBI) technique continues to be an important part of conditioning regimens in patients undergoing bone-marrow transplantation in haematological malignancies.

During this webinar, Assoc. Prof. Bora Tas will speak about the feasibility, accuracy and reliability of volumetric modulated arc therapy (VMAT)-based TBI treatment in patients.

Want to learn more on this subject?

Assoc. Prof. Bora Tas is director and head of MR-Linac (Unity) & Linac Business Lines at Elekta in Istanbul, Turkey. He has more than 10 years of clinical experience managing the physics and dosimetry for a facility with a high-volume, hospital-based radiation oncology department. He is skilled in X-rays, electrons, MRI, particle therapy, medical devices, dosimetry, and oncology management, as well as having lots of experience of working with VMAT, IMRT, SRS and SBRT techniques. He is an editorial board member and reviewer of scientific journals.

Make or break: building soft materials with DNA

Call me naive, but until a few years ago I had never realized you can actually buy DNA. As a physicist, I’d been familiar with DNA as the “molecule of life” – something that carries genetic information and allows complex organisms, such as you and me, to be created. But I was surprised to find that biotech firms purify DNA from viruses and will ship concentrated solutions in the post. In fact, you can just go online and order DNA, which is exactly what I did. Only there was another surprise in store.

When the DNA solution arrived at my lab in Edinburgh, it came in a tube with about half a milligram of DNA per centimetre cube of water. Keen to experiment on it, I tried to pipette some of the solution out, but it didn’t run freely into my plastic tube. Instead, it was all gloopy and resisted the suction of my pipette. I rushed over to a colleague in my lab, eagerly announcing my amazing “discovery”. They just looked at me like I was an idiot. Of course, solutions of DNA are gloopy.

I should have known better. It’s easy to idealize DNA as some kind of magic material, but it’s essentially just a long-chain double-helical polymer consisting of four different types of monomers – the nucleotides A, T, C and G, which stack together into base pairs. And like all polymers at high concentrations, the DNA chains can get entangled. In fact, they get so tied up that a single human cell can have up to 2 m of DNA crammed into an object just 10 μm in size. Scaled up, it’s like storing 20 km of hair-thin wire in a box no bigger than your mobile phone.

But if DNA molecules stayed horribly entangled, then nature would have a big problem. In particular, it would be impossible for chromosomes – long pieces of DNA containing millions of base pairs – to be constantly read and copied. And if that didn’t happen, then cells would be unable to make proteins and multiply. Thanks to the wonders of evolution, nature has got round this problem by “engineering” special proteins that can change DNA’s shape, or “topology”, to get rid of the entanglements.

Left to its own devices, a typical human chromosome would take about 500 years to undo or “relax” its entanglements. But these clever proteins can speed up the process by, for example, allowing a DNA molecule to temporarily split up and then reform. These proteins are vital to the operation of biological cells – that’s why the DNA I bought online was so gloopy: it was a pure form that had no proteins to undo the entanglements.

Unfortunately, there can be an over-abundance of these proteins in certain cancer cells, which therefore multiply incredibly fast as the proteins remove the entanglements so efficiently. Indeed, some of the first and most effective anti-cancer drugs were those that could stop so-called “type 2 topoisomerase” proteins from getting rid of entanglements. These drugs have some nasty side effects as topoisomerase proteins also play a vital role in ordinary, healthy cells.

By combining our knowledge of polymer physics and molecular biology, we can exploit DNA’s soap-like behaviour to craft DNA-based soft materials that change shape over time

But would you believe me if I said that DNA’s ability to morph its architecture means that it behaves a bit like soap? The link between DNA and soap is certainly surprising. But by combining our knowledge of polymer physics and molecular biology, we can exploit this soapy feature to craft DNA-based soft materials that change topology over time. And by tweaking their topology, we can control their physical properties in unusual ways.

A wormy tale

To understand the link between DNA and soap, I should point out that soaps and shampoos consist of “amphiphilic” molecules, one part of which loves water and another part that hates it. These molecules don’t exist in isolation but group together to form larger structures, known as “micelles”. At low concentrations, they’re usually spherical, but at higher concentrations, the molecules can gang together to form long, worm-like micelles, with the water-hating parts of the molecules facing inside (figure 1a).

Ranging in size from nanometres to microns, these elongated, multi-molecule objects do strange things at high concentrations. In particular, just like DNA, they get entangled, increasing the fluid’s friction and making it harder to deform. In fact, the entanglements between worm-like micelles are what give your soap, shampoo, face cream or hair gel that pleasant, smooth hand-feel, which is something to think about next time you’re taking a bath or shower.

figure 1

Just like polymers, it turns out that worm-like micelles can also disentangle themselves by sliding apart (figure 1b). But they have other options too. That’s because worm-like micelles are continuously morphing: they break up, fuse or reconnect with their neighbours – no micelle is the same at any two points in time (figure 1c). This ever-changing feature wonderfully embodies the Greek philosopher Heraclitus’s concept of “panta rhei”, or “everything flows” (from which the term “rheology” for the study of flow is derived). Indeed, micelles almost seem like quasi-living objects, thanks to their ability to morph their architecture and, sometimes, even their topology.

This interplay between dynamic architecture and conventional relaxation can lead to some highly unusual flow properties, such as the viscosity of soaps dropping drastically when sheared. Indeed, this sudden loss of stickiness explains why hand lotions, shampoo and creams, which are viscous when left alone, can be easily squeezed out of tube with a narrow nozzle.

Breaking and reconnecting

So just like worm-like micelles in soaps, DNA molecules are constantly getting broken up and glued back together again with a new topology (figure 2). But there’s one big difference: the DNA needs to preserve its genetic sequence otherwise cells might die or diseases could be triggered. In soap, there’s no precise sequence of monomers in micelles so they can be put back together in any order. Nature, however, requires proteins to perform topological operations on DNA while maintaining the original information (the DNA sequence) intact.

This has a fundamental impact on how topological operations are performed on DNA. Unlike worm-like micelles – where the operations can occur at random anywhere along the micelle and at any time – the topological changes on DNA have to happen at the right place and the right time (they have to be “regulated” as biologists love to say). It’s a mind-blowing concept – and one that I’ll be spending the next five years trying to artificially reproduce, to create a new generation of materials.

figure 2

To break DNA, for example, you need “restriction enzymes”, which cut the chain only where a certain DNA sequence is recognized. Topoisomerase proteins, meanwhile, have to be precisely positioned at certain locations on chromosomes where entanglements and mechanical stress often accumulate. Similarly, when two pieces of DNA reconnect and recombine – for example when parental genetic material is shuffled in gametes (the precursor of egg and sperm cells) – the process is tightly regulated in space and time to avoid aberrant chromosomes in cells. It’s almost as if DNA (thanks to proteins) is a smart worm-like micelle.

While all this may sound rather esoteric, it turns out that when the US microbiologist Hamilton Smith discovered the first restriction enzyme in the 1970s, he didn’t use any fancy biological techniques – but simply carried out accurate viscosity measurements. Having extracted DNA from a virus and mixed it with the insides of a bacterium, he saw that the viscosity of the DNA solution fell with time; the runnier liquid meant that the DNA must have been cut by an enzyme in the bacterium. Smith won the 1978 Nobel Prize for Physiology or Medicine for his efforts and it’s humbling to think it was all done with a simple viscosity experiment that has its roots in physics.

DNA and nanotechnology

I’m definitely not the only person to see the potential of DNA as an advanced polymer, rather than just as genetic material. Over the last two decades, researchers have developed lots of new, DNA-based materials, such as hydrogels and nano-scaffolds, that could, for example, grow bones, tissues, skin and cells, using the unique properties of DNA to encode information. Recently, there’s also been lots of work on “DNA origami”, in which the information along the DNA chain is now stored in 3D shapes (figure  3a). Indeed, we could even see nano-robots or nano-machines made from DNA.

What excites me about this line of research is that solutions of DNA, functionalized by the presence of proteins that can change DNA’s topology in time, may yield novel “topologically active” complex fluids that respond to external stimuli. These fluids and nanomaterials would exploit the information-storing abilities of DNA to form complex 3D shapes or hybrid scaffolding with the responsiveness, plasticity and precision endowed by specialized proteins (figure 3b). For example, adding restriction enzymes that can cut the DNA at specific sequences could allow stiff and robust DNA-based scaffolds to be degraded as soon as they are no longer needed. That could be useful if you’re using a scaffold to, say, regenerate a bone in a patient’s body: once the scaffold is not needed any more, you can get rid of it.

figure 3

At the same time, adding topoisomerase to an ensemble of DNA plasmids (circular DNA) can create a gel, in which the rings of DNA are joined together like the rings on the logo of the modern-day Olympic Games (figure 3c). These “Olympic gels” have proved impossible to synthesize in the lab despite decades of trying, yet nature has been doing so for millions of years.

In fact, I find it amazing that a type of unicellular organism called trypanosomes base their very existence on this Olympic gel. In particular, part of their genome takes the form of a giant network in which each DNA minicircle is linked to about other three others nearby to form an architecture that looks a bit like medieval chain mail. What’s even more fascinating is that this topological structure is continually splitting up and reassembling correctly at each cell division.

Interdisciplinary research from the bottom up

Apart from their intrinsic scientific interest, studying such biological structures will also help us design a new generation of self-assembled topological materials. These complex, DNA-based materials hold great technological promise, but to make progress we need multidisciplinary teams of physicists, chemists and biologists working together. What’s more, they will have to work from the bottom up, exploring basic principles for curiosity’s sake, and not only trying to solve specific technological problems that industry faces.

One notable success story in this regard, at least here in the UK, has been the creation of the Physics of Life network, led by the physicist Tom McLeish, which has seen the country’s research councils invest in this area. Now bearing fruit, I hope it’s the start of a stable, long-term, interdisciplinary programme of support. The Biological Physics Group of the Institute of Physics, which publishes Physics World, is also playing a key role in encouraging more groups to embrace this multidisciplinary approach at the interface between soft matter and biological physics.

However, we still need more top-quality journals that recognize high-value interdisciplinary research of this kind, while research centres that cut across traditional academic disciplines will be vital too. It is an exhilarating field to be in, where everyone – no matter where they are in their career – learns something new every day. My hope is that in 10 or 20 years’ time, scientists who are starting out in their careers will no longer feel obliged to explore only one specific discipline or to choose between theoretical and experimental work. Instead, it would be great if they could simply satisfy their scientific curiosity no matter what background they are from. For if they do that, who knows what we might find next?

How do surgical face masks affect functional MRI measurements?

As the COVID-19 pandemic continues across the world, the wearing of face masks indoors has become a requirement to help reduce virus transmission. Facial coverings are also worn in MRI scanners during data acquisition to keep participants safe. However, it was unclear what impact this could have on measured brain signals. Now, researchers at Stanford University have investigated the effect of wearing a face mask on functional MRI (fMRI) signals during scanning. They describe their work in NeuroImage.

Functional MRI measures the blood-oxygen-level-dependent (BOLD) response of the brain due to changes in blood flow during activation. The BOLD response is sensitive to the concentrations of oxygen and carbon dioxide (CO2) in the blood. Wearing a facial covering mixes the expired and inspired air streams. As you breathe out CO2, this mixing increases the amount of inspired CO2, resulting in mild hypercapnia (elevated blood CO2 levels). This hypercapnia increases cerebral blood flow and therefore elevates the measured BOLD signal, resulting in greater contrast in the fMRI compared with that seen in the absence of hypercapnia.

The impact on BOLD signals

The Stanford team performed task-based neuroimaging studies using a “block design”, in which an activity is performed or a stimulation is given during an “on-window”. This is followed by an “off-window”, where the action or stimulation ceases.

In this study, the researchers ran two block designs simultaneously. The first involved a short 15-s sensory-motor task, which stimulated the brain’s auditory, visual and sensorimotor regions concurrently. Throughout the task, they delivered fresh air to the participant through a nasal cannula in on–off cycles of 90 s. They conducted the experiment twice with each participant, once with a surgical face mask and once without. The cannula manipulated the gas content of the inspired air during the mask-on state, preventing CO2 build-up; it had a minimal effect on CO2 levels when the mask was off.

The team also recorded end tidal CO2 (ETCO2) levels in a separate session outside of the MRI scanner, to measure the effect of the mask on hypercapnia for each participant.

The researchers analysed data from eight healthy participants using a general linear model with two variables: one describing the sensory-motor task, and the other the nasal cannula air supply. The resulting group activation maps from the sensory-motor task, which indicate the areas of the brain that are active during the task on-window, showed no significant differences between the mask-on and mask-off states. These results demonstrate that task-activation can be reliably detected while the participant wears a mask.

The baseline BOLD signal, which the team analysed using the nasal cannula air cycles, showed a significant difference between the mask-on and mask-off states. The results demonstrated that the face mask induced an average baseline signal shift of 30.0%, with the grey matter across the brain showing an evident deactivation (observed via an increase in signal without the air supply) in the group activation maps. The measured ETCO2 showed an average increase of 7.4%, confirming the predicted rise in inspired CO2 concentration with mask use.

This topical study provides some clarity to the neuroimaging community regarding the impact of face masks on data collected throughout the pandemic. The insignificant difference measured between the task-activated signal with masks on or off supports the safe continuation of task-based fMRI studies in clinical and research settings while following mask regulations.

Motorized water phantom underpins ‘gold-standard’ QA for MR/RT systems

It’s still relatively early days in terms of wide-scale clinical adoption, but the benefits of the new generation of MR-guided radiotherapy (MR/RT) systems are already clear. Think real-time image-guided adaptation of radiation delivery – more effectively treating the tumour target while sparing healthy tissue and minimizing damage to adjacent organs at risk and critical structures. On the flip side, the operational challenges of a hybrid MR-Linac treatment machine are also evident – not least when it comes to the implementation of efficient and streamlined protocols for quality assurance (QA) with an MRI scanner integrated into the radiotherapy workflow.

Fundamental physics complicates matters further, with the MRI scanner’s magnetic field having a non-trivial impact on dose deposition and distribution in the irradiated volume – most importantly within the patient, but also the radiation dosimeters used for QA purposes. The QA challenge doesn’t end there. Conventional water phantoms, essential for the commissioning and annual verification of radiotherapy systems, are not suited to the unique MR-Linac environment – chiefly because, for safety reasons, the use of ferromagnetic materials is prohibited within the strong MRI magnetic field.

To address this troublesome bottleneck, laser and radiotherapy QA specialist LAP has developed a 3D and MR-compatible motorized water phantom tailored specifically for the commissioning and QA of MR-Linacs. The newly launched THALES 3D MR SCANNER is MR-conditional – i.e. all system components are made from non-ferromagnetic materials certified for use within the MRI scanner’s magnetic field – while the automated set-up (which takes under 15 minutes to prepare) and predefined measurement sequences are intended to help the medical physics team save time and simplify their test routines during system commissioning and annual or biannual QA.

“The THALES 3D MR SCANNER provides a gold-standard dose accuracy check for MR/RT users,” claims Thierry Mertens, a physicist and LAP’s business development manager for healthcare. In the radiation oncology clinic, the phantom will be used alongside a portfolio of QA tools – some providing daily, weekly and monthly QA checks, with the THALES 3D MR SCANNER reserved for system commissioning and ongoing verification of dose delivery after any major upgrades to the MR-Linac. “In this way,” adds Mertens, “the water phantom will give the medical physicist peace of mind, ensuring that their MR/RT system is accurately calibrated and supporting accurate verification of delivered dose to the patient.”

Collaborative development

Significantly, the THALES 3D MR SCANNER is now cleared for full commercial release in the US after receiving 510(k) approval from the US Food and Drug Administration (FDA), while the product’s CE mark provides a green light for roll-out to clinical customers in the European Economic Area. In both regions, the phantom comes with a yearly maintenance visit, software and hardware updates, and a configurable multiyear warranty.

Thierry Mertens

These commercial milestones are the culmination of a five-year product development effort that began with LAP’s acquisition of Euromechanics Medical GmbH in summer 2016 – a purchase that, in large part, was driven by the latter’s active collaboration with MR/RT pioneer ViewRay to develop an MR-compatible phantom for the then-prototype MRIdian treatment system. That product collaboration accelerated post-acquisition, with ViewRay keen to promote the development of an independent QA ecosystem around its MRIdian machine. “Not surprisingly,” says Mertens, “the THALES 3D MR SCANNER is a perfect fit when commissioning the beam model of the MRIdian system, supporting end-users with an efficient process for accurate dosimetry measurements.”

In parallel, Mertens and his colleagues at LAP broadened the product development effort on their water phantom to gather insights from clinical early-adopters of the ViewRay MRIdian system – most notably the University Medical Centre (UMC) in Amsterdam (Netherlands), University Clinic Heidelberg (Germany) and the Henry Ford Cancer Institute in Detroit (US). “The voice of the clinical customer was fundamental to our requirements-gathering and optimization of the phantom design, usability and functionality,” says Mertens.

Put another way: as a QA vendor, it was incumbent on LAP to understand what Mertens calls “the A to Z of the clinical workflow”, thereby ensuring that the hardware, software, electronics and components of the THALES 3D MR SCANNER are all optimized versus the ViewRay MR-Linac design. “This continuous-improvement mindset is key,” adds Mertens. “The phantom has been shaped by clinical physicists at the sharp-end of treatment delivery and we continue to incorporate their feedback from the field to inform and iterate our product design.”

The view from the sharp end

While commercial roll-out of the THALES 3D MR SCANNER is now the priority, it’s also worth noting that further innovation is in the works. A custom version of the phantom to support Varian’s Halcyon image-guided radiotherapy system and ETHOS, the vendor’s new AI-enabled adaptive radiotherapy machine, is expected later this year. Down the line, a modified water phantom is also planned to provide compatibility with Elekta’s Unity MR-Linac machine.

Out in the clinic, meanwhile, Mertens predicts a variety of use-cases for the THALES 3D MR SCANNER through 2021. For starters, there are new ViewRay customers who need a suite of QA tools to support the commissioning and acceptance of their MRIdian systems. “Medical physicists are ultimately accountable,” he notes, “and they want independent QA and verification tools to confirm that what they’re getting from the radiotherapy OEM is operating as per the specification.”

Given the relative novelty of MR/RT, it’s inevitable that many clinics are newcomers to the field and, as such, are still finding their way when it comes to the unique functionality and nuances of MR-Linac machines. “As they ramp up their MR/RT programmes, I can see these customers using the THALES 3D MR SCANNER more frequently – maybe once a month at the outset – to explore the impact of the magnetic field and really get to know their treatment system,” adds Mertens.

Over time, though, it’s likely that the water phantom will be needed less often – perhaps once or twice a year as part of standard machine QA and after any significant upgrade to the MR-Linac hardware or software. Mertens concludes: “This is where the water phantom really comes into its own, helping the medical physicist with rigorous beam data and beam model visualizations to verify that the delivered radiation as it applies to the patient is indeed correct.”

Light could levitate micron-thin aircraft in Earth’s mesosphere

A new light-driven levitation technique could soon enable tiny, low-cost aircraft to achieve the first sustained flight in the Earth’s mesosphere. Mohsen Azadi and colleagues at the University of Pennsylvania University exploited the effect of photophoresis, combined with an intricately shaped light beam, to levitate thin mylar disks at low pressure in a vacuum chamber. Their microflyers could soon allow researchers to explore one of the most poorly understood parts of Earth’s atmosphere in unprecedented detail.

Situated between 50-80 km above Earth’s surface, the mesosphere is a no-man’s-land for sustained flight. At these altitudes, air density is too low to generate lift for airplanes, but still high enough that space-based satellites will experience unsustainable drag and burn. One emerging solution lies in the use of light-driven motion, known as photophoresis, as an alternative propulsion mechanism. The effect has been widely observed in small particles such as atmospheric aerosols. When illuminated by suitably intense light beams, they will move due to non-uniform temperature distributions that form in the air surrounding them.

Azadi and colleagues aim is to use photophoresis to levitate much larger objects, starting with a design based on a circular mylar film that is 6 mm in diameter and 500 nm in thick. On the underside of the disk, they deposited a 300 nm-thick layer of tangled carbon nanotubes, creating a network of microscopic air traps. They then placed their structure inside a vacuum chamber, and reduced the air pressure to as low as 10 Pa – just a small fraction of the atmospheric pressure experienced on Earth’s surface (about 100 kPa).

Net upward force

To levitate the disks, the researchers illuminate them with a light intensity comparable to sunlight, causing them to heat the sparse surrounding air. On their undersides, air molecules trapped by the carbon nanotubes are heated for longer than those on the upper sides of the disks. As a result, these molecules reached higher velocities when they finally escaped from the traps, generating a net upward force.

To control the flight paths of their aircraft, Azadi and colleagues designed an optical trap using a specially shaped light field. At the centre of the beam, light intensity was just high enough to levitate the disks. Yet surrounding this region, a ring of higher light intensity pushed the disks back towards the centre when their paths deviated. Finally, researchers constructed a theoretical model from their results, allowing them to predict that the lifting force generated by a microflyer could be many times larger than its weight.

With much more development, the concept could enable low-cost aircraft to achieve sustained flight at altitudes ranging from 50-100 km, and even carry payloads as large as 10 mg. Their cargo-carrying abilities may increase even further if hundreds of microflyers were joined together by lightweight carbon fibres – enabling them to carry equipment such as smart dust sensors, and devices to track atmospheric circulation patterns. Ultimately, this would open broad new areas of research into one of the least understood parts of Earth’s atmosphere.

The research is described in Science Advances.

Long-distance space travel: addressing the radiation problem

A team of US and Netherlands-based scientists has published a review paper highlighting ways to protect astronauts from the negative cardiovascular health impacts associated with exposure to space radiation during long-distance space travel.

Cardiovascular impacts

Space radiation is currently regarded as the most limiting factor for long-distance space travel because exposure to it is associated with significant negative effects on the human body. However, data on these effects are currently only available for those members of the Apollo programme that travelled as far as the Moon – too small a number from which to draw any significant conclusions about the effects of the space environment on the human body. In addition, although exposure to space radiation, including galactic cosmic rays and solar “proton storms”, has previously been linked to the development of cancer and neurological problems, data on the consequences of space radiation exposure for the cardiovascular system are lacking.

In an effort to address these limitations, researchers based at the University Medical Center (UMC) Utrecht, Leiden University Medical Center, Radboud University and the Technical University Eindhoven in the Netherlands, as well as Stanford University School of Medicine and Rice University in the US, have carried out an exhaustive review of existing evidence to establish what we know about the cardiovascular risks of space radiation. They present their findings in the journal Frontiers in Cardiovascular Medicine.

Manon Meerman

As first author Manon Meerman, a graduate student at UMC Utrecht, explains, the majority of current knowledge comes from studies of people who have received radiotherapy for cancer, where cardiovascular disease is a common side-effect, or from animal and cell culture studies that demonstrate the major negative effects of exposure to space radiation on the cardiovascular system. Such effects include fibrosis, or stiffening, of the myocardium and accelerated development of atherosclerosis, the main cause of myocardial and cerebral infarction.

“You can argue that if NASA, ESA and other space agencies want to expand space travel, both in terms of location – for example, to Mars – and time, astronauts will be exposed to the specific space environment for longer periods of time. However, we currently do not know what the effects of exposure to these space-specific factors are,” says Meerman.

“NASA currently sees space radiation as the most limiting factor for long-distance space travel, but the exact short- and long-term effects are not fully understood yet. We are therefore exposing astronauts to extremely uncertain risks. However, research into the effects of space radiation has increased over the past few years and we’re constantly gaining more knowledge on this topic,” she adds.

 Space radiation-induced changes

Advanced models

According to Meerman, another important factor in this discussion is the fact that we currently cannot adequately protect astronauts from space radiation. Shielding with radiation-resistant materials is very difficult since exposure levels are far higher than on Earth and the type of radiation is much more penetrating. Pharmacological methods of protecting the cardiovascular system are hampered by the fact that no effective radioprotective compounds have yet been approved.

“The most important conclusion is that we actually do not know enough about the exact risks that long-distance space travel pose for the human body. Therefore, in our opinion, we should keep looking for new ways to protect astronauts from the harmful space environment before we expand human space travel,” says Meerman.

Moving forward, Meerman stresses that research on the effects of space radiation should incorporate advanced models that provide a more accurate representation of the cardiovascular impacts of space radiation – such as those based on lab-created human cardiac tissue and organ-on-a-chip testing technologies. Studies should also examine the effects of combinatorial exposure to different space radiation particles, as well as combined exposure to space radiation components and other space-specific factors, like microgravity, weightlessness and prolonged hypoxia.

“These are all crucial studies to be conducted in order to really understand the risks we’re exposing astronauts to,” says Meerman. “Therefore, we believe we are not there yet and we should debate whether it is safe to expand human space travel significantly.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors