Skip to main content

New device can detect airborne virus particles that cause COVID-19, says maker

A device claimed to detect the virus that causes COVID-19 in ambient air within 2–3 min has been successfully tested at two universities in the US. It was developed by the company Smiths Detection, which says that the test has a sensitivity equivalent to PCR testing. The test is now commercially available in the US and is being prepared for a global rollout.

The ideal way of testing for the SARS-CoV-2 virus is using the reverse transcriptase polymerase chain reaction (RT-PCR or often just PCR). This involves taking a sample from the patient, using one enzyme to transform the viral RNA into DNA before thermally separating that DNA into single strands. The complementary fragment is then added to a sequence of the helix found only in the virus, recreating a short section of double helix if and only if the virus is present. Finally, this is amplified to produce a positive result.

Although a well-established technique, PCR can take days to produce a result – days during which an infected patient may infect others. Tests that can analyse samples more rapidly tend to be less sensitive. Moreover, an asymptomatic patient may have no reason to provide a sample for testing. A product that could rapidly detect SARS-CoV-2 in the ambient air would therefore be highly valuable, and several academic research groups and companies are working towards that aim.

Smiths Detection developed their BioFlash Biological Identifier over 10 years ago to detect toxins such as ricin and pathogens like the anthrax bacterium. The device is currently used by the US government and commercial companies such as couriers.

Jellyfish genes

“Previously, our focus was on biothreats that were intentionally released,” explains Andrew Flannery, one of the company’s chief scientists. The system uses a technology called CANARY developed by researchers at Massachusetts Institute of Technology. Genetically engineered immune cells bind to one specific target such as a toxin or pathogen. When they do so, they begin to emit light: “When CANARY technology was originally developed, [the MIT researchers] basically cloned out the same genes that jellyfish use to be bioluminescent,” explains Flannery. In addition to BioFlash, Smiths Detection has used the CANARY platform in sensors to detect pathogens in food and to monitor the health of plants in agriculture.

The company realized that, if the BioFlash could detect SARS-CoV-2, it could defend not just against malicious threats but against unintentional ones posed by infected individuals too. “We had to identify antibodies that would bind specifically to SARS-CoV-2,” explains Flannery, “We had to screen several antibodies that would be resistant to any mutations that might happen and lead to false negatives to make sure we picked the right one. That way we can be sure that, even if there are variants that pop up, we will still be able to detect them.”

The company now reports two real-world tests: one at the University of Maryland, Baltimore and the other at the University of Oregon. The first Maryland result detected the presence of SARS-Cov-2 in a locker room, leading to three positive diagnoses among members of a sports team, whereas the second confirmed its absence in a research facility in which a member had tested positive for COVID-19. The Oregon experiment detected the virus exhaled by quarantined patients with confirmed COVID-19.

Confirming virus mitigation

“Ultimately, we want to be part of the overall covid mitigation strategy,” says Warren Mino, the managing director of biotechnology at Smiths; “Having a device that will confirm that their mitigation strategies are working, I think, helps people to know that what they’re doing is effective in keeping people safe – especially as we try to get back to our daily, regular lives.”

Laura Lechuga of the Catalan Institute of Nanoscience and Nanotechnology, whose group has developed a spectroscopic liquid biosensor for SARS-Cov-2, is cautiously impressed: “As far as I know, there is no solution for on-site detection of SARS-CoV-2 in air (or aerosols) commercially available. There are many developments on-going at academic research and industrial level, but no one is close to commercialization. So, this Smiths device could become the first detector and could be [in massive demand],” she says. She cautions, however, that “the detection of any pathogen in air is really complex, due to the influence of the way the air sampling is performed, the specificity (to avoid cross-reactivity with other biomolecules and chemical molecules in the air) and, more importantly, the sensitivity level, as normally pathogens are present in the aerosols at a very low level”.

Lechuga adds that there is not currently enough public information available for her to evaluate the device. Smiths Detection told Physics World that the relevant proprietary information is provided to prospective customers.

ExoSCOPE monitors cancer treatment in real-time at the molecular level

The ExoSCOPE platform

Catching cancer drugs in the act may seem complicated – but could soon become a lot simpler.

One of the three common methods of treating cancer is chemotherapy; this process uses active chemicals to destroy abnormal or tumour cells. The fundamental challenge with chemotherapy is that it interacts with all rapidly dividing cells in the body, and therefore can cause severe side effects.

To improve on the downside of chemotherapy and other targeted therapies, scientists at the Institute for Health Innovation & Technology, National University of Singapore, have developed a treatment monitoring technology that evaluates specific drug interactions with cancer cells. The researchers describe the technology, named extracellular vesicle monitoring of small-molecule occupancy and protein expression (ExoSCOPE), in Nature Nanotechnology.

The ExoSCOPE technology

ExoSCOPE relies on the tiny nanoparticle-like vesicles secreted by mammalian cells, particularly cancerous cells. These so-called extracellular vesicles (EVs) contain several components, including the ones responsible for drug interaction and targeting. The ExoSCOPE technology examines the abundance of EVs produced in the blood, using plasmonic sensors to evaluate how drugs or other agents bind with EVs through specific protein receptors. This helps the researchers understand and monitor how cancer cells interact with drugs at the molecular level.

The team designed specialized probes consisting of gold nanorings, which were used to identify the docking sites of the drugs with the cancer cells.  Compared with a previously designed probe, these nanorings, which act as plasmonic resonators, offer enhanced signal detection by amplifying capture sites of molecular reactions with cancerous cells.

The new probe can amplify signals from EVs with low capacity for drug targeting and enables real-time monitoring of molecular reactions during ongoing cancer treatment. Plasmonic resonators generate electromagnetic hotspots, which maximize detection sensitivity. To increase signal detection, the team ensured that the molecular reactions occurred within these hotspots.

Clinical applications  

The team analysed plasma samples from lung cancer patients using the plasmonic nanoring resonators to identify possible cancer markers. Compared with other EV analyses, the ExoSCOPE results showed the most accurate disease classification, with an area under curve of 0.982.

The researchers further examined blood samples from lung cancer patients undergoing targeted treatment with erlotinib, to determine time changes in drug occupancy in cancer associated EVs. They note that the ExoSCOPE platform was able to differentiate between treatment outcomes (responders and non-responders) after just a 24 hrs time-lapse, whereas conventional blood pharmacological analysis could not.

“This technology offers a promising approach for monitoring treatment outcomes in cancer cells,” says first author Sijun Pan.

One step closer to real-time MR imaging in proton therapy

Proton therapy is an advanced cancer treatment technique that delivers highly targeted dose to the tumour while sparing surrounding normal tissue, enabled by the finite range of the proton beam. This precision targeting, however, is compromised by tumour motion or anatomical changes throughout a course of treatment. The absence of fast imaging tools to localize moving targets during dose delivery is a fundamental barrier to exploiting the full potential of proton therapy.

Real-time imaging during treatment delivery could visualize the tumour and synchronize the proton beam to its motion. MRI, which has recently been integrated into conventional photon-based radiotherapy systems, could provide high-resolution, high-contrast soft-tissue imaging, without depositing any additional ionizing dose into the patient. But operating an MRI scanner in conjunction with a proton beam is a major technological challenge that, for a long time, many considered to be impossible.

Aswin Hoffmann from the HZDR Institute of Radiooncology – OncoRay in Dresden thought otherwise. Hoffmann and his colleagues have been working for several years to integrate MRI with proton therapy. Now, the team is planning to build the world’s first whole-body prototype proton therapy system that can track moving tumours with MRI, in real time, during dose delivery from an actively scanned proton pencil beam.

The major challenge when integrating MRI into a proton therapy system is that MRI scanners need precisely defined magnetic fields to create geometrically accurate images, while proton therapy systems use electromagnetic fields to generate, transport and deliver the proton beam. Interference between these fields could distort the MR image and impact the delivered proton dose distribution. Hoffmann and his team showed that it is technically possible to combine both systems, and that these interference effects can be anticipated and thus compensated for. They also recently demonstrated that the proton beam range can be visualized with online MRI.

The prototype system will incorporate a 0.5 T rotating open MRI scanner produced by ASG Superconductors, which uses a helium-free, superconducting magnesium diboride magnet. The MRI scanner has been adapted to meet the requirements of real-time MRI-guided therapy by MagnetTx Oncology Solutions, a spin-off of the Alberta Health Services LINAC-MR group that developed the Aurora RT MR-guided radiotherapy system. Engineers at MagnetTx are also developing a gantry to rotate the scanner, as well as image processing methods to automatically track the tumour in real time.

In the summer of 2022, the team plans to incorporate the MRI system into a clinical-grade, actively scanned proton beamline at OncoRay.

The design of the new proton therapy system is based on the state-of-the-art Aurora RT. “As the Aurora RT has been optimized for image-guided radiation treatment, our prototype system will leverage its unique features to provide real-time image guidance for treatment with high-precision proton beams,” Hoffmann tells Physics World. “Our vision is to not only use it clinically for high-precision cancer treatments, but also for other pathologies that can be targeted non-invasively with highest precision comparable to surgical procedures.”

The MRI scanner will enable real-time, high-contrast imaging of organs in the thorax, abdomen and pelvis. Another advantage is that the scanner can be rotated around the patient relative to the proton beam. This will enable the team to study dosimetric and biological beam effects of MRI magnetic fields both perpendicular and parallel to the proton beam.

“MR-integrated proton therapy will have the capability to capture anatomical changes during therapy and allow for treatment adaptations to increase the targeting precision and reduce normal-tissue side effects,” explains Hoffmann. “The main benefit is expected for the treatment of tumours that show motion during irradiation, such as liver, pancreas, oesophagus, kidney, adrenal and cervical cancers.”

“Thanks to the collaboration with international industrial partners, my team and I are a big step closer to our goal of bringing significant innovation to the field, especially to real-time image-guided proton therapy,” he adds.

Nanoscale degradation of ferroelectric crystals observed for the first time

The first direct observation of the nanoscale degradation of a ferroelectric crystal has been made by researchers in Australia, China and the US. Qianwei Huang at the University of Sydney and colleagues used transmission electron microscopy (TEM) to discover how regions unresponsive to applied electric fields can build up at the domain walls of ferroelectric crystals, diminishing their performance. The discovery could lead to the design of nanoscale devices that are more resistant to the unwanted effects of ferroelectric degradation.

Ferroelectric materials have a spontaneous electric polarization, the direction of which can be reversed by applying an electric field. This useful property is widely used in electronic devices such as capacitors, sensors, actuators, and memories. One important challenge facing device designers is that after many cycles of electric field application, the ferroelectric nature of a materials can diminish steadily.

Known as ferroelectric degradation, this process can both reduce the reliability and shorten the lifespans of many electrical devices. Currently, it is widely believed that the effect is driven by build-ups of excess charge as they are injected into ferroelectric materials by external electrodes. So far, however, the nanoscale mechanisms behind this unwanted phenomenon have remained poorly understood.

Diffraction patterns

Now, Huang’s team used an advanced form of TEM to acquire the diffraction patterns displayed by electron beams as they passed through thin sheets of ferroelectric crystal. The material they used contained alternating domains of perpendicular polarization directions, arranged in a striped pattern. For the first time, this setup allowed researchers to make real-time, nanoscale observations of evolving ferroelectric degradation, over successive exposures to electric fields – which they applied parallel to the plane of the sheet.

The team’s measurements revealed that charge distributions within the crystal gradually shifted during each cyclic application of an electric field. Over time, charges increasingly accumulated at the interfaces between the striped domains, from which a new domain developed and grew. Crucially, the polarization of this domain was no longer parallel to crystal sheet, making the material less responsive to applied electric fields. This result was the first direct observation of ferroelectric degradation, and strengthens our understanding of how the process unfolds on the molecular scale.

Since ferroelectric degradation is one of the most significant factors responsible for shortening lifespans of electrical devices, the discovery could enable researchers to better understand device failure mechanisms. In turn, this knowledge could lead to the design of materials that are more resistant to these effects. If achieved, this could lead to nanoscale devices capable of operating over more successive cycles of electric loading, improving the efficiency of the many systems that depend on them.

The research is described in Nature Communications

New technology for artisanal gold miners and the pros and cons of blockchain

In this episode of the Physics World Weekly podcast, we look at the science of mining precious commodities, both real and virtual. Our first guest is the geochemist Kevin Telmer of the Artisanal Gold Council. He explains how the Canada-based organization is developing and promoting technologies designed to improve the lives of subsistence gold miners, who are responsible for at least 20% of annual gold production worldwide.

Next up is Susanne Köhler who’s doing a PhD in the sustainability of blockchain technology at Aalborg University in Denmark. She explains how blockchains have a wide range of applications from cryptocurrencies like Bitcoin to the distribution of seeds for agriculture. Köhler also talks about the significant environmental impacts of Bitcoin mining.

Molecular compass tracks tiny forces

Scientists in China have devised what they describe as the molecular equivalent of a compass to measure the weak van der Waals interactions between atoms. They did so by using a new kind of electron microscopy to track the rotation of a single hydrocarbon molecule within a crystal. The scientists reckon their tiny sensor could provide new insights into molecular-scale processes such as catalysis and phase transitions.

Van der Waals forces arise from temporary fluctuations in the density of charges in neighbouring atoms or molecules. Although they are only effective over a limited distance, they are ubiquitous in nature and widely exploited in industry, playing key roles in fields from condensed matter physics to structural biology. However, measuring them directly usually requires sophisticated techniques best suited to the study of single atoms.

In the latest work, Fei Wei and Xiao Chen at Tsinghua University in China and colleagues turn to a snappily named technique called integrated differential phase contrast scanning transmission electron microscopy (iDPC-STEM). Like other forms of electron microscopy, iDPC-STEM exploits the short de Broglie wavelength of energetic electrons to create images at far higher resolution than is possible with light waves. However, it has a couple of important advantages over other types of electron microscopy.

Developed by researchers at Thermo Fisher Scientific in Eindhoven, Netherlands, (including current group members Eric Bosch and Ivan Lazić) it uses integrated image data, which yields a higher signal-to-noise ratio for a given flux of electrons and thus allows smaller electron doses to be used. In addition, its image contrast scales roughly linearly with atomic number Z, rather than with Z2, making it more suited to studying systems with both light and heavy elements.

Single molecule in a crystal

The system in the Tsinghua team’s experiment is very sensitive to electron beams, and is made from single molecules of para-xylene (containing eight carbon and ten oxygen atoms each) trapped within the voids of a zeolite crystal. (Zeolites are microporous minerals that occur naturally but are also produced industrially on a large scale.) The type of zeolite used in this research, known as ZSM-5, consists of rings of silicon and oxygen atoms that link up around large holes in a two-dimensional lattice sheet. When a few of these sheets are stacked on top of one another, the holes align, creating shallow channels through the structure. It was into these channels that Wei and colleagues placed their para-xylene molecules by mixing ZSM-5 powder and para-xylene liquid in a centrifuge.

Figure showing a traditional Chinese compass with schematic and STEM images of the molecular compass

To make their compass, Wei and colleagues took advantage of the fact that the perimeter of each hole consists of a ring of 10 silicon and 10 oxygen atoms interspersed at roughly equal intervals of 18 degrees. The idea was to use the para-xylene molecule inside each ring as a pointer. Because any shift in the molecule’s axis relative to the silicon and oxygen atoms (the “compass points”) would indicate changes in local van der Waals interactions, the researchers could determine the nature of these changes by imaging the molecule and observing the orientation of its long axis in the plane of the ring.

Tiny force changes

Wei and co-workers showed they could indeed use their compass to measure force changes in both space and time. They did this by comparing changes in the orientation of the para-xylene pointers, as seen in the images provided by iDPC-STEM, with variations in the shape of the (slightly elliptical) rings. They gauged these variations by using intensity measurements to establish the distance between pairs of atoms on opposite sides of the rings.

The researchers compared multiple rings and found that the pointers tend to line up along the major axis in each ellipse. They also found that the pointer in any given ring tends to move between compass points so that it stays aligned with the longest axis as the ring changes shape (as a result of increasing exposure to electrons).

To relate these responses to changes in van der Waals interactions, the researchers used first principles calculations to work out how the para-xylene molecules’ interaction energy ought to vary with ring geometry. They found that for each different ellipse, it was always energetically favourable for the molecule to line up along the longest axis – demonstrating, they say, that each molecule in its crystal void does indeed function as a “van der Waals compass”.

Challenges of interpretation

Chen argues that the work could have applications in optimizing zeolite-based catalysis – a process used, among other things, to convert alcohol into petrol. As she points out, ZSM-5 contains aluminium atoms, which can provide protons during the acid-base interactions vital to such catalysis.

Other experts have responded with a mixture of enthusiasm and caution. Shigeki Kawai of the National Institute for Materials Science in Tsukuba, Japan, says that while electron microscopy has already been used to image single molecules, no-one has previously anchored a single molecule inside zeolite pores. This capability, he reckons, might allow two different molecules to be aligned or even reacted with one another.

Bart Kooi of the University of Groningen in the Netherlands, meanwhile, praises Wei and colleagues’ “nicely prepared samples and state-of-the-art imaging”. However, he adds that using two-dimensional images to represent three-dimensional structures complicates the process of calculating the interaction energy. He also questions the extent to which the imaging electrons – even in low doses – might still affect the atomic structures being studied.

The research is published in Nature.

Crisis in a lockdown: how NIST coped with a radiation leak

On the morning of Wednesday 3 February 2021, the research reactor at the National Institute of Standards and Technology (NIST), just outside Washington, DC, was starting to come back online after routine maintenance. Shortly after 9 a.m., however, radiation monitors in the building detected above-normal background radioactivity, consisting of fission fragments from the ruptured cladding of a fuel element. The reactor automatically shut down.

NIST is one of the US’s leading metrology labs, located in Gaithersburg, Maryland, whose 60,000 or so inhabitants may not even know about the presence of a reactor in their midst. Although just half an hour’s drive from the White House, the NIST campus is in an ordinary residential area. However, the reactor’s long-term survival may depend on the local community – and its residents – being able to make a clear and accurate appraisal of its value and safety.

News of accidents at nuclear reactors, wherever they are in the world, often triggers uncertainty, fear and even panic

News of accidents at nuclear reactors, wherever they are in the world, often triggers uncertainty, fear and even panic. Whatever NIST did next to publicize the incident was therefore going to be highly risky for the lab, the reactor and the research community that depends on it. Commenting immediately, before much information was available, might make it appear as though the lab were in the dark. Holding off an announcement pending an investigation would invite accusations of a cover-up, and seem to open the possibility of worse news to follow.

Then and now

The Gaithersburg research reactor was built on farmland in 1967 by what was then known as the US National Bureau of Standards, which was renamed NIST in 1988. Now officially called the NIST Center for Neutron Research (NCNR), it’s used by some 3000 scientists each year and is still vitally important to the US neutron community. Indeed, the NCNR supports about half of all neutron-scattering work in the country. Any interruption in its operation, however brief, would be devastating for neutron research.

But how was NIST to handle the release of information about the incident? The messaging was critical. I immediately recalled how Brookhaven National Laboratory’s High Flux Beam Reactor was forced to shut down permanently in 1999 after tritium-containing water, of no environmental or health impact, leaked from its spent fuel pool. As it transpired, early in the evening on 3 February, NIST put a notice on its website, sent out the news on social media, and asked Gaithersburg city officials to help spread the word.

“We’ve just released a statement on today’s alert at the NIST Center for Neutron Research,” the lab tweeted that evening. “Health and safety of our staff and community is our top priority. No indications of elevated radiation levels outside the building.” This was followed a few minutes later by another tweet: “The ­public remains safe.”

Two days later, NIST posted a 500-word update on its website and via Twitter and Facebook, saying that, after taking a shower, members of staff who had been exposed to the radiation were cleared to go home. It also said that the Nuclear Regulatory Commission (NRC), which licenses and regulates the reactor, had declared the facility and the public safe.

The reactor, NIST reminded the public, had operated safely for 50 years and was a valuable national resource. Its notice also stressed the differences between the NCNR and power reactors, which typically run at more than 100 times the power of research reactors and are much different in size, scale, structure and operation.

A week after the incident, on 10 February, the lab held a virtual public town-hall meeting. Questions were allowed to be submitted by e-mail or via the meeting platform’s Q&A function, and voices and video were muted (questions could be asked anonymously). Around 240 people attended, and NIST’s acting director Jim Olthoff answered all questions. The lab put up the questions and responses on an updateable web page).

A week later, on 18 February, Rob Dimeo, director of NIST’s Center for
Neutron Research, held a virtual town-hall meeting for the neutron-user community. It was attended by around 300 people concerned with how long the reactor would be down, and how its closure would affect their research. Here, too, attendees were muted and questions submitted via e-mail and the platform’s Q&A, as well as by chat.

The virtual format, mandated by the COVID-19 lockdown, meant that the town-hall meetings could be quickly arranged and draw possibly more people than might otherwise have attended

Interestingly, the virtual format, mandated by the COVID-19 lockdown, meant that the meetings could be quickly arranged and draw possibly more people than might otherwise have attended. “The virtual environment allowed for a more measured interaction between NIST and members of the public,” said NRC public affairs officer Scott Burnell. “People were submitting questions via chat, not open mike, which kept the temperature of the room under control.”

But the format was risky, as accepting questions only in writing and muting participants ran the danger of leaving them feeling indulged and patronized, spurring them to seek other forms of venting and information-spreading, which social media famously invites. So did NIST convey an accurate and transparent portrayal of the incident without condescension on the one hand or provoking unnecessary fears on the other? “It’s too early to draw conclusions,” Jennifer Huergo, NIST’s director of media relations, told me. “We did our best with the tools at hand.”

The critical point

Reactors are indispensable scientific instruments, which address national needs and would cost billions of dollars to replace, if indeed any government had the will to try. They are also lightning rods for political and social concerns. The future of neutron research depends not only on building and maintaining more neutron facilities, but also on understanding and addressing the concerns that they arouse. Paying attention to how episodes like the one at NIST play out is a first step.

2020 was Europe’s hottest year on record, finds report

Last year was the warmest on record for Europe, in part due to an exceptionally warm winter over the northeast of the continent. It was a year that saw wildfires rage in the Balkans and Eastern Europe, while Storm Alex brought record rainfall and led to above-average river discharge across much of western Europe. These are among the key points in the European State of the Climate Report released today.

Published each April, the report brings together data from national meteorological bodies and the European Union’s Copernicus climate change service. The previous year’s weather conditions in Europe and the Arctic are analysed in the context of global climate trends, based on data from satellites, ground stations and computer modelling. “The report confirms, among other things, 2020 as the warmest year, winter and autumn on record for Europe with temperatures in winter 3.4 °C above the average,” says the report’s lead author Freja Vamborg, a climate scientist at Copernicus.

The Arctic really saw quite a spectacular year

Freja Vamborg

Despite localized temporary reductions in air pollution, the COVID-19 pandemic appears to have had minimal impact on climate trends. Preliminary estimates from satellite data indicate that global atmospheric carbon dioxide level increased by 0.6%, a slightly lower rate of increase than in recent years. While methane concentrations increased by 0.8% – a higher rate than recent years – which could be linked with melting permafrost in Siberia.

Perhaps the most alarming weather occurred in Arctic Siberia, where average temperatures were 4.3 °C above the 1981–2010 reference period, almost two degrees higher than the previous record. As a result, ice cover in the adjacent Arctic seas was at record lows for most of the summer and autumn. Regional snow cover was also reduced, which is likely to have further increased local warming with less solar energy being reflected off the white surfaces.

“The Arctic really saw quite a spectacular year,” says Vamborg. “We know that the Arctic is warming at a faster rate than the global average. But on top of this long-term trend, the Arctic is also a very variable environment.” Vamborg says the short-term fluctuations means it is not inevitable that next year will be equally extreme.

‘Frightening’ findings

Europe’s climate report arrived in a busy week for climate developments. On Monday the World Meteorological Organization (WMO) released its State of the Global Climate 2020 report, stating that the average global temperature was 1.2 °C above pre-industrial levels. It was one of the three warmest years on record despite 2020 being a La Niña year when sea surface temperatures were cooler than average in the central and eastern tropical Pacific.

The WMO report estimates that 50 million people were doubly hit in 2020 by extreme weather and the COVID-19 pandemic. Evacuations, recovery and relief operations were affected by the pandemic, highlighting the need for a more integrated approach to managing climate hazards. The pandemic has also revealed cracks in global food security, as movement of goods was restricted and weather forecast services that support agriculture in the developing world were compromised.

The UN secretary general, António Guterres, described the WMO report as “frightening”. He called on governments to offer more ambitious emissions targets – referred to as nationally determined contributions – ahead of the UN climate summit in Glasgow in November (COP 26). Guterres stressed that co-operation between the US and China is vital to limiting global warming to well below 2 °C, the goal of the 2015 Paris climate accord.

China’s president Xi Jinping is among 40 world leaders invited by US president Joe Biden to attend a virtual summit on the climate crisis on 22–23 April. “We look forward to working with governments around the world to raise the level of global ambition to meet the climate challenge,” read a White House statement. Shortly ahead of the summit, the UK government announced that it would set in law a target to cut carbon emissions by 78% by 2035 compared to 1990 levels.

Super-resolution microscope sees deep inside the brains of living mice

Visualizing subcellular structures deep inside the brains of living animals could improve our understanding of how neurons function in their native environment. Thanks to an improved super-resolution microscopy technique developed at Yale University, that dream is now one step closer to reality.

Writing in Optica, the researchers combined stimulated emission depletion (STED) microscopy with two-photon excitation (2PE) to visualize, in three dimensions, the dendritic spines of live mice. These tiny, branch-like protrusions on nerve cells play a major role in neuron-to-neuron communication. Using their 3D-2PE-STED microscope, the researchers observed morphological changes in spines almost 100 μm deep in the brain.

The microscope marks the next generation of 3D-STED technology. Importantly, the technique could offer an in vivo view of several nanoscale structures buried deep within biological tissues.

“The ability to study cellular behaviour in this way is critical to gaining a comprehensive understanding of biological phenomena for biomedical research as well as for pharmaceutical development,” says lead investigator Joerg Bewersdorf.

Capturing the finer details

Designing a microscope that is compatible with in vivo imaging is far more challenging than imaging cells cultured on a glass coverslip. Not only does the light have to navigate through thick, optically dense tissue, but any movement (such as breathing and heartbeat) can create motion artefacts within the image. The net result is poor resolution, particularly in the axial (z) direction.

The 3D-2PE-STED microscope developed by the Bewersdorf Laboratory addresses two key barriers to deep-tissue imaging: light scattering; and optical aberration (where refractive index variations both within the tissue and between the tissue and the objective lens immersion medium create blurry, out-of-focus images). Incorporating 2PE – which uses near-infrared light to generate fluorescent signals in the region-of-interest – reduces light scattering. Meanwhile, adaptive optics and an optimized immersion lens cancel out optical aberrations from the tissue, allowing the light to focus properly.

Before testing their setup in vivo, the researchers benchmarked the super-resolution capabilities of their system against a standard 2PE microscope.

Image comparison

First, they took images of cells cultured on a coverslip. The 3D-2PE-STED system achieved lateral (xy) and axial resolutions of 70 and 151 nm, respectively – 4.2 and 6.5 times higher than the 2PE microscope. When the researchers tested thicker tissue samples, the resolving power of the 3D-2PE-STED system uncovered structural details within DNA that were lost in the 2PE image.

The full potential of their microscope was then realized 76 μm deep inside the brain of a living mouse. The team successfully tracked the 3D structure of individual spines over three days.

“3D-2PE-STED now provides the means to observe changes [to dendritic spines], and to do so not only in the superficial layers of the brain, but also deeper inside, where more of the interesting connections happen,” explains first author Mary Grace Velasco.

Importantly, the researchers did not see any abnormal changes to the neurons or the mouse’s behaviour as a result of the imaging conditions. They believe that their microscopy technique could help uncover a vast number of structural and cellular relationships found deep within different tissues.

Physicists solve centuries-old brachistochrone problem for complex quantum operations

Complex quantum operations obey a “speed limit” of around 17 millimetres per second, say researchers led by physicists at the University of Bonn in Germany. The team obtained this result by working out how quickly they could transport an atom between two points while still preserving the information contained in its quantum state. The experiment, which required the team to transport the atom in a series of accelerating and decelerating sequences, could help optimize speed and accuracy in applications such as quantum sensing and quantum computing.

The concept of a minimum time or path to complete a certain process was put forward as long ago as 1696 by the Swiss mathematician Johann Bernoulli, whose famous brachistochrone problem involves determining the shape of the wire that allows a bead to slide down it in the least amount of time. While physicists know the solution to the brachistochrone problem for simple, two-level, quantum systems – that is, the fastest path connecting these two states – they run into problems for more complex, multilevel, quantum systems, like the ones on which most quantum computers are based.

Conveyor belt of light

In the new work, Andrea Alberti at the University of Bonn’s Institute of Applied Physics and his colleagues began by making an optical trap using two overlapped (superimposed), counterpropagating laser beams. The superimposition (or interference) of the beams creates a lattice, or standing wave, of light that contains a sequence of crests and valleys that are initially static. The researchers then loaded an atom of caesium-133 into one of the valleys and used a microwave field to cool it to its lowest (ground) vibrational state. In this state, the atoms can be described as a wave of matter that behaves not like a billiard ball but more like a liquid, Alberti says, and it oscillates with the smallest amplitude possible.

To transport this liquid-like atom, the researchers set the standing wave in motion, which displaces the position of the valley so that it moves as far as its nearest neighbour. They could vary the speed of this motion and track it on the sub-nanometre scale using a fast polarization synthesizer. This set-up gave them the flexibility they needed to transport the atom at both constant and varying speeds.

The goal of this experiment was to transport the atom wavepacket over a distance 15 times its size (a total of roughly 0.5 microns) in the shortest possible time, without the atom “spilling out” of the valley. Under these conditions, the atom arrives at its destination in the same energetic state in which it began. Conserving this state, which is fragile because it is sensitive to disturbances in its environment, means that any information stored in this state is not lost – a prerequisite for quantum computing.

Transport fidelity

To test their technique, Alberti and colleagues measured the atom’s transport fidelity – a parameter that characterizes how similar the initial and final states of the atom are. They found that they achieved better fidelities when the atom was transported at different accelerating and decelerating speeds. These variations, or “wiggling”, effectively cancelled out the effects of the transitions to the several intermediate states the atom had to go through as it travelled.

Such wiggling, which ensures that the atom is in the ground state when it reaches its final destination, does not occur in simple two-quantum states in which the start and end destination of the atom are very close to each other. In this case, explains Alberti, the matter waves of the atom at both locations overlap and the atom can be transported to its destination in one go – that is, without having to pass via any intermediate states.

He likens the wiggling to a waiter carrying an entire tray of half-full champagne glasses to a table at top speed without spilling a drop. In his analogy, the tray is the optical trap and the champagne the caesium atoms. When the waiter walks quickly, he slightly tilts the tray so that the champagne does not spill out of the glasses. And when he slows down again, as he approaches the table, he tilts the tray in the opposite direction. “Only when he has come to a complete stop does he hold it upright again,” Alberti explains.

“Anyone who wants to transport atoms from one position to another must be as skilful as the waiter, but even then, there is a speed limit that this transport cannot exceed,” he says. Indeed, the researchers calculated that the transport fidelity was best when the average speed of the transported atom was below around 17 millimetres/second.

The work, which is detailed in Physical Review X, has implications for quantum computing, Alberti says. Because quantum states are fragile, they last only a very short time (known as their coherence time). That makes it important to pack as many computational operations as possible into that short interval. “Our study reveals the maximum number of operations we can perform in this time and thus make optimal use of it,” Alberti concludes.

Copyright © 2026 by IOP Publishing Ltd and individual contributors