Skip to main content

Superconducting nanowires could shed light on dark matter

Superconducting nanowires could be used as both targets and sensors for the direct detection of dark matter, physicists in Israel and the US have shown. Using a prototype nanowire detector, Yonit Hochberg at the Hebrew University of Jerusalem and colleagues demonstrated the possibility of detecting of dark matter particles with masses below about 1 GeV/c2, while maintaining very low levels of noise. The team says it has already used its prototype to set “meaningful bounds” on interactions between electrons and dark matter.

While dark matter appears to make up about 85% of the matter in the universe, it has not been detected directly – despite the best efforts of physicists working on numerous detectors worldwide. So far, the search has been dominated by efforts to detect weakly-interacting massive particles (WIMPs) – hypothetical dark-matter particles that could be streaming through Earth in very large numbers. WIMP detectors are designed to look for particles with masses greater than 1 GeV/c2, and are not expected to be sensitive to lower-energy particles.

To extend the search to lower masses, physicists have used several different sensor technologies made from materials including graphene, polar crystals, and superfluid helium. Superconducting nanowires are already used to detect single photons, and Hochberg and colleagues at the Massachusetts Institute of Technology and National Institute of Standards and Technology believe that nanowires should join the hunt for dark matter. If a dark matter particle collides with an electron in a cold, current-carrying superconducting nanowire, the nanowire could heat-up and for a short time cease to be a superconductor.  The resulting spike in the nanowire’s resistance would reveal that a dark matter interaction has taken place.

Low noise

The physicists tested their proposal by building a tungsten-silicide nanowire prototype, which had a detection energy threshold of 0.8 eV. During 2.8 h of operation, the detector registered no unwanted background counts, which demonstrates a very low level of intrinsic noise.

The team says the technique has several advantages over other detectors, including ultra-fast detection speeds and very low levels of noise. In addition, the wires could potentially pick up dark matter particles with kinetic energies below 1 eV, which is extremely low for a dark-matter detector, and could also detect “dark photons” with energies less than 1 eV. Dark photons are hypothetical particles that could mediate interactions between dark matter.

The team says their early experiments have already placed meaningful bounds on the interaction between dark matter and electrons including the strongest terrestrial bounds on the absorption of sub-electronvolt dark photons.

In future studies, Hochberg and colleagues now hope to fabricate nanowires on larger scales, and with even lower detection thresholds. When coupled with other detection techniques, they believe their nanowires will allow them to probe for dark matter in previously-unexplored regions of mass and energy.

The research is described in a preprint on arXiv.

Emerging medical technologies, pentaquarks and borrowing heat from the Earth

In the latest episode of the Physics World Weekly podcast, Hamish Johnston is talking about the new pentaquark – an exotic hadron comprising five quarks – that’s been discovered by physicists working on the LHCb experiment at CERN.

Later in the show, Anna Demming discusses her highlights from the recent event at the UK House of Parliament called “What next for digital healthcare technologies?” You can hear interviews with a number of guests including Paul Drayson, the former Labour politician and current chief executive officer of Sensyne Health.

To close the podcast, James Dacey presents clips from his recent interview with Neil Lawson, a geoengineer involved in designing ground-source heat pumps. They discuss the underlying principles of these systems and the current outlook for the technology in the UK.

If you enjoy what you hear you can subscribe via Apple podcasts, or your chosen podcast app.

Technology innovations underpin condensed-matter research

At the beginning of April thousands of physicists from all over the world will be gather in Regensburg, Germany, for one of four spring meetings of the German Physical Society (DPG). The focus for this particular meeting will be condensed matter physics, with technical sessions covering everything from DNA nanostructures through to quantum systems and 2D materials.

Alongside the scientific programme, more than 100 companies will be showcasing the latest equipment for condensed-matter research. A few highlights are featured below.

Femtosecond fibre lasers target time-resolved microscopy and spectroscopy

A series of femtosecond fibre-laser systems from TOPTICA offers the flexibility needed to support advanced applications such as femtosecond pump-probe spectroscopy, nonlinear microscopy, and terahertz spectroscopy. The key to this flexibility is the ability to connect different laser amplifiers to a common master oscillator, providing a modular approach to designing standard configurations as well as highly customized systems for specific applications.

The FemtoFiber product line from TOPTICA

An example of this approach is the FemtoFiber Quantum Microscopy system – based on a combination of the established FemtoFiber ultra and FemtoFiber pro series with an Asynchronous Optical Sampling (ASOPS) system from Laser Quantum – which is designed for time-resolved Faraday rotation of coherent spin dynamics in semiconductor nanostructures. The combination of the fiber-laser technology with sophisticated electronics for ASOPS provides highest flexibility in terms of laser parameters, pump-probe configurations, and data acquisition times.

You can find out more about TOPTICA’s FemtoFiber product line at Booth #2 (LH Wirtschaft/Recht).

Compact power supply drives magnetron sources

PREVAC, a designer and manufacturer of complete research systems for material deposition and analysis, will be demonstrating a compact switch-mode DC power supply for driving magnetron sputter sources. The M600DC-PS power supply delivers 600 W as standard, extendable to 1200/1800/2400 W with additional modules, and it can easily be switched between up to three magnetron sources.

The M600DC-PS power supply from PREVAC

All settings can be manually adjusted via a large touchscreen display, and the unit can also be controlled remotely using a variety of analogue or digital interfaces. Settings can be stored and recalled automatically when the unit is switched on, while it also features a built-in timer and automatic standby. During operation the power supply measures the thickness of the deposited layer, the rate of evaporation, plus the vacuum inside the deposition chamber.

You can find out more about PREVAC and its full product line at Booth #81 (Audimax – Foyer).

Discharge protection for low-temperature experiments

An innovative measurement system from Oxford Instruments has been designed to protect sensitive samples from being damaged by electromagnetic discharges. The SampleProtect system allows researchers to monitor or ground individual experiment lines via a signal access box, while samples can easily be changed in standard chip carriers or sample holders – providing discharge protection even when the sample is being moved.

The SampleProtect switching unit

The end-to-end system comprises a rack-mounted switching unit that is linked with measurement-grade cables and sample probes to sample holders that include an additional socket for an equipotential plug. Each probe can accommodate multiple types of sample holders, and sample holders can be transferred between probes or even to different ultralow-temperature inserts. This ensures that the system can be used across a wide range of temperatures.

Representatives from Oxford Instruments will be available to discuss the SmartProtect system at Booth #83 (Audimax – Foyer)

Precise positioning at cryogenic temperatures

Cryogenic positioners from SmarAct allow samples in a cryostat to be manipulated with high precision in all conditions, ranging from atmospheric pressure through to ultrahigh vacuum. The company has developed stick-slip piezo actuators with low heat profiles, allowing sample positioning down to the milliKelvin regime. Resistive cabling is optionally available to reduce the heat load on the positioning stages, while complete systems can be customized to specific applications.

A three-axis cryogenic positioner from SmarAct

The positioners can also be used at temperatures of up to 330 K, just like the company’s standard UHV stages, and they are bakeable at temperatures of up to 150 °C. Non-magnetic versions are also available for use in high magnetic fields, while a linear design is also available for use in tight spaces. More recently, rotation stages have been introduced to combine linear and rotational sample manipulation in cryogenic temperatures.

SmarAct will be showcasing its full range of positioning equipment at Booth #51 (Audimax – Foyer)

Desktop computers can now simulate cardiac activity

Modelling the complex electrical waves that cause heart arrhythmias could be key to understanding and treating these abnormal heart rhythms. Until now, however, real-time modelling of cardiac dynamics within millions of interacting heart cells required access to powerful computer clusters and supercomputers, putting it out of reach for most physicians.

To make cardiac modelling more accessible, a team of US researchers has used graphics processing chips and software that runs on standard web browsers to move high-performance cardiac dynamics simulations onto less costly computers, and even high-end smartphones. This advance could enable clinicians to use 3D modelling data to design specific therapies or prevention strategies for their patients, and help scientists study a particular drug’s effect on heart arrythmias (Science Advances 10.1126/sciadv.aav6019).

“Being able to do real-time simulations in three dimensions could open the door to clinical applications where we could actually obtain patient geometries and solve these equations in the cells that are packed into the heart,” says Elizabeth Cherry from Rochester Institute of Technology. “We could see applications in the clinic that could individualize treatments on the basis of their specific heart geometries. We could actually test possible therapies to see what would work for each patient.”

Key to this development is the use of graphics processing units (GPUs) designed for gaming applications, which were developed to help computers display graphics and video. High-end smartphones can have up to 900 GPU cores, while high-end graphics cards for laptop or desktop computers may contain more than 5000.

“Over the past several years, GPUs have become really powerful,” explains Flavio Fenton from Georgia Institute of Technology. “Each one has multiple processors, so you can run problems in parallel like a supercomputer does. As many as 40 or 50 differential equations must be calculated for each [heart] cell, and we need to understand how millions of cells interact.”

To allow the simulations to run on any GPU, Georgia Tech’s Abouzar Kaboudian developed a versatile programming library that enabled the team to develop programs in WebGL that can run through common web browsers.

“If you have access to the Internet and a modern web browser like Firefox or Chrome, you can just go to a web link and the simulation will start running on the graphics card of your computer,” says Kaboudian. “Any problem that can be parallelized can run on the library that we have created. It will accelerate simulations on any computer by several hundred times.”

The researchers have developed ten different models based on their WebGL programming, and are planning to make the tools available for other researchers to use. Future enhancements will include the ability to run simulations on more than one GPU card to achieve even higher computational speeds.

“Models that might have been accessible to only a handful of researchers in the world will now be available to many more groups,” says Fenton.

Introducing a tiny, wireless, battery-free tissue oxygen sensor

Researchers in the US have designed a completely implantable, wireless, battery-free oxygen sensor to monitor tissue oxygen levels when implanted subdermally or even in deep brain regions. With these features, the newly fabricated oximeter supports in vivo tissue oxygen monitoring in awake and free-moving animals such as mice (Science Advances 10.1126/sciadv.aaw0873).

Oxygen levels in different regions of tissue represent the balance between oxygen demand and supply. Imbalance and abnormalities in tissue oxygen levels are of relevance to various physiological or pathological processes, such as neural activity, tissue perfusion, the tumour microenvironment and wound healing. Therefore, determining tissue oxygenation is of significant importance.

To ascertain regional tissue oxygen levels, existing methods either measure oxygen partial pressure or assess changes in the concentration of oxygenated haemoglobin. However, most approaches developed to date interfere with the natural behaviours of the test subjects (for example, by requiring physical tethers or anaesthetics). Unfortunately, these limitations can alter oxygenation levels and lead to inaccurate oxygen measurements. In addition, depth of operation in several tissues has remained a challenge for many existing oximeters.

Design and working principle

To overcome these restrictions and challenges, the researchers — led by John Rogers at Northwestern University — designed a thin, fully implantable wireless oximeter. Their design contains an injectable filamentary measurement probe connected to an electronic module. The filamentary probe performs the optoelectronic measurements, while the electronic module supports wireless data communication.

John Rogers

The sensing probe exploits differences in the optical properties of oxygenated haemoglobin (HbO2) and deoxygenated haemoglobin (Hb) to infer local changes in their concentrations. This quantification is then used to estimate regional tissue oxygen saturation (rStO2) levels.

The estimation of rStO2 levels depends on the absorption spectra of both oxygenated and deoxygenated haemoglobin in the visible and near-infrared spectral range. At high oxygen concentrations, the ratio of HbO2/Hb increases and at low oxygen concentrations (hypoxia), it tends to decrease.

Sensitivity to oxygenation of haemoglobin is manifested in the differences between the molar extinction coefficients of HbO2 and Hb. The measurable optical properties, such as light attenuation by haemoglobin, then define the rStO2 as a function of HbO2 and Hb concentrations.

Wireless data communication occurs via magnetic resonant coupling-induced power harvesting and infrared-enabled data transmission. To improve stable operation in chronic implants, and also shield the devices from biofluids, the researchers used bioinert coatings in which a conformal coating of parylene surrounds the device.

Device characterization

To assess the device’s functionality, the researchers tested their system in vivo, in deep brain regions of both anaesthetized and free-moving mice, as well as in artificial blood solutions. They observed that the assembled optoelectronic platforms demonstrated continuous, sensitive and localized rStO2 sensing at regions-of-interest.

The fabrication concepts and electronic designs could also enable implantable platforms with other functionalities, such as heart rate tracking. “Other extended options include the integration of the oximeter probes with other functional modules for optogenetic modulation or microfluidic drug delivery,” the authors speculate. “These multimodal systems with colocalization of stimuli and oxygenation detection could support unique capabilities in coupling the metabolism of specific tissue regions with external physiological or pathological challenges.”

Are circular economies the answer?

Throughout the 20th century the prevailing assumption was that the solution to pollution is economic growth. The trajectories followed by many developed nations – from widespread poverty through heavily-polluting industrialization to clean technology and good standards of living – support this simplistic relationship. But recent decades have revealed that matters aren’t this straightforward. Now a study has reviewed the links between pollution and economic development and investigates ways to transition towards a more sustainable economy.

Between 1970 and 2006, GDP in the US, adjusted for inflation, grew by 195%. The number of cars and trucks in the country doubled and the total number of miles driven grew by 178%. In theory air pollution should have skyrocketed. Technological innovations and new regulations, however, led to significant decreases in emissions of carbon monoxide (by 37%), nitrogen oxides (30%), sulphur dioxide (52%), particulates (80%) and lead (98%).

This is a classic example of the inverted-U-shaped environmental Kuznets curve. It suggests that economic development initially leads to environmental degradation but once average income reaches a certain point, the curve undergoes a turning point and the environment improves as wages rise further.

A circular economy requires us to mimic biological cycles in terms of re-using waste materials from industrial processes

The environmental Kuznets curve is a compelling hypothesis but there are lots of exceptions to the rule. Take deforestation. On paper many developed countries have their tree-felling under control and back to sustainable levels. More often the reality is that imports increase, exporting the deforestation elsewhere.

To understand the connections between the environment and economic development in more depth, Saleem Ali from the University of Delaware, US, and Jose Puppim de Oliveira from the São Paulo and Brazilian School of Public and Business Administration, Brazil reviewed six fundamental models.

Recent studies have revealed flaws in the environmental Kuznets curve. Rather than being shaped like an upside-down U, it more commonly has an upward flick at the end that represents the renewed rise of pollution once the easiest pollution challenges have been tackled.

“The key take-home message is that there is no generic environmental Kuznets curve which can be used for policy-making,” says Ali, “and therefore pollution policy is better determined by monitoring and enforcement of standards that are based on the environmental and social impact of pollution.”

Sometimes technology can enable countries to “tunnel through” the environmental Kuznets curve

Sometimes technology enables countries to “tunnel through” the environmental Kuznets curve, avoiding the environmental degradation associated with the first phase of economic development. The paper industry, for example, traditionally relied on mercury for electrolysis but now tends to use a membrane cell process. As a result, many developing countries can avoid the burden of mercury pollution associated with mass paper production.

If we really want to achieve sustainability, the research suggests we should strive for a “circular economy”.

“A circular economy requires us to mimic biological cycles in terms of reusing waste materials from industrial processes,” says Ali. “It is compatible with economic growth but focuses on reused, re-manufacturerd and recycled material rather than virgin materials.”

Such an economy would need policies that make waste more economically useful and a redesign of manufacturing processes. Finland has taken a leading role in driving its economy towards a circular structure but there are potential negative consequences. In particular, as systems become more resource-efficient, prices start to drop, which can cause a “rebound effect” as people consume more.

“The increased consumption could lead to environmental concerns such as increased greenhouse gas emissions,” says Ali.

There are ways to improve human well-being, achieve sustainable development goals and have a healthy economy but the research to date suggests that keeping a circular economy on track needs constant evaluation and refinement.

Ali and Oliveira reported their review in Environmental Research Letters (ERL).

LHCb bags another pentaquark

A new pentaquark – an exotic hadron comprising five quarks – has been discovered by physicists working on the LHCb experiment at CERN. LHCb scientists have also found that a feature in their data that had previously been associated with one pentaquark could be evidence for two pentaquarks with similar masses.

Preliminary analysis of the three pentaquarks suggests that they have a molecular structure that resembles a meson bound to a baryon (see figure). Gaining a better understanding of how pentaquarks are bound together could provide important insights into the strong force and quantum chromodynamics.

Hadrons are heavy particles that are made of two or more quarks held together by the strong force. Until the early 2000s, physicists had concrete evidence for only two types of hadron: baryons (such as protons and neutrons) containing three quarks and mesons, which contain a quark and antiquark.

Not surprising

Since then, physicists have discovered tetraquarks containing four quarks and pentaquarks containing five. This has not come as a complete surprise because when Murray Gell-Mann first proposed the quark model in 1964, he realized that quark–antiquark pairs could be added to mesons and baryons to create heavier particles.

The first tetraquark was discovered formally (with a statistical significance greater than 5σ) at Japan’s BELLE experiment in 2008. The first two pentaquarks – called Pc(4450)+ and Pc(4380)+ – were discovered in 2015 at LHCb using proton–proton collision data from Run 1 of the Large Hadron Collider (LHC). The four-digit number refers to the mass of the pentaquark in MeV/c2, which means that these pentaquarks are more than four times heavier than the proton.

Using new data from Run 2 of the LHC, physicists have discovered a third pentaquark called Pc(4312)+, which they have observed at a statistical significance of 7.3σ. What is more, they also have 5.4σ evidence that the mass peak in the Run 1 data associated with Pc(4450)+ is actually two peaks. They believe these correspond to two different pentaquarks, which they have dubbed Pc(4440)+ and Pc(4457)+.

LHCb team member Tim Gershon of the University of Warwick told Physics World that combining data from Run 1 and Run 2 means that the pentaquark peaks are now much better resolved than in previous studies. Gershon and colleagues found that the three peaks are narrow, which means that the pentaquark particles enjoy relatively long lifetimes before they decay.

Long lifetimes suggest that these pentaquarks resemble molecules that comprise a baryon and a meson bound together by the residual strong force – which is the force that binds neutrons and protons together in a nucleus. The mass of the Pc(4312)+, for example, is just below combined masses of a Ʃc+ baryon and a neutral D meson. Such a configuration is expected to be relatively stable and therefore correspond to a narrow peak.

Gershon says that the LHCb team is currently doing a much more sophisticated analysis on the collision, which should reveal the spin and parity of the pentaquarks. This would provide crucial information about the internal structures of the pentaquarks.

The recent discoveries were described in a talk by Syracuse University’s Tomasz Skwarnicki at the Rencontres de Moriond conference in Italy.

Global energy challenges: is there room for growth?

Many global energy scenarios see renewables as expanding rapidly, with projections from the IRENA ranging past 50-60% by 2040 on the way to 80% or more of global electricity by 2050. Some even look to 100% by then. However, others offer slower expansion projections. One energy company thinks renewables will supply about 30% of global electricity by 2040, another 45%. Even so, a 50% share of electricity supply by around 2050 now seems an unexceptional global aim. Some countries can do much better than that. Several countries are already at well over 50% and, with generation costs falling, many others should be able to follow their lead. It may be worth noting that, according to an OECD/NEA report, between 2008 and 2015 renewable energy deployment “caused an electricity market price reduction of 24% in Germany and of 35% in Sweden”.

While attaining high electricity contributions seems relatively straightforward, many countries have found it harder to meet heat and transport needs directly from renewables. Instead the main approach has been to try to use electricity for electric vehicles while, in some cases, the plan is to install electric heat pumps for domestic heating. However, rates of growth of renewable electricity have fallen in some countries. For example, the demise of Feed-In Tariffs across Europe has slowed deployment rates and China has throttled back on its very rapid PV expansion to reduce subsidy costs. The result of this, along with continued growth in energy demand in most countries, for transport use especially, is that emissions have risen — the switch to electric vehicles has not as yet had much impact. Although the use of coal for power production is being phased out in many countries, it is still expanding in some.

The call to consider what some may see as desperate, expensive and possibly dangerous technical measures heightens the underlying more fundamental debate over the role of economic growth

The situation isn’t entirely gloomy. Demand for electricity had fallen in some countries, notably the UK, but with demand for oil (for transport) and coal (for power) still growing globally, the prospects are not too good and the IPCC is painting ever-worsening pictures of what horrors climate change may bring if we do not act soon. In response, while some call for much more attention to be given to energy efficiency and to much more rapid expansion of renewables, including non-electrical renewables, others suggest that urgent attention should also be given to carbon capture technology, including carbon negative initiatives, as well as to nuclear power and even to planetary geoengineering.

Can growth continue for ever?

The call to consider what some may see as desperate, expensive and possibly dangerous technical measures heightens the underlying more fundamental debate over the role of economic growth. Can it really be sustained on a finite planet even given clever new technologies? The debate has become rather polarized.  Bloomberg’s Michael Liebreich has put the essentially technological/market fix view that growth is vital for humanity. Pushing an ecological view, Tim Jackson from the University of Surrey, UK, says that it is not, actually it’s lethal for the planet.

The Post Carbon Institute in the US has been struggling with this issue for some time, looking to stable state economics. It’s no longer a fringe issue. Leading US journal Foreign Policy has asked whether economic growth can continue, even so-called “green growth”. Certainly, a deep green view is that on its own, green energy is little use unless growth is tamed. But growth is collapsing as global economies falter. Tim Jackson says that economic stagnation is likely, a social and economic disaster given the way the world is run at present, but a boon for almost all – and the planet – if we consciously designed a sustainable stable state economy.

The debate continues, with a core issue being whether what’s sometimes called eco-modernism can really enable sustainable growth. Technology may be able to help but possibly the real issue is whether we can change our consumption patterns and expectations.

Although some look to major social and economic changes as vital, some of the changes needed may not have to be that radical — for example, changes in diet away from so much meat-eating would help a lot. And that has begun to appear in plans and scenarios. For some that might be perceived as an appalling infringement on individual freedom, almost as bad as the parallel recommendation that population growth should be reduced. And as for the idea that we should fly less, well that’s heresy! Dare I add to the sense of injury by noting that carbon emissions from the energy supplies needed to run IT systems, including mobile devices and servers, is now said to be comparable with those from aircraft. I can envisage laptop and smart phone users talking of resistance to prising these devices from their “cold dead hands”.

Hope for the future

Are things quite so desperate? Some global political trends are not good. For example, it’s commonplace to worry about the climate impacts of Trump’s policies. Yet US emissions are falling and renewables are still booming there, as in most places, although the export of coal by the US (and gas from Russia) will undermine global emission reduction. The shift to electric vehicles may help to reduce carbon emissions, as long as green power is used, even if that shift won’t reduce congestion or other social/eco impacts. While much of Europe seems to be in thrall to populist urges, there are nevertheless some progressive renewable energy programmes, as I noted in my last two posts. A lot more is needed and can be achieved if we do not get sidetracked into panicky measures. Cutting demand across the board would help too, making it easier for renewables to supply the reduced amount needed.

Interestingly, in that context, Mark Jacobson of Stanford University, US, has released a revised chart from his upcoming new book 100% Clean, Renewable Energy and Storage for Everything, updating his earlier data, with global energy demand falling by 57.9% by 2050 due to fuel substitution and energy efficiency upgrades, and with renewables then meeting the residual demand in all sectors. With similarly positive news, IRENA has backed a study by the Global Commission on the Geopolitics of Energy Transformation that includes a brave new global energy scenario, based on Shell data, with renewables accelerating exponentially to almost totally eclipse fossil fuels by 2100. In terms of winners and losers, the Commission said that “no country has put itself in a better position to become the world’s renewable energy superpower than China” and it warned that countries reliant on oil exports might lose out. That may also be true for those still backing coal — see my next post. Certainly there is a need for change as the new quite grim report from the World Economic Forum argues: while some progress has been made, few countries are ready for the transition, it says, and calls for “swift action”.

Antimicrobial coating kills multi-resistant pathogens on the ISS

Elisabeth Grohmann

A new antimicrobial coating made of silver and ruthenium can kill multi-resistant pathogenic bugs. The substance, dubbed AGXX®, has been tested on contamination-prone surfaces inside the International Space Station, which is an extreme, closed, hostile habitat where bacteria develop particular defence mechanisms against antibiotics and detergents.

“On the ISS, bacteria develop a thicker cell wall, for example, or highly express virulence genes,” explains Elisabeth Grohmann of Beuth University of Applied Sciences Berlin, who led this research study. “But despite the harsh conditions and these defence mechanisms the AGXX® remains active.”

The microorganisms on a spaceship come from humans themselves – the crew and the helpers who prepared the mission. On Earth, these bacteria are generally harmless, but microgravity and cosmic radiation can increase their virulence and transform them into potential pathogens. These conditions also lower the immune defences of the astronauts, which, when coupled with the psychological stress associated with spaceflight, makes them much more prone to infection.

The bacteria humans carry in fact become hardier, says Grohmann. They develop thicker, protective cell walls and resistance to antibiotics, becoming more vigorous, multiplying and metabolizing faster. Unfortunately, that is not all: the genes responsible for this newfound resilience can readily be shared among different bacterial species as they come into direct contact with each other or via the increasing amounts of biofilm they produce.

Micro-galvanic silver and ruthenium

The new antimicrobial coating is made of micro-galvanic silver and ruthenium, conditioned with ascorbic acid. It can be coated onto any kind of surface, including steel, plastics and wood, and can be incorporated in bead/powder form in creams and lacquers, explains Grohmann.

Another of its advantages is that it is only slightly cytotoxic (it has been declared as a medical product). “No bug-resistance has been detected so far, and this is likely due to the reaction mechanism by which it damages biomolecules, such as DNA, proteins and lipids. These reactions occur via reactive oxygen species (ROS) that penetrate biological cells rather than via released silver ions, which would trigger silver resistance.

Effective against a variety of Gram-negative and Gram-positive bacteria

The researchers tested the AGXX® on both Gram-negative and Gram-positive bacterial strains, including MRSA, Enterococcus faecalis, Staphylococcus epidermidis, pathogenic E. coli (ESTEC), Pseudomonas aeruginosa, Acinetobacter baumannii and Legionella. It was active against all these but to a different extent.

The effects of AGXX® are similar to bleach, except that it is self-regenerating so it never gets used up, says Grohmann. “As well as all kinds of bacteria, it also inhibits the growth of certain fungi, yeasts and viruses. And after six months of exposure on the ISS, no bacteria were recovered from AGXX®-coated surfaces.”

After 12 and 19 months, the researchers say they recovered a total of just 12 bacteria. This is 80% less compared to those from bare steel, which was used as a control. A conventional silver antimicrobial coating also tested only reduced the number of bacteria by 30% compared to steel.

Contact effect

Since the coating works through contact with bacteria, its effectiveness can decrease over longer periods of time though, says Grohmann. The antimicrobial test-materials are static surfaces, on which dead cells, dust particles and cell debris can accumulate, interfering with the direct interaction between the coating and bacteria, she explains. The isolates obtained from such surfaces after 19 months are able to form immunity-invading biofilms and are resistant to at least three antibiotics, including sulfamethoxazole, erythromycin and ampicillin. They are also able to share the genes responsible for resistance between them.

“However, by simply rinsing off the dead cells and dust particles with water, the efficiency of AGXX® can be fully recovered, at least on Earth where this was done successfully,” insists Grohmann.

“Immunosuppression, bacterial virulence, and therefore infection, increase with spaceflight duration, so we must continue to develop new approaches to combat bacterial infections if we are to attempt longer missions in the future – to Mars and beyond.

“AGXX® shows promise because it is already used on Earth in cooling towers to keep water free from contaminants and in water tanks in vans. It is also being tested as an antimicrobial coating in urine catheters and wound dressings, and as an anti-fouling agent on ship hulls.”

Improved filter systems and antimicrobials

The researchers, reporting their work in Frontiers in Microbiology, are now busy developing a prototype water filter system with Uwe Landau’s team at Largentec GmbH Berlin and Rainer Haag’s group at Free University Berlin. This filter consists of AGXX® and functionalized graphene oxides (GOX) and it should be more durable and longer lasting than existing filters. A similar system is planned for air filters (for air-conditioning).

“We are also testing AGXX® and GOX as antimicrobials in a four-month isolation project (SIRIUS habitat, IBMP Moscow) co-funded by ESA and NASA,” Grohmann tells Physics World. “This project started on the 19thof March.

“The next steps in our work will be to see if the coating materials can inhibit the germination process of the most resistant life-modes of bacteria – the so-called endospores, she adds. “In addition, we are looking into the molecular stress response of the bacteria that do survive on the AGXX® coating to understand why they survived and what makes them that resistant. We will try and further improve the materials based on the findings of this study.”

Nine nanoparticles for multicolour electron microscopy

Lanthanide nanoparticles

Electron microscopy is unique in its ability to provide high-resolution cellular imaging, but it does not give information about the location of specific proteins in a cell. Labelling with electron-dense particles like gold enables visualization of proteins, but is limited to one type of protein at a time. To overcome this limitation, a group of researchers in the US have presented nine nanoparticles with distinct colours that might allow multicolour electron microscopy in the future (Nature Nanotechnol. 10.1038/s41565-019-0395-0).

Maxim Prigozhin, Peter Maurer and colleagues made nanoparticles from NaGdF4 or NaYF4 and spiked them with one of nine lanthanide ions: Eu3+, Er3+, Ho3+, Tb3+, Sm3+, Dy3+, Nd3+, Tm3+ and Yb3+. These rare-earth elements determine the colour of the nanoparticle via a phenomenon called cathodoluminescence. The resulting nanoparticles represent the first ever coloured labels for electron microscopy.

Stanford researchers

What is cathodoluminescence?

Cathodoluminescence is a phenomenon in which electrons impacting on a material, in this case the lanthanide-spiked nanoparticles, emit light of material-specific wavelengths. In this study, the electron beam of the electron microscope was used to excite the nanoparticles. The resulting light emission spectra of the nine types of nanoparticle were distinct, making them a potential stepping stone towards multicolour imaging.

The researchers also tested three additional lanthanides, which did not yield sharp spectra like the other nine elements. A number of lanthanides remain to be tested and might yield further colours to expand the repertoire. To create even more colours, the researchers are also thinking about co-doping nanoparticles with multiple lanthanides.

The resulting nanoparticles had diameters of less than 20 nm, which is comparable to the quantum dots, gold nanoparticles and immunoglobulin antibodies typically used to label proteins in electron microscopy. The big advantage of the new nanoparticles is that they come in nine different colours, such that co-labelling of several proteins might be possible. At the same time, cathodoluminescence-electron microscopy images of the new nanoparticles showed that this method is suitable for nanoscale imaging.

The main problem that the researchers encountered was variability between experiments in the level of the coloured signal emitted from the nanoparticles. However, they are confident that this will be manageable with further optimization.

Future potential

The team, led by Nobel Laureate Steven Chu, outline many ways in which their method can be optimized. These include further reducing the nanoparticle size, optimizing the nanoparticle surface and making use of the long-excited state of the lanthanides to distinguish signal from noise by time-gating the measurements. Combined, these approaches might allow the design of even smaller labels for biological multicolour electron microscopy. Another way to improve the signal-to-noise ratio might be to reduce the energy of the electron beam, thereby matching it to the size needed to excite the nanoparticle.

In the future, the researchers envision that instead of scanning the whole sample for cathodoluminescence, imaging might be accelerated by identifying the nanoparticles through conventional electron microscopy and then only using cathodoluminescence to identify the colour of each nanoparticle.

Copyright © 2025 by IOP Publishing Ltd and individual contributors