Skip to main content

Dark-matter detector result is consistent with previous hint of exotic particles

New data from the PandaX-II particle detector in China leave open the possibility that the XENON1T experiment in Italy has found evidence of new physics. In June 2020 researchers working on XENON1T announced the detection of around 50 events above background levels and concluded that hypothetical solar axions or very magnetic neutrinos might be responsible. The new results from PandaX-II are consistent with these hypotheses but further work will be needed to settle the issue.

XENONIT was built to hunt for a type of dark matter known as weakly interacting massive particles (WIMPs). Housed under a mountain at Italy’s Gran Sasso National Laboratory, it contained 3.5 tonne of liquid xenon and operated between 2016 and 2018. Like other experiments of its type, it was designed to pick up the tiny flashes of light generated when WIMPs in the “halo” of dark matter thought to envelop the Milky Way collide with xenon nuclei.

The events reported in 2020 involved electron, rather than nuclear, recoils. Elena Aprile of Columbia University in the US and colleagues reported 53±15 such recoils at low energy that they could not tie to other identifiable sources of background (these events themselves being considered noise in the search for WIMPs). Careful not to claim any discovery, they instead laid out several possible explanations for the observation.

Two novelties

These explanations included two novelties associated with particles arriving from the Sun – either hypothetical particles known as axions (postulated originally to fix a problem with the strong nuclear force) or neutrinos with a greater magnetic moment than previously observed. Another possibility, they said, was “bosonic dark matter”, which would be absorbed, rather than scattered, by the xenon nuclei and cause electrons to be emitted.

However, as Aprile and colleagues pointed out, the events could also have had a more mundane explanation – the beta decay of tritium nuclei. This would come about when the few neutrons liberated from surrounding rock by cosmic rays create tritium by splitting xenon nuclei. Unlike other background processes, this remains a nuisance since its extent is not possible to estimate reliably.

Aprile and colleagues calculated that the tritium could account for the excess events with a statistical significance of 3.2σ – compared to 3.4σ, 3.2σ and 3.0σ for solar axions, neutrino magnetism and bosonic dark matter, respectively.

Dimmer white dwarfs

Despite their cautious presentation, these results caught the attention of both the public and fellow physicists. For example, theorists put forward several ways to overcome one obvious sticking point with the Sun-based hypotheses – that the flux of the particles involved would make white dwarf stars dimmer than they appear.

In the latest work, Jianglai Liu of Shanghai Jiao Tong University and colleagues did an independent experimental check on the XENON1T results using the PandaX-II detector in the China Jinping Underground Laboratory in Sichuan, south-western China. Although PandaX-II contains just over half a tonne of xenon, the researchers ran their experiment for longer and acquired nearly half the data as their XENON1T counterparts.

The Chinese group had the advantage of being able to better characterize their background spectra, thanks to direct measurement or calibration. In part, this was done by twice injecting methane with one of its hydrogen atoms replaced by tritium into the target. With the two injections carried out three years apart, they say they were able to measure the energy spectrum of the tritium contamination within the experiment.

By in effect subtracting the background spectra of tritium, krypton and radon, the group was able to quantify any signals from putative solar axions or a high neutrino magnetic moment – the two theoretical possibilities that Liu says the researchers used as a “benchmark” in their work. As they report in Chinese Physics Letters, they found that the remaining electron recoils were in fact consistent with the excess events seen by XENON1T. However, they could not fully endorse the earlier result given, they say, that their data were also consistent with a “background-only hypothesis”.

Detector upgrades

To try and establish whether some new physical process really has been observed, the Chinese researchers are increasing their detector mass to 6 tonne – meaning a sensitive target of 4 tonne – while lowering background rates. The upgraded detector is called PandaX-4T and should start taking data this year. Also coming online are an upgraded 8.3 tonne “XENONnT” as well as the 10 tonne LUX-ZEPLIN detector currently being installed in the Sanford Underground Research Facility in South Dakota, US.

According to Liu, the new measurements should yield a verdict soon. “A year of low background data taking from PandaX-4T would be able to offer a definitive answer to the XENON1T excess,” he says, although he adds that it remains to be seen just how low they can make the background.

One group already has an explanation for the XENON1T excess – and it does not rely on exotic new physics. Matthew Szydagis, Cecilia Levy and colleagues at the State University of New York at Albany used what is known as the noble element simulation technique to model background interactions within the Gran Sasso detector and found that around 30 decays of the isotope argon-37 would generate the observed excess.

Levy says that their hypothesis could be investigated by carrying out a thorough calibration of the XENON detector, adding that her group does not know where the argon might come from. Beyond that, she agrees that the observed excess should be scrutinized by the new round of larger experiments. “If it is due to a new particle, it should predictably scale with the more massive detectors,” she says, “and a signal should be clear.”

United Arab Emirates’ Hope probe enters Martian orbit

The $200m Emirates Mars Mission successfully arrived in Martian orbit today, concluding its seven-month journey to the red planet. The arrival of the United Arab Emirates’ (UAE) probe – named Hope – marks the beginning of the science stage in the first interplanetary voyage undertaken by an Arab nation.

At 16:13 GMT, mission control at Dubai’s Mohammed bin Rashid Space Centre (MBRSC) received a signal, relayed from Nasa’s Deep Space Network, to confirm that the car-sized spacecraft had entered into a stable orbit. That followed a nail-biting half hour as Hope fired its Delta V thrusters, slowing its speed from over 121,000 km/h to approximately 18,000 km/h to be captured by Mars’ gravity. The 1500 kg craft will now undergo further manoeuvres and testing during the coming weeks before it begins returning science data from the Martian atmosphere.

“We congratulate our leadership and our people of all nationalities in the UAE,” said Sarah Al Amiri, chair of the UAE Space Agency via Twitter. “The science team has a lot of work to do, and we are all confident that they will make new, great, and tremendous discoveries about the red planet.”

Hope is the first of three separate missions arriving at Mars this month and will be swiftly followed by China’s Tianwen-1 spacecraft, which arrives in orbit on 10 February ahead of landing a rover in May. Then, on 18 February NASA’s Perseverance rover will land and begin searching for signs of ancient life in a Martian crater that was once flooded with water.

Diagram of the Hope probe's journey to Mars

Weather watcher

Hope’s main scientific objective is to study daily and seasonal weather changes, as well as observing how hydrogen and oxygen are lost into space. This data could help us better understand how Mars turned into the dusty barren planet we see today. Hope carries three main instruments: two spectrometers – one operating in the infrared and the other in ultraviolet – and an imager that will study the lower atmosphere at visible and ultraviolet wavelengths.

Sending a mission to Mars was a bold statement from the UAE, an Arabian state with a population of 9.8 million that gained independence from the UK in 1971. Today’s achievement comes just three years since Hazza al-Mansouri became the first Emirati in space when he spent eight days on the International Space Station.

The UAE could have easily just purchased a spacecraft to go to Mars, but they had a goal to build it, not buy it

Brett Landin

Having established a national space agency in 2014, the UAE quickly built up its space capacity by collaborating with established space nations. Launched from Japan’s Tanegashima Space Center in July, the Hope spacecraft was built in partnership with the Laboratory for Atmospheric and Space Physics (LASP) at the University of Colorado, Boulder. The mission team comprises 200 staff from MBRSC, 150 from LASP, along with support from an international science team and roughly 100 partners.

“The UAE could have easily just purchased a spacecraft to go to Mars, but they had a goal to build it, not buy it,” says LASP engineer Brett Landin, who leads the mission’s spacecraft team. “I think the most fascinating part of this mission has been watching a nation decide to institute a meaningful change and then actually make it happen in a very short period of time.”

Building a ‘knowledge economy’

It may still be early days, but the UAE has grand ambitions for space exploration. Hope is just one project in the nation’s “Mars 2117” programme, which has the ultimate goal of establishing the first human settlement on Mars within the next century. Other key projects are to send an unmanned rover, dubbed Rashid, to the Moon in 2024 and to build a “Mars Science City” in the desert outside of Dubai – a research facility that will eventually host analogue “missions to Mars”.

By investing heavily in space exploration, the UAE hopes to kickstart its science and engineering sectors, to diversify its economy away from oil. Since the Hope mission was first mooted in 2014, mission leaders have spoken regularly about how it can foster interest in science among students. Under broader plans outlined in 2017, the UAE set the target that “knowledge workers” would make up 40% of its total workforce by the end of 2021.

“The UAE’s Mars Mission is a clear reflection of the UAE’s vision and ambition,” says Sanam Vakil, a Middle East researcher at Chatham House, an independent policy institute in the UK. “The project is designed to promote the knowledge-based economy while also inspiring Emiratis and attract other regional nationals to the Emirati economic model.”

Laser-based autofocus unit transforms imaging and workflow outcomes

The PureFocus850 is a laser-based autofocus system that delivers enhanced imaging outcomes, sustained workflow efficiencies and reductions in capital spend across a range of applications in materials science, biological microscopy and industrial inspection. That’s the claim of Prior Scientific, a UK-based manufacturer specializing in the design and production of precision positioning devices, optical systems, automation solutions and microscope components.

The integrated autofocus unit – which comprises an infrared laser diode, precision optics, detector and on-board microcontroller – provides a real-time focusing capability for infinity-corrected optical systems and is suitable for both upright and inverted microscopes. “The PureFocus850 allows powerful autofocus functionality to be installed on the latest commercial microscopes or retrofitted as an upgrade to established imaging systems,” explains Simon Bush, sales engineer (UK and Ireland) at Prior Scientific. “The product is equally at home when integrated with a reflected-light optical system in a production-line context or when incorporated into an OEM imaging platform.”

In terms of operational specifics, the PureFocus850’s motorized offset lens adjusts the imaging depth into the sample, continuously holding the precise distance between imaging focal point and a reference boundary of choice. This capability can be put to work in all manner of imaging applications: from simply scanning across a biological sample on a slide without the need to manually refocus – yielding significant efficiencies for the user – to high-resolution tile-scanning or time-lapse imaging, where focus stability and accuracy determine the success of an experiment or sample analysis.

The commercial roadmap

Right now, Prior Scientific is busy developing several discrete customer segments for the PureFocus850. Within the research community, for example, the hardware autofocus offers scientists a cost-effective upgrade path for their existing microscope facilities, rather than recurring capital spend on new optical systems. A case study in this regard is the Nanofabrication Laboratory at Chalmers University in Gothenburg, Sweden, which required a retrofittable autofocus solution that will work with a variety of samples and objectives while ensuring compatibility with brightfield and darkfield imaging.

Simon Bush

For a research facility like Chalmers, where a variety of experiments are run on the same microscope, other advantages include one-time installation and the ability of the PureFocus850 software to specify and store a range of autofocus settings (for example, laser power, recovery speed, focus stability and focus confirmation parameters). “We released an expanded range of inverted microscope kits at the end of 2020,” adds Bush, “and our aim is to cater for researchers who either require an out-of-the-box autofocus solution or one with an accessible SDK [software development kit] that can be used to develop a novel imaging system.”

Beyond the laboratory, the PureFocus850 provides a versatile option for specialist OEMs developing next-generation microscopy and imaging systems. There are several features – including extra-long-range recapture and interface detection – that are only accessible to OEM customers in a fully automated context, while the motorized offset has the potential to allow dynamic offset recalculation. The latter is paramount for ensuring focus stability over long periods when imaging different layers in multilayer samples, such as during laser scanning confocal microscopy (LSCM) or fluorescence-lifetime imaging microscopy (FLIM).

“The scope for modification is extensive,” notes Bush. “Customers can engage with our R&D team to change the optics, offset mechanics and laser wavelengths – for example, to support long-wavelength imaging such as two-photon microscopy or Raman spectroscopy. The autofocus also integrates with OpenStand, our instrument development platform for building OEM solutions and one-off customizations.”

 Industrial inspection

Out in industry, meanwhile, the PureFocus850 has generated significant interest among customers engaged in low- to medium-throughput materials analysis and inspection. A case in point is Top-Electech, a China-based electronics supplier, which last year integrated the hardware autofocus into an existing microscope to fast-track the inspection and analysis of its PCB components – delivering a 95% reduction in processing time for multisample component arrays.

The PureFocus850 provides a neat fit for this application because it’s not tied to any particular software and can be used as a standalone microscope add-on or integrated into custom protocols via Prior Scientific’s SDK. What’s more, the technical challenges are non-trivial for the Top-Electech quality-assurance team, with electronic components first mounted in resin and then filed down to allow imaging of their internal structure – a process that creates a subtly uneven surface comprising materials with variable reflectance and contrast.

“Traditional software or hardware autofocus systems may struggle to maintain focus on this variable surface, but the PureFocus850 averages the signal reflected by the sample across the microscope’s field of view,” notes Bush. “This ‘line-mode’ capability allows a consistent, reliable signal to be obtained while scanning across each sample, even where parts of the field of view are non-reflective, ensuring the sample is constantly in focus without user intervention.”

Another issue is variation in the amount of light reflected by the sample depending on the magnification of the objective. As such, it helps that Top-Electech engineers can store the optimal laser power for each objective on the microscope to ensure seamless switching between high and low magnification while keeping the sample in focus. By also storing a sample detection threshold for each objective, the engineers are able to avoid large, unnecessary refocusing steps when moving between samples. This protective feature allows the loading of multiple samples onto the microscope simultaneously and the ability to image in sequence without the risk of refocusing onto areas of the microscope stage that do not contain a sample. “In this way,” notes Bush, “the Top-Electech team is able to undertake batch processing rather than loading and imaging samples sequentially.”

Elsewhere, STMicroelectronics, the multinational electronics and semiconductor manufacturer, has combined the PureFocus850 with image recognition software to automate the analysis of silicon-carbide wafers at its manufacturing facility in Sweden. The firm’s engineers use defect-selective etching to assess wafer quality – a process that creates etch pits on the silicon-carbide wafer surface, with the frequency, morphology and distribution of these pits linked to the type and location of potential defects within a sample. This uneven surface would typically cause problems for laser autofocus systems, but the line-mode configuration of the PureFocus850 uses a weighted average of the reflected signal to find the optimal focal plane.

In this use case, sample interrogation focuses on the bottom of the etch pits, such that the imaging plane of interest differs from the main reflective surface of the silicon-carbide wafer. “The analysis is enabled by the motorized optics of the PureFocus850, which allows the imaging plane to be offset from the wafer surface,” explains Bush. Equally important for STMicroelectronics is the efficient acquisition of brightfield and darkfield images, with the user interface allowing engineers to optimize the autofocus settings for each imaging technique based on the intensity of illumination and to easily switch between them (rather than having to find a compromise between the two).

“All told,” concludes Bush, “the PureFocus850 enables STMicroelectronics to acquire higher-quality tile-scans of their silicon-carbide wafers while delivering a massive improvement in throughput – typically an 85% reduction in the wafer scanning time.”

Breath holds protect the heart during proton therapy for breast cancer

Radiation therapy plays an integral role in the management of breast cancer. Following breast-conserving surgery, in which the tumour is removed while leaving as much healthy breast tissue as possible, irradiation of the whole breast is a standard follow-up procedure. In some cases, irradiation of regional lymph nodes and/or a boost to the tumour cavity are also performed.

As survival rates improve for patients with early breast cancer, it’s important to consider potential long-term complications for those undergoing such radiation treatments. In particular, irradiation of the left breast is challenging due to the possibility of delivering dose to the heart and the subsequent risk of long-term cardiac complications.

One way to reduce this risk is via the deep inspiration breath-hold (DIBH) technique, which physically separates cardiac structures from the target volume and helps reduce cardiac dose. Proton therapy can also be employed to help reduce cardiac dose, as protons target the tumour with high conformality while delivering almost zero dose to distal structures.

Researchers from the Rutgers Cancer Institute of New Jersey propose that combining the two techniques could reduce cardiac dose further. As previous studies of proton therapy with DIBH are scarce, they have compared photon DIBH with proton DIBH in 10 patients, reporting their findings in the International Journal of Particle Therapy.

Dose comparisons

The study included 10 patients with left-sided breast cancer who received radiation as part of breast-conserving therapy. All underwent lumpectomy, nine also had sentinel lymph node biopsy and one underwent axillary lymph node dissection. All patients then received photon DIBH with two parallel, opposed beams used to irradiate the whole breast. Most patients also received a boost dose to the tumour cavity and, where required, nodal irradiation.

The researchers also created treatment plans for double-scatter proton therapy with DIBH, using clinical target volumes adjusted to match those of the delivered photon plans. Using each patient’s photon plan as a control for their proton plan, they investigated the doses to a number of cardiac subunits, including the entire heart, left ventricle (LV), left coronary artery (LCA), right coronary artery (RCA), left circumflex coronary artery (LCx) and left anterior descending coronary artery (LAD).

Both plan types provided adequate target coverage, but proton DIBH significantly reduced doses to cardiac structures compared with photon DIBH. This included reductions in: the mean dose to the heart (1.19 to 0.23 Gy); the mean dose to the LV (1.7 to 0.25 Gy); the mean, maximum and half-maximum doses to the LAD (5.54 to 1.15 Gy, 22.15 to 7.7 Gy, 4.42 to 1.61 Gy); and the maximum dose to the LCx (1.35 to 0.13 Gy).

The team point out that the lower mean doses to the entire heart, LV and LAD could lead to an estimated reduction in long-term cardiac mortality of at least 7%. The mean doses to the LCx, LCA and RCA were already low for photon DIBH, thus the clinical significance of further dose reduction to these structures with protons is unknown.

Added lung protection

Radiation pneumonitis (lung inflammation arising from irradiation) is another area of concern when planning breast radiotherapy. The researchers observed that proton DIBH significantly lowered the dose to the left lung compared with photon DIBH. The mean left lung dose was reduced from 8.04 to 2.28 Gy, while volumes receiving 20 Gy and 5 Gy were reduced by 13% and 17%, respectively. Clinically, this lower lung dose from proton therapy may have long-term benefits.

The study also revealed that proton therapy reduced the maximum dose to the right breast, although the result was not statistically significant. The researchers note that skin dose was slightly higher with proton than photon therapy. While the difference was not statistically significant, they suggest caution in this regard.

“Proton DIBH significantly reduces dose to vital organs-at-risk in comparison to photon DIBH in patients requiring whole-breast radiation and/or nodal irradiation,” the researchers conclude. “This may be the new standard-of-care in the future because of its significant long-term clinical benefits.”

They point out, however, that understanding the clinical implications of this dosimetric advantage requires a randomized study with long-term follow-up. In addition, the expected toxicity profile from such proton treatment is difficult to anticipate because of the lack of adequate long-term clinical experience. “We hope that the ongoing RADCOMP study will be able to answer such questions in the coming few years,” they write.

Next-generation planetary missions could hunt for gravitational waves, say astronomers

Spacecraft heading to Uranus and Neptune in the next decade could be used to investigate gravitational waves as they venture into the outer solar system. That is according to a new study by a team of Swiss and Danish researchers, who say that examination of the radio signals from far-flung probes might reveal the signature of these subtle ripples in the fabric of space–time as they roll across our planetary neighbourhood (arXiv: 2101.11975).

The scientists say that gravitational waves would make themselves known through a Doppler shift in the transmissions from distant spacecraft. “When a gravitational wave passes through, it can slightly disturb the radio link by shifting its frequency. We can detect the small deviations in the carrier frequency we receive from the spacecraft and deduce that a gravitational wave has passed,” explains Deniz Soyuer from the University of Zurich, who led the work.

Similar gravitational-wave hunts have been attempted – without success – by previous missions. In principle, it is even something that could have been tried with NASA’s New Horizons spacecraft, which is currently traversing the remote region of our solar system known as the Kuiper belt. What a future generation of outer planet explorers would have on their side, though, is time.

Overlapping observations

Proposed missions to Uranus and Neptune – which planetary scientists hope might launch around 2030 – would take many years to reach their targets, meaning they would have several opportunities to carry out searches for the elusive undulations. “There is one-and-a-half to two months of ideal time in a year to do these kinds of observations when the Earth–Sun–spacecraft angle becomes favourable,” explains Soyuer. “So a 10 year cruise time would yield a total of 10 one-and-a-half-month long observations.”

The technique would also not require any dedicated on-board equipment to be fitted to the probes. “All missions already have Doppler tracking instruments on them, since that is how you track the spacecraft and also conduct gravitational field measurements of the planetary gravitational fields,” says Soyuer. “[It] sounds easy in principle, but the changes in frequency that we want to detect are extremely small.”

Another technical challenge is the noise in the data. Among the main contributors to this, adds Soyuer, is “the mechanical noise of the antenna” listening in back on Earth. If those hurdles can be overcome with advances in technology, the missions could potentially detect the gravitational waves given off by bodies – like stellar-mass black holes – whirling around gargantuan black holes, a phenomenon astronomers call “extreme mass ratio inspirals”. They may also be able to pick up the space–time ripples emanating from colliding supermassive black holes.

If detections do occur, they could provide a “nice overlap” with observations by Europe’s upcoming gravitational-wave mission, LISA, says Laura Nuttall from the University of Portsmouth who is a gravitational-wave expert and member of the LIGO scientific collaboration. “[These missions] are more likely to see different events than LIGO/Virgo are sensitive to as they are probing a different part of the gravitational-wave spectrum,” she says. “So just like LISA, [they] would complement LIGO/Virgo.”

Bond distance of rare element einsteinium is measured

New insights into the physical and chemical properties of the rare heavy element einsteinium have been gained by researchers working at several labs across the US. The team, led by chemist Rebecca Abergel at Berkeley National Laboratory in California, used cutting-edge approaches in both synthesis and analysis to overcome several significant setbacks in studying the element. Their results shed light on the poorly-understood properties of the heaviest elements and could help scientists synthesize new and even heavier elements.

The periodic table displays the elements in a systematic way that provides great insights into their chemical properties. However, this appears to break down for the heaviest elements, which can behave in unexpected ways given their positions in the table. Understanding the chemistry of these elements is exceedingly difficult because they can only be synthesized in extremely small quantities and have short half-lives.

With an atomic number of 99, einsteinium is in the same actinide row of the periodic table as uranium. A metal, it is currently the heaviest element that can be produced in quantities large enough to carry out classical chemistry experiments.

Tiny sample

In their study, Abergel’s team used a variety of advanced techniques to learn more about the isotope einsteinium-254. This is the most stable form of the element, with a half-life of 276 d. First, they synthesized a 250 ng sample using the High Flux Isotope Reactor at Oak Ridge National Laboratory in Tennessee by bombarding curium targets with neutrons to trigger specific radioactive decay chains.

Unfortunately, the researchers encountered several setbacks in their initial analysis. They discovered californium (element 98) contaminants in their sample, which meant that they could not perform planned X-ray crystallography studies. Furthermore, delays related to the COVID-19 pandemic meant that they were losing their sample due to radioactive decay. To overcome these problems, Abergel and colleagues bonded their einsteinium atoms to groups of organic molecules called ligands, which acted as luminescent antenna. They placed their sample in a specialized holder, which was 3D printed at Los Alamos National Laboratory in New Mexico. With this setup, they could analyse the sample using X-ray absorption spectroscopy, carried out at the Stanford Synchrotron Radiation Lightsource.

By measuring the resulting spectrum of the sample, which was complemented by the luminescence of the ligands, Abergel’s team determined the bond distance of einsteinium, which is crucial in understanding how metallic atoms bind to molecules. In addition, they uncovered aspects of einsteinium’s physical chemistry that deviate from expected trends across the actinide series. This knowledge could open up new avenues of research into how actinides could be used in areas including nuclear power, and novel pharmaceutical drugs.

More broadly, the discoveries improve our understanding of how physics and chemistry is altered towards the edge of the periodic table. This could enable researchers to better predict the processes that occur when einsteinium and its actinide neighbours are bombarded with other atomic nuclei in the hopes of creating even heavier elements that have yet to be discovered.

The research is described in Nature.

The ethics of quantum computing, starter homes on the Moon

Metallurgy, steam power and the integrated circuit are just a few of the many technologies that have had a profound effect on humanity. Will quantum computing soon join this list? Some people think so and they are warning that we should begin thinking about the ethical implications of a technology that could solve problems well beyond the reach of even the most powerful quantum computers. In the above video from Quantum Daily, six leading lights in the nascent quantum computing industry discuss the issue.

The UK developer Barratt London has teamed up with the British Interplanetary Society to create a prototype lunar home for future astronauts. Consisting of two floors with a 2 m-thick roof to protect inhabitants from radiation, the lunar module features three bedrooms – including an en-suite toilet – that are located on the lower floor as a further step to reduce the radiation dose. The open-plan upper floor features a kitchen, dining and living space as well as a gallery that offers the “perfect place to offer views across the lunar landscape, and if you’re lucky even be able to spot Earth in the distance”.

Barratt says that many amenities will be available such as electricity, water, internet and – thankfully — air. While the house will be built using resources found on the Moon including basalt bricks, Barrett adds that furnishing the home would be “one of the trickiest aspects of living on the Moon,” as wood and plastic would need to be imported from Earth. Connecting each terraced property, meanwhile, is a “street”, which looks more like an underground passage given the need to protect from radiation. Perhaps naming the street “The Crescent” would be fitting?

I’m not sure if Barratt London is planning to build a golf course community on the Moon, but if you fancy hitting a few balls on the lunar surface the BBC has an analysis of Alan Shepard’s attempt during the Apollo 14 mission 50 years ago. Instead of hitting a golf ball “miles and miles and miles,” as Shepard claimed,  researchers now believe the ball went about 37 m. However, they point out that this was pretty good considering Shepard was wearing a very restricting spacesuit.

New quasiparticle may mimic Kondo-effect signal

A new type of quasiparticle – dubbed the “spinaron” by the scientists who discovered it – could be responsible for a magnetic phenomenon that is usually attributed to the Kondo effect. The research, which was carried out by Samir Lounis and colleagues at Germany’s Forschungszentrum Jülich, casts doubt on current theories of the Kondo effect, and could have implications for data storage and processing based on structures such as quantum dots.

The electrical resistance of most metals decreases as the temperature drops. Metals containing magnetic impurities, however, behave differently. Below a certain threshold temperature, their electrical resistance increases rapidly, and continues to increase as the temperature drops further. First spotted in the 1930s, this phenomenon became known as the Kondo effect after the Japanese theoretical physicist Jun Kondo published an explanation for it in 1964.

Kondo showed that the spin of a magnetic impurity, which comes from its magnetic moment, strongly couples, or “sticks”, to the spins of all the electrons in the region surrounding it. The resulting “cloud” of spin-coupled electrons effectively screens off the conducting electrons and prevents them from moving, producing the observed increase in the metal’s resistance.

Kondo resonances

One key signature of the Kondo effect is the dips, or resonances, in the electron transport spectra observed when magnetic impurities are deposited on metal surfaces. These resonances can be detected by scanning tunnelling spectroscopy (STS), a technique that makes it possible to position and detect individual atoms on a surface and record energy spectra at exactly these positions.

Researchers measured such resonances for the first time in 1998, observing a dip in the measurement curve at the point where magnetic cobalt atoms were deposited on a gold surface. Before this pioneering series of experiments, the Kondo effect could only be detected indirectly, through resistance measurements. These first STS results – which were later confirmed for cobalt atoms on the surface of other metals, such as copper and silver – therefore opened up a new way of studying many-body physics at the sub-nanoscale.

Inelastic spin-excitations are important

The Jülich team, however, argues that this characteristic dip is not, in fact, an unambiguous sign of the Kondo effect. Instead, their studies suggest that another phenomenon altogether – magnetic anisotropy – is creating the dip.

In a paper published in Nature Communications, the researchers explain that, below a specific temperature, the magnetic moment of the cobalt impurity atom couples to the crystal lattice of the atoms in the gold surface. At this point, its moment essentially “freezes”. Meanwhile, above the critical temperature, certain excitations of the magnetic moment, known as inelastic spin-excitations, arise thanks to the spin properties of the tunnelling electrons used in STS.

Based on a recently developed mathematical model that combines relativistic time-dependent density functional theory (TD-DFT) and many-body perturbation theory (MBPT), Lounis and colleagues think the dip in the measurement curve could stem from interactions between the inelastic spin-excitations and electrons. These interactions form a bound “spinaron” state and the physics of the overall system is dictated by the relativistic effects induced by these interactions. The combination of spin-excitations and spinaron give rise to transport curves that agree rather well with the ones supposedly arising from the Kondo effect, the researchers say.

“Much of what we thought we had learned about the Kondo effect over the last two decades, and which has already found its way into textbooks, needs to be re-examined,” Lounis says. “For our part, we are now planning to systematically investigate various nanostructures believed to host Kondo resonances,” he tells Physics World. “We also hope to unravel the complexity and richness of the physics behind the spinaron and explore news ways of identifying the interplay between spin-excitations, spinarons and Kondo features.”

Strain induces nonlinear Hall effect in monolayer material

nonlinear Hall effect in monolayer material

Researchers in China have observed a novel version of the Hall effect in a single layer of tungsten selenide (WSe2). The result, which they obtained by applying strain to a sample of the two-dimensional material, might aid the development of next-generation non-volatile magnetic memories.

The conventional Hall effect was first described in the late 19th century, and it occurs when electrons flow through a conductor in the presence of a magnetic field. The magnetic field exerts a sideways force on the electrons, leading to a voltage difference that is proportional to the strength of the field. More recently, scientists have uncovered a whole family of related effects, including the quantum, fractional and spin Hall effects.

In the nonlinear Hall effect, the voltage produced scales quadratically with the strength of the field, rather than linearly. This type of Hall effect, which was predicted theoretically in 2015, originates when a Hall current is generated in response to a “second-order” component of the electric field rather than a magnetic field. This second-order component is related to electrons’ orbital magnetism (that is, the magnetization induced by the particles’ orbital motion, rather than that caused by their spin), and it means that the charge carriers in a current travelling along a material can be deflected – thereby producing a Hall voltage – even without a magnetic field.

Few-layer TMDCs

In 2019, researchers observed the nonlinear Hall effect for the first time in few-layer tungsten ditelluride, WTe2, a material that belongs to the transition-metal dichalcogenide (TMDC) family of materials. Now, a team led by Zhi-Min Liao of the School of Physics at Peking University in Beijing, China, has found that the effect can also occur in a monolayer of a related TMDC, tungsten diselenide (WSe2), when the material is strained along its crystalline axis.

TMDCs have the chemical formula MX2, where M is a transition metal such as molybdenum or tungsten and X is a chalcogen such as sulphur, selenium or tellurium. In their bulk form, they act as indirect band-gap semiconductors, but when scaled down to a monolayer thickness, they behave as direct band-gap semiconductors, capable of efficiently absorbing and emitting light.

In the new work, reported in Chinese Physics Letters, Liao and colleagues studied flakes of WSe2 that they obtained by shaving off monolayer-thin slivers from bulk crystals of the material. To apply strain in the direction they wanted, they selected flakes with long, straight edges and transferred these onto a single crystal piezoelectric substrate known as PMN-PT. After they aligned the WSe2 flakes along the [001] orientation of the PMN-PT crystal, the used external electrodes to apply a voltage to the crystal along this same axis to generate a piezoelectric displacement and induce strain in the WSe2 flakes in this direction.

Berry curvature dipole

By controlling the amount of strain applied, the researchers observed a quantum mechanical property known as a Berry curvature dipole, which dictates how moving charges (such as electrons) propagate through solid semiconductors. This Berry curvature dipole is directly related to the orbital magnetization in TMDCs when a current is applied to them. The researchers were therefore able to control this magnetization, which typically manifests itself perpendicular to the plane of the material, in a direct way by varying the external piezoelectric field.

Liao says that a similar tuning technique could be exploited in next-generation non-volatile magnetic memory devices using a phenomenon called orbit-magnetism torque (OMT). “Here, the information will be written by a charge current in a strained TMDC monolayer and the perpendicular magnetization switched by OMT,” he tells Physics World.

Whole-body MRI could prove optimal for detecting myeloma

Myeloma is a blood cancer that develops from plasma cells in the bone marrow and which affects 140,000 new patients each year across the globe. Diagnosis and assessment are usually performed using advanced imaging techniques such as whole-body CT, whole-body MRI (WBMRI) or PET/CT with the radiotracer 18F-FDG. But at present, it is not clear which is the best imaging test to use.

Researchers from King’s College London hypothesized that the greater sensitivity of WBMRI over 18F-FDG PET/CT for detecting myeloma could prove advantageous for patient management. To investigate this further, they compared the diagnostic performance of the two techniques in patients with a suspected or newly confirmed myeloma diagnosis, reporting their findings in the European Journal of Nuclear Medicine and Molecular Imaging.

“Our results showed that imaging with WBMRI changed how patients would have been managed by their doctors in 24% of cases, where review of clinical data alone would have resulted in surveillance only,” explains lead researcher Vicky Goh. “What this ultimately means for patients is improved outcomes from earlier treatment.”

Expert analysis

The study included 46 patients who underwent both pre-treatment WBMRI and 18F-FDG PET/CT. Forty-one patients had symptomatic myeloma, three had smouldering (asymptomatic) myeloma and two had multifocal plasmacytoma. The PET/CT and WBMRI scans were reviewed independently by an experienced nuclear medicine physician and radiologist, respectively, who recorded the presence or absence of focal and diffuse disease, as well as the number of focal lesions.

The per-patient sensitivity of WBMRI for detecting myeloma bone disease was 91.3% (42 of 46 patients), compared with 69.6% (32 of 46 patients) for PET/CT and 54.3% for the PET component alone. There were no cases where PET/CT was positive for disease detection and WBMRI was negative.

In 58.7% of patients, WBMRI and the CT component of PET/CT detected the same number of focal bone lesions; in the other cases, WBMRI detected a larger number of such lesions. The researchers note that in 10 patients with negative PET/CT studies, WBMRI detected more than 10 lesions. All three patients with smouldering myeloma had negative PET/CT studies. Two of these patients also had negative WBMRI studies, but the third had focal bone lesions seen on WBMRI and their diagnosis was upgraded to myeloma.

To assess inter-observer agreement, two additional readers (a nuclear medicine physician and radiologist) performed further evaluation in a subset of 12 patients. The readers showed moderate agreement in detecting focal disease using CT and PET. In the WBMRI scans, there was excellent agreement between the two readers in detecting the presence or absence of focal lesions in each patient.

Looking at the impact on clinical management, the researchers found that based on clinical data alone, 32 of the 46 patients would be treated for myeloma rather than undergoing active surveillance. Adding PET/CT to the clinical data resulted in treatment of 40 patients, while adding WBMRI led to 43 of the patients receiving treatment – an additional 7%.

Goh and colleagues concluded that WBMRI detects skeletal disease in a higher number of patients than 18F-FDG PET/CT and also detects a higher number of lesions per patient. Adding the imaging findings to the clinical data led to in a significantly higher proportion of patients recommended for treatment rather than surveillance, although the researchers note that treatment decisions were not statistically different between modalities – either would be appropriate in initial staging.

In a press statement, Goh notes that the study supports national guidance from the National Institute for Health and Care Excellence (NICE) that WBMRI should be performed as a first-line imaging test for suspected myeloma. “Earlier diagnosis and treatment is key to improving patient outcome. Forty percent of NHS hospitals still only perform X-rays, an insensitive test, for diagnosing bone disease in suspected myeloma,” she says. “This clearly needs to change.”

“We offer WBMRI as our first-line test in suspected myeloma patients,” Goh tells Physics World. “Going forward, we are clarifying the role of WBMRI compared to 18F-FDG PET/CT in the treatment setting.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors