Skip to main content

Happy 24th birthday Hubble Deep Field, celebrating Tony Skryme and skyrmions

It has been 24 years since the first Hubble Deep Field image was released, providing an astonishing view of thousands of galaxies that are lurking in just a tiny part of the sky. These include some of the youngest and most distant galaxies we have ever seen. But what is most amazing is that the image contains a jumble of galaxy shapes and sizes that hint at just how huge and diverse the universe is and how many stars it contains.

The folks at the Perimeter Institute for Theoretical Physics have put together “14 mind-bogglingly awesome facts about the Hubble Deep Field images” for your enjoyment.

2020 marks the 30th anniversary of the launch of the Hubble Space Telescope. In 2010, Physics World celebrated the 20th birthday of the observatory with the article “Hubble’s greatest hits” by the astrophysicist Mark Voit. He reminds us of the disastrous start of the mission, when a major fault in the telescope’s optics was discovered. Voit also explains how this was corrected, allowing Hubble to acquire some of the most amazing images of the universe that we have.

Our favourite quasiparticle here at Physics World is the skyrmion. It is named after the British physicist Tony Skyrme who postulated the eponymous particle in a series of papers published in the late 1950s and early 1960s.

Today, the skyrmion is familiar to condensed-matter physicists as a magnetic excitation that is topologically stable. But as Ian Aitchison explains in “Tony Skyrme and the origins of skyrmions”, Skyrme’s motivations were much more fundamental.

In his paper (based on a talk Aitchison gave in December), he explains that Skyrme was trying to develop a description of the nucleon (proton or neutron) in terms of meson fields. His goal was to develop a unified description of physics that did not require both boson and fermion fields.

“But Skyrme didn’t really like fermions, a point I’ll return to in a moment,” Aitchison writes.

Aitchison also explains how Lord Kelvin influenced Skyrme’s thinking and points to a connection between Skyrme’s great grandfather and Kelvin’s famous tidal predictor machine.

 

Tumour hypoxia tracked in real time during radiotherapy

Imaging oxygenation

Tumour hypoxia, defined as low levels of oxygen partial pressure (pO2), can cause resistance to radiotherapy. The ability to characterize tumour hypoxia could help to optimize radiation treatments. But there’s currently a lack of methods to monitor tumour oxygenation noninvasively or without averaging across the entire lesion.

Recently, a team headed up at Dartmouth-Hitchcock’s Norris Cotton Cancer Center developed a new imaging approach that combines phosphorescence quenching with excitation by the Cherenkov light generated within irradiated tissues during radiotherapy. They have now used this approach – called Cherenkov excited luminescence imaging (CELI) – to non-invasively image oxygen distribution in mouse tumours during radiation delivery (Nature Commun. 10.1038/s41467-020-14415-9).

For their study, Brian Pogue and his team employed time-resolved CELI with the phosphorescent probe Oxyphor PtG4. They irradiated mouse tumours with pulsed megavoltage X-ray irradiation, generating Cherenkov light that served as an internal source to excite the probe’s phosphorescence. They employed a time-gated camera to capture emission at different delay times after each radiation pulse, and then used these images to determine phosphorescence lifetimes and obtain tissue pO2 values.

“The imaging is all done without any additional radiation, simply by using a camera to monitor the emissions during radiotherapy,” explains Pogue. “We have a unique set of time-gated cameras in our radiation therapy department that were designed for Cherenkov-based radiation dosimetry, but we have used them for this additional purpose of monitoring oxygen in the tumours under treatment.”

Imaging of mice injected with PtG4 revealed that the drug stays in the tumour for at least five days. This provides the opportunity to employ CELI over multiple radiation fractions using a single PtG4 injection, allowing tracking of oxygen dynamics in tumours throughout the treatment course.

CELI

To test the approach, the researchers performed CELI on six mice with subcutaneous tumours during five days of hypofractioned radiation therapy (5 Gy/fraction), following a single intravascular injection of PtG4. The tumours were either radiosensitive MDA-MB-231 breast cancer or radioresistant FaDu head-and-neck cancer xenografts.

CELI performed during each fraction showed that pO2 distributions in both tumours were highly heterogeneous. The hypoxic regions decreased in size from one fraction to the next, with a far more pronounced decrease in the MDA-MB-231 tumours. The median pO2 values in the radiosensitive tumours increased markedly as treatment progressed, while no obvious changes were seen in the radioresistant tumours. Tumour response was delayed relative to the response of the local pO2, with MDA-MB-231 tumours starting to shrink five days after the start of therapy and FaDu tumours only shrinking on day nine.

“Following two tumour lines, one which is known to be responsive to radiation and one which is known to be resistant, we could see differences in the oxygenation of the tumour which are reflective of their differences in response,” says Pogue.

Clinical goals

The excitation source in CELI is the Cherenkov light generated along the treatment beam path, meaning that probe excitation occurs at all depths where radiation dose is deposited. Detection of the phosphorescence, however, is limited by tissue absorption and scattering. The researchers estimate that the current maximum imaging depth is about 2 cm.

They suggest that this depth can be increased by employing brighter phosphorescent probes or using cameras with sensitivity optimized for the 750–850 nm phosphorescence spectral range. This could extend the potential applications of CELI to include imaging of near-surface tumours, intracavity measurements using catheter-based cameras, and pre-clinical assessments of oxygenation during therapy in lab animals.

The researchers conclude that, compared with current clinical pO2 measurement modalities, CELI is capable of significantly higher spatial resolution and allows image acquisition simultaneously with multiple fractions of radiotherapy. They note that the method can be easily added to clinical protocols, enabling evaluation of tumour pO2 at the time of the radiation dose delivery – a long-sought goal in tumour therapy.

The team is now characterizing how small a region they can track the oxygenation from, and how fast they can take measurements. “Our goal is to produce oxygen images at video rate, with a spatial resolution that allows us to see radiobiologically relevant hypoxia nodules in tumours of humans,” says Pogue.

Colloidal quantum dot photodetectors see further in the infrared

Optical sensors that work in the mid- to long infrared part of the electromagnetic spectrum have a host of applications, including gas sensing, thermal imaging and detecting hazards in the environment. Such sensors are, however, costly and based on toxic mercury-containing compounds or epitaxial quantum-wells or quantum-dot infrared photodetectors that are difficult and time-consuming to fabricate. Researchers at the Institute of Photonic Sciences in Spain have now overcome these flaws by constructing a mercury-free colloidal quantum dot (CQD) device that can detect light across these difficult-to-access wavelengths. The new detector is based on lead sulphide (PbS) and is compatible with standard CMOS manufacturing technologies.

“Although previous PbS CQD photodetectors had shown compelling performance in the visible/short-wave infrared (VIS/SWIR) range, we now show that they can also cover the mid-wavelength/long-wavelength infrared (MWIR/LWIR) range,” explains team leader Gerasimos Konstantatos. “This makes PbS CQDs the only semiconducting material to cover such a broad spectral range.”

From interband to intraband transitions

CQDs are semiconductor particles only a few nanometres in size. They can be synthesized in solution, which means that CQD films can readily be deposited on a range of flexible or rigid substrates. This ease of manufacture makes them a cost-competitive, high-performance photodetector material that integrates readily with CMOS technologies.

PbS CQDs have recently emerged as a promising basis for detectors in the SWIR (1-2 micron) wavelength range. They do have a drawback, however, in that they rely on interband absorption of light. This means that incident photons excite charge carriers (electrons) across the electronic bandgap (the energy difference between the bottom of the conduction band to the top of the valence band) of the material. The size of this bandgap thus places a lower limit on the energies at which the technology can operate.

For PbS, this limit was thought to be 0.3 eV, but Konstantatos and co-workers have now reduced it to just above 0.1 eV by heavily doping their PbS CQDs with iodine. The presence of large amounts of iodine facilitates electronic transitions amongst higher excited states known as inter-sub-band (or intraband) transitions, instead of interband ones. This makes it possible to excite electrons using photons with much lower energies than before, in the mid- to long-wavelength IR (5-10 µm) range.

First robust electron doping in PbS CQDs

The researchers’ doping method, which they developed in 2019, involves substituting iodine atoms for the sulphur in PbS using a simple ligand-exchange process. Konstantatos explains that this heavy doping means they can populate the first excited state of the conduction band (1Se) in the CQDs with charge carriers at a rate of more than one electron per dot. When they illuminate their material with low-energy light, these electrons are excited from the 1Se to the second excited state in the conduction band (1Pe). This modulates the material’s conductivity, rendering it sensitive to mid- and long wave IR radiation.

In their most recent experiments, the researchers synthesized PbS CQD films using a wet colloidal chemistry technique, which creates a colloidal suspension of the spheres in a liquid. They followed this with a simple spin coating and ligand exchange with iodine molecules. The ligand exchange occurs on the surface of the dots and this doping renders the CQD films conductive. “To our knowledge, this is the first time that robust electronic doping has been achieved in this material,” says Konstantatos.

Doping is more effective in larger dots because these contain more exposed sulphur atoms. Indeed, for dots smaller than 4 nm in diameter, the 1Se band is almost empty, while for dots with a diameter between 4 and 8 nm, heavy doping occurs and the 1Se band is partially populated. For dots bigger than 8 nm, the 1Se band is nearly completely filled with roughly eight electrons per QD. In such heavily doped QDs, the populated conduction band bleaches interband photon absorption and intraband absorption then becomes possible. This means that the larger the QDs, the longer the wavelength of IR they can absorb.

Thanks to transmission measurements of two iodine-exchanged PbS QD samples, one heavily doped and the other undoped, the team were able to confirm that strong light absorption occurred at an interband (1Sh → 1Se) transition in the undoped sample and at a strong intraband (1Se → 1Pe) peak in the doped one. As mentioned, this occurs because the 1Se is partially populated with electrons.

Towards hyperspectral imaging

Such ultrabroadband operation could make it possible to perform multispectral or hyperspectral imaging, which provides not only visual but also compositional (chemical) information on an object or scene, as well as its temperature, says Konstantatos. “Until now, this could only be achieved by using several image sensors utilizing different technologies, with the IR part being very expensive,” he tells Physics World. “With a CQD technology now covering the full range from the visible to the LWIR, low-cost broadband photodetectors may now be possible.”

The researchers, who report their work in Nano Letters, are now planning to improve the performance of their detectors. They would also like to achieve efficient intraband absorption in smaller dots.

Boosting infrared spectroscopy, making complexity a descriptive science, artificial intelligence in medical physics

In this installment of the Physics World Weekly podcast we look at how a new technique for infrared spectroscopy uses ultrashort laser pulses to minimize problems associated with noise. We chat with the physicist and mathematician Ginestra Bianconi of Queen Mary, University of London about how our ability to make complexity science a predictive effort is providing important insights into epidemics and other collective behaviours.

We also discuss some recent examples of how artificial intelligence is being used in medical physics and argue the case for standards defining the graphene content of commercial products.

Convection cells the size of Texas dazzle with clarity on the Sun

An image of the Sun with the highest spatial resolution ever has been taken by the Daniel K Inouye Solar Telescope. Shown above is the full field image, which is displayed in false colour and shows infrared light at 789 nm wavelength.

The image covers an area of 36,500×36,500 km2 on the Sun’s surface – which is about 2.6 times the surface area of Earth. Despite the huge area covered by the image, features as small as 30 km can be resolved.

The granular structures in the image are convection cells, which are each about the size of the US state of Texas. Hot plasma from inside the Sun bubbles up in the bright centres of the cells. It then cools by radiating its heat and falls back into the Sun in the gaps between the cells – which are dark because they are cooler.

The occasional bright specks that are seen in the gaps mark-out magnetic field lines, which are thought to channel energy from the surface of the Sun to the corona. The presence of these specks could explain why the corona is much hotter than the surface of the Sun.

Located on Haleakala volcano in Hawaii, the Inouye Solar Telescope has a mirror that measures 4 m across, making it the largest solar telescope ever. It acquired its first images in late 2019 and will be operate for at least 44 years to capture four complete cycles of solar activity.

 

Making it in 2D materials research

Mar Garcia Hernandez

What is the focus of your own research?

We grow graphene on insulators of technological interest and then explore the nature of the interfaces between the substrates and graphene so as to optimize the electrical properties of the materials we grow. Examples include glass, Si/SiO2 wafers, TiO2, as well as grown layers of 2D transition metal dichalcogenides (TMDs). We are also developing a new platform that will allow us to functionalize 2D materials for various device applications under conditions compatible with the device itself, while preserving the properties of the pristine material. Our main focus is functionalization with biomolecules for biomedical applications.

How did you find yourself working on this field?

I had been working on the growth and characterization of complex perovskite heterostructures for spintronics applications for quite a number of years – in fact I still work in this field now. I was interested in the new physics that these systems reveal at the interfaces, such as the appearance of superconductivity at the interface of two insulating materials or the build-up of magnetism at the interface between non-magnetic materials. So I was familiar with exploring the interfaces between two grown 2D layers and was very much interested in these low-dimensional materials. Then graphene was discovered and shown to be the thinnest known material, as well as exhibiting fascinating electronic transport properties. A new series of free-standing 2D materials started to enlarge the catalogue of interest for 2D flatlands research and presented a feasible path towards the fabrication of different kinds of multifunctional heterostructures of different 2D materials. So I decided to explore these 2D materials with a view to building up heterostructures.

What would you identify as the most important recent achievement for characterization and fabrication of these materials?

There are some very recent results on the growth of hexagonal boron nitride (hBN) on platinum that deserve particular attention as these layers provide nearly perfect crystals that we can then transfer to encapsulate high-quality graphene and meet the requirements of electronic foundries. This is technologically very relevant as there is no way to preserve graphene’s outstanding properties without proper encapsulation.

Is standardization a problem for other 2D materials as it has been for graphene?

Yes indeed, different synthesis routes render materials with very different properties, so standardization is a problem for all 2D materials. The problem will continue until the standardization international bodies establish corresponding standards. This will take a very long time because the procedures have to be non-ambiguous and robust, and these 2D materials are very new. In the meantime, it would be desirable to refer to a good practice code for labelling these materials.

What prompted you to draw up this review for the journal 2D Materials?

The Graphene Flagship has gathered a large set of world-class researchers with outstanding expertise in the synthesis of materials, that attend meetings together and organize workshops to exchange their most recent findings. I saw that there was a great opportunity to offer an overall picture on the synthesis of graphene and other 2D materials to a broader audience and started to lay out a basic scheme for this review.

The review covers synthesis methods for 2D materials, ranging from chemistry routes to ultrahigh vacuum methods. It also tackles the processing and characterization methods suitable for this class of materials. I lead the materials work package in the Graphene Flagship and succeeded in encouraging Flagship partners to send in their contributions. I also attended many other work package meetings to spot other groups working on the synthesis of these materials that were not covered by my own work package, so that I could invite them to contribute. Finally, I contacted other contributors to fill in the gaps for any topics that had not been specifically addressed by the Graphene Flagship. The result is this comprehensive review dealing with many aspects of the synthesis, processing and characterization of these materials.

The scope covers a huge field these days – how did you draw the line over what to include and what not?

We decided to include the topics that could attract a large audience, those that according to the literature have proven to be the most popular and useful. The Graphene Flagship had already targeted most of the hot topics related to the synthesis of 2D materials and this helped a lot. Once the basic content was decided, most of the co-authors knew pretty well how to describe the major findings in their respective fields. Obviously the participation of a large number of people in the writing of the review is only possible at the expense of losing some homogeneity in the coverage.

What would you consider the main outstanding challenge?

One of the main challenges from the view point of synthesis is producing high-quality 2D materials in large quantities in a reproducible way under conditions compatible with current industrial technologies. This would speed up the integration of this new class of materials in real world products.

Machine learning for tomographic imaging

Machine Learning for Tomographic Imaging

The field of artificial intelligence and machine learning, particularly the subcategory of deep learning, has experienced massive growth in recent years, with applications ranging from speech recognition to material inspection, healthcare to gaming, to name but a few. One area that’s being transformed by machine learning is tomographic imaging – in which a series of data projections (such as X-ray radiographs, for example) are reconstructed into a three-dimensional image.

A newly published book, Machine Learning for Tomographic Imaging, presents a detailed overview of the emerging discipline of deep-learning-based tomographic imaging. The book arose from discussions among four colleagues with a long-standing interest in advanced medical image reconstruction: Ge Wang from Rensselaer Polytechnic Institute, Yi Zhang of Sichuan University, Xiaojing Ye from Georgia State University and Xuanqin Mou from Xi’an Jiaotong University.

“Deep tomographic reconstruction is a new area, and the development of this area has been rapid over the past years,” explains Wang. “Ours is the first and only book on this new frontier of machine learning.”

The book begins with an introduction to imaging principles, tomographic reconstruction and artificial neural networks. Parts two and three provide in-depth tutorials on CT and MR image reconstruction and describe a range of recent machine learning techniques. The final part of the book covers other imaging modalities, including PET, SPECT, ultrasound and optical imaging, as well as taking a look at image quality evaluation and quantum computing. The text also includes appendices describing relevant numerical methods and suggesting hands-on projects, with sample codes and working datasets.

Writing in a forward to the text, Bruno De Man, a CT authority from GE Global Research, says that the book is ideal for introducing machine learning and tomographic imaging into applied disciplines such as physics and engineering, as well as for bringing application contexts into more theoretical disciplines such as mathematics and computer sciences. “Every medical imaging scientist who graduated before machine learning was taught in college should probably learn about this area in order to remain competitive,” he suggests.

Wang notes that the book is a little lengthy for classroom teaching. Consequently, he and his coauthors are working with the publisher (IOP Publishing) to consider a simplified version. “This will be based on my teaching experience last semester at Rensselaer Polytechnic Institute, where we performed a combined graduate/undergraduate course on medical imaging in the artificial intelligence framework. This course was the first of its kind in the world,” Wang tells Physics World. “We will keep working in this area, and hopefully produce the next version in about two years.”

Quantum calorimeter is as precise as nature allows

How do you define the position of something that won’t stay still? This is the problem physicists face when they try to measure a system’s properties with such precision that quantum effects contribute a significant source of uncertainty. Whatever the variable, and however refined the instrument, there comes a point at which the signal is lost in the noise.

A quantum calorimeter developed by researchers at Finland’s Aalto University School of Science and Lund University, Sweden, defines this limit for an ideal thermometer by measuring fluctuations in the electron temperature of a copper nanowire. The team found that the intrinsic thermal noise in the wire is small enough for them to detect a single microwave photon. As well as enabling new experiments in quantum thermodynamics, the device could be used to make noninvasive measurements of quantum systems such as qubits in superconducting quantum computers.

To work out where the measurement limit lies for a thermometer, Bayan Karimi and colleagues built a calorimeter sensitive enough to measure the tiny but unavoidable energy fluctuations that affect every system warmer than absolute zero. The team’s device consists of a copper nanowire 1 µm long and 35 nm wide, deposited on an insulating substrate. A tunnel junction at one end of the copper wire, and a direct contact 50 nm along its length, allow current to flow into an aluminium circuit that becomes superconducting at the temperature range studied (about 10–250 mK). At the other end of the wire, a second tunnel junction lets the researchers inject high-energy electrons so they can also study the system in an out-of-equilibrium state.

Because a tunnel junction’s conductance varies with the energy distribution of the electrons, the researchers measure the voltage across the circuit to find the temperature of the 108 or so electrons in the copper wire. Averaged over time, a plot of this temperature would be a smooth line – flat for a system in equilibrium, sloping for a system out of equilibrium.

Different types of fluctuation

Look more closely, however, and a different picture emerges. While the energy in the system may stay the same overall, it is constantly being exchanged between electrons in the copper wire and random lattice vibrations in the wire and underlying substrate. At the 10 kHz sampling rate that Karimi and colleagues achieve with their device, the inconstant share of energy held by the electrons translates into a fluctuating electron temperature. In the out-of-equilibrium experiments, an additional source of fluctuations arises from the random arrival times and energies of the electrons injected into the nanowire.

These fluctuations are different from the instrumental noise that ambitious metrologists deal with daily. By isolating and characterizing temperature variations that originate in their experimental apparatus, the researchers confirm that their measured electron-energy fluctuations represent a fundamental and inescapable limit on temperature sensitivity. They are inherent to the system and mask any temperature changes smaller than a certain size.

The good news, the team found, is that even faced with this hard limit on sensitivity, the noise level is low enough for energy changes as small as a single microwave photon to be detected – without disturbing the system in the process. Since this is the amount of energy that separates qubit states in superconducting quantum computers, the researchers think their calorimeter could provide a non-invasive way of monitoring relaxation and decoherence processes.

Clever design

“The problem is that most thermometers at the nanoscale heat the system, and thus determining the temperature might destroy the delicate quantum features,” says Sebastian Deffner of University of Maryland, Baltimore County (UMBC) in the US, who was not involved with the project. “The authors of this paper seem to have found a clever design that works around this issue. If this thermometer is now taken up by others (if it becomes the iPhone of quantum thermometers), then this may be a very, very important breakthrough.”

Being able to spot such tiny temperature changes could also enable advances in more fundamental physics, as there are still significant gaps in our understanding of how energy relates to quantum mechanics.

“Thermodynamics is something that is largely missing from our understanding of the quantum world,” says Jukka Pekola, who leads the team at Aalto. “Our detector could look at the heat and temperature changes determined by the tiniest increments: quanta that mediate energy exchange. Approaches like stochastic thermodynamics could then be applied in the quantum regime, for the first time ever, by directly measuring the heat.”

Full details of the research are reported in Nature Communications.

UK unveils post-Brexit investment and visa changes to boost science

The UK government has announced a raft of measures to boost science in the country as it gets ready to leave the European Union on 31 January. They include investing £300m over five years to fund mathematical sciences, the lifting of visa restrictions on scientists coming to the UK, as well as removing the need for researchers to make “impact” statements when submitting grant applications to UK funding councils.

The boost for mathematics comes via the Engineering and Physical Sciences Research Council (EPSRC), which will invest £60m per year in the field – double what it currently spends. EPSRC says it will provide £19m towards funding PhD students for four years “as standard” and offer five-year funding for research associates to “compete with the US and Europe”.

The cash will also include £34m per year for research, which will come with “more flexibility on the number and length of fellowships and will not be ring-fenced between sub-disciplines”. Finally, £7m each year will go towards PhDs and research fellows at the Heilbronn Institute for Mathematical Sciences in Bristol as well as funding to increase participants and workshops at both the Isaac Newton Institute in Cambridge and the International Centre for Mathematical Sciences in Edinburgh.

The new funding was announced at the same time as a new fast-track visa scheme to attract scientists, researchers and mathematicians to the UK. It is expected to replace the current “Exceptional Talent: Tier 1” visa, which has a current cap of 2000 visas per year. The new visa scheme, which comes into force on 20 February, will not feature such a cap.

There remains a real need to continue to enable and support people with the right skills and experience to live and work in the UK.

Patrick Cusworth, Institute of Physics

“At present, 44% of science, engineering and technology firms report difficulties in finding recruits with the right skills,” says Patrick Cusworth, head of policy at the Institute of Physics, which publishes Physics World. “There remains a real need to continue to enable and support people with the right skills and experience to live and work in the UK. Developing a pragmatic means of doing so is therefore a big step in the right direction.”

It is still not clear, however, if the UK will remain part of the EU’s Horizon research programme, the latest of which ends this year. Members of the programme have to accept freedom of movement of scientists between EU member states, but this is likely to be terminated once the UK’s transition period finishes at the end of 2020.

The impact agenda

UK researchers will, however, benefit from no longer having to submit a “Pathways to Impact” plan or complete an “impact summary” when applying for cash from UKRI – the umbrella organization for the UK’s seven research councils. The Pathways to Impact requirement, which had been in place for around a decade, was controversial. But for grant applications made from 1 March 2020, researchers will not have to submit. The UKRI currently invest a total of £7bn into British science each year

The removal of “Pathways to Impact” will be broadly welcomed by the many grant-writing physicists whose heart sank at the thought of churning out two pages of boilerplate on the ill-defined socioeconomic impact of their proposed research

Philip Moriarty

“The removal of ‘Pathways to Impact’ will be broadly welcomed by the many grant-writing physicists whose heart sank at the thought of churning out two pages of boilerplate on the ill-defined socioeconomic impact of their proposed research,” says physicist Philip Moriarty from the University of Nottingham. “Yet despite being a vocal opponent of it for many years, I feel it’s important to recognize that it played a role in shifting attitudes regarding the broader implications of academic research. For one thing, the ‘impact agenda’ led to a greater – albeit, often rather opportunistic – interaction between science and the arts and humanities. Hopefully this interdisciplinary activity will continue in its absence.”

Writing on the Wonkhe blog, James Wilsdon from the University of Sheffield, who is director of the Research on Research Institute, says he feels it is “premature” to see this move as the end of the impact agenda. “Rather, this is a reflection of impact’s maturity and the extent to which is has now been mainstreamed within research culture and practice,” he adds.

New metal detector finds small objects by how they disrupt Earth’s magnetic field

A compact, low-cost and low-power metal detector that can identify the unique magnetic fingerprints of small metallic objects has been created by a team led by Huan Liu at the China Institute of Geosciences. Comprising micrometre-scale sensor arrays, the instrument is sensitive to subtle disruptions of the Earth’s magnetic field that are caused by metallic objects. The new design could allow for significant reductions in the size and energy requirements of security screening systems.

The ability to detect hidden metallic objects is a critical requirement of public security systems. This is currently done by metal detectors that use inductor coils to transmit alternating electromagnetic fields, which inducing eddy currents within metallic objects. These eddy currents generate secondary fields of their own, which are picked up by the coils. While effective, such systems are bulky and consume large amounts of energy.

Liu’s team has developed an alternative technique that focuses on time-dependent variations in the Earth’s magnetic field, which are affected by the presence of nearby metal objects. The effect gives an object its own unique magnetic “fingerprint”, which depends on factors including the object’s shape, size, and composition. These fingerprints can be identified using a technique that the team has dubbed “weak magnetic detection”’ (WMD). This is a passive detection technique that does not transmit a probe signal — which means it has very low energy requirements.

A downside of WMD had been that the fingerprints of smaller objects are difficult to distinguish from background noise, and this had precluded its practical use in security scanning applications.

Central processing

Now, Liu and colleagues have shown that the signal-to-noise ratio of WMD can be improved using microelectromechanical systems (MEMS). These are arrays of micrometre-scale components that are highly sensitive to ambient electromagnetic fields and send information to a central unit for processing.

The detection mechanism is anisotropic magnetoresistance (AMR), whereby the electrical resistance of a detector component is a function of the strength and direction of an external magnetic field. This means that when the arrays are placed in a planar cross arrangement, noise can be significantly reduced.

The team’s metal detector integrates three sets of magnetic sensor arrays with a microcontroller, a battery, and a PC that provides a noise-suppressing data processing framework. With this setup, the physicists successfully identified the magnetic fingerprints of metal objects smaller than 50 cm; and could distinguish between multiple objects separated by under 20 cm. Specifically, they picked up the characteristic field mutations induced by objects including a phone, a hammer, and a knife. Larger objects such as the hammer could be detected at distances up to 80 cm.

Liu’s team now hopes to improve the accuracy of their sensor arrays to detect magnetic fingerprints from even further distances, and to higher resolutions.

The detector is described in AIP Advances.

Copyright © 2025 by IOP Publishing Ltd and individual contributors