Do quantum effects such as entanglement and coherence play a role in consciousness? In this episode of the Physics World Weekly podcast, Betony Adams of the Center for Quantum Technology at South Africa’s University of KwaZulu-Natal talks about recent developments in the burgeoning and contentious field of quantum biophysics.
Also in this week’s podcast is Giacomo Indiveri, who is professor at the Institute of Neuroinformatics at the University of Zurich and ETH Zurich and editor-in-chief of the new journal Neuromorphic Computing and Engineering. Indiveri explains how researchers are developing new neuromorphic technologies that mimic how information is processed by the nervous system. He also talks about his own research on developing neuromorphic cognitive systems using very large scale integrated circuits and chats about the scope and aims of the new journal.
During my undergraduate degree at the University of St Andrews, UK, I decided that lab work and experimental physics wasn’t for me and switched to a degree in theoretical physics. However, as I progressed, I found myself seeking out opportunities to get involved with hands-on research that had a measurable impact on the real world. Following an event hosted by the university physics society, I approached the university’s radar research group about carrying out a summer placement, in which I programmed a data logger to function together with a disdrometer, which measures rainfall. The outcome of the project was something I could hold in my hands that could be used in important scientific research. Only a month later it was part of field research on the Caribbean island of Montserrat.
After another internship that took me briefly into the world of X-ray spectroscopy, I returned to the radar group. One summer-long project had me assembling a radar for detecting drones and for my Master’s degree project with the same group I simulated a radar functioning at sea. Although I still have a deep love of maths and theoretical physics, I am now embarking on a PhD in engineering at the University of Edinburgh. Through studying radar technology I’m excited to develop and improve tools to measure and inform us about the world around us.
During these opportunities I have discovered a deep willingness among experts to share their knowledge and encourage young professionals. I have always felt supported, even when I did not start with much radar knowledge and my soldering resulted in smoking circuits. Taking the time to learn and make mistakes was allowed and expected, which not only gave me confidence to embark on a PhD course but was an integral part of my university experience.
The other aspect of that experience was developing a sense of community. By running events with the university’s physics society, I got to know fantastic staff and students. At the end of my first undergraduate year, I joined as one of the social conveners. Before leaving at the end of my fifth year I had served as president, podcast producer and the ever-popular baking representative. I also started an interview podcast called “insight St Andrews” and had fascinating conversations with leaders in many fields from physics to forensic science. These experts shared their experiences, which helped me realise that they were once where I am now – as someone passionate about learning and wanting to make an impact on the world.
An integral part of the society was the student community, whether that was other committee members, event attendees or even other societies that we collaborated with. My involvement on the committee allowed me to not only grow as a person but also to contribute to the community. I was involved in organizing more than 50 events with the physics society, from academic talks to theatre performances and end-of-year balls, and it was a privilege to share those spaces with like-minded people.
Through the university’s physics society, I was invited to join the Institute of Physics (IOP) student community panel, representing Scotland’s students. I had the joy and privilege of working with other regional student representatives, organizing events and conferences to bring people together. I made sure that student societies were at the heart of reaching out to the student community. My proudest achievement on the panel was championing change in how the IOP sponsors student societies: to provide more guidance and also financially reward those societies promoting IOP values of inclusion and personal development.
The physics community, its academics and institutions, recognize that students bring a fresh perspective and encourage us to voice new ideas in the community. It is important that we use that voice and, when the time comes, listen to the voices of the students who follow after us.
For the latest careers advice, opportunities, case-studies and employer directories, take a look at Physics World Careers 2021
An international research team from multiple medical centres has developed an automated image-analysis method to reveal the pathological progression of Alzheimer’s disease. Improving our understanding of this progression is critical to designing treatments that effectively halt the advancement of the disease.
Alzheimer’s disease is a degenerative brain disorder that affects memory, thinking and behaviour. It is widely believed that the toxic build-up of two abnormal structures – plaques and tangles – is responsible for the neuronal damage that causes the gradual worsening of symptoms. The plaques consist of deposits of the protein fragment amyloid-beta, while tangles are formed by twisted fibres of the tau protein.
Unfortunately, existing Alzheimer’s treatments have shown limited success, possibly because they target these plaques and tangles after they are already widespread throughout the brain and irreversible damage has set in. Therefore, the development of more effective treatments requires a deeper understanding of their appearance and spread over time.
Justin Sanchez, from the Gordon Center for Medical Imaging at MGH and Harvard Medical School, and colleagues developed an automated method that uses positron emission tomography (PET) to track the origin and advancement of tau protein in relation to amyloid-beta levels across the unique brain anatomy of 443 individuals. The researchers recently published their results in Science Translational Medicine.
Combining images of structure and pathology
Although the distribution of tau protein in the brain has previously been imaged, identifying the pattern of appearance and spread has been complicated by normal variations in individual brain anatomy. To overcome this limitation, Sanchez and colleagues combined high-resolution structural imaging with molecular imaging to assess amyloid-beta and tau levels in specific brain regions throughout the study participants.
First, they used MRI. This provided a three-dimensional picture of each individual’s unique brain structure, with enough detail to delineate various brain subregions.
Next, they performed two different PET studies. PET is a molecular imaging technique that uses radioactive substances, known as radiotracers, to visualize metabolic and/or physiologic processes. These radiotracers can be specifically designed to highlight a distinct biologic function or molecular abundance.
When injected intravenously, the positron-emitting radiotracer travels through the body and, based on its design, binds to its specific target. As the radioisotope decays, the emitted positrons meet electrons in the body and annihilate, producing two 511-keV photons. By detecting these photons, the origin of the annihilation, and therefore the location of the radiotracer in the body, can be determined.
The first PET study used a radiotracer known as Pittsburgh compound B, which highlights areas of the brain where amyloid-beta is located. The second PET study used a radiotracer called flortaucipir that binds to sites in the brain with tau-protein tangles.
Finally, the researchers applied an automated method to the structural MRI data to identify the brain region most vulnerable to initial cortical tau build-up in each individual. They aligned the two PET images with the MR image to assess the amyloid-beta and tau protein levels in each brain subregion.
Putting the method to work
The researchers applied their novel image-analysis method to 443 adults ranging in age from 21 to 93 years, 55 of whom had Alzheimer’s disease. They additionally performed a two-year follow-up study in 104 of these individuals to assess the change in tau protein levels over time.
This study revealed that cortical tau protein first arises in a small region of the brain’s medial temporal lobe called the rhinal cortex. This initial emergence was independent of amyloid-beta levels and often occurred before amyloid-beta began accumulating. Then, when certain individuals additionally accumulated amyloid-beta throughout the brain, the researchers saw that tau was able to escape the rhinal cortex and catastrophically spread to other brain regions – the temporal neocortex and extratemporal neocortex.
The two-year follow-up study revealed that those individuals with the highest baseline tau levels in the rhinal cortex experienced the largest spread of tau throughout the neocortex. This finding is important because it suggests that tau PET measurements in the rhinal cortex can serve as a clinical indication of impending tau accumulation and spread.
These findings “have implications for clinical trials of Alzheimer’s disease-modifying therapeutics: they suggest that medical intervention against early tau accumulation could be effective in halting the progression of disease before the spread becomes catastrophic,” says Sanchez, adding that the new method has great potential to contribute to “advancing our efforts of provide effective interventions for patients at risk for Alzheimer’s disease.”
The quest for cheaper, safer, higher-energy and resource-abundant energy storage has driven battery innovations in the past several decades. Organic battery electrode materials that store charge with dedicated redox groups have emerged as an exciting option.
In this webinar, Yan Yao will discuss the organic battery materials design in emerging rechargeable battery technologies, such as solid-state lithium batteries and multivalent metal batteries. He will also discuss an integrated battery diagnostic platform for structural, chemical and mechanical characterizations that provide insights into failure mechanisms of solid-state batteries.
Dr Yan Yao is Cullen Professor of Electrical and Computer Engineering at the University of Houston (UH). He received his PhD in materials science and engineering from UCLA in 2008. After working as a senior scientist at Polyera Corporation and a postdoctoral fellow at Stanford University with Prof. Yi Cui, Dr Yao joined the UH faculty in 2012. He was promoted to associate professor in 2017 and full professor in 2020.
Dr Yao has led research on the fundamental study of energy-storage materials and devices, spanning from solid-state batteries for electric vehicles to multivalent ion batteries and aqueous batteries for grid-energy storage. He has authored more than 110 journal articles with 25,000 citations and holds 10 US patents.
Dr Yao is a fellow of the Royal Society of Chemistry and senior member of the National Academy of Inventors and Institute of Electrical and Electronics Engineers. He received the Office of Naval Research Young Investigator Award (2013), UH Teaching Excellence Award (2016), UH Research Excellence Award (2018), Top 1% Clarivate Highly Cited Researcher (2018), and Scialog Collaborative Innovation Award (2018 and 2020). He founded the ECS University of Houston Student Chapter in 2016 and continues to serve as the faculty advisor.
Solar cells made from lead halide perovskites are good at converting solar power into electricity and relatively straightforward to manufacture. Unfortunately, they’re also unstable at room temperature and ambient humidity, which is something of a drawback for devices that tend to be located outside. Now, however, researchers at the US Department of Energy’s SLAC National Accelerator Laboratory and Stanford University may have found a solution. Their new technique involves pre-treating the material at high pressures and temperatures, and its developers say it could be scaled up for industrial production.
Perovskites are crystalline materials with an ABX3 structure, where A is caesium, methylammonium (MA) or formamidinium (FA); B is lead or tin; and X is chlorine, bromine or iodine. They are promising candidates for thin-film solar cells because they can absorb light over a broad range of solar spectrum wavelengths thanks to their tuneable bandgaps. Charge carriers (electrons and holes) can also diffuse through them quickly and over long distances. These excellent properties give perovskite solar cells a power conversion efficiency (PCE) of more than 18%, placing them on a par with established solar-cell materials such as silicon, gallium arsenide and cadmium telluride.
One of the most efficient perovskites for solar cell applications is composed of caesium, lead and iodine. This material, CsPbI3, has four possible phases: a yellow room-temperature non-perovskite phase (δ), plus three black high-temperature perovskite-related phases in which the crystal takes on a cubic (α), tetragonal (β) or orthorhombic (γ) structure. While the black phases are efficient at converting sunlight into electricity, heat and humidity quickly make them revert to the yellow phase, which is useless for photovoltaic applications.
High pressure squeeze
The SLAC-Stanford researchers have now shown that it is possible to nudge this yellow phase into an efficient and stable black configuration. Led by Yu Lin, Wendy Mao, Hemamala Karunadasa and Feng Ke, they did this by placing crystals of the yellow phase between the tips of a diamond anvil cell (DAC) and subjecting it to pressures of 0.1 to 0.6 GPa at a temperature of up to 450 °C. They then rapidly cooled the material down and removed the sample from the DAC.
Synchrotron X-ray diffraction and Raman spectroscopy measurements showed that this treatment yielded a version of orthorhombic γ-CsPbI3 that was stable in the presence of moisture (at a relative humidity of 20-30%) and remained efficient at room temperature for 10 to 30 days. This is a significant improvement over earlier efforts to stabilize the bulk black phase at room temperature using, for example, applied strain, surface treatments and changes to the material’s chemical composition – all of which produced good results only when the environment remained moisture-free.
“This is the first time that pressure has been used to control this material’s stability [under ambient conditions],” Lin says. “Now that we’ve found this optimal way to prepare the material, there’s potential for scaling it up for industrial production, and for using this same approach to manipulate other perovskite phases.”
Stabilizing the γ-CsPbI3 phase
Theoretical studies show that the secret to the SLAC-Stanford team’s success lies in the way the high-pressure, high-temperature treatment affects distortion within the perovskite lattice. Other researchers had previously identified three types of distortion: distortions of the BX6 octahedral units making up the material’s crystalline structure; B-cation displacements within the octahedra; and a tilting of individual BX6 octahedra relative to one another to create rigid corner-linked units. Of these three types, the third, known as octahedral tilting, is the most common.
In the current study, Chunjing Jia and Thomas Devereaux of the Stanford Institute for Materials and Energy Sciences (SIMES) used first-principles density functional theory calculations to show that applying pressure to the perovskite affects its tilt. More specifically, the pressure treatment makes it possible to control the relative energy difference between the desired γ-CsPbI3 phase and a competing non-perovskite phase (δ- CsPbI3) – and, ultimately, to stabilize the former.
In August 2020 the European Synchrotron Radiation Facility (ESRF) in Grenoble, France, unveiled the world’s brightest source of high-energy X-rays. An extensive upgrade costing €150m has transformed the facility into the Extremely Brilliant Source, or ESRF–EBS, a fourth-generation synchrotron that boosts the brilliance and coherence of the X-ray beams by around a factor of 100 over its predecessor.
Scientists from around the world started to use the machine on 25 August, for the moment via remote experiments, with the upgraded machine making it possible to study the structure of matter at the atomic level much faster and in greater detail than before. At this year’s ESRF User Meeting, a virtual event that will run from 8–10 February, attendees will be able to explore the technical capabilities of the ESRF–EBS as well as the new science it will enable.
Normally held at the ESRF, this year’s online event will enable scientists in all parts of the world to take part in tutorials, poster sessions and scientific symposia, and to interact with each other remotely to discuss new research ideas. A plenary session on Tuesday 9 February will feature five keynote lectures on different applications of synchrotron science – including industrial catalysis and research into the pathology of the COVID-19 virus – as well as a facility update on the ESRF-EBS.
Suppliers to the ESRF will be taking part in a virtual commercial exhibition, and have also been working closely with the ESRF’s technical teams to deliver solutions that meet the needs of next-generation synchrotrons as well as other major scientific projects. A few examples are highlighted below.
Fast detectors deliver precision for high-energy applications
The ESRF has received eight EIGER2 detectors from DECTRIS as part of its EBS upgrade (Courtesy: DECTRIS)
DECTRIS, which specializes in producing high-performance hybrid photon-counting X-ray detectors, has developed two models for high-energy applications in next-generation synchrotron sources. The EIGER2 X and XE CdTe detectors combine high performance with easy integration and operation, and are equipped with a cadmium-telluride sensor to provide high quantum efficiencies for hard X-ray energies up to 100 keV. With a pixel size of just 75 mm, they also provide excellent spatial resolution.
The detectors’ readout eliminates any dead time, which ensures that no photons are lost between frames. Two adjustable energy thresholds make it possible to simultaneously determine the fluorescence background (the lower threshold) while also measuring the contribution of higher harmonics (the upper threshold). With a one-pixel-point spread function (PSF) and a count-rate capability of 107 photons/s per pixel, the EIGER2 X and XE CdTe detectors deliver more precise measurements than ever before.
Different versions of the detectors are available, with active areas ranging from 77 x 38 mm2 to 311 x 327 mm2. The smallest detectors in the EIGER2 X series offer frame rates of 2 kHz, while the XE versions have been optimized to combine a large active area with a frame rate of up to 550 Hz – ideal for applications such as macromolecular crystallography, materials science, and small-angle X-ray scattering. In addition, both versions have been designed to deliver high efficiency at high energies, yielding the best possible resolution and frame rate for next-generation synchrotrons.
The ESRF, as part of its EBS upgrade, has received eight EIGER2 detectors that are being taken into operation. “The combination of the extremely brilliant source and the high dynamic range and high sensitivity of the EIGER2 CdTe detector will allow us to detect weak features in diffraction data,” says beamline scientist Carlotta Giacobbe. “Ultrafast 3D mapping and ultra-fine slicing will now be accessible for monitoring systems as they evolve during in situ experiments”.
ITER sets new challenges for leak detection
ITER’s vacuum vessel must be kept under ultrahigh vacuum to maintain the right conditions for fusion (Courtesy: ITER)
40-30, a French company that specializes in process and vacuum technologies for large engineering projects, has been a long-time supplier to the ESRF and other major experimental facilities. That includes ITER, the fusion reactor now being built in Cadarache in the south of France, where 40-30 is part of a consortium designing and building one of the most complex leak detection systems in the world.
Achieving and maintaining the right conditions for the fusion reaction between deuterium and tritium requires the vacuum vessel that contains the plasma, along with the 14,000 m3 cryostat and other core components, to be kept under ultrahigh-vacuum conditions. Any leaks or contamination must be detected as soon as possible, but that’s a challenging task when there are around 2000 entry points to the overall system.
“We spent a whole year working out the basic functional requirements of the leak detection systems for ITER, and translating them into technical specifications and contract documentation,” says Roger Martín, the project manager at Fusion for Energy (F4E) who was responsible for procuring the detection system.
The resulting design is a series of helium leak-detection subsystems that exploit high-resolution mass spectrometry to detect helium ingress under specific conditions – including a high-radiation environment with strong magnetic fields, complying with the nuclear safety standards applying to ITER, and even the possibility of seismic activity. Although mass spectrometry is widely used for leak detection, adapting the technology to ITER standards is particularly demanding.
After an extensive tendering process, the €17 m contract for delivering the leak-detection system was awarded to the IG4 consortium, led by Spanish engineering consultants IDOM. 40-30 will be responsible for process and vacuum engineering, as well as equipment specification, while Gutmar will manufacture, assemble and test the equipment.
“We are really proud to be involved in this project,” says Charles Agnetti, CEO of 40-30. “For this mission, 40-30 has assigned its most experienced specialists in vacuum technology and processes, leak detection and gas analysis.” It will take more than three years to deliver the project, with a preliminary design review due in the first half of 2021.
Compact LAMBDA detectors shrink even further
The first generation of the compact LAMBDA 750k X-ray detector (left) seems large compared to the re-engineered version (right).
X-Spectrum GmbH specializes in developing high-resolution and high-speed X-ray detectors built around the Medipix3 RX detector chips, originally developed by CERN. The workhorse among the company’s line-up is the LAMBDA 750k, one of the most compact X-ray detectors on the market. In the latest development, the well-known wedge-shaped case of the LAMBDA 750k has been redesigned to create an even more compact box that measures just 235 x 100 x 42 mm3 – about half the length and height of the previous version.
The X-Spectrum engineers have worked hard to develop an efficient cooling system that fits inside this very small package. That will enable users to install the LAMBDA 750k X-ray in even the most crowded beamlines and tight experimental set-ups.
The LAMBDA 750k is one version of a family of next-generation pixel detectors for X-rays, all based on Medipix3 technology. The LAMBDA series are photon-counting detectors, effectively making them noise free, and offer speeds of up to 23,000 frames per second (with no readout dead-time) along with a small pixel size of 55 µm. A variety of sizes and configurations are available for different applications, and each one can be equipped with different sensor materials to allow high detection efficiency even at high X-ray energies.
The LAMBDA system also has “colour imaging” capabilities, where X-rays hitting the detector can be divided into two energy ranges. Originally developed by DESY in Hamburg, Germany, for use at the PETRA-III synchrotron, the system is designed for high reliability, and has external triggering and gating capability for synchronization with the rest of the experiment. It can also be easily integrated into common beamline control systems.
In most types of medical imaging, the aim is to gain information on the internal structure of the body, looking for growths, tumours or other abnormalities. In doing so, medical practitioners gain critical information that can be used in treatment.
However, for many illnesses we must move beyond simple structure and learn about the way in which an organ functions. This is particularly important for assessing disorders of the brain – such as epilepsy, dementia or mental health problems – where a structural image, no matter how detailed, often looks “normal”, even in patients with profound difficulties. In such instances it is the function of the brain that is perturbed. To gain insight, we must therefore develop technologies that image what the brain is doing, which means developing ways to assess the electrical activity of its 90 billion or so neurons.
For many illnesses we must move beyond simple structure and learn about the way in which an organ functions
One method of imaging brain function is magneto-encephalography (MEG) – a relatively new technology in which we measure the magnetic fields generated by current flowing through neuronal assemblies. Mathematical modelling of these fields generates 3D images showing moment-to-moment fluctuations in neural current (figure 1a). In this way, MEG is a safe and non-invasive way of imaging brain networks as they form and dissolve to support cognition. By letting us probe brain function, MEG is used extensively in research to investigate and understand both the healthy brain, and its perturbation by disease (figure 1b).
1 The power of magnetoencephalography (MEG)a When someone moves their fingers, clusters of neurons in the brain’s primary motor cortex are rendered active, and the resulting current passing through them generates magnetic fields. The component of this field that is radial to the head can be measured by MEG. In the visualization (left), red represents a field going away from the head, blue represents a field going into the head. Via a mathematical modelling process, called source reconstruction, we can use these fields to construct 3D images showing moment-to-moment changes in neural current (right). The yellow blobs show areas of fluctuating neural current during the finger movement. With MEG, brain activity can be resolved not only in space but also in time. Electrical brain activity can be broken down into oscillations at characteristic frequencies (known as neural oscillations or brain waves). These figures show modulation of the amplitudes of electrical oscillations at different frequencies over time, occurring in the primary motor cortex as someone moves their finger. Blue shows a decrease in a neural rhythm relative to baseline, red shows an increase. While a subject moves their hand there is a decrease in beta (~20 Hz) oscillations, which then returns to baseline, post movement, via a ‘rebound’ (i.e. the signal goes above baseline). This rebound is abnormal in patients with schizophrenia and likely indicates a lack of communication between the primary motor cortex and other brain regions in patients (Courtesy: Elena Boto, University of Nottingham; CC BY 4.0 Robson et al. adapted from NeuroImage: Clinical12 869).
Despite excellent promise, the current generation of MEG scanners are severely limited, preventing their widespread adoption. The fields generated by the brain are low amplitude (~100 fT – about a billionth of the Earth’s field) meaning high-sensitivity detectors are required. For many years the only viable option was the superconducting quantum interference device (SQUID) – a cryogenic sensor that relies on quantum tunnelling through an insulating gap between two superconductors (the Josephson effect). The tunnelling current is a function of magnetic flux through the SQUID.
Despite excellent promise, the current generation of MEG scanners are severely limited, preventing their widespread adoption
However, to maintain their superconductivity, SQUIDs must be cooled to –269 °C, which limits the design and deployment of MEG scanners. First, because they operate at such low temperatures, a thermally insulating gap must be maintained between the sensor and the patient’s head to prevent injury. Because magnetic field decays with distance squared, this gap limits sensitivity to the brain’s magnetic field. Second, cryogenics mean that sensors must be fixed in position above the head inside a cryogenic dewar, which means that if a patient moves their head relative to the scanner, the quality of the data goes down drastically. Just a 5 mm shift can render data useless, and many people cannot tolerate this environment. The fixed nature of the sensors also results in a one-size-fits-all helmet. This is a significant barrier to scanning young children and babies, since the helmet is much too large. Finally, the complex combination of SQUID sensors, control electronics and cryogenics makes MEG expensive (typically a price tag of more than £2m plus high running costs).
A quantum revolution
In recent years our research team has tackled MEG’s limitations by exploiting newly developed quantum technology. (This is a collaborative effort by the University of Nottingham’s Sir Peter Mansfield Imaging Centre and the Wellcome Centre for Human Neuroimaging at University College London, funded by the UK National Quantum Technologies programme and Wellcome.) In particular, the development of optically pumped magnetometers (OPMs) has been key. These are quantum enabled magnetic field sensors that offer similar sensitivity to SQUIDs but without the need for cryogenics (figure 2).
2 Optically pumped magnetometers: the basics Each optically pumped magnetometer (OPM) contains a vapour of rubidium-87 atoms enclosed within a glass cell (a). When a beam of circularly polarized light that is resonant with an atomic transition (the D1 line) is directed through the vapour, it “pumps” the rubidium atoms into a quantum state in which their angular momentum is aligned with the beam. Because each atom has a magnetic moment linked to its angular momentum, the spin-polarized atomic vapour has a net magnetization that is highly sensitive to external magnetic fields. With all atoms in the same state and polarization induced, no further absorption of photons can occur, and the intensity of light passing through the cell from the laser is maximized. However, if polarization drops due to, for example, the interaction with an external magnetic field, light is absorbed again (b), and the signal at the photodetector decreases. Unfortunately, polarization is fleeting since atoms naturally “relax”. In a zero-field setting, this is mostly due to spin-exchange collisions, which cause a loss of coherence between precessing spins. In order to maintain a magnetically sensitive state, the effect of this relaxation must be counteracted. Perhaps counterintuitively, this is achieved by increasing the density of the vapour, and thus the rate of spin-exchange collisions. With lots of collisions in a very low magnetic field, the spins do not have enough time to decohere between collisions, meaning that polarization can be maintained along with sensitivity to external magnetic fields. This is called the spin-exchange relaxation free (SERF) regime. In the SERF regime, the bulk magnetic moment of the polarized gas obeys the Bloch equations – a set of equations describing the evolution of the macroscopic magnetization of a system as a function of time. The effect of an external field thus becomes well characterized, showing that polarization of the vapour – measured via light intensity passing through the cell – is a Lorentzian function of external field. Unfortunately, the symmetry of the Lorentzian means that positive and negative fields have the same effect on the vapour, but this is remedied by applying a second, known, external field. Specifically, applying an external oscillating field makes the polarization oscillate with an amplitude that is a linear function of small (<±1 nT) changes in the average field through the cell. Consequently, lock-in detection of the oscillation of the intensity of light passing through the cell provides a signal that varies linearly with field strength, producing a highly sensitive magnetometer.
Recent commercialization by the US company QuSpin has made OPMs robust, easy to use and readily available, while miniaturization has made the most recent generation small and lightweight (similar to a Lego brick in both size and weight). Based on this new design, our team has integrated OPMs into a working prototype MEG device. Because they are so small and don’t need cryogenics, OPMs can be mounted directly on the surface of a human head, increasing sensitivity by removing the thermally insulating gap and getting the sensor closer to the brain.
Based on this new design, our team has integrated optically pumped magnetometers into a working prototype MEG device
This also allows the sensor array to move with the head, making the MEG measurement resilient to subject motion. Similarly, flexibility of OPM placement means an array can adapt to any head size, enabling babies and children as well as adults to be scanned with the same system. The lack of complex cryogenics also means that OPM-based MEG systems are ostensibly cheaper to produce and run. This technology therefore allows MEG to evolve, making systems more practical, more powerful, significantly cheaper and consequently much more suitable for clinical use.
However, significant barriers had to be overcome before OPMs could be used in practice, and one of the biggest was controlling background magnetic fields. The field from the brain is much smaller than the time-varying fields that exist all around us, generated by laboratory equipment, computers, passing cars and even our bodies. Such sources create a cacophony of magnetic interference that makes measuring magnetic fields from the brain akin to hearing a pin drop at a rock concert.
Quantum toys Miniaturization has made each optically pumped magnetometer (OPM) sensor similar in size and weight to a Lego brick. These sensors are manufactured by QuSpin in Colorado, US. (Courtesy: Matthew Brookes, University of Nottingham; Lisa Gilligan-Lee, University of Nottingham)
To complicate matters, there is also the Earth’s magnetic field. It has no impact on SQUIDs because they are sensitive only to time-varying fields – and of course the Earth’s field does not move. But with OPMs we want patients to move freely during a scan, and this means allowing magnetometers to move relative to the Earth’s field. The moving OPMs will therefore measure magnetic field changes that have nothing to do with the brain, because they are rotating in a uniform field (or translating in a field gradient). In some circumstances, this can even prevent OPMs from operating.
For these reasons, both the time-varying environmental magnetic fields and the Earth’s static magnetic field must be removed in the vicinity of the OPM measurement. We achieved this high-fidelity control through a combination of active and passive magnetic screening.
Passive screening involves placing the MEG system in a magnetically shielded room with walls made from multiple layers of high-permeability metal. However, even the most advanced passive shield cannot sufficiently suppress the remnant Earth’s field so that patient movement can be allowed, and for this reason active field compensation systems have been introduced.
3 Active field control Helmholtz coils are devices that can create, or cancel out, a magnetic field in a region . But if we simply used them to control the external fields, the need to generate a complete 3D field vector (i.e. in three orthogonal orientations) would mean that the coils would have to completely surround and enclose the subject. Instead, we constrain coils to two planar systems, where both sets are made up of eight independent coils stacked together. Using techniques adapted from MRI gradient coil design, each coil is constructed to generate a different field or field gradient (e.g. Bx, By, Bz, dBx/dx, dBx/dy) over a central region. This produces a similar performance to that of a Helmholtz cage, but without restricting the subject. These ‘fingerprint’ coils (NeuroImage181 760) are mounted on to two planes, each with an area of 1.6 1.6 m2, separated by 1.5 m. The complexity of the wire paths and the scale of the system means that fabrication of such a design is challenging. Yet easier access and less constraint to the subject makes the system much more practical for MEG studies. Importantly, these designs have other applications; similar techniques are being used to shield quantum gravimeters for advanced geophysical mapping of underground structures. (Courtesy: Niall Holmes, University of Nottingham; Lisa Gilligan-Lee, University of Nottingham)
These consist of intricate electromagnetic coil systems connected to a separate magnetometer array measuring the field close to the patient’s head. A feedback loop then uses these field measurements to generate an inverse field by controlling the currents in the coil array. In combination, these cancel the residual magnetic field almost entirely in the region of the patient’s head (figure 3). Working with industrial partner Magnetic Shields, we have built a unique hybrid active and passive shield in which the Earth’s field surrounding the patient is reduced from 50 μT to ~200 pT, a shielding factor of ~250,000. This technology generates a region of space around 0.5 m3 in which a patient can move freely during a scan.
This technology generates a region of space around 0.5 m3 in which a patient can move freely during a scan
To make the OPM-MEG system a reality we also had to solve a number of other problems. The exact arrangement of sensors on the head was optimized by a combination of analytical analysis and computer simulation. This allows an OPM array to efficiently sample the brain’s magnetic field while also minimizing “cross talk” – an effect where the measurement of field at one sensor is disrupted by the presence of other sensors. Meanwhile, physically holding an array of OPM sensors on the head was achieved by advanced 3D printing. Electronic control and data acquisition systems were also developed to enable synchronized measurements from up to 50 OPMs, while separate control systems were needed to integrate OPM outputs with coil systems and the control of patient stimuli. Finally, mathematical modelling packages were redeveloped to allow imaging of current density in the brain based on the OPM scalp-level field measurements. These developments have led to the world’s first wearable OPM-MEG system, giving complete coverage of the whole brain (figure 4).
4 The new generation of MEG Conventional MEG scanners (left) are cumbersome, one-size-fits-all devices that require patients to keep very still. Via a pathway of critical developments, physicists at the University of Nottingham, collaborating with neuroscientists at University College London, have used optically pumped magnetometers (OPMs) to create a new generation of MEG – a wearable device, adaptable to anyone, in which subjects are free to move, and which gives better data quality. (Courtesy: University of Nottingham)
Wearable imaging
This unique combination of quantum technology and electromagnetic theory makes it possible to conduct previously unimaginable neuroimaging studies where subjects can move freely and interact naturally with the world. Even at this early stage, neuroscientific demonstrations have ranged from measuring brain activity as someone plays ping-pong, to making MEG recordings while a subject explores a virtual world (figure 5).
Laboratories around the world are trying to gain access to this new technology
With ongoing rapid exploitation, these examples are just the beginning; laboratories around the world are trying to gain access to this new technology, and a recently established spin-out company – Cerca Magnetics – is now bringing an integrated OPM-MEG system to the research market. Much of the ongoing technical development is now focused on imaging children’s brains, opening up the possibility of new studies on neurodevelopment. For example, imagine being able to see how a child’s brain function changes before and after they can walk, or talk. This opens a myriad of opportunities, prompting neuroscientists to conceive experiments in a completely new way.
5 Neuroscientific demonstrationsa Brain activity evoked when playing a ball game. Neural oscillations in the ~20 Hz band modulate, with the largest effect centred on the brain regions controlling wrist and arm movement (Nature555 657, reused with permission of Springer Nature). b To test a virtual-reality environment, here the participant finds themselves in a virtual room and must lean around a post to see a chess board in the distance. If the chess board appears on the left, activity is seen in the right visual cortex (red overlay). If the chess board appears on the right, brain activity is seen in left visual cortex (blue overlay) (Neuroimage199 408, reused with permission of Elsevier; Ben McGeorge Henderon). c Brain activity measured in a two-year-old subject during a sensory task (CC BY 4.0 Nature Communications10 4785).
Neuroscience aside, perhaps the ultimate marker of success for OPM-MEG will rest in its clinical applications. Conventional MEG can be used to diagnose epilepsy by identifying specific types of “spike and wave” activity. It is also already used when “resective” surgery (removing the region of the brain causing seizures) is required to treat epilepsy that cannot be controlled by drugs – a MEG scan locates the affected area, significantly increasing the chances of a successful surgery.
Understanding and managing human brain health is one of the major scientific challenges of the 21st century
MEG can also be used to map the “eloquent cortex” (areas of the brain that function normally) around the epileptic focus. This provides neurosurgeons with valuable information – for example, mapping the location in the brain of motor function allows a surgeon to avoid those regions, and so prevent paralyzing the patient. Now, however, the flexibility of OPM-MEG offers the chance to enable young children who suffer with epilepsy to also benefit as it avoids some of the limitations of conventional imaging techniques such as EEG and MRI, which can struggle to localize where the problem is. Motion robustness also ostensibly enables us to record brain activity during a seizure. For epilepsy patients, OPM-MEG is therefore extremely promising.
But it could be applied to other disorders too. For example, this same practicality provides a better means of assessing individuals with Parkinson’s disease who struggle to remain still enough for conventional systems. Moreover, one in four people suffer from a mental health condition at some point in their lives and nascent recordings show that OPM-MEG can measure the connections in the brain that are thought to break down in severe mental health disorders. Meanwhile, characterization of “cortical slowing” (an effect where neural oscillations shift in frequency) in the elderly may generate new and early markers of the onset of dementia. These are just some of the brain disorders that may benefit from this new device.
Ultimately, understanding and managing human brain health is one of the major scientific challenges of the 21st century, and the steps needed to meet that challenge are far from clear. However, from X-rays to MRI, ultrasound to nuclear medicine, physics has always been able to deliver technology that changes lives. As we look to the future, perhaps this early success story of the UK’s quantum technology programme will become a cornerstone of the next generation of healthcare technology.
Robotic platforms enable surgeons to perform complex – usually minimally invasive – procedures with more precision, flexibility and control than possible using conventional techniques. To maximize its effectiveness, minimally invasive surgery requires not only well-engineered robotic instruments, but also precise target definition, achieved via image-guidance technologies.
Prior to surgery in complex anatomical regions, nuclear medicine techniques such as SPECT/CT or PET/CT can be used to create a map of the patient showing the number and location of surgical targets, such as tumours or tumour-related lymph nodes. Such maps are generated by detecting injected radiotracer that localizes in tumour cells.
The same radiotracer also enables “radio-guidance” during robot-assisted, minimally-invasive surgery. To detect the radioactive signal within the patient during the surgery itself, surgeons can use a new device known as a DROP-IN gamma probe, which was developed specifically for minimally invasive robotic procedures. The DROP-IN is a small tethered probe that detects gamma rays and generates a sound (and numeric readout) when it is close to a lesion containing radiotracer. The probe is positioned by the surgeon via the robot and is far more flexible than traditional laparoscopic gamma probes, allowing better localization of target lesions.
“The DROP-IN gamma probe is something we’ve been working on since 2014,” says Matthias van Oosterom from the Interventional Molecular Imaging laboratory at Leiden University Medical Center. “Since roughly 2018, we have been performing proof-of-concept studies in patients using this probe and have evaluated it in over 40 patients so far.”
Despite the availability of a patient map and intraoperative confirmation with the DROP-IN probe, it can still remain challenging to find an efficient route through the patient during surgery, especially when the surgical targets are seated at deep and complex locations. “In analogy to GPS navigation integrated in most modern cars, navigation is expected to take surgical localization to the next level,” says van Oosterom.
The challenge with surgical navigation, however, is that locating the exact position of the surgical instruments within the map of the patient (the SPECT/CT scan) is difficult, as many tool-tracking technologies aren’t suitable for use in a laparoscopic setting. To address this, van Oosterom and colleagues investigated the use of real-time fluorescence-based optical navigation of a DROP-IN gamma probe during robotic surgery. They tested the approach in phantoms and in pigs, reporting their results in the Journal of Nuclear Medicine.
Fluorescence tracking
To ease clinical translation, the team combined the DROP-IN gamma probe with two clinically approved technologies: a Da Vinci robot equipped with a Firefly Si fluorescence laparoscope; and a declipseSPECT navigation system.
To track the DROP-IN gamma probe, the researchers marked its housing with a three-ring pattern of fluorescein, which fluoresces in yellow. Automated detection of this fluorescence allowed the pose of the DROP-IN tip to be estimated with respect to the Firefly laparoscope. The declipseSPECT was then used to determine the pose of the Firefly and the patient within the operating room, based on reference targets attached to the Firefly and the patient.
The researchers first tested their navigation concept using a torso phantom containing bones, artery structures and pelvic lymph nodes filled with radioactive tracer. They fixed the navigation reference target on the phantom’s hip and acquired three SPECT/CT scans, using the reference target to register the scans within the operating room.
The view inside the phantom, as seen on the robotic surgical console, included an augmented reality overlay of the lesions segmented from the SPECT/CT scan. Maintaining a direct line-of-sight between the Firefly and the DROP-IN enabled real-time calculation of the distance between the targeted lesion and the DROP-IN tip, with audible feedback confirming effective navigation.
To investigate DROP-IN tracking in a real-life surgical setting, the researchers next performed robot-assisted laparoscopic surgery on pigs, in the surgical training facility of Orsi Academy. By depositing indocyanine green (ICG, which fluoresces in pink) in the abdominal wall they created surgical targets to which the DROP-IN probe was navigated.
In fluorescence-imaging mode, both the fluorescein markers on the DROP-IN probe and the ICG deposits were visible. Uniquely, fluorescein and ICG fluorescence could be excited simultaneously and the emissions distinguished using the raw signal output of the Firefly camera, allowing the surgeon to visually confirm the localized lesions during navigation. There was no obvious difference in intensity between the two fluorescence signals, and bleaching of the fluorescein rings was not observed during the hour-long experiments.
The researchers conclude that these first steps towards optical navigation of the DROP-IN gamma probe could further integrate interventional nuclear medicine into robotic surgery. The first application could be in prostate cancer treatments, where robotic surgery is already well established, says van Oosterom. He notes that radio-guided surgery using a radiotracer that targets prostate specific membrane antigen (PSMA), is particularly promising. “The DROP-IN gamma probe allows the surgeon to operate on these patients with the robot, to specifically excise tumour cells using the PSMA-targeted tracer,” he explains.
The team has already demonstrated that the DROP-IN works well for robot-assisted radio-guided lymph node dissection in prostate cancer surgery, as described in European Urology. “Following our successful proof-of-concept studies, we are currently extending in-human trials that evaluate the PSMA-targeted robotic surgery,” van Oosterom tells Physics World. “Current findings suggest that our optical navigated DROP-IN gamma probe setup will provide additional value in these in-human studies.”
Testing times All about a new quantum-enabled MEG scanning technique
In most types of medical imaging, the name of the game is to find out about the internal structure of the body, looking for growths, tumours or other abnormalities. But for many illnesses, simple structural information is not enough: you need to know how an organ functions too.
That’s particularly important when assessing disorders of the brain – such as epilepsy, dementia or problems with mental health – where a structural image, no matter how detailed, often looks “normal”, even in patients with profound difficulties.
One method of imaging brain function is magnetoencephalography (MEG), which traditionally use superconducting quantum interference devices (SQUIDs) to measure the tiny magnetic fields created from neuronal assemblies. Trouble is, these devices have to be cooled to –269 °C, which is one reason why MEG scanners are so expensive.
In the February 2021 issue of Physics World magazine, Hannah Coleman and Matt Brookes from the University of Nottingham in the UK explain how “optically pumped magnetometers” could allow MEG to be more widely used. These quantum-enabled magnetic devices are as sensitive as SQUIDs – but don’t need any fiddly cryogenics. You can also read the article online here.
For the record, here’s a run-down of what else is in the issue.
• Radio offers view of gravitational waves – New limits on the size of primordial gravitational waves have been set that are several orders of magnitude lower than the most sensitive laboratory experiment, as Edwin Cartlidge reports
• Physicists welcome Brexit deal – While there is relief that the UK has reached a deal with the European Union, the true impact of the agreement will be difficult to unravel given the ongoing pandemic, as Michael Allen reports
• Protect the scientists of tomorrow – Karel Green says that funders must support all PhD students due to the devastating effects on research caused by the COVID-19 pandemic
• Let’s go green – James McKenzie believes the UK government’s ambitious 10-point-plan for a “green industrial revolution” can deliver – if we put our collective minds to the problem
• Very deep thinking – Robert P Crease discovers why a nuclear-waste programme in Finland can help us to envisage the world a million years from now
• Quantum sensing the brain – Novel technology in healthcare based on fundamental physics saves millions of lives every year but while these machines have revolutionized medicine, the next generation must meet even greater challenges. Hannah Coleman and Matt Brookes are hoping that the University of Nottingham’s new quantum-enabled MEG scanner will herald a new dawn in the study of human brain function
• Improving nuclear fuel safety – A decade after the Fukushima disaster, Michael Allen investigates claims from academia and industry that the next generation of nuclear fuels could reduce the risk of similar accidents occurring ever again
• The beams at the edge of physics – Creating a new cutting-edge accelerator isn’t cheap or easy. But as Kit Chapman discovers, the upcoming Facility for Rare Isotope Beams (FRIB) in Michigan promises great things for nuclear physicists especially those with applications in mind
• Rudiments of reality – Philip Ball reviews Fundamentals: Ten Keys to Reality by Frank Wilczek
• Reinventing the science museum – Margaret Harris reviews Idea Colliders: the Future of Science Museums by Michael John Gorman
• Building a quantum powered future – Quantum physicist and chief executive of Oxford Quantum Circuits Ilana Wisby talks to Tushna Commissariat about deep tech, entrepreneurship and quantum technologies of the future
• Islands in the stream – Logan Chipkin on how other galaxies were discovered
Intense “blue jets” that form during thunderstorms and extend into the stratosphere have been studied in detail by researchers in Denmark, Norway and Spain. Led by Torsten Neubert at the Technical University of Denmark, the team used thunderstorm observations taken aboard the International Space Station (ISS) to work out how the jets form, and how they influence processes higher up in the atmosphere.
Blue jets are short-lived, lightning-like electrical discharges that can appear in the upper reaches of storm clouds. They occur when the potential difference between positively-charged upper cloud regions and negatively-charged boundary layers between the cloud and the stratosphere, exceeds a certain breakdown voltage. At this point, previously insulating air will conduct intense electrical currents.
Within these environments, the jets begin their lives as channels of ionized air called “leaders”. These channels have two propagating tips, which travel in opposite directions due to their opposing charges. As the positive tip propagates upwards towards the negatively charged boundary layer, it transitions into branching discharges called “streamers”, which fan out to form cone-shaped structures that are blue jets.
Electric field pulses
Previous studies have associated blue jets with strong electric field pulses that last between 10-30 µs, and are accompanied by intense bursts of radio waves. Until now, however, researchers had not fully characterized how this behaviour unfolds.
Neubert’s team studied this phenomenon using observations from the ISS, which offers an unimpeded view of the tops of thunderclouds. Measurements were made using the ISS’s Atmosphere-Space Interactions Monitor (ASIM) instrument, which contains three photometers, that measure the intensity of electromagnetic radiation emitted by different atmospheric gases during lightning flashes. In addition, ASIM has two cameras that take images of flashes.
The team used ASIM to monitor a thunderstorm in the central Pacific Ocean in February 2019. During the event, they recorded five intense blue flashes, each lasting around 10 µs. From their measurements, they discovered that the flashes initiated pulsating blue jets between the stratosphere and the ionosphere – the region of Earth’s upper atmosphere where air molecules are ionized by solar radiation.
Neubert’s team concluded that these jets occur as branching streamers are initiated by short, localised leaders, like those found in more familiar cloud-to-ground lightning. As they fanned out, these streamers unleashed waves of ionization in the surrounding air. In addition, the researchers showed how blue jets are accompanied by “elves” in the ionosphere. These dim, expanding glows occur around 100 km above the ground, and can expand to hundreds of kilometres in diameter. Overall, the team’s results offer researchers a far greater understanding of how these dynamic events unfold.