A team of US-based researchers has created an innovative robotic platform that can remotely measure hospital patients’ vital signs – and could help to significantly reduce the infection risk faced by healthcare workers assessing people with symptoms of COVID-19.
The platform consists of four cameras attached to a dog-like robot developed by Boston Dynamics. The robots, which are operated via a remote handheld device, can also be equipped with a tablet that allows doctors to ask patients about their symptoms without the need to be in the same room.
The results of the research have been published on the preprint server TechRxiv, but have not yet been peer-reviewed by scientific or medical experts. The paper describes how researchers based at Massachusetts Institute of Technology (MIT), Boston Dynamics and Brigham and Women’s Hospital used the robotic platform to measure vital signs – including skin temperature, breathing rate, pulse rate and blood oxygen saturation – in healthy volunteers, from a distance of two metres. The team is now making plans to test the robot’s efficacy in patients with COVID-19 symptoms.
Vital signs
As Hen-Wei Huang, post-doctoral researcher at MIT and one of the lead authors, explains, the measurement of vital signs is an “essential aspect of the initial patient clinical evaluation”. This is particularly true in the case of COVID-19, which is often associated with significant changes in vital signs, including fever, which can be detected via elevated skin temperature, and shortness of breath, which can be detected through measurement of the respiratory rate, as well as an increase in heart rate and a decrease in blood oxygen saturation.
“Identifying abnormal vital signs like these helps clinicians to triage patients and determine who needs the most urgent care,” says Huang. “However, current standard procedures require healthcare workers to put sensors and devices on patients, thus increasing the risk of the workers being infected with COVID-19.”
Using the robotic platform to measure the patient’s vital signs can mitigate the risk of spreading infection and reduce the consumption of personal protective equipment. Moreover, an agile mobile robot allows measurements to be taken in a dynamic indoor environment, such as an emergency department.
Next steps
In Huang’s view, one key advantage of using the robotic platform in a clinical environment is that the camera setup enables simultaneous monitoring of four vital signs that are highly relevant to COVID symptoms.
“As the setup is mobile and enabled with artificial intelligence, the robot would dominate the measurement procedures instead of asking patients to adapt to it. Moreover, the agile platform could follow and track patients’ movements, and thus give much freedom to patients during the measurement,” he explains.
Following an initial evaluation of the platform on healthy volunteers, Huang and his team are now amplifying efforts to evaluate patients presenting with symptoms and signs of concern for COVID-19.
“Our next steps include development of a fully autonomous robotic system that can handle the vital signs monitoring by itself without intervention from the clinical staff,” he says. “In addition, we aim to further reduce the cost to maximize the cost-effectiveness of the system and its reach globally.”
Researchers at ETH Zurich in Switzerland have tracked the ultrafast movements of electrons in liquid water for the first time – an important step towards understanding the fine details of how chemical and biochemical reactions begin in liquid environments.
According to the classical Bohr model of the atom, an electron takes roughly 150 attoseconds (10-18s) to orbit the proton in a hydrogen atom. Measurements on a timescale of a few dozen attoseconds have been possible for several years now, notes team leader Hans Jakob Wörner of ETH’s Laboratory of Physical Chemistry, and even at lower resolutions of 10 femtoseconds (10-15 s) it is already possible to observe certain subatomic-scale processes – up to and including the breaking of chemical bonds. “Electron movements are the key events in chemical reactions,” he says. “That’s why it’s so important to measure them on a high-resolution time scale.”
Wörner’s team were among the first to detect electron movements at the attosecond scale. However, because the wavelength of their attosecond pulses is in the extreme ultraviolet (XUV) range, which is easily absorbed by air, they had to carry out their measurements in a vacuum chamber. This restriction meant that they were only able to perform attosecond measurements on molecules in the gaseous state.
Novel measuring scheme
The researchers have now developed a novel measuring scheme that relies on comparing photoemissions (that is, emissions of electrons prompted by light striking a material) from liquid water with those of water vapour. In their experiment, they inject a microscopic jet of liquid water into their vacuum chamber through a quartz nozzle with an inner diameter of roughly 25 mm. This jet flows smoothly (laminar flow) for a few millimetres before breaking up into droplets, which then freeze upon contact with a liquid-nitrogen cold trap. “The laminar-flow region of the microjet allows us to study liquid water in a quasi-equilibrium state, very close to room temperature,” Wörner explains.
To make their attosecond measurements, the team focus an XUV train of attosecond pulses (obtained from a 30-femtosecond near-IR laser pulse in an argon gas cell) onto the evaporating water. In addition to this XUV pulse train, they also focus a strongly attenuated replica of the IR pulse onto the liquid microjet. This set-up allows them to detect electrons photoemitted from the liquid and gas phases at the same time.
The ETH Zurich researchers observed that photoelectrons from liquid water molecules are emitted 50-70 attoseconds after those emitted from gaseous water molecules. This time difference stems from the fact that the molecules in liquid form are surrounded by other water molecules, Wörner explains. This affects the molecules’ electronic structure, producing a measurable time delay.
An important step forward
Going from measurements in gases to measurements in liquids is an important step forward, since most chemical reactions – especially those that are biochemically interesting – take place in liquids, Wörner notes. Reactions such as photosynthesis in plants and the biochemical processes on our retina that allow us to see are also triggered by light, just like photoemission in water – which is also the dominant source of DNA damage due to X-rays or other ionizing radiation. “With the help of attosecond measurements, scientists should gain new insights into the most elementary steps of these processes in the coming years,” he says.
A new class of stars called “fast yellow pulsating supergiants” has been identified by astronomers in the US and Switzerland. The discovery could solve the “red supergiant problem” of astrophysics, which refers to the lack of observations of Type IIP supernova progenitor stars with masses in the range of 16–30 solar masses.
Stars heavier than about 8 solar masses are thought to spent their final phase of life as red supergiants, before undergoing core collapse and exploding as supernovae. The mass of a Type IIP supernova progenitor can be determined by measuring the star’s brightness just before it collapses – which itself occurs before the star explodes. Although red supergiants have been observed in the 16–30 solar masses range, none so far have been identified as progenitors of Type IIP supernovae. This contradiction of the current theory of stellar evolution is known as the red supergiant problem.
It seems that only the lower-mass RSGs explode, which raises the question: what is the fate of the more massive RSGs? One possibility is that many RSGs evolve back to their previous yellow or blue stages of their lifecycle. Such post-RSGs would end their lives as something other than RSGs, thereby resolving the red supergiant problem.
Evolving to yellow and blue
Recently, a team of astronomers sought to observe such post-RSGs. Trevor Dorn-Wallenstein at the University of Washington and colleagues reasoned that stars called pulsating yellowsupergiants might be candidates for post-RSGs. If an RSG were to lose enough mass, it would evolve towards yellow and blue on the Hertzsprung-Russell (HR) diagram and pulsate noticeably. The HR diagram plots stars’ luminosity against their effective temperature.
“The red supergiant problem has been around for years,” notes Philip Massey of the Lowell Observatory in Arizona, who was not involved in this latest research. “One explanation is that these stars ‘turn around’ and return to the blue side of the HR diagram. Then the question becomes: how do you distinguish between a post-red supergiant star from a pre-red supergiant star? One way to do it would be to look for pulsations – only post-RSGs should display unusual pulsations. And that’s just what [Dorn-Wallenstein and colleagues] did.”
The team used data from the Transiting Exoplanet Survey Satellite (TESS), which collects light curves of the brightest stars across 85% of the sky. TESS takes observations every two minutes and light curves are graphs that show the brightness of a star as a function of time. Dorn-Wallenstein and colleagues analysed the variability of 76 supergiants in their search for stars that have the predicted properties of post-RSGs.
New class of supergiants
Their search revealed a group of five yellow supergiants that show rapid multiperiodic variability with periods of less than one day. These stars are also more luminous and warmer than typical Cepheid variables (stars that fluctuate periodically in brightness), fainter than out-bursting yellow “hypergiants”, and cooler than the coolest Alpha Cygni variables, which are another type of supergiant. The five yellow supergiants are also concentrated on a region in an HR diagram not previously associated with pulsating stars. As a result, Dorn-Wallenstein and colleagues say that the stars appear to belong to a class of never-before-seen supergiants, which they have called fast yellow pulsating supergiants (FYPSs).
“They established that this variability could not have been a coincidence,” says Massey. “All massive stars have some variability, but they made a convincing case that this is a unique set of objects.”
Tantalizingly for the red supergiant problem, the lowest estimated mass of the observed fast yellow pulsating supergiants is close to the highest mass of the red supergiant supernova progenitors. Thus, rather than immediately dying in supernovae, it is plausible that the more massive RSGs evolve into FYPSs as part of their lifecycle.
Dorn-Wallenstein and colleagues leave it to future work to determine the exact evolutionary status of these FYPSs. While these objects may indeed have evolved from RSGs, more theoretical and observational work needs to be done to tell whether or not these FYPSs are definitively the solution to the red supergiant problem.
The news last week that scientists had spotted a potential signature of life in the clouds of Venus was always likely to cause a stir. But arriving the middle of the COVID-19 pandemic – during which our everyday lives have changed significantly – the story has truly captured the public imagination. In the latest episode of the Physics World Stories podcast, Andrew Glester takes a broad view of the discovery: an inspiring example of lateral thinking, persistence and collaboration.
The deduction that Venus could be harbouring life is linked with the detection of phosphine gas in the planet’s atmosphere. For terrestrial planets such as Venus and Earth the only known processes to generate phosphine in such a location are connected with metabolism. To learn more about astrobiology, Glester catches up with two members of the team behind the discovery, both based at Massachusetts Institute of Technology.
Clara Sousa-Silva is a quantum astrochemist who for over a decade has studied phosphine as a potential signature for extraterrestrial life. She is joined by Sara Seager, an astronomer and planetary scientist, who among other things speaks about future missions to Venus to help resolve this mystery. As both researchers explain, the “life hypothesis” came as a last resort following a rigorous search for alternative explanations.
The first and largest study of focal high-intensity focused ultrasound (HIFU) ablation as a primary treatment for prostate cancer in the United States showed that HIFU provides an effective alternative to surgery or radiotherapy. Mirroring previous research conducted in Europe, the study of 100 men revealed encouraging outcomes and shortened patients’ recovery times.
Focal HIFU ablation uses a focused ultrasound beam to raise the temperature inside the prostate to approximately 90°C to destroy targeted areas of prostate tissue. HIFU hemigland ablation, which treats targeted and known areas of prostate cancer, has been used successfully worldwide, with few of the debilitating urinary and sexual side-effects associated with standard treatments such as whole-gland radical prostatectomy and radical radiotherapy.
HIFU was approved for prostatic tissue ablation by the US Food and Drug Administration in 2015. Following this, the Keck School of Medicine of the University of Southern California offered it as a primary treatment for localized prostate cancer and launched the study to assess men who underwent a HIFU procedure for prostate cancer between 2015 and 2019.
The patients, ranging in age from 59 to 70, had very low (8%), low (20%), intermediate favourable (50%), intermediate unfavourable (17%) and high (5%) risk prostate cancer. Only 13 of these experienced minor complications after hemigland HIFU, such as difficulties with urination and urinary tract infection. No patients had major complications. Half of the patients completed quality-of-life questionnaires and reported improved urinary symptoms with no significant decrease in sexual potency. None experienced urinary incontinence.
Writing in the Journal of Urology, first author Andre Abreu, of USC Urology, and colleagues report that in a median of 20 months of follow-up, 73% of the patients did not experience treatment failure, defined as clinically significant cancer recurrence, metastases or mortality, or the need for additional hormone therapy, chemotherapy, radiotherapy or surgery. Bilateral prostate cancer at the time of diagnosis was the sole predictor for recurrence. Ninety percent of patients did not require repeat focal HIFU, and 91% did not need any type of radical (whole-gland) treatment within the study timeframe.
The researchers also emphasize the importance of mandatory follow-up biopsies. This recommendation is based on data showing that multiparametric MRI has a sensitivity of only 44% for detecting clinically significant prostate cancer, despite it being the preferred imaging modality for follow-up after partial gland ablation.
“Focal HIFU ablation is safe and provides excellent potency and continence preservation with adequate short-term cancer control,” Abreu tells Physics World. “This study provides the initial US HIFU data to prostate cancer stakeholders, including clinicians and patients. We hope these positive data encourage urologists to offer focal HIFU ablation to effectively address prostate cancer without the intrinsic side-effects of radical treatments. Physicians should discuss in detail with their patients about all potential treatment options for prostate cancer, to ensure that they receive personalized care that addresses patients’ individual needs according to the disease they have.”
The Proton Improvement Plan-II (PIP-II) will be the first accelerator in the US with major contributions from other countries – India, UK, Poland, Italy and France. Is working with major international partners new for Fermilab?
Particle physics is a truly global enterprise because of the sheer size and complexity of projects and the expertise and resources needed. As a result, building and maintaining major facilities is often beyond the capability of a single country. Fermilab is therefore very used to collaborating internationally with partners. We have been contributing to CERN for many years, for example, and we are now building magnets for the high-luminosity upgrade project on CERN’s Large Hadron Collider.
What is new about PIP-II is that it is the first US Department of Energy (DOE) funded accelerator that will be built with significant international participation. Indeed, without our global partners, we could not build PIP-II right now. The partnership includes research institutes, laboratories, and universities in the collaborating countries, so we have had to devise a framework to bring everyone together.This framework includes agreements on several different levels: country-to-country; lab-to-lab; and institute-to-institute. This framework must satisfy DOE rules and the rules of our partner countries. We also need a framework to get the technical work done. As you can imagine, shipping large accelerator components from Europe and India to the US entails more risk than making the components at Fermilab or possibly elsewhere in the US.
We have learned a lot from European accelerator projects, which tend to be international in nature. CERN in Switzerland has been building accelerators with international contributions for decades. Other recent examples include the European Free Electron Laser (XFEL), which began operation in 2017 at DESY in Germany, and the European Spallation Source (ESS), which is being built in Sweden. ESS, XFEL and CERN have all generously shared their experiences with us and we have adapted their approaches to create our own framework – which has been approved by the DOE. PIP-II is not going to be the last DOE project to be built with international contributions – there will be more to come. So, we want PIP-II to provide a successful example that other projects in the US can follow.
Is the design and construction of PIP-II on schedule? And when do you expect to have your first beam?
Before COVID-19 came about, the project was going to be completed at the end of fiscal year 2027. We will, undoubtedly, have some pandemic-related delays so some time in 2028 is a realistic completion date. We expect to have a lower-power beam running by early 2028, so at that point we can start sending beam to Fermilab’s Long-Baseline Neutrino Facility (LBNF). Full-power operation at 1.2 MW will come later.
We have already completed, and are currently commissioning, the PIP-II Injector Test Facility at Fermilab. This is a near full-scale prototype of the first section of the PIP-II linear accelerator (linac). It includes the entire room-temperature front-end section of the machine, which accelerates protons up to 2.1 MeV and two cryomodules that then take the beam up to about 25 MeV. The design energy of PIP-II is 800 MeV, so this is a respectable prototype and will be used as a systems engineering test bed for the front-end of PIP-II.
How will PIP-II be integrated within Fermilab’s accelerator complex?
PIP-II will replace the 50-year-old linac at Fermilab, but it will operate at twice the energy (800 MeV rather than 400 MeV). It will deliver a much higher-quality beam to Fermilab’s existing Booster accelerator, which will increase the proton energy to 8 GeV. Some of these protons will be directed at a variety of physics experiments including several that will look at the physics of muons. Other protons will be sent from the Booster to the existing Recycler- Main Injector complex, which will give the beam a further boost to 60–120 GeV. It is these 60–120 GeV protons that will be sent to LBNF”.
PIP-II is the first US Department of Energy funded accelerator that will be built with significant international participation
What types of experiments will PIP-II support when it is operational?
PIP-II fulfils two missions. The first is to create an intense beam of neutrinos, in fact the world’s most intense beam of neutrinos, for Fermilab’s LBNF. These neutrinos will be sent 1300 km to detectors in South Dakota that are located a mile underground and this will allow physicists to study neutrino oscillations.
Making neutrinos at LBNF involves firing an intense beam of protons at a target. Initially, PIP-II will deliver a beam power of 1.2 MW and we intend to upgrade this to 2.4 MW in the future. PIP-II is needed because the existing Fermilab accelerator complex can only deliver about 750 kW and for technical reasons we cannot push the current technology to 1.2 MW.
The second mission is to support and enable a broad programme of physics research at Fermilab for the international community for many decades to come. Some of this research will be related to muons, which can be created by firing protons at a target. Muons will be studied in detail in several new experiments such as Mu2e-II, which will measure how muons decay to electrons. The aim is to observe forbidden decay processes that would point to physics beyond the Standard Model of particle physics. If they exist, these forbidden processes would be very rare and therefore these experiments will benefit from the large beam power delivered by PIP-II.
Another experiment that could benefit from PIP-II’s proton beam power is based on the η and η’ mesons. These rare particles can be used for searching for physics beyond the Standard Model, provided more than 1013 such particles are produced, which PIP-II makes possible. An experiment has been proposed that utilizes a detector highly sensitive to processes from new physics, but mostly insensitive to background from old physics.
PIP-II can enable a low energy muon programme, using not only its large beam power but also its extremely flexible bunch structure to support two different classes of slow muon experiments, those involving continuous beams and those involving pulsed beams. Muon spectroscopy is being used in a wide range of research areas, including superconductivity, magnetism, battery materials, semiconductors and more.
Last but not least, there is significant interest in using the excess protons at about 1 GeV that PIP-II could provide when operated in continuous mode (total linac beam power ∼2 MW). This could be coupled to a proton storage ring to drive a megawatt-class proton beam dump facility at Fermilab dedicated to and optimized for high-energy physics experiments. This facility would support a rich physics programme, including searches for accelerator-produced dark matter that could explain the observed cosmological abundance and active-to-sterile neutrino oscillations, which would provide evidence for sterile neutrinos.
Will all these experiments be able to operate at the same time?
PIP-II is designed with the ability to distribute the beam to different experiments in programmable bunch patterns. That means we can deliver a bunch pattern that is tailored to the particular requirements of an experiment. This has been used for many years at the Continuous Electron Beam Accelerator Facility at Jefferson Lab in Virginia, where it has been successful at supporting a very productive physics programme.
I have mentioned many times in this column the value of the business awards given by the Institute of Physics (IOP), which have benefited more than 70 physics-based firms in the UK and Ireland since they were introduced eight years ago. You may also remember me saying how in 2018 the IOP expanded the business awards so there was a new prize for start-up firms. We felt it was vital to support early-stage businesses, which had often been unable to provide sufficient evidence of commercial growth to be recognized by existing awards.
The following year we added another award, specifically for very-early-stage companies taking innovative products into the medical and healthcare sector. Financed in perpetuity by a generous donation from Ann and Mike Lee, who had founded the UK-based firm Research Instruments in the early 1960s, the £5000 Lee–Lucas award was given for the first time earlier this year to two new firms: Cellular Highways and Nebu~Flow.
Traditionally, the IOP hosts a reception at the Houses of Parliament in London where all business-awards winners can showcase their products and technology (space permitting) to funding-agency executives, senior industrialists and politicians. The winning firms are also invited to the IOP awards dinner, get highlighted by the IOP’s marketing and social-media teams, get to speak at relevant IOP meetings, and take part in the events run by the IOP’s Business Innovation and Growth (BIG) group.
Nothing makes the companies’ potential come alive more than hearing directly from their founders and chief executives
As part of this year’s BIG group virtual summit, founders and chief executives from the winning businesses were able to explain their technology, its applications and their firms’ visions at two webinars held in July. In fact, the BIG group’s recent online talks, seminars and workshops have been attended by more people than ever went to our past physical events. I’m sure in future that the online and physical events will run in parallel.
This year’s 13 award winners, which were announced in July, are working on a wide variety of “deep-tech” physics-based research and innovations covering everything from quantum computing, compound semiconductor magnetic sensors and acoustic-nebulizer technology to magnetics technology and novel imaging solutions. As the IOP’s vice-president for business, I’m chair of the judging panel and it’s one of the most satisfying aspects of the role. Joined by a group of IOP fellows and others with a track record of bringing physics-based innovation to market, our work always involves lively discussion. But nothing makes the companies’ potential come alive more than hearing directly from their founders and chief executives. Indeed, it was inspiring to hear such passionate and capable entrepreneurs from the award-winning companies at the BIG group webinars sharing their firms’ vision and potential.
Take the presentations by the two inaugural Lee–Lucas award winners. Cellular Highways – represented by founder Samson Rogers – is a Cambridge-based start-up that develops microfluidic devices for rapid cell sorting, while Nebu~Flow – represented by founder Elijah Nazarzadeh – is a University of Glasgow spin-out that is commercializing an acoustic nebulizer based on surface-acoustic-wave devices.
Conditions such as asthma and chronic obstructive pulmonary disease kill one person in the UK every five minutes on average and, as Nazarzadeh pointed out, current drug treatments suffer from a significant flaw: the medication doesn’t always reach the right part of the patient’s lungs. Nebu~Flow’s technology tackles this problem by using controlled surface acoustic waves to create smaller droplets.
In pre-clinical trials, the device boosted the efficiency of absorption of a common asthma drug, salbutamol, from 60% to 90%. To get the best results, the tiny battery-powered device will require a disposable drug cartridge, which is a great business model. Unlike its competitors, Nebu~Flow can also deliver more drug types including vaccines. Its future looks particularly bright given COVID-19, with 29 out of the 45 or so treatments under development expected to be delivered in this way.
As for Cellular Highways, its VACS (vortex-actuated cell sorting) technology works by generating a transient vortex in a microfluidic channel; this vortex flows downstream with a cell to be sorted, thereby gently deflecting it across the streamlines. Each deflection event lasts for 23 μs, making VACS the fastest chip-based cell sorter to date. Scalable, sterile cell sorting of this kind – vital for cell therapy and regenerative medicine – has long been in demand but never previously been possible. Its technology should also be valuable for virus and vaccine research.
The COVID-19 crisis underlines the importance of finding ingenious, practical and timely solutions to the world’s most challenging problems
One of the main business-award winners – Promethean Particles – has seen strong interest related to COVID-19. As its founder Ed Lester explained, the company uses a continuous hydrothermal synthesis process and advanced fluidics technology to make nanomaterials at 1% of the cost of conventional methods. The firm has already built a full-scale volume production facility to produce tonnes of the materials, but it’s the applications for such materials that are intriguing. Copper nanoparticles, for example, are great as antimicrobial and antiviral agents and can be added to textiles and a range of surface coatings.
Indeed, the COVID-19 crisis underlines the importance of finding ingenious, practical and timely solutions to the world’s most challenging problems. And as many of the IOP’s award winners illustrate, plenty of those innovations are firmly rooted in physics. Research and development will be critical as society and the economy recover from COVID-19, enabling us to build a greener, healthier and more resilient world. The IOP’s business-award winners are well placed to contribute to that endeavour.
The BIG seminar on 23 September, which will examine how close quantum technology is to market, will include two other IOP business award winners: Orca and QLM
Deep within the interiors of gas-giant planets like Jupiter, materials can be subjected to millions of atmospheres of pressure. In the most extreme conditions, even hydrogen no longer exists in its usual molecular form. Instead, its covalent bonds break down and the material is believed to become a metallic solid. As astronomers seek to understand the physical structures of gas giants, a detailed knowledge of this metallization process is crucial.
To gain these insights, researchers must look to hydrogen’s phase diagram – which depicts how its physical properties vary with temperature and pressure. Although the phase diagram for hydrogen in less extreme conditions has been precisely mapped out, this is far from the case for extremely dense, metallic hydrogen. This gap in knowledge has prompted numerous investigations into where the boundary lies between molecular liquid hydrogen, and its solid metal form. However, it is proving to be extremely difficult to create the necessary high pressures in the lab.
Distinct boundaries
Recently, physicists have used computer simulations to model hydrogen under extreme conditions, using statistical and quantum mechanics to recreate interactions between groups of virtual atoms. For samples containing a few hundred atoms, run over simulation times of several picoseconds, these models predict that under increasing pressure hydrogen undergoes an abrupt first-order phase transition from a molecular state to an atomic state. Applying these results to planetary interiors suggests that there are distinct boundaries between insulating and conductive layers in the phase diagram
However, these latest efforts have faced a major limitation, throwing their findings into doubt. “Even state-of-the-art methods can only model hundreds of hydrogen atoms,” explains Bingqing Cheng at the University of Cambridge. “Since we were previously restricted to using such small system sizes, the first-order transition was assumed.”
The underlying problem with these simulations is that as more atoms are added, their complexity quickly ramps up due to the nature of their quantum interactions. Any larger models would be far beyond the capabilities of today’s most advanced supercomputers. In a new study, Cheng and colleagues in the UK and Switzerland have approached the problem from a different angle: this time using machine learning – which can build models of complex systems by “training” itself on real data.
Learning from quantum mechanics
“In a nutshell, we exploited machine learning techniques to ‘learn’ the atomic interactions from quantum mechanics,” Cheng explains. “The inputs to the neural network were the positions of each atom, and the outputs were the energy and forces they experience.” This approach drastically reduced the required computing time and cost; enabling the team to simulate over 1700 atoms across a broad range of temperatures and pressures, and for close to one nanosecond.
Within these larger simulated samples, Cheng’s team discovered that molecular-to-atomic transitions did not occur across a clearly defined boundary, as previous studies suggested. Instead, hydrogen underwent a smooth transformation as the pressure rose. This occurred since the “critical point” in the phase diagram, where higher temperatures and pressures cause the physical distinction between liquid and metal to disappear, is “hidden”. This means that the transformation between hydrogen’s insulating and conductive phases occurs both gradually and continuously.
Supercritical behaviour
Uncovering this “supercritical” behaviour could have significant implications for our understanding of hydrogen’s phase diagram at extreme pressures. As Cheng describes, her team’s discovery provides a far more accurate picture of how hydrogen exists deep within giant planets. “Inside these bodies, the insulating and the metallic layers have a smooth density profile between them, instead of an abrupt change,” she explains.
Beyond its implications for planetary science, the researchers believe their results highlight a promising potential for machine learning in exploring the hydrogen phase diagram. “We believe machine learning will not only change the way high pressure hydrogen is modelled, but will be in the standard toolbox of computational physicists,” says Cheng. “With machine learning, we can run nanosecond-scale simulations with thousands of atoms, with first principle accuracy. This was beyond our wildest dreams just ten years ago.”
John Proctor at the University of Salford, suggests that exploring the detailed implications of this work for planetary interiors will not be easy. “[Cheng and colleagues] propose that there is no first order liquid-liquid transition in hydrogen, and that planetary models assuming this therefore need to be revised,” he comments. “Since the actual interiors of Jupiter and Saturn do not consist of pure hydrogen – they are fluid mixtures predominantly composed of hydrogen and helium – a lot of further work is required to fully explore the implications of this work for planetary science.”
Nanotweezers A laser, an AC electric field and an array of gold “nanoholes” make up the new device. (Courtesy: Justus Ndukaife)
A new type of optical tweezer can trap and manipulate objects smaller than 10 nm in size without damaging them. The hybrid opto-thermo-electrohydrodynamic device, which was developed by Justus Ndukaife and colleagues at Vanderbilt University in the US, could be used to control photosensitive biomolecules and other nanoscale objects.
Optical tweezers were invented by the American physicist Arthur Ashkin, who received a share of the 2018 Nobel Prize for Physics for his work. These devices, which use a highly focused laser beam to generate forces that hold and move micron-sized objects in the beam’s trajectory, have emerged as powerful tools for biological research. They are, however, restricted in the size of the objects they can manipulate because of diffraction, which limits the spot size of the trapping laser beam to around half the wavelength of the illuminating light. For red light with a wavelength of 700 nm and a relatively low laser power of several mW, for example, tweezers can only stably trap and manipulate objects that have a diameter of roughly 350 nm or greater. To do the same to smaller objects, the laser power must be increased, which tends to destroy fragile biomolecules.
Moving away from the laser focus
The new opto-thermo-electrohydrodynamic tweezers (OTETs) avoid this problem by trapping nanometre-scale objects at locations several microns away from the high-intensity laser focus. Such objects are thus protected from the intense light and heat the laser produces.
The device is made up of an array of tiny holes 300 nm wide and 120 nm thick drilled in a very thin gold film. These structures, which are deposited on a glass substrate, are plasmonic, meaning that they interact strongly with light via surface plasmons (collective oscillations of electrons at the surface of a metal) at specific resonance frequencies. These interactions produce multiple electric field “hot spots” at which particles can be trapped.
The researchers created their tweezers by combining this array with a laser that has a wavelength of 973 nm and a focused beam diameter of 1.33 μm. Particles become trapped far from the laser focus thanks to an applied AC field (of 83 KV/m at a frequency below 10 kHz), which sets up a trapping potential several microns from the edge of the array along which the particles travel.
Trapping small molecules like DNA
Ndukaife thinks that OTETs could be used to trap small molecules such as DNA, and might also be used to sort objects based on their size. The latter task is important for researchers seeking to isolate specific biomolecules such as exosomes, which are extracellular vesicles that range from 30 to 150 nm in size and are known to play a role in cancer metastasis.
Other promising possibilities include detecting pathogens by trapping and analysing viruses, and studying proteins associated with neurogenerative diseases like Alzheimer’s. The technique could even aid the early detection of diseases since the tweezers are good at capturing molecules that are only present at low levels. The OTETs might also be combined with other imaging techniques such as biofluorescence and infrared spectroscopy.
“The sky is the limit when it comes to the applications of OTETs,” says Ndukaife, who collaborated with Vanderbilt’s Center for Technology Transfer and Commercialization to file a patent on the technology. “I am looking forward to seeing how other researchers harness its capabilities in their work.”
At a recent innovation webinar hosted by the UK’s National Physical Laboratory (NPL), speakers shared details of some of the breakthrough technologies being developed and commercialized by UK companies and laboratories. Two of the presentations described novel medical imaging developments that could improve cancer diagnosis and treatment, and examined how the resulting devices could get into the hands of users and make a difference to society.
Surgical guidance system
Lightpoint Medical, a specialist in precision-guided robotic cancer surgery, is developing miniaturized sensing tools for intra-operative cancer detection. Kunal Vyas, Lightpoint’s head of research, shared the rationale behind the company’s SENSEI cancer surgery guidance system.
“What is shocking, is that when you go for surgery, the surgeon will only use their sight and touch to remove your cancer,” Vyas explained. “The lack of interoperative tools to detect cancer means that tumour tissue is frequently missed, or in other cases, healthy tissue is needlessly removed.” He noted that in prostate cancer surgery, for example, one in four patients has some tumour left behind and one in six experience complications.
To address this, Lightpoint Medical is developing tools to help surgeons more accurately detect and remove cancer during surgery. The company’s SENSEI system combines miniaturized sensors with cancer-targeting radiotracers developed and approved for use in PET and SPECT scans, incorporated into a single-use 12 mm-diameter probe.
The SENSEI cancer surgery guidance system uses approved diagnostic radiotracers to help surgeons detect and remove tumour tissue during surgery. (Courtesy: Lightpoint Medical)
Prior to surgery, the patient is injected with the radiotracer, which accumulates in cancerous tissue. The probe then detects emissions from these radiotracers to help surgeons locate cancer in real time in the operating room. “This allows cancer surgery to be molecularly targeted using state-of-the-art diagnostic tracers,” says Vyas.
The SENSEI system can be used during all types of cancer surgery, including keyhole and robotic surgery, and is suited for numerous cancer types including prostate, lung, colorectal and gynaecological cancers. Assessments of the system’s performance in a preclinical setting demonstrated 100% identification rate at 2.5 cm tissue depth. The next step will be to move into clinical trials, with the first trials aimed at improving outcomes for patients with prostate cancer.
“I think precision guidance is the next revolution in surgery,” said Vyas. He noted, however, that uptake of new surgical technologies can be quite slow, particularly when it is a device that’s actually inserted into the body. “Surgeons and medical professionals are rightfully quite conservative, there’s quite a high-risk profile in introducing something new into the operating room.”
Lightpoint is expecting to receive US and EU market clearance later in 2020, said Vyas, adding that the company has secured £3m in its latest funding round. “We’ve been really grateful for having a combination of private equity and public funds to make this happen,” he said.
Ultrasound breast imaging
NPL Fellow Bajram Zeqiri discussed another medical imaging technology: ultrasound computed tomography (UCT), which could improve outcomes for breast cancer patients. “In the UK’s national breast cancer screening programme, three million women are invited annually for compression X-ray mammography,” he said. “Of these, only around 75% accept, partly due to the discomfort of the procedure.” But, as with all cancers, early detection is essential – increasing survival rates, as well as lowering treatment costs and reducing potential treatment-related side-effects.
And X-ray mammography has other shortcomings. It generates false positives, with 20,000 unnecessary biopsies performed each year, and is not ideal for imaging dense breast tissue, due to a significant cloaking effect on images. When such “white out” occurs on mammograms, ultrasound is used as an adjunct, as it is better at differentiating between dense background breast tissue and cancers.
So why is ultrasound not used as a primary screening tool? Likely because conventional ultrasound only targets small regions of tissue, requires high operator skills, and the detectors used in current systems also give a high level of false positives, explained Zeqiri. Some of these obstacles could be overcome by using UCT, which works by transmitting ultrasonic sound waves through the breast and exploits the higher acoustic attenuation and speed-of-sound of malignant tissue as its contrast mechanism.
Various UCT systems are currently under development, but their reliance on conventional piezoelectric technology means that images are still subject to artefacts. So NPL has developed a completely new type of UCT detector that uses the pyroelectric effect to measure ultrasonic power accurately. “The new detector will support near error-free quantitative acoustic attenuation tomography,” said Zeqiri.
NPL’s UCT system generated breast phantom images (right) with excellent correlation to 3D X-ray CT images (left) provided by the phantom manufacturer. (Courtesy: National Physical Laboratory)
Using a laboratory platform, Zeqiri and colleagues demonstrated that UCT with the pyroelectric detectors could create quantitative images of breast phantoms, with excellent correlation to ground-truth values and X-ray images. They have also recorded the first in vivo measurements (not images) from volunteers, to generate engineering data for future development of the technology.
Ultimately, the UCT device will enable full 3D quantitative breast scanning in just 5–10 minutes. The system, which will be suitable for measuring dense breast tissue, will also employ AI to compare the scans with a reference library of images linked to known pathologies – leading to improved automated diagnosis and far fewer false positives.
Achieving this, explained Zeqiri, requires three phases of work: recording the first images from volunteers; building a mark II system for comparison with conventional imaging on cancer patients; and then performing a large multicentre trial. He also highlighted the need for investment, noting that the biggest obstacle at the moment is funding for the second phase. “The problem is funders want to see proof, but you need funding to actually demonstrate that it works.”
Zeqiri expects that the new UCT system will first be used diagnostically to identify specific cancer types and reduce biopsies, before ultimately being employed for screening. “It could be very versatile because it uses non-ionizing radiation so it’s safe. It could be deployed anywhere, in a GP’s surgery or even in a fitness centre. It’s got real potential,” he concluded.
The session also included presentations by speakers from three other organizations.
Graham Farnsworth from the Defence Science and Technology Laboratory (Dstl) described a forensic technology that can recover fingerprints in scenarios where no other technique can do so. He explained how a collaborative approach helped transition the technology from a development at Loughborough University into a commercial product being sold to multiple users, and even applied by law enforcement agencies to reopen cold cases.
Vivienne Parry, head of engagement at Genomics England introduced the 100,000 genomes project, which has now sequenced around 120,000 genomes from 70,000 patients with either a rare disease or cancer. The project, which collected 21 petabytes of clinical data, enabled a 25% increase in the diagnostic rate for rare diseases and, for cancers, was able to find an actionable variant in nearly two-thirds of patients. The legacy of this project is the National Genomic Research Library, a constantly refreshed dataset of genomes and associated clinical data for academic and commercial research.
Claire A Fowler, process development scientist at Leaf Expression Systems, discussed the use of plant-based expression to produce proteins, antibodies, enzymes, vaccines and complex natural products for research and commercial applications. The company focuses on using their plant-produced expression technology to accelerate research and lower costs for developing and manufacturing therapeutic drugs, vaccines and diagnostics for human and animal health.
The next NPL innovation webinar takes place on 8 October and will focus on AI and robotics, fusion power and nuclear.