Scientists at the $3.5bn National Ignition Facility at the Lawrence Livermore National Laboratory in California say they have come a step closer to their ultimate goal of realizing “ignition”, at which fusion reactions generate at least as much energy as its lasers put in. In an experiment conducted on 8 August they say they achieved a yield of more than 1.3 megajoules (MJ) – about 70% of the energy that the laser pulse delivered to the sample.
NIF trains 192 pulsed laser beams on to the inner surface of a centimetre-long hollow metal cylinder known as a hohlraum. Inside is a fuel capsule, which is a roughly 2 mm-diameter hollow sphere containing a thin deuterium-tritium layer.
This result is a historic step forward for inertial confinement fusion research.
Kim Budill, Lawrence Livermore director
Each pulse lasts just a few nanoseconds and the lasers can deliver about 1.9 MJ of energy. This powerful blast causes the capsule to implode rapidly, creating immense temperatures and pressures inside a central hot spot, where fusion reactions occur.
Since NIF was turned on over a decade ago, its long-term goal has been to show it can, achieve ignition. This involves self-sustaining reactions, in which the alpha particles that are also emitted during fusion emit heat to initiate further fusion.
But after experiments between 2009 and 2012 fell well short of reaching ignition, NIF’s focus switched to supporting the US National Nuclear Security Administration’s work on the physics of nuclear weapons and maintaining America’s nuclear deterrent without further underground testing.
While ignition is part of that overall programme, scientists at NIF decided to change their ignition strategy, which included alterations to the shape of the laser pulses to create much more stable implosions as well as improvements to the precision of the laser and diagnostic equipment.
What has been achieved has completely altered the fusion landscape
Steven Rose, Imperial College
The work began to pay off and in 2014, these “high-foot” pulses each yielded up to 17 kJ of fusion energy (and later 26 kJ) compared to just 10 kJ in earlier experiments. In 2017, NIF researchers then obtained 54 kJ of fusion energy per laser pulse – as measured by the number of neutrons and alpha particles produced – and by last year were creating regular shots that produced around 100 kJ of fusion energy.
More bang for your buck
The shot on 8 August, which was announced yesterday, produced 1.3 MJ, generating more than 10 quadrillion watts of fusion power for 100 trillionths of a second. Although still short of break-even, the figure far exceeded previous markers.
“This result is a historic step forward for inertial confinement fusion research, opening a fundamentally new regime for exploration and the advancement of our critical national security missions,” says LLNL director Kim Budil.
Thomas Mason, director of the Los Alamos National Laboratory, says that the work is the culmination of decades of scientific and technological work stretching across nearly 50 years. “This [result] enables experiments that will check theory and simulation in the high energy density regime more rigorously than ever possible before and will enable fundamental achievements in applied science and engineering,” adds Mason.
Fusion experts outside NIF are also enthusiastic about the latest results. Plasma physicist Steven Rose, who is co-director of the Centre for Inertial Fusion Studies at Imperial College London, says the NIF team has done an “extraordinary” job, dubbing the latest breakthrough is the “most significant advance” in inertial fusion since it began in 1972.
“What has been achieved has completely altered the fusion landscape and we can now look forward to using ignited plasmas for both scientific discovery and energy production,” adds Rose.
NIF officials say the lab now plans to repeat the experiments to get a better understanding of what parameters were responsible for such a leap in energy production, cautioning, however, that it will take “several months” to do that work.
The internal structure of Saturn has been mapped by using data from the Cassini spacecraft to observe seismic oscillations in the planet’s rings. The study reveals that the core is both larger and more diffuse than previously thought.
The research is described in a paper in Nature Astronomy and could improve our understanding of the Saturn’s formation and evolution.
“The conventional picture of Saturn’s interior is of a compact core of rocks and ices that is surrounded by an envelope of hydrogen and helium,” explains the paper’s co-author Christopher Mankovich who is at the California Institute of Technology (Caltech). “Based on the unique information now available from Cassini ring seismology, we found that this distinction between core and envelope is not so tidy. The transition must be gradual, hence the ‘diffuse’ or ‘dilute’ core.”
Rock and ice
Along with co-author Jim Fuller at Caltech, Mankovich has found a rock and ice-dominated fluid at the centre of the planet. The hydrogen and helium content of the fluid gradually increases, moving outwards from the core as does the fraction of heavier elements.
In addition to discovering the lack of a clear boundary separating the core from the planet’s outer layers, the duo also found that the core is considerably larger than previous models had suggested.
“The diffuse core region to occupies the inner 60% of Saturn by radius, a dramatically larger number than the 10% or 20% expected from conventional models with a neatly separated core and envelope,” explains Mankovich. “At 60% of Saturn’s total radius this is a dramatic departure from previous models for Saturn’s structure, which came as a surprise to both of us. But after a long and careful investigation it does simply seem to be what the data require.”
What separates this latest work from previous studies of Saturn is the unique use of seismology data collected from the rings of Saturn — arguably the gas giant’s most famous feature.
Shaken from the core
Saturn’s rings were first observed by Galileo in 1610. They comprise a multitude of objects made of ice and traces of silicates, ranging in size from microns to metres. The closest ring to the surface of the planet is about 7000 km away, so it might come as a surprise that monitoring this stunning feature can reveal details of the interior of Saturn.
Mankovich explains that this is possible thanks to spiral patterns stirred up in Saturn’s rings by the planet’s gravitational influence and natural oscillations. “The planet itself is constantly ringing at a variety of frequencies, just like a musical instrument has its own rich spectrum of sounds at any given time,” he explains, adding “These oscillations in the planet cause small amounts of mass in the planet to essentially wobble back and forth slightly as a function of time, and this carries over into a wobbling gravitational field that can stir up waves in the rings.”
Saturn’s ring system is sub-divided into separate bands and data from Cassini has revealed dozens of waves in the C ring of Saturn driven by the gas giant’s oscillations. The frequencies of these waves allowed Fuller and Mankovich to better constrain the planet’s interior than previous methods have allowed.
Internal gravity waves
“Typically the interior structures of the outer solar system planets are constrained using their gravity fields, but this information only goes so far, since the gravity measurements are not very sensitive to the deepest parts of the interior,” Mankovich says. “Seismology is a handy and independent way to study the interior, especially at Saturn where the ring waves include those produced by Saturn’s internal gravity waves which inherently probe the deepest parts of the interior. It’s the frequencies of these internal gravity waves that turned out to eliminate many otherwise plausible interior models and let us arrive at our surprising result.”
Mankovich says that when it comes to their approach to mapping Saturn, he and Fuller took considerable inspiration from helioseismology — the use of the Sun’s regular oscillations to model its interior. Despite decades of development in this field and the growth of the related field of asteroseismology, a powerful method of charting the interiors of stars, understanding gas giants with seismology is still no mean feat.
“It’s difficult! This success story never would have happened were it not for the Cassini mission, which orbited Saturn for longer than a decade and collected a wealth of data,” concludes Mankovich. “The crucial next step will be to search for an interior model in which both of these stable regions might coexist, with the aim of explaining the ring seismology, gravity field, and magnetic field simultaneously. The best picture ever for Saturn’s structure is really starting to come into focus.”
Fun and games During Marion Cromb’s placement at software company Metaswitch, they created a Pokémon Go-inspired game during a hackathon. (Courtesy: Marion Cromb)
While the summer can be a nice break from studying, three months can feel like a long time to have nothing to do. After an extra-long summer between finishing a one-year art foundation course and beginning a physics degree at the University of Birmingham, Marion Cromb “never wanted to have another summer without much to do”. So they sought out internships to keep them occupied every summer throughout university.
Cromb had mixed experiences of internships. The first one was at a 3D-printing company based in London, and was unpaid apart from travel expenses. “I learnt a lot about employment law,” says Cromb, who ended up negotiating an early end date because the work was repetitive and not very interesting. “You should make sure you’re getting something out of it, and don’t be afraid to quit early if you’re not” they say, “especially if it’s unpaid.”
Despite this negative experience, Cromb searched online for internships the following year, and had better luck when applying to Metaswitch, a telecommunications software engineering company headquartered in London. The application process involved aptitude tests around basic maths and reasoning, as well as an interview. “You didn’t need specific coding skills. Metaswitch was going to train us, so it was just about making sure I had the fundamental skills and would be worth training.”
During the internship, Cromb was based at Metaswitch’s Enfield location, and built code designed to send manufactured data packets across a connection to see if there was any loss, which is a standard networking protocol. The code was built in the programming language C, and had to meet company standards. “That was great experience of learning good coding practice, and the code was actually sold to customers at the end, which was cool.”
Cromb was one of many interns at Metaswitch, and the company arranged lots of social activities for them to get to know each other. “It was 2016 – the summer of Pokémon Go,” they say. “The company organized a hackathon week, and we made a rip-off of Pokémon Go, called Metaswitch Go, where we used facial recognition to capture the faces of Metaswitch employees that you could put in your Metadex.”
Cromb did two further internships, both in academia. The first came about after they asked a professor at Birmingham, who was going to be their MSc supervisor the following year, about any potential projects they could do. The professor arranged a summer project in which Cromb investigated using laser light to improve the accuracy of particle tracking. Quantum uncertainty was the key: by increasing the uncertainty in the amplitude of the light you can reduce it in the phase, or vice versa, a trick known as “squeezing“ light.
Cromb heard about the second internship through an e-mail from their university, advertising a placement at Cardiff University. For this project they built a Michelson interferometer to be used in outreach demonstrations.
Comparing industry with academia, Cromb found that the industry placement was a lot more structured, with more guidance. “In academic placements, you often have less formal supervision, but you should ask for help if you need it. At one point in the project at Birmingham, we realized we’d spent a few weeks trying to do the wrong thing.” Another difference they found between academia and industry is that industry generally pays better. “Well, some industry pays better. Some doesn’t pay at all.”
Cromb advises prospective interns to apply to as many things as possible, “because you’ll probably get rejected from most things”. But there are lots of options out there, so it’s a numbers game.
Once you have a placement lined up, there are things you can do in advance to make for a smoother start. Cromb e-mailed their supervisor the week before to let them know what their pronouns are, and to explain what it means to be non-binary. “There are some things you can get out of the way before you arrive” they say, “to make sure you feel welcome when you get there.”
Of the four internships Cromb has done, three were positive experiences, but even the less helpful one was a good learning opportunity, and they still got something out of it. “I got to keep some of the things we 3D-printed,” they say. “I still have some miniature replicas of museum statues and a wobble toy of BB8 from Star Wars.”
Scientists from the University of Birmingham and the University of Warwick have unveiled a new class of polymeric four-dimensional (4D) printable resins for use in soft-tissue engineering.
Commercialized by 4D Biomaterials under the tradename 4Degra, the resins display the physical properties needed to promote tissue regeneration following injury or surgery. When printed into scaffolds, they exhibit high compressibility and strain recovery, meaning that they can be used to create self-fitting, void-filling support structures. Moreover, their interconnected pore network allows cells to infiltrate the scaffold, which promotes ingrowth of tissue and blood vessels.
“4Degra is the first biocompatible and fully biodegradable UV-cured resin for 3D printing,” says lead researcher Andrew Dove. “This means that complex shapes tailored to an application can be printed with high resolution using UV-based 3D printing methods.”
Filling the void
For patients with soft-tissue trauma, regions of tissue loss (“dead space”) severely limit the healing process. The cells responsible for tissue regeneration cannot proliferate inside the dead space, leading to deformities.
Because shape memory materials conform to the dimensions of their surroundings, they can fill these voids and subsequently provide a physical structure for cells to migrate into. This ensures consistent tissue support during the healing process.
The polycarbonate-based 4Degra resins are formulated with a photo-initiator and photo-inhibitor to ensure that they rapidly form gels when exposed to visible light, making them compatible with stereolithography-based 3D printing techniques.
By modulating the resin composition, the researchers printed porous scaffolds with a wide range of thermomechanical properties. In addition, the scaffolds demonstrated excellent shape-recovery properties (up to 85% compressibility and less than 1 N expansion force) when inserted into alginate gel, a soft-tissue-mimicking material. These results suggest that the scaffolds can be implanted inside the body via minimally invasive surgery, without inflicting pressure (and therefore pain) on the surrounding tissues.
The scaffolds can undergo up to 85% compression before returning to their original geometry. (Courtesy: CC BY 4.0/A Weems et al Nat. Commun. 10.1038/s41467-021-23956-6)
The researchers also tested the biocompatibility of their scaffolds in a mouse subcutaneous model. After two months, they observed that adipocyte (fat) cells had infiltrated the pores and formed lobules, indicating healthy tissue regeneration. Interestingly, less scar tissue formed on the porous scaffolds than on solid polycarbonates investigated for comparison. The researchers concluded that the increased surface area of the porous scaffolds further improved the biocompatibility of the resins.
Controlled degradation
Long-term studies confirmed that the scaffolds can support tissue ingrowth for more than a year. The resins slowly degrade into non-acidic products throughout the tissue healing process, but maintain enough mass to provide the much-needed mechanical support in the early stages of cell infiltration.
“Polycarbonates facilitate healing while reducing the risk of acidosis [increased acidity in the blood] over the material’s lifespan,” explains Andrew Weems, first author of the study.
“When degrading, polycarbonates surface erode, meaning that the degradation and reduction in mechanical properties is more controlled. This is in contrast to many other biomaterials that bulk erode,” adds Dove.
Now that the researchers have validated the printability and the performance of the 4Degra resins, they are collaborating with partners to test the material in a variety of medical devices.
It’s not been a great year for holidays. Coronavirus restrictions mean you can’t just book tickets and jet off to your chosen beach or city destination – there are quarantine arrangements, lateral-flow tests and jabs to consider. Still, this month’s special issue of Physics World has got me thinking about why a break from the daily grind can be so important to get your creative juices flowing. And that’s just as true for academics as for those, like me, who work in industry.
According to a survey of 1000 small business owners by Sandler Training (UK) in 2014, nearly one in five (19%) of those who have been successfully trading for more than five years claim they dreamt up their business idea while on vacation. I’m not saying you should spend your valuable holiday time glued to your laptop doing market research. No, the trick is to let your mind wander. Daydreaming can let you “think outside the box” or have a “Eureka!” moment – and then explore it.
Daydreaming can let you “think outside the box” or have a “Eureka!” moment – and then explore it.
In the travel sector, perhaps the most famous example occurred when Maureen and Tony Wheeler (an engineering graduate) embarked on a journey in an old banger from Europe to Australia for their honeymoon in the early 1970s (hitching a lift on a yacht for the final leg). The couple decided to share their travel tips in a book, later called Across Asia on the Cheap. It was the start of the Lonely Planet travel-publishing empire, which has so far sold more than 145 million guidebooks.
Other successful businesses that started on holiday include the file-hosting service Dropbox. It was dreamt up in 2005 by Drew Houston, a former computer-science student at the Massachusetts Institute of Technology, who was travelling on a bus from New York to Boston. Frustrated he’d left his USB memory stick at home yet again, Houston recalled in a 2017 interview with Business Insider how he just decided to open his laptop and start writing code that let him store and share files. The company now has a $10bn valuation.
Then there’s the photo- and video-sharing app Instagram, which occurred to Kevin Systrom in 2010 while strolling on a beach in Baja California, Mexico. Having studied engineering and management science at Stanford University, Systrom had been developing a prototype app called Burbn. But when his fiancée Nicole said she’d never use it as images taken with her iPhone 4 camera didn’t look great, Systrom decided to add filters that could enhance the quality of the pictures. Renaming it Instagram, the company was snapped up by Facebook for $1bn two years later.
Perchance to dream
I sadly haven’t got a multi-billion-dollar business to my name, but some of my best ideas have occurred while away from the day-to-day grind, when my mind has time and space to explore ideas. I remember once returning to Heathrow airport after a business trip to the US and heading straight on to Thailand, where I’d booked a fly-and-flop beach holiday. At the time I was working for an optical-instrumentation company, which meant I had my laptop and a small spectrometer with me.
On my first morning in Thailand, jetlag woke me before sunrise. Feeling restless, I decided to fire up the spectrometer and measure the spectral characteristics of the Sun from my balcony. After recording data for a couple of days, I realized how much I liked the bright light in Thailand and began to wonder why I (and others too) couldn’t have it in gloomy old Britain. The idea spawned a business selling light-emitting diodes (LEDs) that emit “circadian” light.
After recording data for a couple of days, I realized how much I liked the bright light in Thailand.
These devices mimic the spectral content of the Sun by being bright and having lots of blue during the day, while transitioning smoothly to a dimmer, candle-like glow with lots of red at night. Our lighting products were mostly sold to hospitals and care homes, who used them to shorten patients’ recovery times and boost the well-being of long-term residents. I even installed them in my house, which magically took me back to Thailand whenever I turned them on..
The big picture
Sometimes, though, a good holiday isn’t about developing a new business idea – but getting a sense of perspective about what’s truly important in life. That was certainly the case on my last “proper” holiday, when my wife and I were driving from Spain, up into France and then down through Italy. Realizing we’d be passing near the huge ITER experimental thermonuclear fusion reactor, which is being built north of Marseille, I decided to call them up from the car and see if I could drop in.
Casually mentioning my links to the Institute of Physics, which publishes Physics World, I managed to bag a last-minute tour of the construction site the following day. The visit gave me a chance to see in the flesh this inspirational, international megaproject, which will harness the power of the stars to produce green and carbon-free energy – surely one of the last great problems humanity needs to solve. I didn’t get any new ideas on that occasion, but you never know what a break can do.
Take the Scottish microbiologist Alexander Fleming, who in 1928 famously left a sample of inoculated Staphylococcus bacteria on culture plates in his lab before heading off on holiday with his family to his Suffolk country home. Returning a few weeks later, Fleming saw that one culture was contaminated with a fungus that had destroyed nearby colonies of bacteria but left those further away untouched. The mould was from the genus Penicillium, which can be used to make penicillin – the first and most famous antibiotic. Simply by being on holiday, Fleming had stumbled across a world-changing discovery.
So for anyone who thinks they’re too busy to go on holiday or it’s waste of time – remember that a change can be as good as a rest.
Physicists in the US have developed a new platform for trapping and rapidly manipulating the positions of nanoscale quantum objects. Justus Ndukaife and colleagues at Vanderbilt University and Oak Ridge National Laboratory used a combination of gold nanopillar arrays and a specialized optical tweezer to transport individual nanodiamonds to specific locations within just a few seconds. Their techniques could pave the way for a diverse range of advanced quantum technologies.
Suspended colloidal nanodiamonds are highly effective tools for enhancing interactions between light and matter. Measuring less than 100 nm in diameter, each nanodiamond contains a point defect known as a nitrogen-vacancy centre that can emit single photons under room-temperature conditions – a key building block for quantum photonics.
To exploit these point defects in practical applications, the nanodiamonds’ emission properties must first be enhanced by trapping groups of them and then creating entanglement between the spin states of their nitrogen-vacancy centres. Previously, this has been done via a combination of optical tweezers and arrays of nanopillars that act like tiny antennas. When illuminated by their resonant wavelength, these structures create highly localized and enhanced electromagnetic fields within volumes that are much smaller than the smallest possible spot sizes of optical tweezer laser beams, thereby trapping nanoparticles within deep, narrow wells.
Despite these advanced capabilities, however, researchers have so far only been able to rapidly confine nanodiamonds at specific positions defined by the locations of the nanoantennas. It remains extremely difficult to transport individual particles into positions outside these “hotspots”, meaning that it can take hours to assemble groups of nanodiamonds with entangled nitrogen vacancy centres.
Field distortion
In their study, which is published in Nano Letters, Ndukaife’s team overcame this issue by developing a new manipulation tool known as a low-frequency electrothermoplasmonic tweezer (LFET). As well as a laser beam, this device incorporates an alternating current (AC) electric field that induces thermal gradients in the nanopillar array, distorting the electric field it experiences. This combination allowed the researchers to establish a robust electrohydrodynamic potential capable of stably trapping and dynamically manipulating individual nanoparticles.
As a proof of concept, the researchers used the LFET to trap a single nanodiamond on top of an array of gold nanopillars and manipulate it by moving a near-infrared laser beam over the array. This nanoparticle transport method proved so rapid that Ndukaife and colleagues ultimately cut the time required to assemble a group of nanodiamonds from several hours to just a few seconds.
The researchers hope that their techniques will pave the way for scalable assemblies of ultra-bright sources of single photons. Ultimately, the LFET could become a reliable tool for fabricating large, stable systems of quantum bits (qubits), thereby opening up new capabilities for technologies such as on-chip quantum information processing and low-noise, high-resolution quantum sensors.
Nobody really understands why cuprates – highly doped copper oxides – are high-temperature superconductors, and researchers in the UK and the Netherlands have now discovered that the materials don’t conform to conventional theories in their metallic state either. Instead, the researchers suggest that cuprates may contain a mix of “strange” and conventional metallic components, but this only poses a further question: which component is responsible for the cuprates’ superconductivity?
Superconductivity occurs when a material loses all resistance to an electrical current below a certain critical temperature. Conventional theory (also known as BCS theory after the initials of its authors) states that at this critical temperature, the electrons in the material overcome their mutual repulsion and join up to form so-called Cooper pairs that travel unimpeded through the material.
Conventional theory does not apply
While this theory holds true for most superconducting materials, it does not apply to the cuprates, which are special in several ways. First, they become superconducting at considerably higher temperatures than other superconductors. Second, they form a new type of metallic state in which their electrons behave in a peculiar way. Unlike electrons in ordinary metals, which travel freely with few interactions and little resistance, electrons in so-called “strange” metals move sluggishly and in a restricted fashion. They also dissipate energy at the fastest possible rate allowed by the fundamental laws of quantum mechanics.
In their experiments, researchers led by Nigel Hussey of the University of Bristol and Radboud University began by painting electrical leads onto tiny single crystals of two families of cuprates: Bi2201 and Tl2201. They then placed these crystals inside a cryostat housed in one of the world’s largest resistive magnets, located at the HFML-FELIX laboratory at Radboud. Next, they measured how the magnetoresistance of the materials varied with temperature – only to find that the data did not fit models based on the conventional theory for metallic transport. Instead of increasing quadratically with temperature, as expected, the increase in electrical resistance was linear.
Signature for incoherent transport
“This result provides strong evidence that the magnetoresistance in these materials does not originate from normal charge carriers (electrons),” Hussey explains. “It instead provides a signature for ‘incoherent’ transport – that is, coming from carriers whose energy is being dissipated at the maximal rate allowed by quantum mechanics.”
The new work also backs up a previous study by the same group in which the data also hinted at the presence of incoherent carriers. The results from both studies suggest that superconductivity in high-temperature superconductors may derive from carriers that show signatures of incoherent transport in their normal (that is, non-superconducting) state, Hussey explains. This explanation is completely different from the coherent carriers that form the foundations of the conventional BCS theory of superconductivity.
“For a long time, we thought that we could explain the metallic states of highly doped cuprates using the same theory that we apply to conventional metals like copper, despite the fact they were high-temperature superconductors,” Hussey says. “This work shows that this is simply no longer possible.”
The researchers say they are now investigating how the unusual quadrature magnetoresistance develops across the entire phase diagram of the cuprates they studied to see if they can learn more about where the incoherent effect comes from. “We will also carry out a comprehensive quantitative study of the Bi2201 system in order to study how the superconducting charge carriers and incoherent carriers are empirically correlated,” Hussey reveals.
Many chronic illnesses lie hidden beneath the skin’s surface, making interrogation of disease initiation and progression a challenge.
To tackle this problem, a team of engineers at the University of California San Diego (UCSD) has developed a wearable sensor capable of detecting and continuously monitoring blood flow in deep tissues for cardiovascular diagnostics.
Wearable patches have seen much success when monitoring skin physiology at relatively shallow penetration depths. However, targeting deep tissues and resolving tissue-specific signals has previously posed an obstacle.
Such physiological signals arise from objects as small as a few micrometres, buried underneath strongly attenuating tissue layers. Therefore, a successful sensor must have both a deep probing depth and high spatial resolution.
Non-invasive sensor for deep tissue monitoring
The new patch, reported in Nature Biomedical Engineering, is capable of sensing microscale structures, such as red blood cells, from deep within the body and in real time. The valuable data obtained from the sensor may then aid clinicians in identifying cardiovascular problems at an early stage.
“This type of wearable device can give a more comprehensive, more accurate picture of what’s going on in deep tissues and critical organs like the heart and the brain,” explains principal investigator Sheng Xu, a professor of nanoengineering at the UCSD Jacobs School of Engineering.
Sheng Xu (far right) and his research group. (Credit: Sheng Xu)
How does it work?
The patch consists of a 12 x 12 array of millimetre-sized ultrasound transducers embedded in a thin stretchable polymer sheet. Importantly, it can sense and measure cardiovascular signals as deep as 14 cm inside the body. And by using an array of transducers, rather than a single transducer, the sensor can maintain a high degree of accuracy at this large penetration depth.
The sensors’ penetrative capabilities arise in part from controlling each transducer individually, known as phased-array technology. The transducers can either be operated synchronously, emitting a focused, high-intensity ultrasound beam, or asynchronously, which allows the beams to be steered to different angles.
While conventional wearable devices need to be moved and replaced to sense different regions, the beams of the phased array can be actively steered between a tilting angle of –20° to 20°. By propagating the ultrasonic waves at different angles, a larger region can be monitored than just the tissue directly beneath the patch.
“With the phased-array technology, we can manipulate the ultrasound beam in the way that we want,” says co-first author Muyang Lin, a nanoengineering PhD student at UCSD. “This gives our device multiple capabilities: monitoring central organs, as well as blood flow, with high resolution. This would not be possible using just one transducer.”
Owing to its soft mechanical properties, the patch can be worn, on the neck or the chest, for prolonged periods of time with minimal obstruction to the motion of the user. This capacity for long term, continuous monitoring can produce valuable data to help clinicians diagnose various cardiovascular diseases, including heart valve dysfunction, poor circulation, and blockages and clots that may cause heart attacks.
In tests, the patch performed on par with commercial ultrasound probes typically used in clinics. Unlike commercial probes, the wearable sensor eliminates the need for a trained technician for operation. This reduces the labour involved and the risk of human error, while also presenting a promising pathway for point-of-care and at-home diagnostics.
Data transfer and power supply for the preliminary prototype rely on wires connecting the front-end sensor to the back-end acquisition and processing system. Now that the researchers have proved the accuracy of their sensor, they plan to focus on developing wireless capabilities that will allow users to go about their daily activities while recording data.
Material structures are rarely perfect, but researchers at the Japan Advanced Institute of Science and Technology (JAIST) have now identified a way to make them more so. By monitoring in real time how defects called dislocations evolve in a 2D form of silicon, the researchers uncovered a way of “healing” these defects that could yield fresh insights into how to accommodate similar irregularities in other nanomaterials.
“Dislocations can strongly affect the physical and chemical properties of a crystal,” explains study leader Antoine Fleurence. “Moreover, they can undergo ‘reactions’ when, for instance, strain is applied on the crystal or atoms are added to its surface. Studying how dislocations react can, therefore, provide crucial insights on how to cure these crystal defects.”
Perfect testbed
In their work, the researchers used scanning tunnelling microscopy (STM) to study a 2D layer of silicon, or silicene, placed on zirconium diboride. STM works by exploiting the tiny current which, thanks to quantum tunnelling, flows between a very sharp metallic tip and the surface of a sample that is typically kept less than 1 nm away. By controlling the distance to the sample as the tip is scanned across the surface, this current can be kept constant, and the change in the tip’s vertical position reflects the underlying topography of the surface. The technique’s spatial resolution is determined by the sharpness of the tip: if the tip terminates with a single atom, the microscope can resolve the structure of the surface at an atomic scale.
The researchers say that silicene is the perfect testbed for their experiments because it harbours an array of dislocations that disappear when a small number of silicon atoms are deposited on top of it. The challenge, explains Fleurence, is that it is impossible to predict where and when the silicene sheet will begin to transform once these silicon atoms are added.
A sequence of reactions
By monitoring the surface in real time for more than 24 hours – a challenge in itself, since they needed to maintain an atomically sharp STM tip throughout – the researchers were able to observe how an array of dislocations in silicene naturally accommodates the newly added silicon atoms in a way that minimizes the energy of the system. They found that the 2D material undergoes a sequence of reactions in which the silicon atoms become integrated into the silicene. Locally nucleated single-domain islands then form and propagate across the entire sheet, eventually resulting in a dislocation-free, single-domain structure. “We have now been able to observe all the steps in this process and to my knowledge, this is the first example of a dislocation reaction giving rise to their annihilation,” Fleurence tells Physics World.
While there are no direct applications for the work, the researchers say it might yield clues for how such dislocations in crystals might be healed. They detail their research in 2D Materials.
Lithium-ion batteries are powering a green revolution on our roads. Innovations in materials, electrodes and cell design have boosted their charge density enough for electric-car drivers to travel hundreds of kilometres on a single charge, reducing the oft-cited barrier of “range anxiety”. At the same time, high-volume manufacturing processes have reduced costs – a crucial step in making zero-emission driving affordable, since an electric car’s battery is its single most expensive component.
Together with government initiatives to reduce our reliance on fossil fuels, these developments are expected to drive a rapid rise in sales of fully electric vehicles. China’s government has made electric transport a national priority, while generous incentives in Norway have persuaded around half of Norwegian drivers to buy electric. What’s more, 13 major economies, including the UK, Japan and Germany, have pledged to increase the market share for electric vehicles to 30% of all new car sales by 2030. If that happens, the International Energy Association predicts in its 2021 Global EV Outlook that more than 250 million electric vehicles will be on the roads by 2030 – around 15% of the global fleet – with annual sales reaching 44 million per year.
Mass adoption of electric vehicles will, however, require rapid expansion in manufacturing capacity for lithium-ion batteries. In a detailed technology roadmap, the German Association for Mechanical Engineering (VDMA) estimates that current manufacturing facilities, most of which are located in China, produce batteries with a combined power output of a few hundred gigawatt-hours (GWh). By 2030, the VDMA predicts that demand will grow to almost 3000 GWh – and potentially double that, in in the most optimistic scenario. As a result, both established cell manufacturers and a clutch of newcomers have announced plans for new facilities, with a greater presence in the US and Europe to supply car producers in those regions.
But building new facilities won’t just be a case of replicating existing ones. The underlying technology is still in flux. Lithium-ion batteries are currently made in three different formats, each of which needs a slightly different approach. New materials and electrode designs continue to boost performance and lifetime. And while relatively little R&D effort has so far gone into optimizing battery production, that may be about to change. “There is a huge cost pressure on automotive batteries, which is dictating a move to larger facilities – so-called gigafactories – and more automated processes,” says Marc Locke, a battery production engineer at RWTH Aachen in Germany. “Another major focus is to improve sustainability, both through reducing the amount of wasted material and by developing more efficient production processes that will significantly reduce the carbon footprint of battery manufacture.”
Charging up With manufacturing capacity for electric vehicle batteries predicted to grow by a factor of 10 or more between now and 2030, the incentive to streamline the process is strong. (Courtesy: iStock/Credit:3alexd)
Vacuum technologies have a major part to play in these improvements. Vacuum is already used in several stages of battery production, from the fabrication of the electrodes through to cell assembly, final finishing and test. In many of these steps, vacuum systems work to eliminate contaminants and remove residual moisture, improving the quality and performance of the battery while avoiding wastage.
In the first stage of battery production, active electrode materials are mixed in a slurry with binding agents and solvents. Most mixers operate under vacuum to prevent particulate ingress and avoid the formation of air bubbles, improving both the purity and the homogeneity of the paste.
Later, after the slurry has been processed and formed into electrodes, vacuum drying is used to remove all traces of moisture and solvents from the structure. Combining heat with vacuum speeds up evaporation and reduces the required temperatures, making drying more efficient. The vacuum environment is critical for producing high-quality electrodes, says Sina Forster, product manager for industrial vacuum at Leybold. “To maintain purity, oil-free vacuum pumps are deployed to ensure that no oil particles from the pump can get into the drying chamber and degrade the electrode,” she says.
Vacuum technology is also crucial when the assembled battery cells are filled with electrolyte and the finished battery is degassed. Here, high-quality vacuum maintains safety as well as preventing contamination, as the electrolytes are toxic and highly flammable. Dry-running pumps have become the most popular option at this stage since they can handle toxic gases without the need for pump oil and filters that would otherwise need frequent changing.
Despite the advantages vacuum technology brings, there is still room to improve efficiency
Greening the vacuum
Despite the advantages vacuum technology brings, there is still room to improve the efficiency of these production processes. Vacuum drying, for example, is both time consuming and energy intensive, requiring electrodes to be heated under vacuum for 12–30 hours. Indeed, an analysis in 2017 by Chris Yuan and colleagues at Case Western Reserve University in the US suggests that vacuum drying accounts for 47% of the total energy needed to manufacture the battery pack for a Nissan Leaf.
Adding infrared lamps can speed up the process, as can more powerful vacuum pumps. “The performance of the pumps has a direct influence on the process times, since in battery cell production we usually have big installations with high volumes,” Forster says. “In vacuum drying, for example, pumps usually offer pumping speeds of around 500 m³/h – like our DRYVAC DV650 – often with an additional booster.”
While the pumps themselves account for only a small part of the energy used in vacuum drying, Forster says that Leybold is nevertheless seeking more energy efficient solutions, both through hardware changes and by introducing systems that only provide vacuum when it’s needed. “Typically, vacuum pumps always run at full speed, even though the required vacuum level might change during a process cycle,” Forster explains. “Intelligent pumps can achieve significant energy savings by only providing the pumping speed that is required.”
Production engineers would also like to replace conventional batch drying methods – in which the electrodes are loaded into an oven and the pressure and temperature are changed until all traces of moisture have been removed – with a continuous version in which the electrodes move through a series of chambers with different environmental conditions. “It’s quicker and cheaper, and uses much less energy, but to be used in an industrial setting we still need better shutters and better insulation between the chambers,” Locke says.
In the longer term, Locke says that the goal is to replace the wet slurry currently used to mix electrode materials with a dry mixing and coating process. That would avoid the need for vacuum drying altogether, while also eliminating an earlier drying step in which a layer of the wet slurry is passed through a convection oven up to 80 m long. “Dry coating is the best solution both for cost and for energy consumption,” he says. “A great deal of research effort is focused on developing these dry processes, but no practical solution has yet been established.”
Filling the battery with electrolyte is another efficiency bottleneck. It’s a complicated process, as the highly porous structure of the electrode must be fully wetted for the battery to function properly. Using vacuum reduces filling times because it makes the process more precise, and because capillary forces help draw the electrolyte into electrode structure. However, this filling and wetting process must often be repeated several times – and each time the pressure must be cycled between normal atmospheric pressure and vacuum levels of 100–300 mbar.
1 In the mix Vacuum technologies are involved in several stages of lithium-ion battery manufacture, from electrode production through to cell assembly, final finishing and test. Maintaining clean and dry conditions throughout electrode production and cell assembly helps prevent contaminants from degrading battery performance. (Courtesy: Leybold)
Engineers are investigating ways to speed up this filling and wetting process. One option is to optimize the configuration of the cell to enable more efficient wetting. Another is to pre-treat the inert elements of the cell – which tend to be made from hydrophobic polymer materials – to enable them to soak up the electrolyte more quickly. Locke and his team are also developing measurement technologies based on electrochemical impedance spectroscopy (EIS) to enable real-time monitoring of the wetting process. “The EIS measurement can tell us if the filling and wetting process is complete, or whether we need to add more electrolyte or apply a lower pressure,” he says.
More generally, greater efficiencies could be achieved by replacing the individual batch processes that dominate battery manufacture with a more continuous approach. Processes that require vacuum would then need to be seamlessly connected to other production steps – which again is driving a trend towards greater use of software. “More intelligent pumps can communicate with other process steps and manufacturing equipment to help increase throughput,” Forster explains. This joined-up approach could also reduce the need for clean and dry conditions throughout electrode production and cell assembly – which accounts for 30% of energy consumption according to the analysis by Yuan and colleagues – by replacing a single large, conditioned room with a series of so-called glove boxes or mini-environments.
Further advances in battery materials and design may also impact the production plants of the future. As well as new material chemistries, researchers are also pursuing alternative, solid-state battery designs that avoid liquid electrolytes altogether. “From our standpoint it is essential to keep up with market trends and technology developments,” Forster observes. “Only then can we estimate the impact on the vacuum demand and develop new solutions if necessary.”