Many children are naturally curious and have vivid imaginations – two qualities that make them well-suited for careers in physics. So why do many children eschew science when they are asked what they want to be when they grow up? That is a research interest of Carol Davenport at the UK’s Northumbria University, who talks about how to broaden the career aspirations of children in this episode of the Physics World Weekly podcast.
The move to greener sources of energy should create many new job opportunities for future physicists. Heat pumps offer a much more energy efficient way to heat and cool buildings, and Physics World columnist James McKenzie explains how the technology could play an important role in helping the UK meet its commitment to a net-zero-carbon economy by 2050. He explains how heat pumps work and describes what it is like to live in a house that is heated by the technology.
One of the main goals in quantum computing is to experimentally demonstrate that a quantum machine can perform some computational task faster than a classical one. A team of researchers based in France and the UK has now done just that using a simple quantum photonics experimental set-up. Their work shows that it is possible for a quantum computer to verify solutions to problems classified as NP-complete using a so-called interactive proof protocol and only minimal, unverified information about the solution.
The work is among several recent milestones in demonstrating quantum advantage. In 2019, Google claimed to be the first to the finish line with their 53 programmable superconducting qubit (quantum bit) set-up. More recently, a team in China announced that they had successfully performed “boson sampling”, a task known to be hard for a classical computer. Unlike these previous results, however, the new research, which is published in Nature Communications, not only demonstrates quantum advantage but also promises to be useful in applications like secure quantum cloud computing.
NP verification
Although NP-complete problems are hard to solve efficiently, once solutions are found, they can be verified trivially. The challenge that the team at CNRS (the French National Centre for Scientific Research) and the University of Edinburgh focused on occupies a middle ground between the two: verifying the solution to an NP-complete problem when provided with only a part of that solution.
When the size of the message containing the partial solution, or proof, is fixed, it can be shown that a classical protocol for verifying the solution will take an amount of time that scales exponentially with the size of the message. For the quantum protocol, in contrast, the scaling is polynomial. This means that for large message sizes, a quantum computer would take minutes to verify the solution while a classical one could take years.
The algorithm the researchers use to demonstrate this is known as an interactive proof protocol. Here, one component of the experimental set-up acts as a “prover”, using coherent light pulses to send partial solutions to the NP-complete problem in the form of a quantum state. The second component fills the role of the “verifier”, deciding with high accuracy whether the solution is correct based on the limited information given. When certain bounds are placed on the expected accuracy of the verifier, as well as the protocol’s speed and efficiency in terms of the amount of information that can be communicated throughout the interactions, it is possible to demonstrate that the quantum algorithm far outperforms any classical attempts at doing the same.
Quantum cloud computing
By showing that a quantum algorithm can verify solutions to NP-complete problems efficiently, the result could allow for new applications in secure remote quantum computing. A client with a rudimentary quantum machine could, for example, verify information they receive from a powerful quantum server without ever having access to the full solution. Such proof systems could then contribute to protocols like secure identification, authentication or even blockchain in a future quantum Internet. “In the current era of increasing focus on data privacy and secure computing, our demonstration provides yet another compelling piece of evidence that quantum computers can outperform their classical counterparts in achieving secure solutions,” adds Niraj Kumar, an Edinburgh researcher and a co-author on the paper.
I’m part of a team that provides support for computer, control and data acquisition systems for plasma operations at the DIII-D tokamak. Before the pandemic, I would arrive at the fusion research facility at around 7 each morning to check the readiness of our systems before attending a standing-room-only meeting. Other attendees included physicists overseeing and participating in the experiment, plus representatives from operations support groups in power, vacuum, heating, water, cryogenics, diagnostics and computer systems – as many as 60 people in total.
After that, I would spend most of my day in the DIII-D control room. This was a busy environment, with every seat taken and all eyes focused on large display monitors showing live tokamak status and experiment results. The physics session leader overseeing the experiment would sit at the front, working closely with other physicists and with the chief operator who oversees tokamak operations and safety. At nearby stations, other groups of scientists would monitor diagnostics and analyze results. These individuals would be in close contact with the session leader, often having direct personal conversations to provide immediate assistance and feedback.
In addition to the staff in the main control room, we had teams spread throughout the facility to support various aspects of the tokamak’s operation. These groups kept in contact via a direct overhead paging system, landline phones and walkie talkies. Doing experiments on DIII-D requires input from a lot of people, so there could be as many as 120 of us in the facility during a day of experiments.
From hands-on to mostly remote
The pandemic changed all of this. Since the start of restrictions, most fusion research staff have been working offsite, with only a few allowed in the DIII-D facility. We were, however, fortunate in that DIII-D staff had extensive experience in working with other tokamaks across the world and even, in at least one case, controlling a tokamak remotely from San Diego. This knowledge gave us a good base on which to build.
The first step was to provide offsite staff with access to the full set of computer systems and applications needed to run experiments in a secure manner. Since quite a few of these tools control critical pieces of hardware, it was crucial that access was restricted to authorized personnel.
A rare sight: Since the beginning of the pandemic, the DIII-D facility has taken steps to limit the number of staff who have to be physically present at its site in San Diego, California. (Courtesy: General Atomics)
The next step was to create a virtual control room that would allow offsite staff to see and hear the same information they were accustomed to having at the facility. We did this by adding to and improving upon an existing set of web-based displays showing live tokamak status information, real-time signal plots, and a tokamak plasma cross-section animation. We also set up two live camera feeds of the DIII-D control room along with broadcasts of the on-site paging system.
The final and most crucial step in setting up remote operations was to provide a communications infrastructure that could accommodate coordination amongst a large and varied group of individuals working both onsite and offsite. Initially, we tried having everyone in a single Zoom meeting room, but this quickly ran into problems. Communications during experimental operations tend to be very ad hoc and dynamic, and as the number of people in the meeting grew, it became very difficult to manage these kinds of conversations. With close to 100 people signing on, it was hard for individuals to break into general discussions or for smaller groups to have conversations amongst themselves – the friction involved in jumping from conversation to conversation was just too high.
A new use for a gaming app
Instead, we settled on Discord, an application that is mainly used in the video-gaming community. Our Discord server provides a platform for DIII-D group members to communicate on an as-needed basis, as well as listen in and participate in the main experiment. Discord’s architecture makes it possible for many users to “meet” in one place, while also allowing them to break off into smaller groups and move between separate dedicated channels. It proved very effective in bringing all the different DIII-D groups together.
Another problem that Discord helped solve related to contacting users working offsite. Thanks to Discord, people who could no longer answer phones in their offices could be reached easily by voice or text while signed into the app. While the initial adaptation and deployment was a challenge – we had a large number of users requiring access to Discord, and they all needed to be registered and trained in order to fit it into their day-to-day methods of conducting business – it has been hugely beneficial to our operations.
Telecommuting to a tokamak
Hands on: A pre-pandemic panorama of the tokamak at the centre of the DIII-D National Fusion Facility. (Courtesy: General Atomics)
Thanks to our remote-operations solutions, most of our scientific and support staff can, in effect, run the tokamak without having to step outside their homes. Scientists can analyze all aspects of the plasmas and status of the tokamak in real time, while coordinating and collaborating with one another using tools such as Discord and Zoom.
Of course, there are still some activities that require onsite staff to touch hardware in the lab. These include calibrating equipment at the start of each day, performing routine maintenance, and making necessary repairs and upgrades. To limit the number of people engaged in these tasks, we provided onsite workers with tablets equipped with cameras and microphones to connect them with remote experts who could offer guidance in repairs or adjustments needed to hardware.
What’s missing – and what isn’t
Although the face-to-face camaraderie of working alongside colleagues in our quest for fusion has been much missed, we’ve found ways to adjust. We’ve also been a bit surprised at how easy and natural it’s become to do high-level physics experiments from our homes. In some cases, the transition to remote operations has even made our work more efficient. One example is the improved communications between different groups. It is now much more straightforward to find and contact the right people to resolve an issue thanks to our new Discord-based central communication system.
It remains to be seen whether some people will grow less inclined to sign on to communications tools such as Discord as more staff become available onsite. However, the improvement in overall communications efficiency it offers should give people an incentive to continue using it.
More generally, it is easy to see how many of the tools developed to support remote operations during the pandemic could be left in place as more staff return to onsite working. The live web-based status displays, real-time plots and control room videos are as useful and effective to users working from home as those working from their own offices. These tools give us greater flexibility and efficiency in supporting operations and monitoring the experiment.
Combining materials with different swelling ratios creates structures that transform into tubes when exposed to water. (Courtesy: Yu Bin Lee)
Materials that controllably change shape over time – often called four-dimensional (4D) materials – are excellent candidates for advanced tissue engineering applications. Many 4D materials, however, have only been loaded with low concentrations of cells, potentially limiting their use in regenerative medicine.
To address this problem, researchers from the University of Illinois at Chicago have developed a 4D biomaterial system using two types of biocompatible hydrogels. Manufactured as sheets, the materials curl into tubes when exposed to water. Each new hydrogel can support cell densities as high as 100 million cells/mL – approaching the same order of magnitude as that found in developing and healing tissues.
“Using a high density of cells can be advantageous in tissue engineering as this enables increased cell–cell interactions that can promote tissue development,” explains lead author Eben Alsberg in a press statement. The researchers describe their work in Advanced Functional Materials.
Water-activated shape change
Tissue development is a highly dynamic process. Clusters of cells are organized through a series of complex architectural changes until the final tissue structure is formed. It may therefore be beneficial for tissue engineered scaffolds to respond to, and even replicate, the geometric changes that occur within the tissue on the same time scale.
This is where shape-shifting materials shine. The researchers hypothesized that stacking hydrogels with different swelling rates would create a 4D material that gradually changes shape as it absorbs water. By controlling the spatial distribution of each hydrogel throughout the scaffold, the extent of swelling (and therefore shape change) could then be regulated over time.
First, the team investigated the swelling properties of two biocompatible hydrogels: oxidized and methacrylated alginates (OMAs) and methacrylated gelatin (GelMA). They not only found that OMAs swell more than GelMA, but that OMA expansion could be further augmented by altering its chemistry. The higher the degree of OMA oxidation, the faster its degradation rate and the more it expands.
Next, the researchers submerged a series of flat, bilayered OMA/GelMA scaffolds in cell culture media and monitored their subsequent deformation. Over the course of 21 days, each scaffold curved into a “C” shape, with some forming closed circles or even rolled, spiral-like structures. In each case, the OMA layer dictated the overall shape change because it absorbed water the fastest. What’s more, the extent of curling could be controlled by changing the thickness of the OMA layer or varying the OMA oxidation level.
The material is also compatible with techniques like photolithography and bioprinting, which allowed the team to create 4D hydrogels with more complex starting geometries and shape transformations.
“Using our bilayer hydrogels, we can control how much bending the material undergoes and its temporal progression,” says first author Yu Bin Lee.
The team from the University of Illinois at Chicago (from top left): Yu Bin Lee, Oju Jeon, Sang Jin Lee, Aixiang Ding, Derrick Wells and Eben Alsberg. (Courtesy: Eben Alsberg)
Record-breaking cell encapsulation
To test the impact of cell density on 4D shape change, the researchers loaded each hydrogel layer with varying concentrations of fibroblast cells or stem cells. The material maintained its rolled structure after three weeks, even at extremely high cell densities (1.0 × 108 cells/mL).
Prior to this study, the highest reported concentration of cells encapsulated within a shape-changing material was 1.0 × 107 cells/mL – 10 times lower than that achieved by the team in this work. Both cell types remained viable throughout the 21-day period. Importantly, the stem cells could perform normal cellular activities like differentiation.
The researchers hope that the material could be used to mimic a range of target tissues that have varied cell concentrations. “This system holds promise for tissue engineering, but may also be used to study the biological processes involved in early development,” says Lee.
Cattle battle: how to measure and combat methane emissions from cows
“Consider a spherical cow” is probably the start of a joke about the abstractions of theoretical physicists. But in the real world, cows are no laughing matter.
That’s because when they chomp on grass and straw, cows produce methane, which they belch out. And while that might be pleasant for the cow, it’s not great for the environment given that methane is a potent greenhouse gas.
Studying bovine emissions and trying to reduce the amount of methane they emit is therefore a clever, short-term solution to climate change.
But the tricky bit is measuring how much methane cows produce in the first place – especially from a herd of them.
As Michael Allen explains in the cover feature of the April 2021 issue of Physics World magazine, physicists have turned to drones carry spectrometers and even “frequency combs” – sensitive, laser-based systems that bagged a Nobel prize.
For the record, here’s a run-down of what else is in the issue.
• Pulled paper slows hunt for Majorana – Physicists remain convinced that the elusive Majorana particle will be found, despite the retraction of a paper claiming its discovery, as Alexander Hellemans reports
• China powers ahead in neutrino physics – Ling Xin examines the legacy of the recently closed Daya Bay Reactor Neutrino Experiment on neutrino physics, US–China collaborations and future neutrino facilities
• Changing bad exam habits – Paolo Elias says that the COVID-19 pandemic offers the chance to revamp how we assess physics students at school and college
• A decade of success – James McKenzie celebrates some of the firms that have won business innovation awards from the Institute of Physics over the last 10 years
• Crisis in a lockdown – Robert P Crease describes how news of a radiation leak at a US neutron facility was handled in today’s online, networked and locked-down world
• Battling bovine belching – Cutting methane production from livestock is considered vital to climate change mitigation, with lots of research focusing on how animals breed and are fed. But physicists are playing their part too by developing ways to measure the emissions from cattle, using techniques such as spectroscopic analysis and aerial sampling, as Michael Allen discovers
• A century of nuclear isomers – One hundred years after “nuclear isomers” were first discovered, Philip Walker and Zsolt Podolyák pick five examples of these long-lived, excited nuclear states to show why they are so important in medical physics and beyond
• Solar superpower – Multi-layered solar cells stand on the brink of 50% efficiency, but the practical benefits of such high-efficiency cells are more likely to be realized in space than on Earth, as Richard Stevenson explains
• Mission to Mars – Andrew Glester reviews Dream Big: How to Reach for Your Stars by Abigail Harrison
• The surprises essential to life – David Appell reviews Seven Pillars of Science: the Incredible Lightness of Ice, and Other Scientific Surprises by John Gribbin
• Living in a materials world – Materials scientist Arnab Basu, head of radiation-detection technology developer Kromek, talks to Tushna Commissariat about founding a spin-off, the challenges of COVID-19 and looking to the future
• Ask me anything – Tim Gershon is a professor of physics at the University of Warwick, UK.
• Counting muons in schools – Andrew Ferguson on how lockdown didn’t stop him with his project to get school pupils into particle physics
An international team of researchers has used Google’s Sycamore quantum computer to power an online Zoom meeting for the first time. The US tech giant’s device, which consists of 53 programmable superconducting quantum bits, has already been shown to outperform classical computers at certain tasks. The new discovery could allow meeting participants to appear in more than one break-out room at the same time – a phenomenon that the team has dubbed “quantum Zoom advantage”.
Conventional, classical computers store and process information as bits that can have one of two states – “0” or “1”. But quantum computers like Sycamore exploit the ability of quantum particles to be in “superposition” of two or more states at the same time. N such qubits can therefore be combined or “entangled” to represent 2N values at once, allowing quantum devices to process information in parallel on a massive scale.
This unprecedented power has now been exploited for the first time in a video call when Benedetta Brassard – a quantum physicist at the the University of Waterloo in Canada – accidentally installed Zoom on Sycamore during an online meeting. Brassard is part of the International Fault Tolerant Benchmarking Team (FiT/BiT), which she set up to diversify participation in measuring the performance of quantum computers.
Distracting meme
“I was in a FiT/BiT board meeting and just thought I would have a quick check of the Sycamore dashboard to see how my quantum calculation was going,” Brassard told Physics World. But after being distracted by an amusing Shor’s algorithm meme, she somehow ported the Zoom meeting to Google’s quantum processor.
The 11 participants became encoded in Sycamore’s 53 superconducting qubits and found themselves in confusing quantum superpositions of Zoom settings. “Some colleagues were telling me that I was on mute, while others could hear me,” recalls Brassard.
“I knew something was really wrong when multiple versions of the meeting kept popping up on my screen”. Brassard now believes that Sycamore was using the “many worlds” interpretation of quantum mechanics while running Zoom. “The only way to steer it back to the classical world was to keep making measurements – which meant that I actually had to pay attention to what other people were saying”.
Fortunately for her fellow FiT/BiT members, Brassard had supervised a PhD student on the implementation of Instagram on D-Wave’s 2000Q quantum annealer. “The problem was to work out the optimum time of the day for influencers to post pet-related images – which we discovered is an NP hard problem,” she explained. As a result, Brassard already knew how to transform an app from a quantum to a classical state.
Brassard and colleagues have published a paper describing the Zoom incident in the journal Quantum Advances in Computing and Correlation (QUACC). They now hope to develop a quantum formalism to allow meeting participants to exist in multiple break-out rooms at once. “This could lead to the real quantum advantage of making online meetings shorter and more bearable,” she said.
Quantum sensors based on cold-atom interferometry are among the most accurate instruments in fundamental physics, with predicted applications that include mapping underground structures and creating more precise navigational systems. Their speed, however, is limited by the fact that the measurement process typically destroys the carefully prepared atomic sample, meaning that a new sample must be created for each measurement. This takes a few hundred milliseconds for even the fastest sensors.
Researchers at SYRTE in France’s Observatoire de Paris have developed a new non-destructive method that uses microwaves to measure the number, or population, of atoms in specific quantum states. The new method enables experimenters to perform quantum sensing measurements nearly 30 000 times a second – a rate that could make it possible to complete large surveys on a timescale competitive with current commercial devices.
The principle of microwave detection
Non-destructive measurements of cold atoms can be performed using existing methods. However, the complex optical systems often required make it difficult to create compact, practical sensors. The SYRTE team of William Dubosclard, Seungjin Kim, and Carlos L Garrido Alzar solved this problem by developing a system based on microwaves instead. Their solution uses the fact that microwave power radiated by an antenna into a medium depends on the radiation resistance of that medium.
Detection boost: A diagram illustrating the spectral sensitivity of reflected microwaves due to the presence of an atomic resonant transition. This is the working principle of the non-destructive detection. (Courtesy: C L Garrido Alzar)
In the SYRTE experiment, the medium is a sample of 10 000 rubidium atoms prepared at temperatures of around 3 µK. By using an antenna to direct a beam of microwaves onto the atoms, and then observing the microwave signal the atoms reflect, the experimenters could detect the atoms’ quantum state. Although the microwave reflections are weak, the researchers saw clear variations in them when they scanned the microwave frequency across a resonant atomic transition.
Proving it works: detecting cold atoms
The team proved the non-destructive nature of their detection method by measuring coherent quantum effects known as Rabi oscillations. These sinusoidal patterns appear as populations of atoms oscillate between two atomic energy states when near-resonant light is applied, and they form the basis of cold-atom interferometry. In a typical experiment, these patterns are observed by creating multiple atom samples and taking one data point per sample. In this case, however, the researchers managed to observe Rabi oscillations with a single sample – by using one microwave antenna to drive the oscillations and another to perform repeated detection.
They observed no difference between the amplitude of the oscillations obtained with their method compared with the multiple sample method. This validates that their technique did not cause any additional atom and decoherence losses even as it increased the detection bandwidth to 30 kHz.
Garrido Alzar tells Physics World that they now plan to characterize the noise level of their novel detection method and investigate how it would affect the performance of quantum inertial sensors.
Researchers in the US have improved the resolution for imaging the human eye by a third. This has enabled them to visualize the mosaic of rod and cone photoreceptors at the back of the eye in greater detail than ever before and allows the assessment of individual photoreceptors. The new imaging technique could facilitate earlier detection of diseases such as age-related macular degeneration and enhance the monitoring of treatments for eye diseases, the team claims.
Age-related macular degeneration is a progressive eye disease and a leading cause of sight loss. People with age-related macular degeneration lose the ability to see fine detail in their central vision, as the cells at the centre of their retina become damaged. Early diagnosis and treatment are crucial to preserve sight, in macular degeneration and other eye diseases.
“The goal of our research is to discern disease-related changes at the cellular level over time, possibly enabling much earlier detection of disease,” says Johnny Tam, at the National Eye Institute in Maryland. As well as improving the monitoring of degenerative changes in retinal tissue, enhanced image resolution could also help doctors see whether treatments for eye diseases are working and assist with the development and assessment of new therapies.
Imaging the eye is challenging. The light-distorting properties of parts of the eye like the lens and cornea reduce image resolution. Then there’s the diffraction limit of light to consider, with most conventional techniques for imaging beyond the diffraction limit using too much light to be safe for the eyes. Current retinal imaging techniques that use near-infrared light have a resolution of around 2−3 µm, while the smallest rod and cone photoreceptors range from 1–3 µm in diameter.
To improve resolution beyond the diffraction limit, and allow these individual cells to be distinguished, Tam and colleagues combined two imaging techniques: annular pupil illumination and sub-Airy disk confocal detection. By combining these two approaches, the researchers improved upon a conventional retinal imaging technique known as adaptive optics scanning light ophthalmoscopy, which uses deformable mirrors and computational methods to correct for optical imperfections of the eye in real time.
Annular pupil illumination generates a hollow beam of light. This improves the transverse resolution across the mosaic of photoreceptors, but reduces depth resolution. However, the researchers then used a very small pinhole – known as a sub-Airy disk – to block the light coming back from the eye, which allowed them to regain the depth resolution.
“One might think that more light is needed to get a better image, but we demonstrate that we can improve resolution by strategically blocking light in various locations within our instrument,” explains Tam. “This approach reduces the overall power of light delivered to the eye, making it ideal for live imaging applications.”
The researchers tested their technique on five adults with no sign of eye disease. Describing their research in Optica, they claim that this approach improved resolution across the photoreceptor mosaic by 33% and depth resolution by 13%, compared with conventional adaptive optics scanning light ophthalmoscopy. This allowed them to reveal subcellular features of photoreceptors that were not clearly visible with previous techniques.
The new imaging technique can acquire images of the smallest cone photoreceptors in the living human eye (left). The team also combined this approach with non-confocal split detection (right) to better see the inner segments of the same photoreceptors. (Courtesy: Johnny Tam, National Eye Institute)
“The ability to noninvasively image photoreceptors with subcellular resolution can be used to track how individual cells change over time,” says Tam. “For example, watching a cell begin to degenerate, and then possibly recover, will be an important advance for testing new treatments to prevent blindness.”
The researchers say that this clear improvement in the ability to obtain higher resolution images of the retina could help answer fundamental questions about photoreceptor health. They add that their technique provides a straightforward method of achieving sub-diffraction limited resolution in point-scanning-based microscopy and imaging approaches. This could enable routine sub-diffraction imaging of cells in the human body, they state, and be useful in other applications where it is important to image with low levels of light.
Primary-school children, and the rest of us too, are continuously being showered by unseen muons, a heavier relative of the more familiar electron. These muons are created when energetic cosmic rays, including protons and alpha particles, hit our atmosphere and produce a shower of particles as they slow down. At sea level, muons arrive at a rate of about one per square centi-metre per minute.
Muons remain unnoticed unless you have the right equipment to look for them. Several years ago I was therefore excited to read about a successful US-based outreach project called Cosmic Watch. Started by particle physicists, it allows members of the public to make muon detectors for less than $100 and observe these tiny particles for themselves. At the heart of the detector is a silicon photomultiplier chip, which measures the few blue photons emitted by a plastic scintillator whenever a muon passes through.
Inspired, I started to build muon detectors based on the Cosmic Watch design. But when I had got one working, I needed something to do with it. A work trip to Belgium on the Eurostar train presented one such opportunity. I took the detector with me (curiously, no questions were asked at security) and, sure enough, as we travelled through the Channel Tunnel between Britain and France, it recorded a lower rate of muons than at sea level. The sea and seabed were shielding the detector.
The rate of muon detections dropped as the detector went through the Channel Tunnel.
As fun as that was, muonic measurements are better shared. So, along with another physicist parent, Lisa Ibberson, I got in touch with Kate Cooke, who teaches science at Coton Church of England Primary School in Cambridgeshire, where my son attends. Together, the three of us applied for money from the Institute of Physics School Grants Scheme. Our idea was to work with pupils to teach them about muons, get them to design a muon detector and finally install it in the school.
In July 2019 we were delighted to hear that our grant was successful and the real work began. We held an initial assembly at school that October using a water pistol and a few slides to introduce ourselves and muons. With the pupils in years 3 and 4 (ages 7–9), we drew pictures of the cascading particles resulting from a cosmic-ray air shower. Meanwhile, the children in years 4 and 5 (ages 8–10) were in charge of how the detector looked.
Together we decided that it should have a muon counter and a display that flashed different colours, depending on the muon’s energy. We jointly defined various parameters of the display, including its size, the colours of the flashes and the number of digits on the counter. Finally, we talked with year-6 pupils about the data the detector would produce, with the help of some edible Smartie bar graphs of course. And behind the scenes we were busy ordering printed circuit boards, soldering components and programming microcontrollers.
In January 2020 we returned to the year 4/5 class with a red flashing prototype encased in a shoe box. We got some great feedback. The colour red was no good – it was too much like a warning light and too bright. We therefore dimmed the display and democratically chose blue, green and amber for the colours. The pupils also told us we needed a switch to turn off the display when the flashing got too distracting. Finally, we decided to have eight digits in the detector so it could count to 99,999,999 muons – over roughly the time pupils spend in school (at a count every two seconds we were expecting about 15 million counts per year).
The pupils designed the muon detector with 8 digits, so it could count up to 99,999,999 muons.
Unfortunately, when the pandemic struck we couldn’t continue to work directly with the children. Instead, over the summer I worked at home, quietly improving the detector’s electronics. In fact, on walks through the village where I live, people would often ask me what had happened to the muon project. I’d tell them we’d get back to it when we can and fortunately, by September 2020, schools re-opened and we started to think about the final switch-on.
Originally we envisaged a school assembly with a rowdy group countdown to the detector being turned on. We couldn’t do that with social-distancing measures in place, so instead planned a virtual switch-on for December 2020 with pupils from years 4 to 6 in their classrooms, Cooke at school, and me and Ibberson joining from our homes. And so, after revving the pupils up with a quiz to remind them about muons, we turned the device on.
It worked, phew! We then spent a few minutes watching the detector count up to 60, before opening the floor to questions. And, wow, what great questions they asked. How can a muon travel through 10 metres of concrete? Why do they decay into electrons? If the detector had a bigger area, would it count more muons? Has a detector like this ever been made before? Where do the cosmic rays that generate the muons come from?
Overall, the project was a great success. We had wonderful engagement from the pupils, who had contributed to the design of a fun scientific instrument. In fact, the switch-on of the detector, though virtual, was every bit as exciting as if it had happened in school. Looking to the future, we hope that this strange machine from a strange year counts up to 99,999,999 many times over its lifetime – and that it continues to provoke curiosity from primary-school pupils in Coton for years to come.
While X-ray imaging is routinely employed in medical diagnosis and in industry for inspecting materials like semiconductors for defects, existing X-ray machines cannot image curved three-dimensional objects with high resolution. A team led by researchers at the National University of Singapore (NUS) and Fuzhou University in China has now developed a new flexible X-ray sensor that can do just this. The device, which relies on a series of nanoparticles that emit light for a long time after being excited with X-rays – a phenomenon known as persistent radioluminescence – might find use in healthcare applications such as portable X-ray detectors for mammography and imaging-guided therapeutics.
The X-ray detectors in today’s X-ray machines are usually flat panels in which each pixel has its own integrated circuit. This set-up makes the pixels bulky and limits the resolution of the detector, explains team member Xiaogang Liu at the NUS’ Department of Chemistry. The panels’ flatness also means the detectors struggle to capture images of curved objects.
Wrap-around device
In their work, Liu and colleagues focused on lanthanide-doped nanomaterials, which have unique luminescent properties that are already widely exploited in X-ray scintillation, optical imaging, biosensing and optoelectronics. They began by doping sodium lutetium fluoride (NaLuF4) nanocrystals with ions of the rare-earth element terbium (Tb3+). They then embedded the doped nanocrystals (which are denoted as NaLuF4:Tb@NaYF4) into silicone rubber to make a highly flexible X-ray detector that can be wrapped around 3D objects.
The next step was to excite the NaLuF4:Tb@NaYF4 with X-rays at an energy of 50 kV. When they did this, the team observed that the material emitted intense light long after the source of X-rays had been removed. The light persisted for more than 30 days, which means it can be used to image objects throughout this time.
Slow “hopping” charge carriers
Liu and colleagues explain that the light is emitted as the lutetium ions in the NaLuF4:Tb@NaYF4 lattice absorb the energy of the X-rays, generating many energetic electrons in the process. When X-ray photons collide with small fluoride ions in the material, flaws known as anion Frenkel defects form in the nanocrystal and trap the energetic charge carriers (electrons and holes) created. The prolonged radioluminescence in the material comes from these electrons slowly “hopping” through the crystal scaffold towards the Tb3+ ions and radiatively recombining with hole-Tb3+ centres, they say.
Liu acknowledges that other persistently luminescent materials already exist. Phosphors are one prominent example, and in 2011 a team at the University of Georgia, US, reported that ZnGa2O4:Cr3+ phosphors have an afterglow lifetime of approximately 15 days. However, Liu notes that these other materials are either not very sensitive to X-rays or are difficult to manufacture at the nanoscale, which makes them unsuitable for making flexible detectors.
Sub 25-micron image resolution
The NUS team’s imaging technique, which they call X-ray luminescence extension imaging (Xr-LEI), can be used to produce images with a resolution of less than 25 micrometres, the researchers say. “Many research groups, including ours, have been taking on challenges in X-ray imaging over the past few years,” Liu notes. “The technology we report on may provide a much-needed solution for imaging highly-curved 3D objects. It could be particularly suitable for applications like point-of-care X-ray radiography and screening mammography without having to compress the breast, which is uncomfortable for the patient.”
As well as healthcare applications, the technique might also be used to detect defects in electronic materials like semiconductors, authenticate works of art and to examine archaeological objects at the micron scale, he adds.
The researchers, who report their work in Nature, say they now plan to optimize the performance of their persistent luminescent nanomaterials to further reduce X-ray dosage and exposure time. “We will also be pursuing the development of dynamic X-ray imaging techniques that would benefit real-time monitoring of biological processes of living organisms,” Liu tells Physics World.