The ability to make artificial atoms containing exotic particles in place of electrons is giving physicists a new way of probing fundamental interactions. Now, researchers have created and interrogated a novel kind of helium atom in which one of the electrons is replaced by a sub-atomic particle known as a pion. The work could shed light on the nature of both pions and neutrinos – tiny, neutral particles for which certain attributes, including mass, remain relatively poorly understood.
Among the particles previously used to make these unusual atoms is the muon, which is about 200 times as massive as the electron but otherwise has identical properties. In 2010 Randolf Pohl and colleagues at the Max Planck Institute of Quantum Optics (MPQ) in Garching, Germany, carried out spectroscopic measurements on muonic hydrogen. They used their data to great effect, calculating a value for the proton’s charge radius that was completely at odds with the then-accepted value and forcing other groups to try to resolve the discrepancy.
Creating atoms containing pions is more difficult. Pions are mesons, meaning that they consist of a bound quark and anti-quark. When fired at high speed into a dense target, some of them will form atoms. But they also create lots of background particles, which makes it hard to pick out the atoms. What’s more, pionic atoms, like the pions themselves, have very short lifetimes, which has made it difficult excite them with a laser beam and measure their internal transition frequencies.
Long-lived atoms
Masaki Hori, also at the MPQ, and colleagues have now overcome this barrier by creating relatively long-lived atoms of pionic helium. In an experiment at the Paul Scherrer Institute in Switzerland, they used an intense beam of protons to generate negative pions in a carbon target, then collided the pions with atoms of superfluid helium-4.
Most of the incoming pions slam directly into a helium nucleus, causing the helium to split into a proton, neutron and deuteron within around a picosecond (10–12 s). But roughly 2% manage to displace one of helium’s two electrons and enter a weakly-bound orbit around the nucleus. The bound pions maintain this orbit for several nanoseconds (10–9 s), as they are protected from the effects of thermal collisions with other atoms by both the remaining electron and the very cold target, which is held at just a couple of degrees above absolute zero.
Left to their own devices, the pionic helium atoms will eventually split and generate fission products that can be picked up by detectors positioned around the target. The problem is that the signal from these fission products gets swamped by the signal from the nucleons created by the remaining 98% of incoming pions. The pions arrive in pulses every 20 ns, thanks to the 50 MHz frequency of the accelerator cavities, and they produce big peaks in the detector counts. In contrast, the spontaneous decays of the metastable pionic atoms produce an unobservable smooth background.
Transition frequency measurement
This is where the laser spectroscopy comes in. As they report in Nature, Hori and colleagues synchronized an infrared laser with the pion beam so that it fires a 0.8ns-long light pulse about halfway between each successive pulse of pions. This causes some of the pions in their respective nuclei to drop to a lower orbit and eject the remaining electron, which they do in a few picoseconds, before crashing into the nucleus. Although this process yields just three pionic atoms per hour above the background, that is enough to register a significant peak in the detector count.
By scanning the frequency of their laser beam, the researchers established that the transition they were studying occurred at 183,760 GHz. They point out that the accuracy of this measurement is limited by the many collisions with helium atoms, each of which shifts the frequency slightly, and the broad but easy-to-identify atomic resonance that they used. They hope eventually to use their experiment to set a new, more accurate value for the mass of the negative pion (currently known to just six decimal places), and propose both thinning out the target and studying atomic transitions with narrower linewidths.
In a commentary written to accompany the research, Niels Madsen of Swansea University in the UK points out that Hori’s group has previously achieved record precision for the antiproton-to-electron mass ratio by replacing one electron in a helium atom with an antiproton. However, he says that improving the negative-pion mass accuracy will be tougher, partly because a lower helium density means less signal. Nevertheless, he argues that, with persistence, the accuracy could improve by a factor of 10–100. “The experiment thus paves the way to fresh insights into the fundamental constituents of nature,” he writes.
Hori points out that shoring up the pion’s mass could also lower particle physicists’ current upper limit on the mass of the muon antineutrino, given that pions decay into muons (the mass of which is very well known) and neutrinos. “The muon neutrino mass can be estimated by indirect methods much more precisely,” he says. “But it is always nice to have a direct laboratory determination.”
Physics World’s Laser at 60 coverage is supported by HÜBNER Photonics, a leading supplier of high performance laser products which meet the ever increasing opportunities for lasers in science and industry. Visit hubner-photonics.com to find out more.
As a laser scientist, I’m responsible for ensuring that the CLF’s Vulcan laser operates smoothly for our experimental users. A workday for me can vary considerably depending on the operational needs of our users, but I would typically spend my time aligning the beam and making sure all aspects of the system are working, from the oscillators to laser diagnostics.
We had two experiments running before the lockdown. The first was from an Oxford University group studying dust charging and destruction in shocked plasmas, while the second was an experiment that was using our petawatt beamline to investigate the source mechanism of electromagnetic pulses. We were a few weeks into the first experiment and in the setup stages of the second when the seriousness of the pandemic in the UK started to affect our work.
Change of plan
My role has gone from being very experimental to more theoretical, and it is a challenge to change the way you work when you’re used to the activity and fast pace of doing experimental physics. However, as a recent graduate, my experience of remaining productive while working independently and remotely on a research degree is fresh in my mind, and I think that has helped.
At the moment, I am working alongside my colleagues on research relevant to Vulcan and the laser facility, and I believe the group is working well under the circumstances. The Vulcan group is meeting every day on the online platform Zoom, and it has been nice to keep in touch with colleagues and students. I like these meetings because they give me a sense of normality. It’s only when it’s taken away from you that you realize the importance of human interaction. Video conferencing is an invaluable tool at times like this to contact each other and stay well connected.
Supporting research
The CLF is run by the Science and Technology Facilities Council (STFC, part of UK Research and Innovation), and it has responded positively to the crisis. It’s inspiring to hear about the ways my organization is supporting efforts to tackle the pandemic – for example, by offering supercomputing power at the Hartree Centre to help with modelling work, or by giving rapid access to the CLF’s Octopus biological imaging facility for research relevant to COVID-19. We have also given most of our clean room personal protective equipment to the National Health Service.
The shutdown of experiments at Vulcan has given us time to focus on our research work and prepare articles for the STFC’s annual report. Already, we have produced an article on some ongoing work we did in the laboratory involving the development of a device to strengthen Vulcan’s diagnostic capabilities. The STFC is also encouraging staff to develop new skills, for example, learning how to use software for optical design and computer aided design. These personal development activities will prove helpful in our roles well beyond the pandemic.
Automation and remote access
I was hoping to attend more international conferences on laser and plasma physics, but with travel restrictions expected to be in place for the foreseeable future, I think 2021 will be a much better year for that. A lot of conferences have simply postponed until 2021 anyway.
At Vulcan, there are discussions about how we might automate some of our day-to-day tasks and provide remote access to users. Obviously, Vulcan requires human minds and experience to operate safely, but our turn-on procedures could incorporate some level of automation, and I can imagine some changes to operations that could see users accessing the facility remotely. In the short term, this would help to minimize the number of staff members required to be in the facility at any one time, but it may prove to be a positive change to standard practice if it turns out to be successful.
As for me personally, I live in a small village surrounded by the Oxfordshire countryside, and one positive consequence of the lockdown is that nature has had some respite during the time we’ve all spent indoors. I have observed many more bees and butterflies on my walks and in my garden. The air feels and smells cleaner too. It will be interesting to look back on this time in the long-term future and see if the pandemic was a catalyst for worldwide environmental change.
X-ray imaging is a valuable tool for studying biological systems, providing extremely high resolution as well as the ability to see within thick samples. The recent development of coherent X-ray sources has enabled the introduction of lensless imaging techniques, which eliminate the problem of creating suitable optics for X-ray microscopy.
Generation of the coherent X-ray radiation needed for lensless imaging generally requires large facilities such as synchrotrons or free-electron lasers. But in the extreme ultraviolet (EUV) spectral region (124–10 nm), coherent radiation can be created via high harmonic generation (HHG) using intense femtosecond lasers. Such HHG sources could, in principle, enable coherent EUV imaging to be performed in a small-scale laboratory.
An international research team, led by Bill Brocklesby and Jeremy Frey at the University of Southampton, has now developed a laboratory-scale coherent EUV source from a femtosecond laser, and used it to create high-resolution images of lab-grown neurons. Such highly detailed images could have many potential applications in medicine and biology, including the study of neurodegenerative diseases.
Lose the lens
In lensless imaging, the object is illuminated with coherent radiation and the scattered radiation is collected onto a detector without requiring imaging optics. The image is created by algorithmic reconstruction of the phase of each pixel in the detected scatter pattern. In this study, the team employed ptychography – a type of lensless imaging in which the illumination is moved relative to the sample and multiple scatter patterns are recorded.
To test their approach, the researchers grew mouse hippocampal neuron samples on silicon nitride membranes in vitro. They first imaged the samples using an optical microscope. They then performed ptychographic imaging of small areas of the samples, using coherent 29 nm (43 eV) illumination produced by generating high harmonics from a pulsed femtosecond laser.
Ptychographic imaging with the EUV source created images that were quantitative in both amplitude and phase, with a lateral resolution of 80 nm and an axial sensitivity of approximately 0.8 nm (equivalent to a layer of protein). The high lateral resolution is due to the short wavelength and the accuracy of the phase retrieval algorithm, while the high axial sensitivity arises from the strong interaction of EUV radiation with the biological materials being imaged.
Comparison of an image taken using a phase-contrast visible light microscope with the EUV ptychographic image showed that the EUV image had much higher lateral resolution than its optical counterpart. For example, the EUV image showed fine “flared” structures emanating from thicker dendrites that were not seen in the optical image.
(A) White light phase-contrast microscopy image of a neuron sample after fixing; (B) 29 nm EUV ptychography of the sample, demonstrating extra detail and higher resolution. (Courtesy: CC BY 4.0/Science Advances 10.1126/sciadv.aaz302)
Fluorescence comparison
The researchers – who performed these studies at Southampton and the Artemis facility in the Rutherford Appleton Laboratory – also used EUV ptychography and conventional fluorescence microscopy to image neuron samples that were stained for immunofluorescence imaging. They observed that the EUV resolution was significantly better. EUV imaging was also more sensitive to thin structures in the neurons, identifying features of below 100 nm in width and only 10 nm thick, which could not be seen using fluorescence imaging.
The authors note that even with super-resolution fluorescence techniques, which can have lateral resolution smaller than the diffraction limit, fluorescence imaging of very thin structures will remain difficult, due to the extremely small quantities of fluorescent material present. They add that correlative EUV and fluorescence imaging allows the structural elements seen using EUV ptychography to be directly correlated to biological function.
Finally, the team examined the impact of the EUV radiation on the samples. Many other high-resolution techniques used for imaging biological samples, such as electron cryo-microscopy or hard X-ray microscopy, are limited by the damaging effects of the radiation. The 29 nm EUV radiation, however, did not damage the delicate neuron structure on the exposure timescales used for imaging.
“The ability to take detailed images of delicate biological structures like neurons without causing damage is very exciting, and to do it in the lab without using synchrotrons or other national facilities is a real innovation,” says Brocklesby. “Our way of imaging fills an important niche between imaging with light, which doesn’t provide the fine details we see, and things like electron microscopy, which require cryogenic cooling and careful sample preparation.”
“Spent all these days in my laboratory and found many interesting things,” wrote Belgian-American chemist Leon Baekeland in his journal in June 1907. For four days he had been experimenting with condensation reactions between phenol and formaldehyde impregnated in wood blocks. “Have applied for a patent for a substance which I shall call Bakalite.” It was the first plastic made from synthetic components and the beginning of a materials revolution.
There’s no denying that plastics are amazing materials. They are made from polymers – long, stringy molecules composed mostly of a carbon backbone with a cornucopia of different functional atoms and groups branching off, from simple halogen atoms to aromatic rings and oxygen-containing ester linkages. Plastics can be hard or bendy. They are easily melted and reformed. And they are among the cheapest and most enduring materials on the planet. But that’s the problem. Plastics have spawned a consumer revolution in disposable goods that persist for decades, even centuries. And while public concern about plastic pollution is now strong, turning those sentiments into positive action can be tricky.
One person seeking solutions is Sally Beken, a chemist from Innovate UK. She heads the UK Circular Plastics Network, which aims to cut waste by bringing users of plastic together. For her and many others working on plastic waste, the problem is not the plastics themselves, but our poor husbandry of them. The good news is that technology is constantly making it easier to retrieve, reuse and recycle plastic, with developments in physics playing an important role. But despite the progress, the biggest challenge may be yet to come.
Sorting out the problem
To minimize the carbon footprint of a plastic product, you would ideally reuse it multiple times. But the additional plastic needed to make something robust enough so that it can be reused isn’t always balanced by the amount of reuse it gets. A Tupperware takeaway container, for example, would need to be used 200 times to leave a lower environmental footprint than its expanded polystyrene counterpart, even when the latter cannot be recycled. Sometimes it’s not even possible to reuse a plastic product – for example, plastic tubs can crack, rendering them useless. That’s why many people who are looking for ways to handle plastics more sustainably are instead trying to develop more efficient circular economies for these materials rather than simply trying to make each product last longer.
What makes recycling plastics more complicated than, say, cardboard, is the sheer proliferation of different types, all of which need distinct treatments. “For food packaging, we don’t need all these different types, we could manage with just three,” says Beken. And if we did just use fewer types, the volumes of each would increase, making it more economic to recycle.
But if we can’t limit the number of types of plastic, why not find smarter ways to sort them? In 2017 more than two million tonnes of plastic were used as packaging in the UK alone, and with many recycling stations still requiring people to separate the plastics by eye and by hand, there’s clearly got to be a better way.
Humans and machines Many plastic recycling plants sort the material by hand at some stage (left). Where there is automated sorting (right), multiple methods are needed to separate the many different types of plastic. (Courtesy: iStock/andresr; Paprec)
One company leading the way is the French recycling business Paprec, which currently has 210 sites handling around 12 million tonnes of waste each year. Although it employs operators to sort some of the plastic by colour, and manually remove rivets, screws and the like from industrial plastic waste, a lot of the sorting is now automated. However, even the sorting approach depends on the type of plastic. In the case of polyvinyl chloride (PVC) and polyethylene terephthalate (PET), sorting is fully automated and based on the plastic’s optical properties. Cameras analyse the spectra of the waste before air nozzles blow off different types to redirect them, either for different treatments or ultimately landfill. Such optical sorting is the dominant automated approach in a lot of recycling units, but the technique often struggles with dark plastic. That’s because it traditionally uses near-infrared radiation, which is not reflected enough from darker materials to distinguish between different types.
While equipment is now commercially available over a broader spectrum of frequencies so that dark plastics can be sorted, additional plastic properties can be exploited instead. Paprec, for example, also uses “flotation sorting” which differentiates materials according to whether they float or sink. The technique neatly combines sorting with the rinsing stage of the recycling process, but the problem is the materials need to have density differences of at least 0.2 g cm–3. Unfortunately, for polyethylene (PE, typically used in packaging and plastic bags) and polypropylene (PP, used in packaging and labelling) this is not the case. Another densimetric approach is to suck the lighter plastics off from a vibrating inclined plate – plastics too heavy to suck up are then redirected elsewhere. This requires a density difference of at least 0.3 g cm–3, so is also not able to separate PE and PP. In fact, these two polymers are particularly problematic because they are often mixed within materials, making it even harder to isolate them.
Another option is “triboelectric sorting”. Using friction to charge a plastic’s surface, you can use charged electrodes to attract and reject different types of plastic depending on whether they become positively or negatively charged. Triboelectric sorting is effective on quite a few plastics including PET drinks bottles, engineering thermoplastics, plastics from electronic and cable wastes, PVC window profiles and even plastic production waste. Unfortunately, the waste must be dry – and multiple stages of plastic recycling are wet or involve washing.
“One bin to rule them all”
Clever and ingenious though these different sorting approaches may be, mixed plastic waste still requires a host of sorting treatments to separate it all. And since not all recycling depots have all the available techniques, the reality is a lot of potentially recyclable types of plastic end up in landfill. As an alternative, researchers in the Wolfson Centre for Materials Processing at Brunel University London in the UK devised a fluorescent tagging system that not only allows all plastics to be optically sorted, but can also draw distinctions based on use, separating containers used for food from those for pesticides for example.
Known as PRISM (plastic packaging recycling using intelligent separation technologies for materials), the technique involves writing a code onto the plastic using a dye containing phosphors – light-emitting molecules like those used in strip lighting. The phosphors emit ultraviolet (UV) light, so the code is visible only to detectors operating at that frequency. All you need to do is attach a UV detector to the optical sorters and attract buy-in from plastic manufacturers to tag their products. The first full-scale trial of PRISM was demonstrated by TOMRA – a Norwegian firm working on recycling innovations – which in 2017 claimed it could collect 98% of labelled plastics with 95% accuracy.
The fluorescent-tagging approach has many fans, but what do you base the tagging system on? Sorting has traditionally been centred on distinguishing plastics based on the polymer’s carbon backbone, or the monomer unit that is repeated to make up the polymer chain. But thanks to the ever-changing versatility of many plastics, this approach has shortcomings. Milk bottles and laundry detergent bottles, for example, are often both made from high-density polyethylene (HDPE), but the former readily soften when filled with hot water while the latter are more robust. A plastic’s properties are also affected by the type and amount of additives, by the molecular weight (how many repeating units on average make up each molecule for the plastic), by the percentage of recycled content, and by the source of that recycled content. As a result, a laundry bottle made from milk-bottle plastic will not have the flexural strength expected of it.
“The problems around plastics are not solved by a single discipline,” says Michael Shaver, a polymer scientist at the University of Manchester, who heads a UK Research and Innovation (UKRI) project called RE3 – Rethinking Resources and Recycling. Along with 25 stakeholders from across the supply chain, RE3 includes a project that Shaver runs called “One bin to rule them all”, which is seeking better recycling infrastructures so that all plastic waste can be sorted, recycled and valued even if it is disposed of in a single bin. For Shaver, what we need is a system or marker that sorts based on real value – the properties and potential uses of the retrieved plastic – not the attributes assumed from the chemistry of the backbone.
Shaping up
The simplest way to recycle plastic is to mechanically extrude and recast the product without meddling significantly with the chemistry of the polymers involved. The plastic is ground into small pieces, formed into pellets and poured like cereal into a turning “screw” that transports, melts and pressurizes the plastic so that it can be cast in a liquid form around moulds where it re-solidifies into the shape of the desired product. In Europe 99% of all recycled plastic is processed in this way.
Drinks bottles have been one success story in mechanical recycling, with most manufacturers now using the same PET material, making these much easier to sort and recycle. In fact, voluntary pledges from packaging producers alongside content targets set in the EU Single Use Plastics Directive have resulted in increased demand for recycled PET. “You can’t get enough of it,” says Beken, who points out that the UK has been importing recycled PET from Belgium because there isn’t the capacity to meet demand in the UK.
Popular polymer Plastic bottles are almost all made from the same PET material and are easily sorted. (Courtesy: Paprec)
But even with PET recycling, there are disparities. Bottles are readily retrieved and can be extruded into high-quality recycled products, whereas there are fewer separate reprocessing lines for the plastic trays used for meat or ready meals, where the cost per tonne obtained is higher. In addition, plastic trays are often lined with different materials. When they are extruded, materials of the same type start to congregate – “phase separate” from other types – so that the plastic is no longer homogenous, leading to lower-quality recycled products. These problems mean that potentially recyclable materials could end up in landfill.
“Mechanical recycling is about consistency,” says Shaver, who believes that current efforts towards circular plastic economies are hampered by a total lack of standardization in the quality of recycling. “Legislation is the big thing.”
Enlisting the chemists
One active area of research is working out the impact of repeated mechanical recycling on a plastic’s properties. Inevitably, when a plastic is extruded and reshaped, there will be some wear and tear – such as reductions in polymer lengths, or the introduction of impurities – that will limit how many times the same polymers can be recycled. Shaver’s group is, for example, extruding plastics to try and understand the chemical processes at play during mechanical recycling so that they can enlist chemistry to combat the degradation taking place.
Chemistry also offers alternative methods for recycling, albeit with a larger loop between the waste product and its reincarnation. Strategies include breaking down a polymer into its monomer units, which can then be subject to some of the original polymerization reactions to produce a high-quality product once again. Alternatively, the polymer could be broken into oligomers – shorter chains that contain multiple monomer units – therefore preserving some of the original work done to produce the plastic. Such chemical approaches could be vital in the textile industry, where fibres integrate plastics with other materials that mechanical processes can’t easily separate.
Shredder Most plastic recycling starts by shredding or grinding the material into small pieces. (Courtesy: iStock/Викентий Елизаров)
Chemical processing could also help recycle “microplastics” – microscopic scraps chipped off larger plastic objects – but the main challenge is retrieving them in the first place. Most efforts to capture microplastics have focused on preventing these substances from entering the environment where they can pollute waterways and ultimately the food chain. While macroplastics are not absorbed by the body, and simply pass through the digestive system, we do not entirely understand the impact of ingested microplastics. Shaver points out, for example, that some common plastics, such as PET, have oxygen-containing ester groups in the polymer chain. “These functional groups are common in biological systems, and thus have the potential for more significant ecotoxicological impact,” he points out.
The problem is we’re all creating microplastics even if we don’t mean to. Microplastic fibres, for example, are released whenever you wash your clothes, seeping through domestic sewage treatment plants and remaining in the sludge that is often spread on agricultural land. “None of the current domestic or utilities infrastructure was designed to deal with such tiny particles,” says Adam Root, founder of a new UK firm called Matter, which is developing commercial and domestic microplastic harvesting systems. Products to capture microplastics – such as bags to put clothes in during washing – are already available on the market, but few people know about them or use them. To combat the problem from another angle, Root has developed an external regenerative filter that can be retro-fitted to existing washing machines, and an internal unit for new models. Both will be launched later this year.
Melt and shape Shredded plastic is melted into pellets, which can be formed into new products. (Courtesy: Paprec)
Ideally, microplastics would be separated from all waste sewage before they spread into the environment. But traditional optical sorting is difficult for microplastics in sediments because background signals and surface degradation muddy the spectra collected. In 2017 researchers at the University of East Anglia in Norwich, UK, reported a fluorescent tagging technique for microplastics in which they stained the plastics with a compound called “Nile Red” to help identify them in sediments. Spectral shifts in the fluorescence of Nile Red due to the polarity of a plastic’s surroundings could also help identify how hydrophobic the material is and therefore distinguish certain types of plastic. This kind of staining might also be applicable to nanoplastics – even tinier bits of plastic – that Shaver points out are an emerging concern. But there is still a long way to go before anyone has the infrastructure to capture micro- and nanoplastics at a scale where their value can be retrieved in recycling.
PETase – a biological solution
In 2016, while picking through 250 pieces of debris from a PET recycling plant in Sakai prefecture, Japan, a team of scientists reported seeing a sediment sample harbouring a consortium of micro-organisms that appeared to be feeding on the PET.
Further analysis led the researchers – Kenji Miya-moto at Keio University, Kohei Oda at Kyoto Institute of Technology, and collaborators – to home in on a specific strain of bacteria that is key to the process. Named by the researchers as Ideonella sakaiensis, the bacteria releases two enzymes – PETase, which can hydrolyse the PET, and MHETase, which hydrolyses the reaction intermediate, mono(2-hydroxyethyl) terephthalic acid. The result: the environmentally benign biproducts terephthalic acid and ethylene glycol (Science351 1196).
The discovery galvanized the scientific community, with several groups across the world racing to understand and potentially improve the enzymes’ activity. Among those keen to develop the work of Miyamoto and Oda was an international team of researchers, led by H Lee Woodcock at the University of South Florida in the US, John McGeehan at the University of Portsmouth in the UK, and Gregg Beckham at the National Renewable Energy Laboratory in Colorado, US. In their efforts to establish the structure of PETase, Beckham and his team performed X-ray crystallography at the UK’s Diamond Light Source, which produces X-rays that are intense and bright enough to compensate for the fact that PETase crystals are hard to form. The team was able to identify the 3D structure of PETase and even engineer it to improve its activity. As Beckham explains, the molecular structure gives insights into the potential mechanism by which PETase evolved from an enzyme that most likely works on a natural substrate – such as plant-cell-wall polymers cutin or suberin – to one that can degrade man-made PET.
Colourful contaminants UK researchers have developed a way to find microplastics in sediments by staining them with fluorescent dye, then using optical sorting. (CC BY 4.0/T Maes et al. 2017 Sci. Rep.7 44501)
It might seem impressive that in just the five decades since PET waste first began to amass around the world, bacteria have evolved to make it a source of carbon nutrition. However, Beckham suggests that what the discovery and analysis of the Ideonella sakaiensis and PETase really illustrate is how much further the enzyme can be optimized for the task. “This is exciting for the scientific community, though, because it means that we can harness tools like directed evolution [for which Frances Arnold shared the Nobel Prize for Chemistry in 2018] to make even better variants of this and enzymes like it for industrial use.”
However, others remain sceptical of the positive role something like PETase can play. Setting aside scare stories of enzymes running havoc and reducing all plastic to compost overnight, Shaver questions the focus on PET as a polymer, which is already easily recycled. Mechanical recycling to recover the product or even chemical recycling to recover the oligomers are tighter circular systems that would seem more efficient than taking the plastic back to the monomer, he argues. Shaver highlights the potential interest there might be in finding an enzyme that works on polymers with no oxygen on the backbone. Such an enzyme might tackle some of the most persistent types of plastic.
Beckham, meanwhile, points out that plastic bottles – the main success story of PET recycling – comprise just 30% of PET used. In fact, carpets and clothing are the main consumers of PET, but they are not readily recycled. Furthermore, mechanical recycling itself produces a fraction of small particles – “fines” – that are beyond the process’s reach.
“Understanding where and when biology can play a role in recycling of PET (among others) in terms of the economic and sustainability perspectives is important,” Beckham says. His group, for instance, is comparing chemical recycling technologies for PET and many other plastics (including mixed plastics) with biological solutions, which may reach beyond PET. In March 2020, researchers in Germany reported the discovery of a bacteria that could break down polyurethane, a plastic widely used in refrigerators, buildings, footwear and furniture that it is currently very expensive to recycle.
The role of the past in the future
While companies can be doing all the right things in switching to products that are more compatible with sustainable circular plastics economies, it’s down to consumers to play their role too. Beken highlights Gumdrop, a UK-based firm that collects used chewing gum and turns it into bins for collecting more of the gum. Most commercial chewing gum is based on a synthetic rubber called polyisobutylene mixed with food-grade plasticizers, and it takes hundreds of years for each piece of gum to degrade completely. The problem Gumdrop has faced, however, is that people throw cigarettes in the bins, which contaminates the gum and prevents it from being recycled. As Shaver puts it: “What matters is not whether any type of plastic is recyclable, biodegradable or compostable – but whether it will be recycled, biodegraded or composted.” And that’s perhaps the biggest problem facing plastic recycling – how to deal with our fickle human behaviour.
Meet Jim Simons – mathematician, “quant”, billionaire, philanthropist and a unique kind of genius. After completing his PhD at the age of 23 from the University of California, Berkeley in 1962, he spent the next few years doing everything. He broke Russian codes; he became a department chair and built a world-class mathematics programme at Stony Brook University; he left to start his own company – and eventually became not just one of the richest men in the world, but perhaps the greatest investor in history – a modern-day Medici.
At 30 Simons became chair of mathematics at Stony Brook University in New York, where he built a strong programme, especially in differential geometry. There he began work with mathematician Shing-Shen Chern, and in 1974 he and Chern published their work, now known as Chern–Simons theory. The research has key implications for quantum field theory, condensed-matter theory, as well as gravity and superstring theories.
But Simons always had a compulsion to make money. He left academia at the age of 40 to found what ultimately became the hedge fund Renaissance Technologies (Ren-Tech), which invests in financial instruments such as stocks, options and commodity futures. RenTech has since had unparalleled success, with its Medallion fund achieving a flabbergasting average return of 66% a year from 1988 to 2018, representing $105bn in total trading profits. Investors are so willing to invest in the fund that they accept management fees of 5% a year, plus 44% of annual profits, after which the fund still provided an average annual return of 39%, the most of any hedge fund.
How did Simons do this? With mathematics, and by hiring very smart people such as IBM computer scientist Robert Mercer, Elwyn Berlekamp, an electrical engineer who taught at Berkeley, and “always angry” mathematician James Ax. In The Man Who Solved the Market, Zuckerman describes their complicated personalities, and the trials and tribulations of the company they and Simons brought to fruition. Each became richer than they ever could have imagined.
How did Simons achieve this unparalleled success? With mathematics, and by hiring very smart people
Using every piece of market data they could get their hands on, the scientists looked for patterns, regularities and correlations. Of course, market traders and technicians had been doing that – or trying – for centuries, but Simons and his men brought new techniques such as kernel regression methods, stochastic differential equations and hidden Markov models. Eventually they built a very complex algorithm that ran autonomously, the results of which even they didn’t fully understand sometimes. Essentially, they trade on behaviour, not economic fundamentals.
RenTech created the field of investing now known as quantitative trading, which blossomed on Wall Street in the 1990s and continues today. Wall Street hired scientists, mathematicians and engineers, who were employed as “quants” or quantitative analysts to bring rigorous modelling to the trading business, spring-boarding the computerized trading that dominates today’s markets. In the US, about 90% of trades are now done by computers, and about as much in the UK.
But Simons and RenTech did it first and still do it best. Their computers trade hundreds of thousands of times per day, and they win just over half the time. “We’re right 50.75% of the time…but we’re 100% right 50.75% of the time,” Zuckerman quotes Robert Mercer telling a friend. “You can make billions that way.”
While their trading has undoubtedly brought them great wealth and opportunities, it also has some troubling aspects. Today, Simons is worth over $20bn. He and his wife created the Simons Foundation, funded autism research, and gave eight-figure donations that established the Simons Institute for the Theory of Computing at the University of California at Berkeley and the Simons Center for Geometry and Physics at Stony Brook University.
But Simons and RenTech have also been involved in a protracted legal dispute with US tax authorities, who say the hedge fund owes $6.8bn in taxes because, they claim, the company illegally filed short-term capital gains as long-term gains. (The tax difference in the US is just shy of a factor of two.) In 2017 leaked documents regarding the super-rich showed that Simons had stashed more then $8bn in Bermuda, a haven from US taxes.
Simons has also donated $2.7bn to philanthropic causes over his lifetime, including $11m to Hillary Clinton’s 2016 presidential campaign and much more to other progressive causes. His partner Robert Mercer, together with his daughter Rebekah, has offset that with heavy donations to right-wing causes in the US, including Breitbart News and Donald Trump. In the UK, Mercer had his data analytics company Cambridge Analytica help Nigel Farage and the Leave campaign during the Brexit referendum.
Zuckerman spins a sharp narrative, and The Man Who Solved the Market is an absorbing look into a world that essentially manages to grow money on trees. I would have liked more on some of the maths involved, but Zuckerman is a financial journalist, not a science journalist, and RenTech keeps its techniques a closely guarded secret. But it’s no secret that Simons’ company has indeed solved the markets.
Space agencies are playing an increasing role in the race to understand the global spread of COVID-19. Satellite data are being used to explore links between pollution and COVID-19, while land-use tracking over the longer term might help to understand how pandemics emerge. These developments were discussed at the European Geosciences Union’s annual meeting, which took place online this week.
Medics across the globe have noted that underlying health issues play a significant role in mortality rates among COVID-19 patients. In the past few years, a growing body of research has established that exposure to fine particulate matter (PM2.5) – such as the nitrogen oxides common in air pollution – can lead to health issues, particularly heart and lung problems.
Provisional studies in the US, Italy and England have all found tentative correlations between regions of higher pollution and raised mortality rates among COVID-19 patients. Right now, the tricky bit in these studies is trying to rule out all the confounding factors, such the fact that pollution hotspots tend to overlap with densely populated areas.
At the same time, there is also an active debate about the length of time that SARS-CoV-2 – virus responsible for COVID-19 disease – can remain suspended in the air by aerosols.
Tracking pollution
Recognizing the pressing need for further research, the European Space Agency (ESA) has created a dedicated resource frequently updated with maps and data. For tracking nitrogen oxide levels, the key satellite is Sentinel 5P – part of the fleet of the European Union’s Copernicus Earth-observation programme.
“I have asked my teams to quickly put together activities where Earth-observation data can help in the coronavirus health crisis,” said Josef Aschbacher, director of ESA’s Earth-observation programmes. “We are doing this very closely with our partners in the European Commission but also with NASA and the Japanese Aerospace Exploration Agency [JAXA].”
Space agencies are also looking to support novel ideas for using Earth observation data in the COVID-19 crisis and the period of adaptation that will follow. ESA has launched a Euro Data Cube contest requesting for ideas relating to economic activity, agricultural activity, and other human activity distributions.
NASA has invited researchers within several of its science divisions to refocus on the COVID-19 pandemic, and they will be providing an additional $2 million for new satellite-related data projects. NASA has also teamed up with ESA and JAXA to run a special versions of its Space Apps Challenge on 30–31 May – a virtual hackathon focused on the COVID-19 crisis.
Pandemics becoming more likely?
Aschbacher believes that Earth observation can also play a role in understanding the origins of COVID-19 and the likelihood of future pandemics. He pointed out the scientific consensus is that COVID-19 most likely originated in an animal, probably a bat, which passed it to a larger mammal. A recent study by researchers in the US and Australia concluded that deforestation and urbanization are increasing the likelihood of diseases jumping from animals to humans. Tracking deforestation and other land-use changes could assist with future studies.
This view that the Earth system is dangerously out of balance is also shared by Silvia Peppoloni, secretary general of the International Association for Promoting Geoethics. “The growing impact of human activities on bio-geological systems, if not properly controlled in accordance with the transitory ecological balances, produces an increased pandemics risk,” she said. “COVID-19 is the effect of this non-functional, unhealthy, dangerous interaction for humanity.”
To help geoscientists engage with these types of complex interconnected questions, ESA is launching a new initiative called Digital Twin Earth. The idea is to bridge the gap between data collected by satellites and data recorded at the Earth’s surface. It will provide researchers with a range of tools – incorporating AI and machine learning – for modelling complex environmental scenarios, such as how deforestation in the Amazon impacts the entire Earth system.
As an antidote to those glossy, big-budget TV programmes about the wonders of the universe, the cosmologist Peter Coles of Ireland’s Maynooth University is putting out a series of videos that point out that the universe is actually a bit disappointing. In his first video, shown above, he explains why stars really aren’t that impressive after all.
Things are a bit different on the other side of the Atlantic, where glitz is good and Tom Cruise and NASA have apparently teamed up to begin discussions about shooting a movie on the International Space Station (ISS). NASA administrator Jim Bridenstine took to Twitter to announce the “exciting” news. “We need popular media to inspire a new generation of engineer and scientists to make NASA’s ambitious plans a reality,” he wrote.
NASA didn’t give any more details, but apparently the film will be an action-adventure and not part of the Mission Impossible series that Cruise has starred in since 1996. Yet there is a serious side to this with NASA eager to open up the ISS for commercial use, which includes space tourism, and now perhaps, Hollywood studios. Hopefully by the time the film is released we will all be allowed back into cinemas.
The Isle of Man Post Office has released a set of eight stamps to honour the contribution of key workers during the COVID-19 pandemic. The eight stamps feature words including care, compassion and community, along with the strapline “will carry us through”.
One of the stamps includes “science” and as well as the strapline also features two of Stephen Hawking’s famous equations about Hawking radiation and black-hole entropy. The Stephen Hawking Foundation said that a copy of the stamp had been sent to every household on the Isle of Man as well as “leaders all around the world”. “Science leads in everything we do in relation to Covid-19,” the Foundation noted. “It informs our decisions, our actions and our politics, it saves lives, it prevents harm and it will be the only way to return to anything resembling a normal life.”
This week music lovers are mourning Florian Schneider, co-founder of Kraftwerk, who has died age 73. Although Schneider has no background in physics, many of Kraftwerk’s songs touch on science and technology. These include “Radioactivity“, “Geiger counter” and “Ohm sweet ohm“.
Winning challenge: Pit your wits against Caleb Rich. (Courtesy: iStock/Pobytov)
1 What did Max Planck, who first discovered energy quantization while working on the problem of black-body radiation, later call his discovery? An act of… A. God B. Genius C. Desperation D. Madness
2 Albert Einstein’s paper on the photo-electric effect, which hypothesized the quantum nature of light, was written while he was working as a patent clerk in which Swiss city? A. Bern B. Zurich C. Geneva D. Basel
3 Founding figure of quantum mechanics Niels Bohr was gifted a house by which brewing company? It was next door to the brewery and was connected to it by a pipeline providing free beer. A. Tuborg B. Carlsberg C. Becks D. Heineken
Nobel heavy: the 1927 Solvay conference in Brussels.
4 How many of the 29 attendees of the now-legendary Solvay conference of 1927 (pictured) were or later became Nobel-prize winners? A. 7 B. 12 C. 17 D. 25
5 Which physicist was given the nickname “The Crocodile” by one of his students, who commissioned a secret carving of the animal on one of the buildings of the old Cavendish Laboratory? A. James Chadwick B. Ernest Rutherford C. Paul Dirac D. Arthur Compton
6 Who is reported to have said about the Copenhagen interpretation of quantum mechanics: “I don’t like it and I’m sorry I ever had anything to do with it”? A. Erwin Schrödinger B. Wolfgang Pauli C. Albert Einstein D. Louis de Broglie
7 Which theoretical physicist was widely believed to cause experimental apparatus to break by his mere presence? A. Werner Heisenberg B. Richard Feynman C. Paul Dirac D. Wolfgang Pauli
8 In 2017 a group led by Jian-Wei Pan set the record for the longest distance over which quantum teleportation was performed. How far did they teleport a quantum state? A. 100 km B. 600 km C. 1400 km D. 2100 km
9 In 2019 Google claimed “quantum supremacy” by performing a calculation that it claimed would take 10,000 years on the most powerful classical computer, although IBM said this could be shortened to 2.5 days. How long did the calculation take on Google’s 54-qubit quantum computer? A. 10 seconds B. 200 seconds C. 20 minutes D. 1 hour
10 At the end of 2019, what name was suggested by 16 scientists in Nature as an alternative to “quantum supremacy”, due to perceived negative connotations? A. Quantum primacy B. Quantum ascendancy C. Quantum transcendence D. Quantum advantage
Stuck on the questions? Answers are below the sponsor’s message.
Sponsored by IOP SciNotes™ – an IOP Publishing peer-reviewed, open access journal dedicated to rapid publications of shorter research outputs. Currently all publication charges are waived so articles are free to publish. Visit iopscience.org/iopsn to learn more.
Diamond is a remarkable and useful material because of its rare beauty, hardness and extremely high thermal conductivity. At the microscopic level, the gem has crystal defects that are proving to be extremely useful for creating a range of quantum technologies.
In this episode of the Physics World Weekly podcast, Daniel Twitchen of the company Element Six talks about the unique quantum properties of diamond defects and how they are finding a wide range of applications including quantum computing and quantum sensing.
Caleb Rich was captain of this year’s winning team on the BBC’s University Challenge quiz show. He talks to Physics World’s Matin Durrani about his brush with fame and also gives some tips about how to win and set quizzes.
Indeed, Rich has set this week’s quiz on Physics World which focuses on quantum mechanics. Have no fear, there are no equations.
A new technique that efficiently retrieves scattered light from fluorescent sources can be used to record neuronal signals coming from deep within the brain. The technique, developed by physicists at Sorbonne University in Paris, France, uses matrix factorization algorithms to overcome the fact that opaque biological tissues are strong scatterers of visible light, and thus hard to image except at shallow depths.
Brain imaging has traditionally relied on non-optical techniques such as X-ray computed tomography and magnetic resonance angiography. Thanks to its unprecedented combination of contrast, resolution and specificity, fluorescence-based imaging in the visible and near-infrared regions of the electromagnetic spectrum (400-900 nm) is an attractive alternative – especially for studying information processing by neurons. There is, however, a serious drawback: neuronal tissues are opaque at these wavelengths, quickly scattering any incident light. This opacity limits optical imaging techniques to depths of about a few hundred microns, which corresponds to a few scattering lengths.
While researchers have developed techniques to focus light and image at greater depths, most of these methods rely on complex wavefront-shaping techniques. These techniques are also time-consuming, which means that they cannot be used to monitor real-time neuronal activity in the brain.
Analysing the activity of deeply buried neurons
The new approach, developed by Claudio Moretti and Sylvain Gigan in the Kastler-Brossel Laboratory, is different in that it does not aim to retrieve an image of a fluorescent object, nor indeed to localize its position. Instead, it relies on analysing the activity of deeply buried fluctuating sources – in this case, the functional activity of a set of neurons – by recording their fluorescence.
The researchers exploit the fact that each source will generate an extended, low-contrast but well-defined pattern of light after scattering through a thick opaque medium such as brain tissue. This so-called speckle pattern can be imaged at a detector or camera.
Moretti and Gigan have shown that they can use these fluctuating speckle patterns to extract functional signals from fluorescence sources, even when the light has passed through highly scattering tissue. They did this by making use of an advanced signal-processing algorithm known as non-negative low-rank matrix factorization.
Characteristic fingerprint
This algorithm is fairly widely employed in image and music analysis, as well as in other areas of machine learning and big data, Gigan explains. It essentially tries to factor a large matrix — the sequence of images recorded – into a product of two “thin” rectangular matrices.
“The ‘non-negative’ in this context means that we are looking for a solution in which both smaller matrices have positive coefficients,” he says. “This approach enormously simplifies the computations and the algorithm finds a solution even if there is a lot of noise in the data (as in our case).”
The main reason why such a factorization technique works, Gigan adds, is that the huge matrix they are trying to factor can be assumed to come from a limited number of sources (the neurons), each of which has its own characteristic “fingerprint”.
“While such an algorithm has been used in neuroscience applications before, it has never before been applied to a such a ‘stringing’ scenario,” he tells Physics World. “In our experiment, the signals are completely mixed by the propagation of the functional light signals through the scattering medium – something that occludes and prevents direct imaging. We have shown that the algorithm effectively ‘de-mixes’ these signals so they can be efficiently retrieved.”
Proof-of-principle experiment
To test their approach, the researchers designed a proof-of-principle experiment in which they simulated the activity of a small network of synthetic neurons made from fluorescent beads 10 microns in size – roughly the same as most neuron bodies. They placed these beads in an ex vivo mouse skull that was around 300 microns thick and had a scattering length of about 40 microns.
They then excited the beads using blue laser light and collected the resulting fluorescence speckles using first a microscope objective and then a camera. Finally, they used their algorithm to extract information about the light emission and how it varied with time.
The technique proves that even strong light scattering does not fully destroy the information carried by the light and that it can be retrieved using computational means, Gigan says. “Using fluorescent beads that we could excite at will allows us to understand the physics at hand and determine the limitations of the technique,” he adds. “The most obvious application for the technique is in neuroscience optogenetic studies, but we hope that it will be used in fields outside of biomedical imaging too, such as in sensing, for example.”
The Sorbonne team, who report their research in Nature Photonics, are now working with biologists to apply the technique to real neurons in living systems. Another important caveat is that they did not actually image the neurons or their localization, but only recorded their activity. “We don’t really know where the neurons are or what they look like, and this is definitely a problem that we hope we can crack in the future,” Gigan says.