Gravitational waves from the most massive merger of two black holes ever seen have been detected by the LIGO–Virgo observatories. Dubbed GW190521, the event was spotted in May 2019 and involves the creation of a black hole with a mass of about 142 Suns. This is the first intermediate-mass black hole to be observed using gravitational waves, with its mass falling between that of stellar-mass black holes and the supermassive black holes that dominate the centres of most large galaxies.
The initial pair of black holes are thought to have weighed in at 85 solar-masses and 66 solar-masses – making the heavier object the first black hole observed in the pair instability mass gap where black holes are not thought to form from collapsing stars.
The two black holes orbited each other, getting closer and closer together before they merged. In the last moments of this inspiral, the pair broadcast gravitational waves that were observed by the LIGO–Virgo detectors. These are three huge interferometers – two in the US and one in Italy.
A long time ago
Physicists working on LIGO–Virgo have calculated that the merger took place 17 billion light-years from Earth when the universe was half the age it is today – making this one of the most distant mergers seen by LIGO–Virgo to date. Detecting the gravitational waves at such a large distance is possible because of the huge amount of energy they carried away from the merger – the mass–energy equivalent of eight Suns.
GW190521 is of particular interest to black hole experts because of the large masses of the three objects. Stellar-mass black holes are created when a large star collapses under its own gravity, creating a huge supernova explosion that leaves behind a black hole.
Astrophysicists believe that stars of up to 130 solar masses will collapse in this way to create black holes with a maximum mass of about 65 solar masses. The same applies to stars heavier than about 200 solar masses, which will collapse to create black holes of greater than 120 solar masses.
Electron–positron pairs
However, stars in the 130-200 solar mass range experience an effect called pair instability as they collapse. Highly energetic photons within the star are converted into electron-positron pairs, which generate less outward pressure than photons. This causes a rapid gravitational collapse of the star and an extremely violent explosion that leaves no black hole behind.
As a result, astrophysicists believe that there should be a gap in the mass spectrum of black holes ranging from about 65 to 120 solar masses. Therefore, it looks like one – or possibly both – of the GW190521 progenitor black holes lies in this gap, leading astrophysicists to speculate on their possible origins. One possibility is that the progenitor black holes may themselves have been formed in black hole mergers.
Although a black-hole merger is the most likely source of GW190521, the way in which the event was detected opens the door to other intriguing possibilities. LIGO–Virgo looks for signs of gravitational waves in two different ways. One method involves looking for signals that resemble the gravitational waves that astrophysicists expect to be given off by merging black holes or neutron stars. The other involves looking for anything out of the ordinary – and is called a “burst” search.
In the case of GW190521, a burst search was slightly better at identifying the signal, which opens the enticing possibility that the source of the signal is not a distant black-hole merger but rather the collapse of a star in the Milky Way. Another possibility, says the LIGO–Virgo team, is that the signal was created by a cosmic string that was produced just after the inflationary epoch of the early universe.
Earlier this year an international team of astronomers suggested that GW190521 may have been accompanied by a flare of light from a distant quasar, making it the first gravitational-wave signal from merging black holes to have an electromagnetic counterpart. However, more recent work suggests that the quasar is not in the same part of the sky as GW190521.
SARS-CoV-2 has infected more than 22 million people globally, leading to ~800 thousand deaths in just 10 months. However, significant uncertainty still remains regarding the prevalence of asymptomatic and mild cases of COVID-19, the disease caused by SARS-CoV-2, as well as the magnitude, effectiveness, and duration of antibody responses. Gaining a better understanding of population immunity is critical to improving predictive models of infection spread and safely reopening economies worldwide. However, to fill knowledge gaps in these areas requires population-scale testing using low-cost, non-invasive, and highly specific and sensitive assays that can be deployed broadly and serially to characterize antibody responses to SARS-CoV-2.
Benchmark detection approaches are based on sandwich immunoassays relying on optical readouts of fluorescence emission or colour change to report antibody levels. These technologies can be costly, often require centralized facilities with trained personnel and are, therefore, not amenable to at-home testing.
More affordable technologies, such as lateral flow assays have been found to be inaccurate or prone to user misinterpretation. Motivated to circumvent such barriers, the Arroyo lab has undertaken a journey to develop an at-home electrochemical assay.
This presentation will report the results of our initial, three-month effort to produce a portable immunoassay:
Learn about SARS-CoV-2 immunoassays.
Discover how electrochemists help fight COVID-19.
Learn the challenges behind developing antibody-based sensors.
Netzahualcóyotl Arroyo Currás (Netz Arroyo) is an assistant professor in the Department of Pharmacology and Molecular Sciences at Johns Hopkins University School of Medicine. He obtained his PhD degree in analytical chemistry from the University of Texas at Austin, where he worked with Allen J Bard on electrochemical energy storage and studies of electrocatalysis employing scanning electrochemical microscopy. He graduated in 2015 and moved to California to complete his postdoctoral training with Kevin W Plaxco at the University of California Santa Barbara, where he developed electrochemical biosensing platforms supporting real-time measurements of specific molecular targets in the body. His current research focuses on the development of electrochemical biosensors for pharmacological applications.
What first sparked your interest in quantum physics?
I had an interest in physics at high school and just before the Chinese New Year in 1998, our school invited Jian-Wei Pan to give a public science lecture that was held in the largest cinema in Dongyang in Zhejiang province. At the time Pan was in Anton Zeilinger’s group at Innsbruck University in Austria and they had just reported their first quantum-teleportation experiments. The lecture was intriguing but to some extent it all sounded a little crazy.
But it was fascinating enough for you to study physics?
Yes. I then studied physics at the University of Science and Technology of China (USTC) in Hefei and joined Pan’s group where I worked on a number of interesting problems such as six-photon entanglement, quantum simulation of anyons and teleportation of quantum-logic gates. In 2008 I moved to the University of Cambridge in the UK to do a PhD before moving back to USTC in 2011.
What are you currently working on?
In the past few years, I have focused on the development of scalable quantum-light sources for “boson sampling” – an intermediate quantum-computing model. My current research covers a variety of topics from blue-sky research to emerging quantum technologies such as large-scale quantum entanglement, quantum teleportation and quantum computing. I am still young enough to start exploring new fields such as atom arrays in optical tweezers and superconducting circuits.
In 2015, you and colleagues were awarded the Physics World Breakthrough of the Year for your work teleporting two quantum properties of a photon. How has research moved on since then?
We have made steady progress, for example, making quantum teleportation more “complete” by carrying out the first experimental demonstration of “high-dimensional” teleportation of a quantum spin-1 system (Phys. Rev. Lett.123 070505). My colleagues have also achieved teleportation over longer distances by exploiting the Micius satellite. This has allowed us to go from about 100 km to 1400 km via a ground station to space (Nature549 70). We have also proposed and demonstrated a teleportation-inspired method to efficiently simulate random quantum circuits that redefine the limits of “quantum advantage” (Phys. Rev. Lett.124 080502).
How has COVID-19 affected your work and are you now starting to reopen the labs?
Students had already returned home for the Chinese New Year when the COVID-19 outbreak emerged. During the height of the pandemic, people showed a remarkable ability to come together. Thanks to the selfless dedication of medical workers and the strong measures taken by the Chinese government, COVID-19 was effectively controlled within a short time. Graduate students have returned to campus cautiously in a step-by-step and organized way. As far as I know, not a single student or faculty member of USTC was affected by the COVID-19 virus.
As chair of the Quantum 2020 conference in October, what do you hope it will achieve?
Quantum 2020 aims to bring together the international quantum-science community to learn, share and collaborate on the latest research and emerging areas of interest. Taking place over four days from 19 to 22 October, it is co-organized by the USTC, the Chinese Physical Society (CPS), the UK’s Institute of Physics and IOP Publishing, which publishes Physics World.
The virtual event gathers early-career and established researchers from universities, industry and governments worldwide. Beyond a high-profile programme of more than 30 invited talks from world-leading figures in the field, Quantum 2020 will also feature two special interactive panel sessions covering industry and worldwide initiatives in quantum technology.
The meeting is now virtual given the COVID-19 pandemic. How has the organization gone?
We began to organize the Quantum 2020 event at the end of last year. The original plan was to hold the conference at USTC’s Shanghai campus. But because of the COVID-19 situation, in March I suggested changing it to a virtual conference. Organizing an online meeting has been a new experience, but it has gone well thanks to the help of the advisory committee and the quantum-science community. The virtual format delivers several benefits in terms of time and efficiency and we have seen a real appetite from invited speakers and panellists to get involved. We are very excited by the calibre and diversity of the scientific programme.
Have there been any benefits of it going online?
I believe that virtual conferences will remain an important part of the scientific community beyond the current COVID-19 situation. We anticipate that the online format will have several benefits for participants including improved accessibility, reduced cost and personal disruption, as well as an overall reduction in carbon footprints from travel.
Are there any benefits of this conference to wider society?
In partnership with the CPS, we plan to translate some of the plenary and invited talks to raise the awareness of quantum physics and quantum technology in China. In addition, Quantum 2020 plans to establish new international awards to promote and encourage early-career researchers in quantum technologies. These will run alongside other high-profile awards in the field such as the Micius Quantum Prize awarded by the Micius Foundation, the biennial International Quantum Communication Award given out at the QCMC conference and the Rolf Landauer and Charles H Bennett award in quantum computing by the American Physical Society.
Quantum computing has accelerated recently. How do you see its development?
Scientists are gaining exquisite control at a fundamental level and pushing the physical boundaries. Such efforts will not only deepen our understanding of physics but also unlock unbelievable potential for new technology. In the not-too-distant future, they will become a reality to control and entangle thousands or even millions of quantum bits at will. Quantum simulators and computers will become first useful for physicists, chemists and engineers for applications of material and drug design. Many more surprises will emerge but, of course, they are hard to predict.
What excites you about the future of quantum technologies?
We are now beginning the second quantum revolution. Exploiting quantum superposition and entanglement offers a radical new way to transform our technologies from communication and computation to metrology. Comparing quantum computers with their classical counterparts is like comparing lasers to light bulbs. I believe that the development of quantum computers will follow a similar trajectory as lasers – first as a useful tool inside the laboratories then finding applications in many different areas. Inspired by what we know about the history of the laser, I believe what we have already discovered – for example, quantum key distribution, and quantum algorithms – is just the tip of the iceberg of what is to come.
With many countries pushing the implementation of quantum technologies, is is it driven more by co-operation or competition?
Quantum technology has a long way to go before it can be widely used. So, international co-operation and open exchanges are imperative. Just like how we all need to work closely together to confront the COVID-19 pandemic. Likewise, quantum technology is for all and not just a single country. The potential outcome from quantum research, such as solving energy problems, new materials, and new medicine, will too benefit all people. Diseases have no borders and neither should research.
That’s according to the Doomsday Clock – a device created by the Bulletin of the Atomic Scientists in 1947 as a metaphor to indicate how near we are to a humanity-ending catastrophe. The clock started out at 11:53 p.m. and over the years has shifted backwards and forwards as the global situation has worsened or improved. But on 23 January 2020 the clock was moved closer to midnight than at any other time in its near 75-year lifetime.
This year’s historic decision was announced to “leaders and citizens of the world” at the National Press Club in Washington, DC by members of the Bulletin of the Atomic Scientists. In setting the clock to 100 seconds to midnight, they cited risks such as worsening nuclear threats, a lack of climate action, and the rise of “cyber-enabled disinformation campaigns that undermine society’s ability to act”.
The annual resetting of the Doomsday Clock is these days a major media event, providing grist for politicians, policy-makers and commentators around the world. But the clock actually emerged from the concerns of the physics community immediately after the Second World War, when two University of Chicago physicists – Eugene Rabinowitch and Hyman Goldsmith – started to think about the consequences of their work. They were among the many scientists and engineers who had taken part in the Manhattan Project, which developed the atomic bombs that the US dropped on Hiroshima and Nagasaki in August 1945.
Chicago was where the Italian physicist Enrico Fermi had in 1942 designed and built the first reactor that could achieve a self-sustaining nuclear reaction and where much of the science of the Manhattan Project was incubated. “Within three or four months of the bombs being dropped, Rabinowitch and Goldsmith created a publication called the Bulletin of the Atomic Scientists,” says current Bulletin president and chief executive Rachel Bronson.
According to Bronson – who is not a physicist, but an expert in international relations – many of the Manhattan Project physicists were already politically conscious, but their main concern had been to acquire a nuclear bomb before Germany. In the event, the Germans never built such a weapon and it was only after the war that the Manhattan researchers began to debate nuclear risk and proliferation with the wider physics community.
“The Bulletin was established to set a flag and say here is where scientists should engage in political issues,” says Bronson. But the journal was also set up to consider future dangers or – as Rabinowitch poetically put it – “to manage the dangerous presents of Pandora’s box of modern science”. And it was a desire to communicate these risks to the public that led to the Doomsday Clock being set up in 1947, two years after the first edition of the Bulletin.
The idea of the clock emerged from the cover of the June 1947 edition of the Bulletin, created by the artist Martyl Langsdorf whose husband was a nuclear physicist. Langsdorf placed the first clock at seven minutes to midnight for purely aesthetic reasons but its subsequent position was decided by Rabinowitch, the Bulletin’s founding editor. When he died in 1973, a science and security board took over that responsibility in consultation with the journal’s board of sponsors.
Originally set up by Albert Einstein with Robert Oppenheimer as its first chair, the board of sponsors currently includes 13 Nobel laureates – including the particle theorists Sheldon Glashow, Steven Weinberg and Frank Wilczek – as well as astronomer Martin Rees and theoretical physicist Lisa Randall. As for the science and security board, its composition has evolved over the years. It currently has 19 members and is chaired by Robert Rosner – a physicist and former head of the Argonne National Laboratory near Chicago.
“On the nuclear side we have physicists who know about all things nuclear, weapons and reactors, but also people who have been involved in negotiations with the government,” says Rosner, who has served on the board for a decade and currently leads the process to set the Doomsday Clock each year. “There’s a strong political aspect to it,” underlines Bronson, who points to the need for “people who understand the political process of different countries and how the science wraps up in what we’re doing”.
Closer and closer
Before 2020 the closest the Doomsday Clock had been to midnight was in 1953, when it was set to 11:58 p.m. after both the US and the Soviet Union carried out hydrogen-bomb tests the previous year (figure 1). Its furthest distance from midnight came in 1991 when the clock was moved back to 11:43 p.m. That heady moment followed the end of the Cold War and the signing of the Strategic Arms Reduction Treaty, which led to deep cuts in US and Soviet nuclear-weapon arsenals.
The nuclear threat is still with us, however. In 2019 there were almost 13,900 nuclear warheads in the world (albeit down from a high of more than 70,000 in the mid-1980s). And it was the continued existence of these arsenals – coupled with the lapse of several major arms-control treaties and America’s decision to quit the Iran nuclear deal – that shaped the Bulletin’s current risk assessment.
1 The time according to the Doomsday Clock Starting out in 1947 at 11:43 p.m., the Doomsday Clock has changed position depending on the threats to the world from nuclear proliferation and, more recently, from other risks too.
However, since 2007 the journal’s Doomsday Clock deliberations have also factored in the risks facing humanity from climate change. Bronson admits that adding climate change to its remit was “probably one of the most controversial decisions” her organization has made given its reputation as primarily an authority on the threat from nuclear weapons. “[But] if you believe that the Bulletin was founded to manage ‘the dangerous presents of Pandora’s box of modern science’, then the answer is, of course, you have to include climate change,” she says.
In recent years the Bulletin has also begun to focus on new disruptive technologies. “Take synthetic biology, for example, or gene editing,” says Steve Fetter, a physicist-turned-policy-expert from the University of Maryland, who serves on the Bulletin’s science and security board. “While this technology has tremendous promise for curing currently incurable diseases, you don’t have to have too much imagination to see how someone could misuse it.” The risks, Fetter adds, are compounded by how easy these technologies are to develop. “You needed a huge factory to make plutonium or highly enriched uranium, but gene editing, or artificial intelligence; these are things that can be done by an individual.”
Hands-on process
Setting the Doomsday Clock is a process the Bulletin says it takes very seriously. “We pride ourselves in it being consultative,” says Bronson. “Our leaders and experts can come together with different perspectives and put them out on the table and have them examined.”
The formal process begins in November each year, when members of the science and security board meet in Chicago for a day and a half, with its deliberations centred on two fundamental questions. First, is humanity safer or at greater risk this year compared to the year before? And second, is humanity safer or at greater risk this year than in all the years since the Bulletin began its deliberations in 1947? Rosner says discussions are not for the sensitive, with board members being “pretty hard-headed people who are vigorous in expressing their opinions”. With the calibre of minds involved, he says, “you can’t be a lazy thinker”.
To set the clock the closest it’s ever been to midnight, the Bulletin of the Atomic Scientists had to be convinced we are living in the most dangerous period since 1947
After the November meeting, board members start drafting a public statement that explains their decision, refining it with staff from the Bulletin in a process that lasts until the announcement of the clock’s new position in early January. To set the clock the closest it’s ever been to midnight, Bronson insists that the board this year had to be convinced we are living in the most dangerous period since 1947. She cautions, though, that setting the clock is not an exact science. “You’re not shovelling data into a big algorithm that spits out a time; it’s really a judgement,” she says.
The annual clock-setting is intended to serve as a challenge to politicians to do better in the year ahead. However, members of the board are currently concerned that the world lacks the strong leadership and co-operation to deal with the risks they have flagged. “You can see it now playing out in the coronavirus pandemic,” says Fetter, who points out how many governments have ignored the results of their regular security and pandemic-readiness exercises.
Bronson denies accusations – mostly from the right wing – that its risk assessment is arbitrary and politically motivated, pointing out that the clock has been moved forward and back during both Democratic and Republican administrations in the US. “We take criticism on both sides, quite frankly, and we take that as a sign that we’re doing something correct.” If anything, Bronson feels that the clock provides many with a sense of sanity. “It’s empowering, if it helps people feel that they’re not crazy by noting that something is not right [and that] we’re not where we should be in 2020.” She also laughs off criticism that the clock is just a scare tactic. “That’s fair, it’s called the Doomsday Clock!”
Closer than ever The press launch of the Doomsday Clock in January 2020, when members of the Bulletin of the Atomic Scientists announced in Washington, DC, that it had been set to 100 minutes to midnight. From left to right: the journal’s executive chair Jerry Brown, former Irish president Mary Robinson, and former UN secretary-general Ban Ki-moon. (Courtesy: Lexey Swall Photography/Bulletin of the Atomic Scientists)
Members of the Bulletin are also increasingly concerned by the rise of fake news and conspiracy theories, which have led to the anti-vaccination movement and other scare stories. Social media has allowed such disinformation to spread, making it much harder to address real risks. But could this move away from a pure focus on nuclear weapons and the arms race dilute the message behind the Doomsday Clock?
Stuart Parkinson, a physicist and executive director of the UK-based Scientists for Global Responsibility (SGR), thinks the Doomsday Clock may have become too simplistic given the variety of risks it now tries to encompass. “I’d like to see a more multi-dimensional risk-communication device, so maybe for each issue you have a traffic-light system or a five-point rating.” Bronson knows the clock is a blunt instrument but says that’s also its beauty, generating conversations at the highest levels of power and in the most local classrooms.
Engaging physicists
It’s true, though, that the issue of nuclear risk has suffered an overall decline in interest, including among physicists. “There was a very special period during the war and at least 20 years after, when physicists played a very important role in public policy,” says Rosner. “That’s changed dramatically.” The policy agenda has become more crowded, with the climate crisis forcing the Bulletin to consult not just physicists, but chemists, biologists and risk-assessment experts too. Indeed, he feels many of today’s physicists shy away from public or political debate and are uncomfortable dealing with the uncertainties around climate risk.
One physicist who is unafraid to contemplate the risks posed by scientific advances is Anthony Aguirre, a cosmologist from the University of California, Santa Cruz. Together with Max Tegmark from the Massachusetts Institute of Technology, in 2014 he founded the Future of Life Institute – a non-profit centre that investigates how to safely develop new technologies. “[The institute addresses] questions that aren’t part of the regular academic day-to-day research discourse, and what can we tangibly do to increase the probability of things going well,” Aguirre says.
The COVID-19 pandemic has convinced Aguirre that scientists need to think even more seriously about risk. “It’s woken up a lot of people to the idea that risks are not purely theoretical. I’m even more concerned about the lack of effective national and international institutions to both prevent and deal with risks as they arise.” And although the Future of Life Institute does look at nuclear risk, its current focus is on transformative artificial intelligence (AI), which he feels could compound existing risks.
If used in weapons, for example, AI could rapidly and inadvertently escalate minor incidents into nuclear wars without humans being able to stop them. And that, he feels, is exactly the kind of problem physicists are well placed to examine. “Within the physics and cosmology community there is a tendency to think on bigger scales and longer timescales,” Aguirre points out, which he believes gives them an ability to understand how small risks can still become significant over time.
Aguirre cites the physicists on the Manhattan Project, who – even before they had built a bomb – worried that a nuclear explosion could create such extreme temperatures that hydrogen atoms in the air and water would fuse to form helium. Literally igniting the atmosphere and oceans, this process would – they feared – generate a runaway reaction that could engulf the globe. Fortunately, when this possibility was studied, it proved unfounded.
For its part, the Bulletin of Atomic Scientists is now looking to re-engage the physics community, given the nuclear threat posed by the current unravelling of Cold War agreements. The last remaining bilateral nuclear arms control treaty, New START, is scheduled to expire in February 2021, leaving nuclear weapons proliferation unconstrained for the first time in 50 years. Those events prompted physicists at seven US universities to launch the Physicists Coalition for Nuclear Threat Reduction earlier this year, with support from the American Physical Society. Although the Bulletin is not directly involved with this initiative, it is supportive of these efforts according to Bronson.
Dangerous times The Doomsday Clock began in 1947 to assess the risks from nuclear weapons but these days also considers the threat of climate change, cyber-security and biosecurity risks – including pandemics. (Clockwise from top left: iStock/Gerasimov174; iStock/piyaset; iStock/martin-dm; iStock/matejmo)
As part of the initiative, Fetter and others are giving talks at university departments in the US and hoping to attract physicists to advocate for nuclear-threat reduction, through public engagement and lobbying their local Congressional representatives. He points to the effective role that they played in disarmament policy well into the 1980s, including arguing against America’s proposed ballistic-missile defence systems, which had been touted as a solution to the nuclear threat. Physicists pointed out that in space all objects regardless of mass travel along the same trajectory of launch, meaning it would be very hard to distinguish warheads from decoys.
Over at SGR, Parkinson agrees that physicists have become complacent about nuclear risks and how their own work might endanger the world. That’s why his organization thinks ethics and responsibility must be embedded within scientific education. Physics, he feels, is not a pure, abstract exercise but has consequences that cannot just be ignored. Indeed, he calls for more protection for scientists who speak out, and thinks professional institutions should debate the risks of new and existing technologies more openly with their members and with the public. Parkinson also thinks such organizations should end their links with fossil-fuel and defence firms.
Aguirre even feels better incentives and rewards are needed for individuals who help the world to avoid catastrophes, citing the case of Soviet air-defence officer Stanislav Petrov, who in 1983 was on duty in a command centre near Moscow when a radar screen on a satellite early-warning system seemed to suggest the US had launched five nuclear missiles. Petrov refused to alert the authorities, suspecting – correctly as it turned out – a malfunction. Despite having potentially saved the world from nuclear war, he was reprimanded by the Soviet authorities.
The current COVID-19 pandemic, while not a threat created by humans, is clearly a warning of what can happen when risks are ignored
The current COVID-19 pandemic, while not a threat created by humans, is clearly a warning of what can happen when risks are ignored, which Aguirre says is something we continue to do with existing and new technologies. Indeed, he feels the effort we put into planning for all risks is just not enough. “We’ve all seen the catastrophic things that can in fact happen to us [with COVID-19],” he says, “[and] this is pretty minor in the spectrum of catastrophes.” For Aguirre, we need to put far more intellectual thought into other significant and potentially even more catastrophic risks.
Whether 2021 will bring us closer to midnight, we’ll find out soon.
The list of surprising behaviours in “twisted bilayer” graphene (TBG) just keeps getting longer. The material – which is made by stacking two sheets of graphene on top of one another, and then rotating one of them so that the sheets are slightly misaligned – was already known to support a wide array of insulating and superconducting states, depending on the strength of an applied electric field. Now researchers in the US have uncovered yet another oddity: when TBG is exposed to infrared light, its ability to conduct electricity changes. According to Fengnian Xia of Yale University, Fan Zhang of the University of Texas at Dallas, and colleagues, this finding could make it possible to develop a new class of infrared detectors using these stacked carbon sheets.
A single layer of graphene consists of a simple repetition of carbon atoms arranged in a two-dimensional hexagonal lattice. In this pristine state, the material does not have an electronic bandgap – that is, it is a gapless semiconductor. However, when two graphene sheets are placed on top of each other and slightly misaligned, they form a moiré pattern, or superlattice. In this new arrangement, the unit cell of the 2D crystal expands to a huge extent, as if it were artificially “stretched” in the two in-plane directions. This stretching dramatically changes the material’s electronic interactions.
From magic angle to twistronics
The misalignment angle in TBG is critically important. For example, at a so-called “magic” misalignment angle of 1.1°, the material switches from an insulator to a superconductor (that is, able to carry electrical current with no resistance below 1.7 K), as a team at the Massachusetts Institute of Technology (MIT) discovered in 2018.
The existence of such strongly correlated effects – which were first theoretically predicted in 2011 by Allan MacDonald and Rafi Bistritzer of the University of Texas at Austin – kick-started the field of “twistronics”. In this fundamentally new approach to device engineering, the weak coupling between different layers of 2D materials, like graphene, can be used to manipulate the electronic properties of these materials in ways that are not possible with more conventional structures, simply by varying the angle between the two layers.
Infrared light affects TBG’s conductance
Xia and Zhang’s teams have now studied how TBG interacts with infrared light – something that has never been investigated before. In their experiments, they shone light in the mid-infrared region of the spectrum, with a wavelength of between 5 and 12 microns, onto samples of TBG and measured how the electrical conductance varied at different twist angles. They found that the conductance reached a peak at 1.81° and that the photoresponse of the material was much stronger compared to untwisted bilayer graphene. This is because the twist significantly enhances the interactions between light and the material and induces a narrow bandgap (as well as superlattice-enhanced density of states). They also found that this strong photoresponse fades at a twist angle of less than 0.5°, as the bandgap closes.
Further investigations by the team revealed that the TBG absorbs the incident energy of the photons from the infrared light. This increases its temperature, which, in turn, produces an enhanced photocurrent.
The results suggest that the conducting mechanism in TBG is fundamentally connected to the period of the moiré pattern, and the superlattice produced, which is itself connected to the twist angle between the two graphene layers, Zhang explains. The twist angle is thus clearly very important in determining the material’s electronic properties, Xia adds, with smaller twist angles producing a larger moiré periodicity.
Towards a new class of infrared detectors
The researchers, who report their work in Nature Photonics, now hope to find out whether they can combine photoresponsivity and superconductivity in TBG. “Can shining a light induce or somehow modulate superconductivity? That will be very interesting to study,” Zhang says.
“Blue whirls” are small, spinning flames that were first spotted in 2016. Now computer simulations suggest that this soot-free mode of combustion involves three different flames. Joseph Chung, Xiao Zhang, Carolyn Kaplan and Elaine Oran at the University of Maryland came to this conclusion by accounting for the distinct rates at which different types of laminar flame release heat. Their discovery could make it far easier for researchers to stabilize and increase the size of blue whirl flames in the lab, which could lead the way to low-emission combustion.
Blue whirls were first characterized four years ago at Maryland by Oran, Michael Gollner and Huahua Xiao; who were studying the behaviours of turbulent, sooty fire whirls. They were surprised to see these inefficient sooty flames spontaneously evolve into small, stable blue whirls. They found that these elegant new laminar flames could burn through a wide range of liquid hydrocarbon fuels without producing any soot – presenting a previously unknown route towards highly fuel-efficient combustion.
Subsequent simulations and experiments have revealed the conditions in which blue whirls can form. So far, however, researchers have been unable to determine the flame’s structure and dynamics in more detail. In their study, Chung’s team explored these aspects using numerical simulations. Their calculations were based around 3D time-dependent Navier-Stokes equations, which describe the motions of viscous fluids.
Rich or lean
The researchers adjusted model parameters including the velocities and flow rates of fuel and air, until a blue whirl formed. The simulation suggested the presence of three types of laminar flame, each of which releases heat at a different rate. In diffusion flames, oxygen and fuel diffuse into each other, and the flame itself forms where they meet. Oxygen and fuel can also be premixed, creating a moving flame front driven by thermal expansion. Furthermore, these premixed flames can either be “rich” or “lean” in terms of the ratio of fuel to oxygen.
Chung and colleagues found that the purple “crown” of a blue whirl is a diffusion flame that is shaped like a large cone with an upward-facing point (see figure). This is surrounded by a premixed lean flame; while underneath, a premixed rich flame is shaped like a smaller, downward-pointing cone. A “triple flame” appears as a bright, whirling blue ring where the three flames meet, in which a significant proportion of the whirl’s combustion occurs.
The team also discovered that the transition from turbulent fire whirls to blue whirls unfolds through the process of vortex breakdown – a fluid instability that can occur in swirling flows. Through future simulations and experiments, the researchers now hope to explore how the blue whirl can be created on larger scales, and formed directly without transitioning through the dangerous turbulent whirl state. With this future work, sophisticated new techniques could soon emerge for achieving large-scale and highly fuel-efficient combustion in the lab.
What causes a bubble to burst? For bubbles on the surface of viscous liquids such as paint or lava, the finger of blame has long pointed to gravity. Now, however, researchers led by James Bird at Boston University in the US have turned this idea literally on its head, using an upside-down bubble-testing rig to show that surface tension, not gravity, is at fault. Their result has implications for industrial processes such as glass manufacturing and spray painting and could even shed light on the break-up of respiratory aerosols – a phenomenon that has taken on extra significance due to the COVID-19 pandemic.
When a bubble rises to the surface of a liquid, it typically forms a thin, liquid, dome-shaped film supported by the gas trapped inside it. Once this film develops a hole, surface tension causes the film to retract further, and the bubble bursts.
In runny, low-viscosity liquids, the bursting process is over in a matter of milliseconds. In thicker, viscous fluid, it takes longer, yet the film collapses when the hole is barely open. The reason is that as soon as the hole forms, trapped gas can escape from the bubble. Without the support of this gas, the forces on the liquid film become unbalanced, causing the bubble to collapse and radial wrinkles to form around the bubble’s edge – rather like what happens in an elastic sheet, or a parachute.
Exploiting super-slow flow
Until now, scientists thought that the weight of the thin liquid film was responsible both for the bubble’s collapse and for the formation of the radial wrinkles – suggesting that gravity was the main factor behind viscous bubble bursting.
Bird and colleagues set out to test this hypothesis by injecting an air bubble into a viscous silicone oil and filming the bubble’s collapse with a high-speed camera. Their first experiments were aimed at replicating how viscous bubble collapse had been studied in the past. Once they accomplished that, however, they did something different: they changed the orientation of the bubble relative to gravity. This, Bird says, was possible because of the extreme viscosity of their test liquid. “We exploited this super-slow flow by preparing the bubble in an upright position and then rapidly turning it upside down, quickly puncturing it before it could significantly readjust its shape,” he explains.
In these upside-down experiments, the researchers varied the thickness of the films and the viscosity of the fluid, while also providing a way for the air inside a bubble to escape without rupturing the film. Their results showed that gravity plays only a negligible role in bubble collapse. Instead, it is surface tension and the dynamic stress of the liquid that forms the bubble that are the main drivers of viscous bubble behaviour – including the formation of unstable wrinkles.
A delicate interplay of forces
This finding is exciting, Bird says, because it shows that such forces also play a role in situations where they might otherwise be overlooked – for example, at especially small scales and for multiple orientations of bubbles. It is also important because it resolves a longstanding paradox: while surface tension normally smooths out wrinkles, in this case it also initiates them.
“It is through a delicate interplay of capillary, viscous and inertial forces that wrinkles develop,” he tells Physics World. “Indeed, it took us a day to show that surface tension was responsible for the wrinkles and over a decade to adequately explain why.”
A timely result
Bird and colleagues say that the results of their study have implications wherever curved viscous films are prevalent – either because the surrounding liquid is highly viscous (as is the case for bubbles in molten glass or lava) or because the bubble is small (such as in bubbles that form on coated films). Since the wrinkling and folding of liquid films can trap air, the transport of heat and mass at a liquid interface will be affected, too.
The retraction and collapse dynamics the researchers uncovered will also influence any breakup or aerosolization of the thin film. “A timely example of where our study could be applicable involves the mechanism by which respiratory aerosols form when we breathe and speak,” says Bird. “These aerosols are believed to develop when thin, curved films repeatedly develop across small airways in the lungs and then rupture. Since surface tension rather than gravity is important for the collapse and viscous buckling instability, it is possible that it is also relevant in these films.”
The team, which includes researchers in applied mathematics at the Massachusetts Institute of Technology and the Department of Mechanical and Aerospace Engineering at Princeton University, has developed a theoretical model to explain their observations. They now hope to extend it to more complex liquids, such as viscoelastic fluids that have both liquid and solid-like behaviours. “For example, in respiratory tract fluids, elastic and dynamic surface tension are present and these might modify the phenomenon in interesting and unexpected ways,” says study lead author Alexandros Oratis.
The researchers also intend to explore the precise ways that a film’s thickness profile develops, and how this affects (and is affected by) the bubble collapse. They detail their work in Science.
Time is ticking: why the Doomsday Clock is further forward than it’s ever been.
Life has been turned upside-down since COVID-19 gripped the world earlier this year. I wonder, though, what members of the Bulletin of the Atomic Scientists will make of the pandemic.
Every year they meet to decide the time on the Doomsday Clock – a notional device created by physicists shortly after the Second World War to surmise how close the world is to destruction. In January the clock was set at 100 seconds to midnight – the furthest forward it’s ever been.
In the new issue of Physics World magazine, Rachel Brazil describes how the clock gets set, which these days is no longer based solely on the risk from nuclear conflict, but also takes into account climate change, cyber security and biosecurity risks – including pandemics. My bet is on Bulletin staff edging the clock even further towards midnight
Elsewhere in the new issue, Sam Vennin shows how scientific modelling is increasingly critical to many aspects of medicine, including the spread of viruses, while Kate Ravilious finds out how lockdown and travel restrictions have led to air pollution plummeting, giving atmospheric scientists a unique and unexpected opportunity to study its impact on the weather and climate.
For the record, here’s a rundown of what else is in the issue.
• Tunnelling measured with ultracold atoms – Researchers have determined the time taken for an atom to tunnel through a laser beam, which could have implications for quantum technologies, as Philip Ball reports
• Trio of probes launch to Mars – China, the UAE and the US have sent craft to Mars to study the planet in greater detail, with NASA set to conduct the first controlled flight on another planet, as Ling Xin, Michael Banks and Liz Kruesi report
• A quantum revolution – Chaoyang Lu from the University of Science and Technology of China in Hefei talks to Michael Banks about planning Quantum 2020 – a major online conference – and the future of quantum technologies
• Hacking a path to innovation – Bonnie Tsim says “hackathons” are a great way for scientists to apply their skills to the commercial world
• First-hand insights – With so much spin, jargon and fake news, it can be hard to know what businesses are really up to. James McKenzie finds that hearing a company’s vision first hand is key to understanding its potential
• It’s a material world – From wine and skis to clocks and bone implants, Linn Hobbs has a passion for everyday materials, as he tells Robert P Crease
• 100 seconds to midnight – For almost 75 years, the Doomsday Clock has monitored how close humankind is to global catastrophe. With the clock now closer to midnight than ever before, Rachel Brazil talks to physicists who say we must step up our efforts to prevent disaster
• Has the COVID-19 lockdown changed Earth’s climate? – The lockdown measures imposed by many nations due to the COVID-19 pandemic have led to air pollution falling dramatically, thereby offering scientists a rare opportunity to study its links with climate and weather. But as Kate Ravillious discovers, it’s a complicated connection
• How modelling is transforming medicine – Computational modelling has been brought under the spotlight during the COVID-19 pandemic, with scientists trying to predict how the SARS-CoV-2 virus will spread. But epidemiology is not the only medical field in which modelling is sparking breakthroughs, as Sam Vennin explains
• Build a bot – Ian Randall reviews How to Grow a Robot: Developing Human-Friendly, Social AI by Mark Lee
• Beyond the bounds of Earth – Andrew Glester reviews the film Proxima and interviews its screenwriter-director Alice Winocour
• Closing the skills gap – Why do many science graduates lack the skills that industry is looking for? Sean Ryan and Veronica Benson of the South-East Physics Network explore the problem and discover that there is one straightforward way for universities and businesses to address this
• Ask me anything – Nanophysicist Deji Akinwande from the University of Texas at Austin gives his tips for a rewarding career in physics
• Lattice layabout in the park – Hamish Johnston muses on social distancing in public spaces.
A technique that improves the precision of radiotherapy used to treat wet (neovascular) age-related macular degeneration (AMD) is being developed at Stanford University School of Medicine. Principal investigator Wu Liu and a multinational team of researchers are using polycapillary X-ray optics to deliver focused kilovoltage (kV) radiation. The approach, described in Medical Physics, enables personalized conformal radiotherapy and spares critical structures in the eye, with potential to make AMD treatment more effective, less costly and more convenient for patients.
AMD is the leading cause of blindness in people aged over 50 in developed nations. The International Agency for the Prevention of Blindness estimates that around 196 million people throughout the world currently have AMD, with steady increases projected as the global population ages. The American Academy of Opthalmology estimates that 10–15% of these AMD cases are wet AMD, which is caused by choroidal neovascularization (CNV), the growth of abnormal blood vessels into the macula.
The standard treatment for wet AMD involves repeated intravitreal injections with anti-vascular endothelial growth factor (VEGF) drugs, which helps to control the disease. However, injections are required every 30 to 60 days, are high in cost, and risk causing retinal detachment, injury, infection, bleeding and pain. Radiotherapy could provide an appealing alternative, destroying abnormal vessels and potentially inhibiting associated inflammation and fibrosis not addressed by VEGF inhibitors.
Principal investigator Wu Liu.
A recent study of radiotherapy for neovascular AMD concluded that the role of stereotactic radiotherapy combined with anti-VEGF is currently uncertain, due primarily to a lack of clinical trial evidence. But Liu and collaborators believe that a combination of radiotherapy with anti-VEGF treatment may be more effective if techniques to deliver stereotactic radiotherapy are improved.
Stereotactic radiotherapy involves targeting high doses of radiation onto a small area, minimizing exposure to surrounding healthy tissues. Two clinical trials (one completed and one with results expected in 2021) have investigated the use of stereotactic radiotherapy to treat wet AMD. The first, the randomized phase II INTREPID trial of 230 individuals dependent on anti-VEGF therapy showed that a single 16 or 24 Gy radiation dose reduced injections by 25%, compared with patients who did not receive radiotherapy.
A follow-up phase III trial, STAR, builds on INTREPID’s findings, which showed that the best response was in patients who had an AMD lesion that was actively leaking at the time of radiotherapy and was smaller than 4 mm. The STAR trial recruited 411 such patients to receive either a single 16 Gy dose or a sham treatment, supplemented by injections as needed over 24 months.
Both of these studies used a collimation-based kV radiotherapy device that delivers fixed 4 mm beams to the centre of macula. This can, however, over- or under-treat CNV lesions of various sizes and shapes.
To enable personalized treatment, Liu and colleagues are developing a system that uses polycapillary X-ray focusing lenses capable of withstanding continuous high-intensity radiation. The polycapillary lens consists of a large number of hollow curved and tapered glass tubes. A converging arrangement of hundreds of thousands of these small channels can produce an X-ray beam with a focus perpendicular to the beam direction of less than 0.2 mm, enabling high-accuracy targeting.
Schematic of a single focused X-ray beam using polycapillary optics in the AMD treatment simulation (not to scale). The X-ray source is placed at the input focus of the X-ray lens, the macula at the output focus. (Courtesy: Med. Phys. 10.1002/mp.14404)
“This ultrasmall beam focal spot enables spatially fractionated grid therapy, which has been shown to preferentially damage abnormal neovascular blood vessels versus normal ones,” write the authors. “The grid dose delivery has the potential to control the CNV while sparing the retina and normal capillaries from the small possibility of radiation-induced retinopathy and capillary dropout associated with conventional radiotherapy.”
The researchers performed Geant4-based Monte Carlo simulations of conformal treatments of three clinical CNV cases, using 60-kVp focused X-ray beams. To estimate possible dosimetric uncertainties to the target and critical organs, they introduced positioning errors that modelled setup errors and patient eye motion during treatment. They also simulated spatially modulated dose delivery to the CNV lesion plus margin to demonstrate the potential of grid therapy.
“The simulated treatments showed highly conformal delivery of dose to the lesion plus 0.75 mm margin with sharp dose fall-offs and controllable spatial modulation patterns,” report the researchers. “The 90–10% isodose penumbra is less than 0.5 mm. With a prescription dose of 16 Gy to the lesions, critical structure doses are well below the tolerance.”
The team also noted that the proposed technique allowed adjustment of the dose distribution based on the distance to the optic disc, to balance the benefits and risks. The average CNV dose varied by no more than 10%.
Liu and colleagues suggest that, because of the conformal nature and grid therapy potential of the technique, it may be possible to use focused X-rays alone to treat AMD. The technique could also be used synergistically with anti-VEGF drugs, specifically for newly diagnosed wet AMD patients. Here, patients would receive a single focused X-ray treatment within 14 days of the first anti-VEGF injection, followed by two additional monthly injections and subsequent ones as clinically determined.
“Based on the promising computer simulation results, which show significant dosimetric improvement on wet AMD treatment compared with previously investigated devices, we plan to build a prototype focused kV X-ray platform with a Monte Carlo-based treatment planning system to enable subsequent pre-clinical translational laboratory and clinical research,” Liu tells Physics World. “We hope this technique can be combined synergistically with anti-VEGF drugs to improve the treatment of wet AMD and substantially reduce the frequency of anti-VEGF injections. After successful prototype development and preclinical verifications, we hope to conduct human clinical trials in the next few years.”
Such is their sensitivity to environmental noise, quantum computers might in future be shielded by thick layers of lead and even operated deep underground. So say physicists in the US, who have found that ionizing radiation significantly limits the coherence time of superconducting qubits. Indeed, they say that minimizing radiation effects will be crucial if general-purpose quantum computers are to be made using superconducting technology.
Quantum computers can perform certain calculations much more quickly than classical computers by storing and processing information using quantum bits (qubits). Superconducting circuits are among the leading types of qubit currently under development, generating superpositions of 0s and 1s from the ground and first excited states of an anharmonic oscillator formed from the combination of Josephson junctions and a capacitor. Although they need to be cooled down to very low temperatures, such qubits are solid state and therefore hold the promise of being relatively easy to manufacture and integrate.
Indeed, last year John Martinis and colleagues at Google used a processor comprising 53 superconducting qubits to execute a very specific algorithm more than a billion times faster than they say would be possible using one of the world’s leading conventional supercomputers – although this billion-fold advantage has since been disputed.
Minimum coherence time
Superconducting qubits can currently retain their delicate quantum states – their “coherence” – for more than 100 µs. While this much better than the nanoseconds of two decades ago, coherence times will need to increase by several orders of magnitude before the qubits can be used in general-purpose fault-tolerant computers. These devices would rely on error correction and can only work efficiently if the error rates on individual qubits and gates are already below a certain threshold – implying a minimum coherence time.
Coherence is impeded by a wide range of noise sources. On the timescale of tens or hundreds of microseconds, material defects, magnetic moments and trapped charges, among others, tend to cause the biggest headaches. However, pushing coherence times up to and beyond a millisecond will require overcoming the problem of ionizing radiation. Beta particles, gamma rays and cosmic rays create electron-hole pairs within devices, which lead to cascades of energy and the breakup of the Cooper pairs responsible for the frictionless current in a superconductor.
Earlier this year, physicists in Germany and Italy reported that environmental radioactivity can impair the performance of superconducting resonators. The group, headed by Laura Cardani of the National Institute of Nuclear Physics in Rome and Ioan Pop of the Karlsruhe Institute of Technology, showed that cosmic rays and radioactive impurities can significantly increase the density of broken Cooper pairs, known as quasiparticles, within devices above ground. Conversely, by using a radio-pure set-up within Italy’s Gran Sasso laboratory – located under 1400 m of rock – it was able to reduce the incidence of what are known as quasiparticle bursts by up to a factor of 50.
Ionizing effects
Now, William Oliver and colleagues at at the Massachusetts Institute of Technology (MIT) and the Pacific Northwest National Laboratory have taken this research a step forward by measuring and modelling the effect of ionizing radiation on superconducting qubits themselves. As they report in Nature, they did so using qubits made from aluminium mounted on a silicon substrate.
The team began by exposing two such qubits to a known source of ionizing radiation – a thin disc of copper-64 – and measured the rate at which qubits’ decohere repeatedly over the course of several days (the copper having a half-life of just over 12 h). The idea was to establish how readily quasiparticles are generated in the qubits for a given flux of radiation.
The researchers then combined this information with measurements of the radiation present in the MIT lab, both from cosmic rays and naturally occurring radioactive isotopes – in the latter case, mainly from the lab’s concrete walls. They calculate that the decohering effects of this radiation on the qubits would impose an upper limit to their coherence time of about 3-4 ms.
Lead bricks for shielding
To check this result with an independent experiment and establish how well such qubits might be shielded from ionizing radiation, the team surrounded seven such qubits (or rather the cryostat used to keep them cool) with 10 cm-thick lead bricks. This is the kind of shielding often used in neutrino and dark-matter experiments. By placing the shield on a scissor lift and periodically raising and lowering it, they were able to establish the effect of the external radiation thereby confirming the coherence limit of about 4 ms. They also found that the shield increased the coherence time by around 20%.
Given the existence of stronger sources of decoherence, Oliver and colleagues say that this shielding only raised the qubits’ overall coherence time by about 0.2%. But they have no doubt that such noise-reduction measures will be needed if quantum computing is really to take off. “Reducing or mitigating the impact of ionizing radiation will be critical for realizing fault-tolerant superconducting quantum computers,” they write.
One option, at least in the medium-term, would be operate devices underground. Oliver says that this would be a “good direction to go for verification and research”. But he argues that for practical applications it would be better to design qubits that are less susceptible to quasiparticles. “That would allow us to keep superconducting quantum computers above ground,” he says.