This webinar will show a “fairly new” technology for micro and nano fabrication and characterization. It will discuss the Focused Ion Beam (FIB) technology, a tool available on the market from the early 2000s.
With a FIB it is possible to image and to modify materials at micro and nanoscale by milling and deposition. Despite its complex architecture, it is a relative user-friendly machine, especially when it is included in a dual-beam system with an electron microscope.
Giuseppe Firpo is a physicist and currently a technologist and head of the technical department at Dipartimento di Fisica – Università degli Studi di Genova. He is an expert on vacuum science and technology and has published patents and several scientific papers in peer-review journals on this subject. Since 2005, he has been a FIB user to fabricate nanostructure for biomedical sensing device (Lab on a Chip) and for material science applications. His latest research is on the permeability of ultra-thin membranes for gas separation technology.
Osteoporosis is a bone disease characterized by the loss of bone mass and increased bone porosity. Such weakened bones are far more likely to fracture, making regular monitoring and early diagnosis of bone diseases essential.
The gold standard for evaluating bone status is dual-energy X-ray absorptiometry (DXA), which measures bone mineral density. But DXA is not suitable for quantifying other key mechanical parameters. Instead, ultrasound imaging could play a crucial role, with techniques such as ultrasonic computed tomography (USCT) able to characterize bone microstructure and biomechanical properties, as well as being inexpensive and non-ionizing.
Researchers at Fudan University in China have proposed a reconstruction algorithm that enables quantitative bone imaging using USCT. Writing in Chinese Physics B, they demonstrate the performance of their proposed method using a series of increasingly complex bone models.
Iterative approach
The main challenge when imaging bone with ultrasound is that the speed of sound in bone differs significantly from that of surrounding soft tissue; but common medical ultrasound techniques assume uniform sound velocity. Such methods cannot therefore accurately image irregular bone–soft tissue boundaries without prior knowledge of the sound speed distribution.
“Conventional ultrasonic B-mode imaging has inherent limitations in imaging biological hard tissue and bone using uniform sound velocity assumption,” explains researcher Dean Ta. “USCT provides a promising alternative. Particularly, USCT with full-waveform inversion shows potential for high-resolution bone imaging.”
Full-waveform inversion (FWI) is an image reconstruction algorithm originally developed for geophysics. In this study, Ta and colleagues employed frequency-domain FWI (FDFWI), an inverse process that reconstructs parametric images by minimizing the mismatch between measured and numerically simulated signals. When used with USCT, the FDFWI algorithm iteratively updates bone material parameters – sound velocity and mass density – in a simulated model until it reaches the best match.
The algorithm solves the inverse problem by calculating parameters at gradually increasing ultrasound frequencies, and then uses the final values to create quantitative bone images. “The FWI starts from a relatively low frequency to avoid ‘cycle skipping’ and ensure the iteration is not trapped in a local minimum,” Ta explains. “By gradually increasing to a high frequency, it can obtain a high-resolution inversion for bone structure.”
In theory, using a maximum frequency of 2.5 MHz, FDFWI can image pores and trabeculae in bone tissue with a spatial resolution of around 0.6 mm.
Computational models
To demonstrate the validity of their approach, the researchers estimated parametric bone images via FDFWI using a ring-array ultrasound transducer. They first modelled a simple 2 mm-thick tubular bone phantom with known mass density, using FDFWI to estimate the sound velocity. During the simulations, the ultrasound frequency increased from 100 kHz to 3.5 MHz in 100 kHz steps.
As the frequency reached 1.5 MHz, both the outer and inner edges of the phantom could be observed. At 2.5 MHz, the image was clearer with few artefacts, showing that FDFWI can accurately restore the macroscopic morphology. Increasing the frequency to 3.5 MHz made little further improvement.
Next, the researchers modelled a distal fibula (lower leg bone). Here, they used the FDFWI algorithm to simultaneously estimate sound velocity and mass density, using ultrasound frequencies from 100 kHz to 3.5 MHz in 50 kHz steps. They note that using this smaller interval ensures enough low-frequency components to reconstruct mass density.
FDFWI results for a distal fibula for a velocity model with different maximum frequencies. (a) True velocity model. (b–e) Results obtained with final frequencies of 0.5, 1.5, 2.5 and 3.5 MHz, respectively. (Courtesy: Chinese Phys. B 10.1088/1674-1056/abc7aa)
As the frequency gradually increased to 1.5 MHz, both the outer and inner edges of the velocity map were accurately recovered and microstructures in the bone could be clearly seen. At 2.5 MHz, the velocity map became clearer and some more subtle features appeared, demonstrating that FDFWI can accurately recover geometry and microstructure and provide high-resolution bone images.
The third numerical model used a distal tibia–fibula pair model derived from a high-resolution peripheral quantitative CT image. The FDFWI algorithm used ultrasound frequencies from 100 kHz to 2.5 MHz in 50 kHz steps. Even in this challenging scenario, FDFWI reconstructed both the macroscopic morphology and the microstructure with sub-millimetre resolution. Compared with the true CT image, the simulations clearly and accurately presented pores and trabeculae in bone images.
The researchers note that reconstruction of the density map was not as good as that of the velocity map. The reconstruction errors of both sound velocity and mass density were larger than seen in the single fibula model, attributed to multiple scattering and diffraction effects between the two bones.
Finally, to investigate the robustness of FDFWI against noise, the team added random noise to the synthetic data generated in the tibia–fibula pair to create cases with signal-to-noise ratios of 30, 10 and 0 dB. They found that sound velocity maps could still be recovered well in the presence of noise. Even in the 0 dB case, the bone geometry and some relatively larger microstructure could be reconstructed.
The next step is to verify the effectiveness of the method using experimental rather than synthetic data. “We are trying to perform experimental studies on the topic of musculoskeletal system imaging with full-waveform inversion,” Ta tells Physics World.
If January turns out to be a template for the rest of 2021, the product development team at Oxford Instruments NanoScience is set for a busy year after registering the first industry and academic installations of Proteox, a next-generation dilution refrigerator designed for applications in quantum computing R&D and ultra-low-temperature condensed-matter physics. The customers: Oxford Quantum Circuits (OQC), a University of Oxford start-up that’s pioneering a “quantum computing as a service” (QCaaS) business model, and the University of Glasgow’s quantum circuits group, a multidisciplinary research team working at the frontiers of quantum science, technology and application.
In terms of the back story, Oxford Instruments NanoScience is a division of parent group Oxford Instruments, a diversified and long-established UK provider of specialist technologies and services to research and industry. The NanoScience business unit, for its part, designs and manufactures research tools to support the development, scale-up and commercialization of next-generation quantum technologies. Think cryogenic systems (operating at temperatures as low as 5 mK) and high-performance magnets that enable researchers to harness the exotic properties of quantum mechanics – entanglement, tunnelling, superposition and the like – to yield practical applications in quantum computing, quantum communications, quantum metrology and quantum imaging.
Flexible solutions for cold science
It’s with this quantum opportunity front-and-centre that the fundamentals of the Proteox dilution refrigerator have been reimagined to support multiple scientific users and a variety of ultra-low-temperature experiments from a single system operating in the mK regime. That scalability is achieved with a side-loading “secondary insert” module that allows samples, communications wiring and signal-conditioning components – basically full experimental set-ups – to be installed and changed whenever necessary.
“Proteox is the largest dilution refrigerator in its class with an extensive capacity for integrating components, experimental services and sample mounting,” explains Harriet van der Vliet, product segment manager for quantum technologies at Oxford Instruments NanoScience. “Modularity and flexibility are key,” she adds, “and we work closely with our customers to offer them tailored solutions and experimental set-ups on standard lead times.”
Supercool physics the Proteox dilution refrigerator allows scientific and industrial end users to add new functionality as their R&D requirements evolve. (Courtesy: OQC)
With adaptability comes future proofing – effectively a “pay-as-you-grow” offering that allows end users to add new functionality to Proteox as their research requirements evolve and their funding permits. “The customer can specify an entry-level system that’s just a base refrigerator – for example, no magnet and no fast sample exchange,” notes van der Vliet. “Over time, as new research grants or start-up investments are secured, it’s possible to upgrade your Proteox and purchase different secondary inserts, such as the rapid-sample-exchange bottom-loader, as well as taller frames and a variety of magnets. The freedom to upgrade the Proteox is a key design feature, bringing unparalleled value for money to the dry dilution-refrigerator market.”
Quantum collaboration
The development of Proteox looks to be well-timed, tapping as it does the growing technology push and commercial pull within the “quantum economy” – not least in the UK. Last year, for example, a research/industry consortium led by OQC, and including Oxford Instruments NanoScience, secured £7 million in funding from Innovate UK, the UK’s innovation agency, to fast-track the commercialization of superconducting quantum technologies.
Broadly, that upfront investment will support fabrication of superconducting quantum circuits and the scale-up of core enabling infrastructure such as specialist cryogenic equipment and state-of-the-art test electronics – all of which currently represent a significant barrier to entry for companies seeking to access emerging quantum markets and applications. The consortium is eyeing multiple revenue opportunities in the near term, including QCaaS, cryogenic measurement as a service (MaaS) as well as a contract foundry offering.
“Our strategy is to build the core, in terms of our quantum computer, and to partner with the best,” says Ilana Wisby, CEO of Oxford Quantum Circuits. (Courtesy: OQC)
“Our strategy at OQC is to build the core, in terms of our quantum computer, and to partner with the best,” says Ilana Wisby, CEO of OQC. As such, Oxford Instruments NanoScience represents a natural partner when it comes to the enabling cryogenic technologies for QCaaS. “Put simply,” adds Wisby, “the Proteox platform allows us to efficiently and reliably generate the ultra-low temperatures needed to operate our quantum computer.”
There’s a pleasing circularity to this tie-up. While Oxford Instruments was one of the first spin-out companies to emerge from the University of Oxford’s research programme (back in 1959), the established manufacturer is now supporting OQC and other start-ups and research groups in the UK’s nascent quantum supply chain. That sense of collective endeavour, it seems, also informs OQC’s mindset. “With our main focus on QCaaS,” notes Wisby, “we do think it’s important to play our part in developing a healthy quantum ecosystem for the UK. That will pay off for us in the long run through new collaborations and the opportunity to work with the brightest talents in the field.”
Cool customers
The original version of OQC’s quantum computer was developed using Triton, the previous generation of cryogen-free refrigeration technology from Oxford Instruments NanoScience. The move to Proteox, and the incorporation of the new refrigeration system into OQC’s state-of-the-art laboratory earlier this month, marks a significant milestone in the start-up’s commercial roll-out of its QCaaS and MaaS offering. “We’ve been able to collaborate closely with the engineering team at Oxford Instruments NanoScience to develop high-density wiring solutions that meet our specific requirements,” explains Wisby. “Ultimately that is going to help us to scale the number of qubits in a cost- and space-efficient way.”
Another feature of Proteox is ease of use. For starters, an all new web-based control system combines remote connectivity with push-button automation routines, while enhanced data interrogation and visualization software offers live plotting of key process parameters – for example, the temperatures and pressures at relevant stages of the system cool down. “The intuitive user interface ensures we spend our time building cutting-edge quantum computers rather than focusing on whether things get cold,” adds Wisby.
Gearing up for exotic physics at ultra-low temperatures
The University of Glasgow’s quantum circuits group announced in January that it is also using the Proteox dilution refrigerator to support its wide-ranging R&D effort in superconducting quantum technologies, including dedicated initiatives spanning superconducting spintronics, quantum-engineered nanoelectronic circuits and quantum information processing.
“We’re excited to be using Proteox, the latest in cryogen-free refrigeration technology, and to have the system up and running in our lab,” explains Martin Weides, head of Glasgow’s quantum circuits group. “Proteox is designed with quantum scale-up in mind, and through the use of its secondary insert technology, we’re able to easily characterize and develop integrated chips and components for quantum computing applications.”
The University of Glasgow, its subsidiary and commercialization partner, Kelvin Nanotechnology, and Oxford Instruments NanoScience are part of the OQC-led R&D consortium developing specialist foundry and measurement services to support the commercialization of superconducting quantum technologies (see main article). Other consortium partners include quantum computing pioneer SeeQC UK and the SuperFab nanofabrication facility at Royal Holloway, University of London.
For research customers like Glasgow, one of the main features of Proteox is its enhanced workflow efficiency – especially when, as is often the case, there might be several PhD students and postdocs all seeking access to the system at the same time. “The larger experimental space of Proteox, and the flexibility afforded by the secondary insert architecture, means the user can now transfer their wiring set-up in and out of that space easily and quickly,” explains Harriet van der Vliet, product segment manager for quantum technologies at Oxford Instruments NanoScience.
“Turnaround is a big issue for scientists working at ultra-low temperatures,” she continues. “Proteox makes life a lot easier and translates into significant enhancements in terms of workflow, throughput and, ultimately, research productivity.”
A method of optically selecting and sorting nanoparticles according to their quantum mechanical properties has been developed by researchers in Japan. The method could prove a crucial tool for makers of nanostructures that have applications in quantum sensing, biological imaging and quantum information technology.
Scientists have several ways of manipulating and positioning tiny objects without touching them. Optical tweezers, for example, use a highly focused laser beam to generate optical forces that hold and move objects in the beam’s trajectory. Such tweezers have become powerful tools in biological research, microfluidics and micromechanics. However, they can only manipulate relatively large objects, because diffraction restricts the spot size of the trapping laser beam to around half the wavelength of the illuminating light. For red light with a wavelength of 700 nm and a laser power of several mW, for example, tweezers can only stably trap and manipulate objects with a diameter of roughly 350 nm or greater. Trapping and manipulating smaller particles is challenging because the optical force weakens as the volume of the particle decreases.
Nanodiamonds with luminescent centres
In the new work, a team led by Hajime Ishihara of Osaka University and Keiji Sasaki at Hokkaido University developed a way of sorting nanodiamonds, which are tiny pieces of semiconducting dots with optoelectronic properties that derive from bulk diamond as well as certain defects. One such defect, known as a nitrogen-vacancy (NV) centre, occurs when neighbouring carbon atoms in the diamond lattice are replaced by a nitrogen atom and a vacancy. These defects are a promising platform for quantum optics devices because they serve as luminescent centres – meaning that they absorb light at one specific, resonant, frequency while emitting it at another.
When illuminated by a laser, the nanodiamonds scatter light while their NV centres (if they have any) absorb it. The combination of light scattering and absorption transfers momentum from the photons to the nanoparticles, and the different momentum transfers experienced by nanodiamonds with and without NV centres could, in principle, be used to differentiate them. In practice, though, it isn’t straightforward.
“While these two effects [scattering and absorption] produce optical forces that can be used to move the particles on the macroscopic scale, it is difficult to select nanodiamonds that contain NV-centres from surrounding pristine nanodiamonds that don’t contain these defects,” Ishihara and Sasaki explain. “This is because the scattering optical force of bulk diamond is much stronger than the optical force coming from light absorbed by NVs.”
Restricting particle motion
The researchers’ solution was to balance out the larger scattering force so that they could distinguish the absorbing force due to the NV centres. To do this, they send two different-coloured laser beams propagating in opposite directions along a nanofibre. An intense evanescent light field forms around this fibre, which is several millimetres long with a diameter of a few hundred nanometres. This field, explain Ishihara and Sasaki, allows light to propagate for long distances while remaining as a tightly focused beam, thus restricting the motion of nanoparticles trapped within it to one dimension.
Within such a waveguide, the momentum of the photons is constant, making the setup ideal for analysing the optical forces exerted on the nanoparticles. By balancing the absorption and scattering forces induced by the two laser beams along the nanofibre, the researchers were able to transport single nanoparticles according to whether NV centres were present.
While the team focused on nanodiamonds in this work, Ishihara and Sasaki point out that other nanoparticles could be “equally interesting targets”. Indeed, they now plan to study ways of applying similar methods to nanomaterials such as organic-dye doped nanoparticles and different types of semiconducting quantum dot structures. “We are aiming to sort nanoparticles with a single quantum state (that is, with a single luminescent centre) and develop a technique to sort a huge number of particles for use in practical applications,” they tell Physics World.
Full details of the present research are reported in Science Advances.
It is a scatological mystery that has puzzled biologists for some time – how and why do wombats produce cubes of poo? Now David Hu and colleagues in the US and Australia say they have the answer. Writing in the journal Soft Matter(where else), the team says that the cubic faeces form in the marsupial’s intestines, rather than when they are excreted as was previously thought.
The team discovered that the muscles that line wombat intestines do not have cylindrical symmetry, but rather create two stiff and two flexible regions. As material moves through the intestines, rhythmical muscle contractions sculpt the poo into cubes. The team says its discovery could “have applications in manufacturing, clinical pathology, and digestive health”.
That is the how, but what about the why? Some speculate that wombats communicate via the smell of their poo and the cubic shape stops faeces from rolling away.
Chill in the air
It might be summer in the land of the wombats, but here in the northern hemisphere there is a chill and perhaps even snow in the air. But if you can’t enjoy the delicate intricacies of snowflakes in real life, the physicist-turned-polymath Nathan Myhrvold has taken the highest-resolution photographs of snowflakes ever.
Myhrvold made the trek to the Canadian town of Timmins, Ontario to find the right humidity and temperature conditions to photograph snowflakes. He used a camera of his own design – with a built-in cooler and other features to avoid melting the snowflakes as they are photographed. This is important because he takes about 100 photos of each snowflake before combining them to create a high-resolution composite image.
You can read more about Myhrvold’s photos and camera in the Smithsonian Magazine.
Physics World columnist Bob Crease visited Myhrvold in 2017 to talk about a massive five-volume book on the science of bread and bread-making that Myhrvold has written. See: “The physics of bread“.
The decay lifetime associated with the emission of Auger electrons from atoms has been measured to sub-femtosecond precision using a technique called self-referencing attosecond streaking. Daniel Haynes of the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg and an international team of scientists are the first to make such a measurement using an X-ray free electron laser (XFEL). They found that the Auger decay lifetime is significantly longer than that predicted by a semi-classical approximation, but in agreement with a fully quantum calculation. The team says that their new technique could soon be used to study the quantum entanglement of electrons in molecules.
Since its invention in the early 2000s, attosecond streaking has illuminated atomic and molecular dynamics on unimaginably short timescales. In conventional attosecond streaking spectroscopy, an initial laser pulse triggers the ionization of an atom or molecule before the oscillating electric field of a second, longer-wavelength laser pulse modulates the energy of the emitted electrons. Some electrons will gain a boost in energy from the modulation, while others will lose energy. This streaking is measured and can be used to work out properties of the emitted electrons.
The temporal stability of the pulses is crucial, according to team member Adrian Cavalieri of the Paul Scherrer Institute in Switzerland: “One of the big enablers for attosecond spectroscopy in the laboratory was the phase stability of the field…So you can very carefully change the delay between your laser field and your ionization event and use the ramp of the laser field to get whatever information you want.”
Timing jitter
XFELs are large, accelerator-based radiation sources that deliver high-energy, laser-like pulses of radiation. However, they suffer from an unavoidable problem called timing jitter – an inconsistency in the gaps between successive pulses.
“There is a physical element in that the X-ray [laser] pulses are produced by self-amplified spontaneous emission, which is slightly stochastic, so you can’t predict with 100% accuracy and infinite precision when the pulse will be produced,” explains Haynes.
This had made XFELs unsuitable for delivering ionization pulses for attosecond streaking and prevented researchers from studying many important high-energy atomic processes such as Auger decay with sub-femtosecond resolution.
Ejected electrons
The Auger process begins when high-energy radiation such as X-rays eject an electron (called a photoelectron) from an inner orbital in an atom. This creates a core-hole, which is immediately filled by an electron from an outer orbital. This leaves the atom with extra energy, which is removed by the emission of a second (Auger) electron. Studying this extremely common effect is key to understanding many important processes in atomic physics.
Characterizing the Auger process on the appropriate timescale requires ultrashort, intense X-ray pulses that are beyond the capability of small-scale X-ray lasers. XFELs can deliver appropriate femtosecond pulses, but their timing jitter is about 100 times longer than the timescale of Auger decay.
In their new work, the team found a clever way to get around this problem. They could not control the timing of the X-ray pulses, they reasoned, therefore they could not control the phase of the streak field at the time the photoelectron was emitted. They did know, however, that this phase would evolve in the time between the emission of the photoelectron and the emission of the Auger electron. The streak field would therefore imprint a different phase on the two electrons, and this would be reflected in their relative energies.
Clear elliptical pattern
The researchers made 80,000 measurements at the Linac Coherent Light Source, an XFEL at SLAC in California. They plotted detected photoionization energies against the Auger electron energies. This showed a clear elliptical pattern, which the researchers then used to reconstruct the phase delays – and from that the time delays – between the two electrons. Their results showed an Auger decay lifetime of 2.2 fs. This is longer than predicted by a classical approximation, which treats the two emission processes separately. Instead, the measurement agrees with a fully quantum model of the correlated electron decay.
David Villeneuve of the Joint Attosecond Science Laboratory in Canada is impressed by the research: “XFELs have very poor timing jitter, making accurate time measurements unfeasible. The work by Haynes [and colleagues] has solved this problem,” he says. “This self-referencing technique greatly expands the range of experiments that can be done at XFEL facilities. It requires that there are two electrons emitted so that their relative timing can be measured but opens up new possibilities for experiments that benefit from the huge light intensities available from XFEL sources.”
Reinhard Dörner of Goethe University in Frankfurt agrees: “This is wonderful work, which really pushes the limits of time resolution achievable with FELs. It also paves the way towards time-resolved views of entanglement of electrons in molecules for the future.”
Click beetles use the elastic recoil of a spring-like mechanism in their exoskeleton to throw themselves into the air. The mechanism of the insects’ motion was discovered using high-speed synchrotron X-ray imaging, and researchers in the multidisciplinary team responsible say their techniques could further our understanding of energy storage and release strategies in other animals that perform ultrafast movements.
Small creatures like fleas, mantis shrimp and trap-jaw ants produce some of the fastest movements in the animal kingdom. These extreme accelerations are powered not by muscles, but by complex energy storage and amplification structures. One such animal is the click beetle, Elater abruptus, which can jump more than 20 body lengths without using its legs. It does this using a latch to bend its body at a hinge between its thorax and abdomen. When the latch releases, the beetle unbends extremely quickly, makes a loud clicking sound and – if unconstrained – leaps into the air.
“There have been many speculations about why beetles evolved this manoeuvre, including to escape predators or scare them away, or to free itself from tight places such as wooded areas or the soil,” Aimy Wissa, a mechanical engineer at the University of Illinois Urbana-Champaign, says. “However, the jump itself is almost vertical, making it a bad escape strategy.”
Latch, load, release
While the click beetle’s jump has been studied before, there has been little analysis of its clicking motion or energy storage and release mechanism. To record the rapid movement of this mechanism, Wissa and her colleagues took four beetle specimens to the Advanced Photon Source at Argonne National Laboratory. Using a high-speed synchrotron X-ray imaging system, they captured the beetle’s internal movements at rates of 30 000 frames per second.
The beetle’s latch is on the underside of its exoskeleton and is composed of a peg on the thorax and a lip on the abdomen. The high-speed X-ray recordings show that the click has three phases: latching, loading and energy release. To “latch”, the beetle moves its head and thorax away from the underside of its body, causing the peg to slide over and latch onto the lip.
During the loading phase, the beetle maintains this bend in its body while the soft cuticle behind the peg deforms and stores elastic energy, acting like a spring. After this contraction – which takes around 240 milliseconds – the peg slips, the latch unlocks, and the energy stored in the cuticle is rapidly released. As a result, the beetle’s thorax is flung in the opposite direction and the insect fires itself into the air.
“The keys for the extreme accelerations are both the latch and the spring,” Wissa tells Physics World. “The spring allows for storing elastic energy and releasing it quickly, while the latch prevents the spring from recoiling prematurely.”
Damage limitation
The X-ray images show that the peg reaches a maximum velocity of 1.8 metres per second during the energy release. This velocity corresponds to about 1000 peg lengths per second and a maximum acceleration of more than 500 times the acceleration due to Earth’s gravity.
The images also show that the deformation in the soft cuticle is released in less than 1 millisecond. This is much faster than the loading time, which supports the idea that click beetles can amplify mechanical power during the clicking motion and is characteristic of an elastic recoil.
Despite the power they generate, Wissa says that the beetles can click repeatedly without apparent injury. “We found out that energy is dissipated using a nonlinear damping force,” she says, adding that “the exact mechanism that prevents damage is an active area of research”.
Understanding the physical mechanisms such small animals use to achieve extreme accelerations could have biomimetic applications. “The most obvious might be insect-scale robotic systems,” Wissa says. “However, what we discovered in the paper can also help us design new actuation strategies for small mechanical systems where power might be limited.”
I was recently reading through an update on the current COVID-19 situation, with particular reference to the number of secondary infections, R, produced by a single infected person. It suddenly dawned on me that, in the past, I had often encountered numbers that have exactly the same significance as R, but in other situations, with the potential for global disruption of the integrity and wellbeing of the planet. In essence, if the number exceeds 1.0 then the effect becomes divergent, or in more dramatic terms a chain-reaction occurs; whereas for a level below 1.0 the effect will be convergent towards zero. Being mindful of the Doomsday Clock with which most scientists, as well as readers of Physics World, are familiar, I have come to think of these numbers as “Doomsday numbers”.
In the case of the SARS-CoV-2 virus, with a Doomsday number R greater than 1.0 the infection will strike everybody, unless it is checked by the natural immunity of those recovering from infection, or by induced immunity by vaccine. Essentially, the infection will terminate when whole populations have gained immunity. It seems logical to expect any form of intervention such as temporary isolation, or protective equipment, would result in alternating periods of spikes and recovery; very reminiscent of a simple harmonic motion, and would only prolong the passage towards herd immunity. Sadly, as with all serious health issues, there will be some sections of a community with very poor outcomes.
In August 1939 Albert Einstein wrote to the US president Franklin D Roosevelt, alerting him to the real possibility of the development of an atomic weapon based on the fission of uranium and of its possible use by a potential enemy. Einstein quoted convincing research by Leo Szilard and Enrico Fermi to substantiate his warning. After consultation with his own group of scientists the president approved the production of an atomic bomb and in November 1942 work began at Los Alamos.
A week later, working independently with their nuclear reactor in Chicago, Enrico Fermi and his team achieved the first sustained nuclear chain reaction in the 235 isotope of uranium. During its work the team used its own Doomsday number, k, to assess the progress of the reaction. Fermi defined k as the average number of secondary neutrons produced by one original neutron in a lattice of infinite size. They were, of course, seeking a k value of more than 1.0.
This subsequently led to the design and construction of the first atomic weapons, eventually dropped on two Japanese cities in 1945. The safe operation of the pile depended critically on the manual lifting and lowering of neutron-absorbing rods. The brilliant but rather naive Fermi later expressed the hope that with the dramatic end to the war, emphasis would be shifted from weapons to peaceful applications of nuclear energy, such as energy production and uses in medicine. He hoped it would be “hailed for the advantages that it may bring to man, never being feared on account of its destructive possibilities”.
In view of its potential implications on global resources it seems appropriate to ascribe a unique Doomsday number to global population, which we may call P. The number would reflect the global balance of births over deaths. Overpopulation of the planet has implications of crop shortage, collapse of ecosystem services, overfishing, scarcity of safe drinking water and deforestation, along with many others. Since the end of World War Two, the average family unit has decreased steadily from around 7 in 1950 to around 4.5 today. Assuming that most families contain two adults these figures suggest that P remains doggedly above 1.0, with the current global population increasing towards 8 billion.
If I was a little more adventurous and a lot more imaginative, I might propose further that all Doomsday numbers are somehow connected
It is tempting to suggest that other global threats such as pollution and climate change have their own Doomsday numbers associated with them, which we may call Pn and C respectively. If I was a little more adventurous and a lot more imaginative, I might propose further that all Doomsday numbers are somehow connected. Indeed, I may even attempt to formulate a relationship between R, C and P, or between P, R and Pn.
Many scientists have voiced concerns over the rapid pace at which technology has advanced and many view the recent surge as a multi-pronged threat to humankind and the environment. In particular artificial intelligence (AI), which seems to be spreading its influence into increasingly more aspects of human activity, causes anxiety for many. As long ago as 1951 that brilliant mathematician and computer scientist, Alan Turing, shared his concerns that future super-intelligent, possibly hostile, AI may pose a threat to humans. Much later, Stephen Hawking worried that AI “could spell the end of the human race”.
The derivation of the Doomsday numbers of every threat would be a challenging, if not impossible, task but in return they would offer guidance towards recovery. In the meantime, perhaps we should be mindful that in January 2020 the Doomsday Clock was set at 100 seconds to midnight, the closest it has been since its inception in 1947. I for one feel terrifyingly close to those final deafening chimes – or should that be final Big Bang?
In recent years, DNA nanotechnology has matured to enable robust production of complex nanostructures and hybrid materials. We have combined DNA nanotechnology with sensitive optical detection to create functional single-molecule devices that enable new applications in single-molecule biophysics and biosensing.
To improve the sensitivity of DNA detection assays, we created DNA origami optical antennas for metal enhanced fluorescence. We first discuss the influence of the metal nanostructure on the photophysical properties of dyes. We then show how that single molecules can even be detected on a battery-driven portable smartphone microscope using DNA origami nanoantennas.
Further applications including self-healing nanostructures as brightness references, a molecular force clamp and single-molecule assays on graphene are presented.
Prof. Dr Philip Tinnefeld obtained his PhD in 2002 in the field of single molecule spectroscopy. After postdocs in Los Angeles and Leuven, he completed his habilitation in physics at Bielefeld University in 2006. He became associate professor of biophysics at the Ludwig-Maximilians-Universität München (LMU) and as full professor of biophysical chemistry at the Technische Universität Braunschweig. Now he is a full professor of physical chemistry at LMU and the head of the NanoBioScience group. He is the author of more than 170 research items and filed nine patent applications.
Among his key publications there is the first demonstration of the switchability of fluorescent dyes such as Cy5 and the discovery of the reducing and oxidizing system (ROXS). He was involved in the development of the superresolution techniques dSTORM, Blink Microscopy and DNA PAINT.
In recent innovations, he used DNA nanotechnology for self-assembled molecular devices including calibration nanorulers, nano-adapters, fluorescence enhancers and molecular force clamps. He has been initiator and mentor of GATTAquant, the first company commercializing DNA origami applications.
For the second year running the Doomsday Clock is at 100 seconds to midnight – its latest ever time. The decision was announced yesterday by the Bulletin of the Atomic Scientists, the publication founded in 1945 by physicists concerned by the existential threat of nuclear weapons.
“The pandemic revealed just how unprepared and unwilling countries and the international system are to handle global emergencies properly,” says Bulletin editor John Mecklin. “As a result, many hundreds of thousands of human beings died needlessly.”
Created by the Bulletin in 1947, the clock provides a metaphor for how close we are to a humanity-ending catastrophe. Having begun at seven minutes to midnight (mainly for aesthetic reasons), the clock has ticked further or closer to midnight depending on prevailing nuclear concerns.
Each year, the clock is set by the Bulletin’s science and security board in consultation with its board of sponsors, which includes 13 Nobel laureates. Since the mid 2000s, the clock-setters have also been factoring in the risk of climate change, and more recently the threats of disruptive technologies such as artificial intelligence.
History was made in January 2020, when the clock was set to 100 seconds to midnight, the closest it had ever been to the doomsday scenario. That decision was based on the continuing existence of nuclear arsenals coupled with the lapse of several major arms-control treaties, and the US’s decision to quit the Iran nuclear deal.
Since then, the COVID-19 outbreak evolved into a global pandemic, so many had expected the clock to tick even closer to midnight this year.
But in the eyes of Bulletin scientists, the COVID-19 pandemic is a tragedy but not an existential threat in the same realm as nuclear war and climate change. They also cite some recent positive developments, such at the US’s intention to rejoin the Paris Agreement on climate change and to extend the New START arms control agreement with Russia.
Overall though, there is no attempt to sugarcoat the bleak global situation.
“The lethal and fear-inspiring COVID-19 pandemic serves as a historic wake-up call, a vivid illustration that national governments and international organizations are unprepared to manage the truly civilization-ending threats of nuclear weapons and climate change,” says Rachel Bronson, president and CEO of the Bulletin.
The Bulletin statement concludes by providing 16 practical steps for world leaders to initiate in 2021. Alongside the nuclear warnings, there are pleas to ramp up decarbonization, boost biological research, and to ethically combat Internet-enabled misinformation.