In a twist on a classic physics experiment, researchers in the Netherlands and France have worked out why room-temperature alcohol droplets will levitate and propel themselves across a pool of liquid nitrogen for long periods of time. The team, led by Anaïs Gauthier at the University of Twente, have studied the propulsion associated with the “inverse Leidenfrost effect” and their work could lead to more efficient ways to transport small amounts of biological material.
The Leidenfrost effect arises when a liquid droplet is deposited onto a surface hotter than its boiling point, causing the bottom of the droplet to evaporate continuously. This creates a repulsive cushion of vapour, which both prevents the droplet from quickly boiling away, and causes it to hover above the hot surface. There is virtually no friction between droplet and surface, and changes in surface texture can cause the droplet to accelerate, climb small hills and even negotiate a maze.
Formally identified in 1756 by the German scientist Johann Leidenfrost, the effect has probably entertained people for millennia and has become a popular classroom demonstration.
Hot objects
The inverse Leidenfrost effect was first described in 1969 and involves a hot object such as a droplet levitating above a cold liquid. In this case, heat from the droplet causes some of the cold liquid to evaporate, creating the repulsive cushion of vapour.
Gauthier’s team did this by depositing a room-temperature droplet of alcohol on top of a pool of liquid nitrogen at −196 °C. Within just a few seconds of deposition, the droplets propelled themselves from rest to reach speeds of several centimetres per second; gliding in straight lines before ricocheting off the container walls (see video). This continued for tens of minutes before the droplets slowed down as they cooled to the temperature of the nitrogen bath. Gauthier and colleagues propose that this self-propelling behaviour arises from subtle symmetry-breaking in the vapour film, which cause vapours to flow out from under the droplet, inducing drag.
Based on their observations, the physicists constructed simulations to model the inverse Leidenfrost effect. By modelling variations in vapour film thickness and the cooling dynamics of the drops, they could accurately recreate the variations in their observed droplet velocities.
Gauthier’s team believe the effect could be used to develop efficient techniques for freezing and transporting biological materials including cells and proteins. With the help of simulations, they hope that this transport could occur with no risk of contamination or heat degradation to the materials.
While proton therapy is becoming a standard treatment option in radiation oncology – there are currently 92 operational proton facilities worldwide and a further 45 under construction – many challenges remain in terms of the fundamental physics, radiobiology and clinical use of protons for the treatment of cancer. Those challenges, and plenty more besides, were front-and-centre at the UK’s Fifth Annual Proton Therapy Physics Workshop held early in February at the National Physical Laboratory (NPL) in Teddington.
The timing of this year’s event was apposite. In 2018, the UK’s National Health Service (NHS) opened its first high-energy proton therapy centre at The Christie Hospital in Manchester, with a second facility at University College London Hospital (UCLH) scheduled to come online for patient treatment in 2020. Proton Partners International, a private health provider, is also rolling out a network of four proton-therapy facilities across the UK, with the first of its Rutherford Cancer Centres now treating patients in south Wales.
“PPRIG was set up by NPL in 2012 to progress UK deployment of high-energy proton therapy,” explained Russell Thomas, senior research and clinical scientist at NPL and chair of PPRIG. “We aim to help coordinate research activities, encourage multicentre collaboration and minimize duplication of effort. Our annual proton-therapy physics workshop is a logical extension of PPRIG’s remit, bringing the UK proton-physics community together with leading researchers from overseas.”
It’s good to talk: delegates at the annual conference on proton therapy (Courtesy: NPL)
Another objective of PPRIG is to promote the work of early-career researchers, helping them to build networks, collaborate on specific research problems, and support their grant applications. “The workshop fosters open, robust, but always good-humoured debate around the hot topics in proton therapy,” Thomas added.
Ana Lourenço, a postdoctoral research scientist at UCL and NPL, agrees that the PPRIG meeting provides a welcoming platform for younger scientists. “The PPRIG workshop was a great opportunity for early-career scientists to present their work and have feedback from world-leading medical physicists and clinical scientists,” she said. “The reduced registration fee for students allowed many to participate, with plenty of time in the programme dedicated to more open, informal discussion to facilitate the interaction between students and senior researchers.”
Standardize and verify
Given NPL’s role as the UK’s national measurement institute, much of the discussion at last week’s meeting eddied around issues of dosimetry and quality assurance (QA). In other words, how to maximize clinical outcomes by ensuring that patients receive standardized and rigorously audited proton therapy – irrespective of where they’re being treated.
With those outcomes in mind, Thomas reported on NPL’s work to develop a code of practice for proton-beam dosimetry in collaboration with the UK Institute of Physics and Engineering in Medicine (IPEM). Underpinning that code is absolute dosimetry for calibration of the proton beam, something that NPL is striving to improve through the development of a primary standard for protons based on a portable graphite calorimeter, in which the temperature rise due to a typical patient dose is measured to quantify the amount of “dose” deposited.
Previously, the only option for proton-beam reference dosimetry was a 60Co-based calibration, which has an uncertainty in terms of reference dosimetry that’s often quoted at 4.6% – but which anecdotally may be somewhat higher. With the increase in patient numbers for proton therapy, it is desirable to bring this uncertainty down to a similar level achieved with the reference dosimetry of conventional photon radiotherapy, which would be closer to 2%.
Thomas says that the NPL calorimeter has been transported to proton-therapy centres in Liverpool (Clatterbridge), Manchester (The Christie), Newport (Rutherford Cancer Centre), Sicily, Prague and Japan, where it has been operated successfully in clinical settings.
“We need to improve the uncertainty of the dose delivered to patients to ensure the best possible consistency across the patient population and to fully understand and interpret the patient outcomes,” he explained. “By bringing the uncertainty on reference dosimetry down to a similar level as that currently achievable in conventional photon radiotherapy, the primary standard will aid in the comparison of the results from high-quality, multicentre clinical trials featuring different treatment techniques.”
Stuart Green, director of medical physics at University Hospital Birmingham, told Physics World that the new IPEM code of practice for proton dosimetry relies heavily on calorimetry developments at NPL over the past 15 years. He says the new code will be ready for publication later this year and hopes that UK centres will transfer to the new approach soon after.
“What’s more,” Green added, “the NHS has a significant opportunity with the opening of the two new proton-therapy centres [at Christie and UCLH] to initiate definitive clinical trials. I am sure the rest of the world will watch with interest to see how well these trials are rolled out.”
Other speakers developed the QA theme in terms of reference and in-clinic proton dosimetry. NPL’s Lourenço, for example, reported findings from a team of UK and Danish scientists who compared the response of user ionization chambers at three clinical facilities against NPL reference ionization chambers.
Their study, which involved a low-energy passively scattered proton beam and two high-energy pencil-beam-scanning proton beams, showed “good agreement between the results acquired by NPL and the proton facilities”. Lourenço added that “reference dosimetry audits such as this are important to improve accuracy in radiotherapy treatments, both within and between treatment facilities, and to establish consistent standards that underpin the development of clinical trials.”
In the same session, Antonio Carlino of the MedAustron Ion Therapy Center, Austria, detailed a new approach for end-to-end auditing of the treatment workflow based on customized anthropomorphic phantoms featuring different types of detector. During the dosimetry audit, which was carried out at HollandPTC in Delft, the phantoms followed the patient pathway to simulate the entire clinical procedure, mimicking the human body as closely as possible in terms of material properties and movement. Carlino told delegates that human-mimicking phantoms deployed in end-to-end audits of this type “may serve as dosimetric credentialing for clinical trials in the future”.
Image and adapt
Standardization and QA notwithstanding, there was plenty of focus on new concepts and emerging technologies for the proton-therapy clinic, with imaging at the point of treatment delivery and online adaptive proton therapy very much to the fore.
“It is one thing to deliver the sophisticated treatment dose volumes that proton therapy is capable of, but it is another to be confident that the dose is being delivered to the right place,” explained Thomas. “During a course of treatment lasting up to six weeks, with daily fractions, the anatomy of a patient may change dramatically as a result of weight loss and/or tumour shrinkage. Imaging can ensure the dose is still being delivered correctly, while informing dynamic refinement of the treatment plan over the course of the treatment.”
Central to the success of adaptive proton therapy is a technique called deformable image registration (DIR), the use of powerful image-processing tools that maximize spatial correspondence between multiple sets of images (e.g. CT scans) collected over an extended treatment timeframe, and even across multiple imaging modalities.
Many developments will take place in the next few years, enabling dose rates to increase and treatment times to reduce
Tony Lomax, Paul Scherrer Institute
However, according to Jamie McClelland from UCL’s Centre for Medical Image Computing, a number of open questions remain around online deployment of DIR in the proton-therapy workflow: “What exactly do we want DIR to do? How do we know it’s doing it correctly? And how do the errors and uncertainties in DIR impact clinical applications?”
More broadly, what of the longer-term development roadmap for proton therapy? Harald Paganetti, director of physics research at Massachusetts General Hospital (MGH) and professor of radiation oncology at Harvard Medical School in the US, reckons proton therapy is currently at what he calls the “Adaptive 1.0” stage, with CT scans performed daily but treatment plans revised and adapted offline.
MGH’s evolution to “Adaptive 2.0” will see that adaptation taking place online in the proton treatment workflow. Key enablers of the MGH approach include cone-beam CT-based imaging and online measurement of the prompt-gamma emissions from delivery of a “partial dose” to the centre of the target (enabling an initial assessment of range accuracy). This is then followed by a rapid adaptation before the remainder of the dose is delivered for a given fraction.
For protons, it seems, the future is bright, with no shortage of opportunities for progress across core physics, emerging technologies and clinical applications. “Pencil-beam-scanning proton-beam therapy is still in its infancy,” noted Tony Lomax, chief medical physicist at the Paul Scherrer Institute in Villigen, Switzerland. “Many developments will take place in the next few years, [enabling] dose rates to increase and treatment times to reduce.”
An efficient new method to find out whether a material hosts topological states or not could help increase the number of known topological materials from a few hundred to thousands. The technique is very different to conventional target-oriented searches and uses algorithms to sort materials automatically according to their chemical properties and properties related to symmetries in their structure.
Xiangang Wan. Courtesy: Nanjing University
Topological materials – exotic materials whose surface properties are very different to those in their bulk – have created a flurry of interest in recent years and are currently revolutionizing modern condensed matter physics thanks to their unique properties that come from their topology. Topological phases of matter are so-called because they are mathematically described by global invariants that are unaffected by imperfections, such as defects or other variations, in a material.
An example of a topological material is a topological insulator (also known as a 2D quantum spin Hall insulator). These are materials that are electrical insulators in the bulk but which can conduct electricity extremely well on their edge via special topologically protected electronic states. Electrons can only travel in one direction along these states and do not backscatter. This means that they can carry electrical current with near-zero dissipation of energy and so could be used to make energy-efficient electronic devices in the future.
Topological insulators were discovered over 10 years ago and we now know of a few hundred materials that belong to this category of material. Only a dozen or so of these appear to be suitable for real-world applications, however.
Symmetry indicators theory
To predict whether a material can host topological states, researchers mainly rely on complex theoretical calculations. Two teams, one at Princeton University in New Jersey and the other at Harvard University in Cambridge, Massachusetts, recently put forward a new approach based on the recently established theory of symmetry indicators, however, to speed up this search process. In these studies, the physicists use algorithms to sort materials automatically according to their chemical properties and properties that come from symmetries in their structure. These symmetries define where electrons move in the crystal lattice and can be used to predict how electrons will behave – and therefore whether a material can host topological states or not.
Researchers at Nanjing University in China led by Xiangang Wan together with the Harvard team, led by Ashvin Vishwanath, have now shown that the computation of symmetry indicators for any crystalline symmetry setting can readily be integrated into standard first-principle calculations. “In stark contrast to conventional target-oriented searches, our technique does not presuppose any specific phase of matter but instead automatically identifies all nontrivial electronic band structures and then classifies them as either topological insulators, topological semi-metals or topological crystalline insulators,” explains Wan.
To show how powerful their algorithm is, the researchers began by analysing crystalline materials in eight different space groups (which represent a description of the symmetryof a crystal) and say they have unearthed hundreds of materials capable of hosting topological phases.
“Two topological crystalline insulators in particular are worth mentioning,” says Wan. “The β-MoTe2 (space group 11) with screw-protected hinge states, and the BiBr (space group 12) with a rotation anomaly. The existence of the novel topological feature in β-MoTe2 was verified in a later study (arXiv:1806.11116).”
Fast technique
In traditional topological-material-discovery algorithms, we have to first pre-assume a specific topological phase and then calculate the corresponding topological invariant(s), a process that is usually very time-consuming,’ says Wan. “Our new technique is very fast and we have used it to comprehensively search for topological materials in 230 space groups in all (arXiv:1807.09744),” he tells Physics World.
The researchers, reporting their work in Nature Physics 10.1038/s41567-019-0418-7, say that many of the topological materials they have discovered using their approach could be promising for making next-generation electronic devices. Indeed, their work has already attracted much attention.
The approach is not just limited to topological materials either and could be extended to other 2D and magnetic materials, says Wan. “Our present work focuses on electronic band topology, but the symmetry indicators of phonons, photons and magnons could also be built and used to search for topologically non-trivial phases in these systems.”
The European Physical Society (EPS) has warned that a major open-access initiative in Europe could cause “irrecoverable damage” if it is implemented too quickly. In a statement, the EPS says that while it welcomes the proposal – known as Plan S — as a “medium to long-term vision”, its proponents must get more support by engaging further with the scientific community.
Plan S is an ambitious attempt to make research papers open access immediately after they are published. It was unveiled in September 2018 by 11 national research funding organizations — dubbed cOAlition S – that include UK Research and Innovation and the French National Research Agency. The group says that all scientific publications resulting from research funded by public grants provided by “national and European research councils and funding bodies” must be published in “compliant” open-access journals or on open-access platforms from 1 January 2020.
If implemented, the agreement means that authors funded by these agencies would not be allowed to publish in so-called “hybrid” journals, with funders being able to sanction researchers who are not compliant with the rules. Hybrid journals are publications that remain subscription based but give authors the choice to make their papers open access for a fee, known as an article-processing charge.
Reaching a tipping point
While the EPS states that it supports open science and that the physics community has often pioneered its implementation, it argues that several governing principles for Plan S are not “conducive” to a transition to open access. Notably, the EPS says that a forced transition in such a short period of time could “undermine the economic viability of many journals”, which would cause “irrecoverable damage to established, well-functioning networks of editors and referees”. The society adds that publication in open-access repositories can only “complement, not replace” publication in peer-reviewed outlets.
The EPS also warns that non-European authors may not have access to the same level of open-access funding as in Europe and that such a plan can “only succeed” when it is coordinated globally. A solely Euro-centric implementation, the society says, risks “accentuat[ing] knowledge divides, both inside Europe and between north and south”. In addition, the EPS is concerned that Plan S limits researchers’ freedom to choose where to publish, which could be a problem as academic recruitment and career advancement are still based on publication metrics and journal prestige.
Some of the EPS’s views are also shared by the Institute of Physics (IOP), which publishes Physics World. In a statement released today, the IOP calls on cOAlition S to extend the timeline for transition to fully open access and to continue to support the hybrid model until a “natural tipping point” has been reached. “As more funders and countries support open access, so a larger proportion of articles in IOPP’s journals will be published on an open access basis and they will be in a position to convert,” the IOP says. The statement also underlines the importance of income from publishing for learned societies, such as the IOP.
Springer Nature – the largest publisher of open-access papers – also says cOAlition S should rethink its opposition to hybrid journals. In a statement released earlier this week, it advises the group to carry out research to demonstrate the benefits of open access so that more funders and scientists support it. The company also calls for “highly selective journals and those with significant levels of non-primary research content need” to be treated differently.
The statements from the publishers follow an open letter published in November 2018 by over 1700 scientists that called Plan S a “serious violation of academic freedom”. They noted that while Plan S was written with good intentions, the ban on hybrid journals would cause a “big problem” and that the initiative also threatens to split the global scientific community into two separate systems.
A couple of weeks ago the small satellite company Capella was alerted that another craft was on a high-speed collision course with their pathfinder, Denali. Manoeuvre commands were urgently sent to the satellite and the crisis was averted, although the two objects still passed each other very closely.
Space may be great and vast but it’s getting crowded closer to Earth. According to the European Space Agency (ESA), more than 29,000 large pieces of debris are orbiting our planet – and many more smaller bits besides. And as smaller satellites, such as the one that flew past Denali, often don’t have propulsion systems to help them avoid impact, the risk of these numbers growing is increasing.
Many have gazed upon Vincent van Gogh’s The Starry Night in wonder. But if you’re a scientist, those famous spirals – all of varying sizes – in the painted sky might remind you instead of turbulent flow. After all, Andrei Kolmogorov’s 1941 description of subsonic turbulence does rely on vortices of different length scales.
Researchers in Australia have now published a paper weighing in on whether van Gogh’s famous artwork does indeed depict realistic turbulence.
After calculating a 2D power spectrum on a square region of painted sky, they concluded that the sky of The Starry Night is actually far more reminiscent of the supersonic turbulent flow inside molecular gas clouds. Such conditions are known to be the birthplace of stars in the universe, a coincidence that’s oddly fitting.
Now, as it’s the weekend, you might be looking forward to a cheeky glass of limoncello, the liqueur originating from southern Italy. But did you know that while water-repellent industrial chemicals usually require surfactants to mix with water, in limoncello the alcohol effortlessly keeps the citrus oil and water together.
Scientists from the Institut Laue Langevin have now examined the microscopic composition of the liqueur. They uncovered that limoncello is composed of tiny oil droplets in a water-alcohol mix. And if we can work out how the mixture forms, it could lead to many applications for essential oils in specialty chemicals as well as environmentally friendly plastics and insect repellents.
You can find out more about the research in ACS Omega.
Measuring the vital signs of small conscious animals is important for assessing their health and behaviour. Current monitoring techniques, such as electrocardiograms (ECG), ultrasound and auscultation (listening to heart and breath sounds), however, rely on close skin contact with the animal and other invasive approaches. Such methods can cause discomfort or stress to animals and may even require anaesthesia, particularly for birds, reptiles and fish.
To address this problem, Xiaonan Hui and Edwin Kan from Cornell University are developing a less invasive way to monitor the health of small animals, based on radio-frequency (RF) near-field coherent sensing (NCS). They note that this approach has minimal impact on the daily rhythms of the animal, with most unlikely to even notice the ongoing measurements (Science Advances 10.1126/sciadv.aau0169).
Hui and Kan investigated two NCS setups. The first involves a wireless system that uses a harmonic RF identification (RFID) architecture with inexpensive passive sensing tags. In this setup, a harmonic reader transmits the downlink signal at a frequency, f, through the reader transmit antenna. This signal powers up a passive harmonic RFID tag placed in the vicinity of the animal and is converted to a second-harmonic frequency at 2f (the NCS sensing signal).
As long as the animal is within the near-field range of the sensing tag antenna, motion on and inside its body is coupled to the backscattered signal and sent to the harmonic reader’s receive antenna. This approach, deployable without need of maintenance, is suitable for use in the animals’ natural habitats with weather-proof RFID tags and the reader placed nearby.
The second approach replaces the wireless links between the reader and the harmonic tag with RF cables. Here, the harmonic reader transmits the NCS sensing signal directly at 2f and is placed in the near-field range of the animal under test. The NCS signal modulated by the animal’s vital signs is received by the reader antenna. This design reduces interference and is appropriate for deployment in an indoor laboratory.
To compare their NCS approach to current monitoring methods, Hui and Kan first performed synchronized NCS and ECG measurements on an anesthetized rat. The heartbeat intervals extracted from NCS and ECG data matched very closely. The researchers suggest that NCS is sufficiently accurate to replace ECG for behaviour studies based on heart rate variation.
Next, they demonstrated the possibility of using non-invasive NCS on several species of small conscious animals, in which monitoring methods such as ECG are difficult, if not impossible. These included a pet golden hamster, a parakeet, a Russian tortoise and a betta fish.
Hamster under measurement with the sensing tag. (Courtesy: Xiaonan Hui, Cornell University)
The hamster, for example, was monitored in its cage using both wireless and wired NCS applied from outside the cage. The researchers acquired respiratory and heartbeat waveforms without the hamster being aware of the device. The heartbeat waveform features were similar for each beat during the recording, and were similar to those in the anesthetized rat (with 20% longer heartbeat intervals).
The researchers also successfully detected detailed features of heartbeat and respiration in the parakeet and tortoise. A heartbeat was thought to have been recorded in the fish, but further studies are needed to confirm this finding.
“Our NCS system with convenient setup not only provides the previously unachievable sensing capability but also improves the animal testing protocols with no harm to their welfare or interference to their circadian rhythms,” Hui and Kan conclude. “Our demonstrations can be further adapted to other species and laboratory settings to provide more humane study, care and assessment of animals; our method also provides measurement of unbiased vital signs of animals monitored in their natural state.”
The UK and US have announced a $35m upgrade to the Advanced Laser Interferometer Gravitational-wave Observatory (aLIGO). The improvement will see the twin observatories — located near Hanford, Washington and Livingston, Louisiana in the US – double their sensitivity to gravitational waves. Work on the upgrade will start in 2023 and be complete two years later.
Each LIGO facility works by sending twin laser beams down two 4 km-long tubes – arranged as an L-shape – that are kept under a near-perfect vacuum. The beams are reflected back down the tubes by mirrors precisely positioned at the ends of each arm. As a gravitational wave passes through the observatory, it causes extremely tiny distortions in the distance travelled by each laser beam.
LIGO first turned on 2002 and was upgraded between 2010 and 2015 to improve the facilities’ ability to spot gravitational waves by a factor of 10. Thanks to this $221m upgrade – known as Advanced LIGO, or aLIGO – researchers can detect gravitational waves that originate anywhere within a sphere of about 420 million light-years in radius, centred on the Earth.
That breakthrough was announced in February 2016 when researchers working on aLIGO directly detected gravitational waves for the first time in an event in September 2015 – when aLIGO was being calibrated. The waves were produced from the collision of two black holes of 36 and 29 solar masses, respectively, which merged to form a spinning, 62-solar-mass black hole, some 1.3 billion light-years (410 mpc) away in an event dubbed GW150914. The finding ended the decades-long hunt for these ripples in space–time and marked the beginning of a new era of gravitational-wave astronomy that has since resulted in around 10 gravitational-wave events being detected including from the merger of two neutron stars.
Reducing noise
While aLIGO is set to begin another operating run in the next couple of months, plans are now afoot to boost its sensitivity even further. The US National Science Foundation (NSF) announced today that it will provide $20.4m for a further upgrade to the facility, dubbed aLIGO+. The UK Research and Innovation, meanwhile, will provide a further $14.1m with additional support from the Australian Research Council.
The upgrades will include applying new coatings to the mirrors to reduce thermal noise as well as improvements to the laser system. aLIGO+’s capabilities are expected to probe the origins and evolution of stellar-mass black holes as well as allow precision tests of extreme gravity and enable detailed study of the equation of state of neutron stars.
“This award ensures that LIGO will continue to lead in gravitational wave science for the next decade,” says NSF Director France Córdova. “These detections may reveal secrets from inside supernovae and teach us about extreme physics from the first seconds after the universe’s birth.”
According to David Reitze, executive director of the LIGO Laboratory, the upgrades will see the observatory being able to detect binary black hole collisions on “an almost daily basis”. The improvement will also be made “as standard” to the planned LIGO facility in India, which, if built, is expected to come online in 2025.
A new method to read out the spin states of individual negatively-charged nitrogen vacancy (NV–) centres has been developed by researchers in Europe and Japan. The technique could make today’s bulky read-out systems obsolete and enable new uses of NV– centres in electronic devices. It could also be used to read-out NV- centres that are very close together, which could be useful for developing quantum-information technologies.
A nitrogen vacancy occurs when two adjacent carbon atoms in a diamond lattice are replaced by a nitrogen atom and an empty lattice site. Together, the nitrogen atom and the vacancy can behave as a negatively-charged entity with an intrinsic spin. NV– centres are isolated from their surroundings, which means that their quantum behaviour is not immediately washed out by thermal fluctuations. As a result, they can be used to create a range of quantum technologies that operate at room temperature.
A green photon hitting an NV– centre can promote an electron to an excited state. As it decays back to the ground state, it may emit a red photon. The NV– centre has three spin sublevels, whose excited states have different probabilities of emitting a photon when they decay. By exciting an individual NV– centre repeatedly and collecting the red photons emitted, therefore, researchers can detect its spin state – which is extremely useful for quantum computation. Moreover, because the spin state can be influenced by external variables such as magnetic field, electric field, temperature, force and pressure, NV– centres can therefore been used as atomic-scale sensors.
Bulky detection
Although NV– centres are tiny, the equipment required to collect the red photons is bulky and complicated. This has prevented the integration of NV– centres into chip-sized devices. It also poses a problem for using NV– centres in room-temperature quantum computing. Entangling two NV– centres requires them to be about 30 nm apart, which is much smaller than the diffraction limit for red light. As a result, detecting the spin states separately requires difficult and expensive microscopy techniques.
Also, the finite lifetime of the excited state slows down experiments: “To get information about the NV centre’s spin state, you have to repeat the measurement many times,” explains Milos Nesladek of Hasselt University in Belgium: “You can put in only a certain amount of laser power before the optical signature saturates.”
Nesladek and colleagues have created an alternative method of detecting an NV- centre’s spin that also uses green laser light. However, the same physics that causes the excited states of NV– centres with different spins to have different fluorescence on decay also causes them to have different probabilities of absorbing a second photon from the same laser. This removes the extra electron from the NV– centre into the conduction band of the diamond. If a voltage is applied, the electron can move freely through the diamond and be detected. Measuring the photocurrent produced when light hits a specific NV– centre therefore allows researchers to infer its spin state. This process was first unveiled by Nesladek and colleagues in 2015 for NV– centre ensembles. The new work extends this to detection of single NV– centre spins.
Stronger signal
The researchers demonstrated a higher signal-to-noise ratio than possible with optical detection under the same conditions. Furthermore, they found that, as the photocurrent was produced when electrons were promoted from the excited state rather than when they decayed from it, it continued to increase when they turned up the laser power. Most importantly, says team member Petr Siyushev of the University of Ulm in Germany, “you don’t need to implement complicated optical detection: you can just integrate everything into a tiny diamond chip which will be compatible with all current electronic technology”.
Ronald Walsworth of Harvard University in the US describes the work as “a very important technical step.” He cautions that the optics required to target specific NV– centres with green light are still quite complex, and says electrical readout presents its own difficulties: “You need to fabricate electrodes in specific places on the diamond,” he says. “Once you do that you can’t easily image many NVs over a wide field of view.” Nevertheless, he believes the technique has real promise for applications such as use of NV centres in cryogenics: “Getting good optical detection of red photons coming all the way out of a cryostat is a real challenge,” he says, “With electrical detection that would become very straightforward.”
A new mechanically strong, double-pane ceramic aerogel made from hexagonal boron nitride that is resistant to high temperatures could be used in aerospace and industrial applications. The material, which boasts both a negative Poisson’s ratio and a negative thermal expansion coefficient, is very different to typical ceramic aerogels that are brittle and structurally degrade under thermal shocks.
Aerogels are exceptionally lightweight, composite materials containing more than 99% air. They can withstand high temperatures and are resistant to many chemicals. Most aerogels studied so far, however, are made from ceramic materials, such as silica, alumina and silicon carbide, and are thus very brittle.
Researchers recently made aerogels from graphene (a sheet of carbon just one atom thick). Here, the nanosheets of carbon stack up against each other, which makes the material incredibly strong. The nanosheets also divide the aerogel into nanosized cells through which air cannot pass. This means that the material has a thermal conductivity that is lower than that of air.
Sacrificial template
A team led by Xiangfeng Duan of the University of California, Los Angeles, has made a structurally similar aerogel from another 2D material, hexagonal boron nitride (hBN) by using a porous graphene aerogel as a sacrificial template. The researchers grew their aerogel using modified hydrothermal reduction and non-contact freeze-drying techniques. They used borazine as the hBN precursor and then grew hBN layers on top of the graphene structure using chemical vapour deposition.
Since hBN resists oxidation better than graphene and has a higher thermal stability as well, they are easily able to remove the graphene using a thermal etching process to leave behind the pure hBN aerogel.
The resulting material has a density as low as 0.1 mg/cm3 thanks to its highly porous structure with atomically thin cell walls (made of highly crystalline hBN), is superelastic (it can be compressed to 5% of its original length without breaking and fully recovers), and has an ultralow thermal conductivity (of around 2.4 mW/m∙K in vacuum and 20 mW/m∙K in air). It can also withstand sharp temperature shocks in that it can be heated to 900°C and then rapidly cooled to -198°C at a rate of 275°C per second over several hundred cycles while hardly losing any of its strength.
Careful microstructure engineering
The material’s superior properties come from the fact that it has both a negative Poisson’s ratio (it contracts inwards when compressed) and a negative thermal expansion coefficient (it contracts when heated). Duan and colleagues were able to confer both these properties (which are the opposite to conventional materials) onto their aerogel by carefully engineering its microstructure through hierarchical structuring and produce a material with hyperbolic frameworks. These have saddle shapes with negative curvature.
“The sub-cell wall features in our aerogel also have a double-pane architecture that reduces wall thickness without compromising the mechanical strength of the material and facilitates out-of-plane vibration modes for the negative thermal expansion coefficient effect,” Duan tells Physics World. “Such walls also retard heat transfer by gas molecules to ensure very low thermal conductivity.”
According to the researchers, the new ceramic aerogels could be used as thermal insulation in applications that require extreme temperatures, such as spacecraft and automobile components. Since the material also has a high surface area of more than 1080 m2/g, a value that is higher than those reported for other ultra-light materials (around 800 m2/g for silica or carbon aerogels), it might also be used in applications that call for high surface-to-volume ratios, like gas catalysis and thermal energy storage.
The team, reporting its work in Science, says that it is now busy making ceramic aerogels with even better flexibility and robustness, higher working temperatures and lower thermal conductivity.
The Blue Planet is to get a little bluer as the world warms and climates change. Where the seas turn green, expect an even deeper verdant tint, new research suggests.
But when US and British scientists tested a model of ocean physics, biogeochemistry and ecosystems – intending to simulate changes in the populations of marine phytoplankton or algae – they also incorporated some of the ocean’s optical properties. Since green plants photosynthesize, they absorb sunlight, and change reflectivity.
And, as mariners have known for centuries, the blue ocean is blue because levels of marine life in the warmer mid-ocean waters are very low.
The researchers tweaked their simulation to see what the world would look like in 2100 if humanity carried on burning fossil fuels on the notorious business-as-usual scenario and took global average temperatures up to 3°C above historic levels.
And they found that higher temperatures would alter the global palette. More than half of the world’s oceans would intensify in colour. The subtropics would become even more blue, and the oceans that sweep around the poles would become an even deeper green, they report in the journal Nature Communications.
“The models suggest the changes won’t appear huge to the naked eye, and the ocean will still look like it has blue regions in the subtropics and greener regions near the equator and the poles,” says Stephanie Dutkiewicz, of the Massachusetts Institute of Technology, who led the research
Wider effects
“That basic pattern will still be there. But it will be enough different that it will affect the rest of the food web that phytoplankton supports.”
The clearer the water, the bluer the reflection of the sunlight. From space, the world looks blue. Waters rich in phytoplankton are by definition rich too in chlorophyll that absorbs blue wavelengths and reflects a green tint. But changes in chlorophyll colouring, observed over the decades from satellite monitoring, can be affected by natural climate cycles and shifts in nutrient supply.
The researchers were looking for a more complete model of the wavelengths of visible light that are absorbed, scattered or reflected by living things. They devised one, and tested their new model against satellite evidence so far. When they found agreement with the past, they had also found yet another way to read the future
Explaining ecosystem change
They tuned their simulated planet to the 3 °C warming that seems inevitable unless humans rapidly shift from fossil fuels to renewable energy sources, to discover that wavelengths of light around the blue-green spectrum shifted the fastest. The shifts in colour could tell a story of altered ecosystems.
“The nice thing about this model is that we can use it as a laboratory, a place where we can experiment, to see how our planet is going to change,” Dutkiewicz says.
“There will be a noticeable difference in the colour of 50% of the ocean by the end of the 21st century. It could be potentially quite serious.
“Different types of phytoplankton absorb light differently, and if climate change shifts one community of phytoplankton to another, they will also change the types of food webs they can support.”