Skip to main content

Rainer Weiss, Barry Barish and Kip Thorne win 2017 Nobel Prize for Physics

The 2017 Nobel Prize for Physics has been awarded to Rainer Weiss, Barry Barish and Kip Thorne for decisive contributions to the LIGO detector and the observation of gravitational waves. The winners will receive their medals at a ceremony in Stockholm on 10 December.

The prize is worth SEK 9m (£823,000). One half of the prize goes to Weiss. Barish and Thorne have equal shares of the second half of the prize.

In February 2016 Weiss, Barish, Thorne and colleagues working on the Laser Interferometer Gravitational-wave Observatory (LIGO) in the US announced that they had made the first direct detection of gravitational waves. The observation was made in September 2015 and the gravitational waves had been created by the merger of two distant black holes.

A similar event was swiftly detected in December of that year and announced in June 2016. A third event was seen in January 2017 and just last week LIGO physicists announced that they had detected their fourth black-hole merger, this time in conjunction with the Virgo gravitational-wave detector in Italy.

Fully clothed

“I even have clothing on,” quipped US-based Weiss on an early-morning telephone call to Sweden shortly after the winners were announced. He said that the prize recognizes the work of thousands of people who worked on LIGO. “At first, we failed but slowly we got things to work,” he said.

When asked about the first detection by LIGO he said “Many of us didn’t believe it. It took us two months to convince ourselves that we had detected a gravitational wave.”

Image of the physics Nobel medal

Weiss said he is looking forward to the detection of gravitational waves from supernovae and merging neutron stars by LIGO and Virgo. He also said that the next generation of gravitational-wave detectors could provide important clues about the early universe.

LIGO comprises two 4 km-long interferometers – one in Hanford, Washington and the other in Livingston, Louisiana (see above video). Each instrument can measure extremely small changes in distance along the interferometer arms that occur when a gravitational wave passes through Earth.

Weiss played a crucial role in designing and building LIGO. He was born in Berlin in 1932 and fled Germany with his family in 1939, settling in New York City. He studied physics at the Massachusetts Institute of Technology (MIT), where he gained a PhD in 1962. After a brief stint at Princeton, Weiss returned to MIT where he started an experimental group that focused on gravitation and cosmology.

Barish became the principal investigator of LIGO in 1994 and was named director of the observatory in 1997. He was born in 1936 in Omaha, Nebraska. He did a BA in physics and PhD in experimental high-energy physics at the University of California, Berkeley. He joined Caltech in 1963. He is Ronald and Maxine Linde Professor of Physics, Emeritus at Caltech.

Thorne was a co-founder of LIGO and did important calculations that showed how gravitational waves would be picked up by the detectors. Born in 1940 in Logan, Utah, Thorne studied physics at Caltech and then moved to Princeton University to complete a PhD in 1965. He then returned to Caltech, where he is now Feynman Professor of Theoretical Physics, Emeritus. Thorne’s research has focused on theoretical astrophysics and relativity.

Diagram of the LIGO detector

LIGO’s (and lately Virgo’s) success has opened a new window into the universe by ushering in the era of gravitational-wave astronomy. There is also the potential for multimessenger astronomy, whereby telescopes are pointed at the source of a gravitational wave to detect electromagnetic radiation emitted by the source. While no such radiation has been seen from the four black-hole mergers, events involving neutron stars and other astronomical objects are expected to be picked up by telescopes.

Accelerating masses

First predicted by Albert Einstein’s general theory of relativity, gravitational waves are ripples in the fabric of space–time that are generated by accelerating masses. Since the theory was published a century ago, Thorne and other physicists have predicted that binary-star or binary-black-hole systems would be prolific sources of gravitational waves.

In the above video, researchers from the LIGO collaboration, based at Cardiff University in the UK, talk about the history of gravitational waves, how LIGO’s interferometers work, the new era of gravitational-wave astronomy and what it was like finding out about their pivotal first observations.

You can read about Physics World’s visit to the LIGO detector in Livingston, Louisiana in “The great detector”.

  • Additional reporting for this article was provided by Tushna Commissariat, reviews and careers editor of Physics World.

DSIF protein caught in action

In animals, plants, fungi or any other eukaryotes (organisms whose cells have nuclei), transcription of DNA into RNA cannot function without a protein called DSIF, but little is known about the exact process enabling it. Carrie Bernecky, Jürgen Plitzko and Patrick Cramer from the Max Planck Institute for Biophysical Chemistry and Max Planck Institute of Biochemistry managed to image DSIF in action while the protein was interacting with RNA Polymerase II (Pol II). These images reveal the structure and function of DSIF, a modular protein that performs a whole range of tasks including holding and positioning DNA and RNA, regulating timing of transcription and subsequent RNA processing for translation.

In the study, the authors show that DSIF consists of two loops, which wrap around the DNA and RNA strand respectively, and a stabilizing domain (Nature Structural and Molecular Biologydoi: 10.1038/nsmb.3465). The protein keeps all molecules involved in transcribing genes into RNA in place: it positions incoming DNA and outgoing RNA, and recruits factors for RNA processing. By physically contacting all key players, including the polymerase itself, it exerts its influence. The positioning of nucleic acids is done by forming clamps, like rings, around the DNA and RNA.

Beyond the resolution of light

To capture images of this process, Bernecky and her colleagues resorted to cryo-electron microscopy and X-ray crystallography. Instead of light, these techniques use electrons and X-rays, respectively, allowing for incredibly high resolution. The much smaller wavelengths of electrons and X-rays overcome the resolution limit that can be achieved with light.

Schematic of DNA to RNA transcription

The experiment highlighted DSIF’s interaction with the outgoing RNA that keeps this RNA in the exit channel of Pol II. If this was not the case, the produced RNA would compete with the non-template DNA strand for the template DNA strand and destabilize the interaction between RNA, DNA and polymerase. As a result, transcription would stop.

DSIF pauses and un-pauses Pol II

Another noteworthy feature is the involvement of DSIF in pausing Pol II and releasing it from pause positions. The pausing regulates the rate of transcription and ensures that genes are expressed synchronously, for example to form new tissues during the development of embryos. This can be achieved by un-pausing Pol II simultaneously in many neighbouring cells. Whether DSIF initiates pausing or un-pausing depends on its phosphorylation state.

The DNA clamp, as observed by Bernecky and her co-workers, is important for maintaining stability of the elongation complex. Indeed, without it, Pol II dissociates from the template and transcription pauses. The RNA clamp of DSIF contributes more actively to the regulation of pausing, mediated by phosphorylation. This makes this RNA clamp a major developmental regulator.

The physics of bread, the quest for metallic hydrogen and adventures in LIGO land

Physics World October 2017 cover

By Matin Durrani

If you’re a student wondering whether to go into research or bag a job in industry, don’t miss our latest Graduate Careers special, which you can read in the October issue of Physics World.

Philip Judge from the US National Center for Atmospheric Research and his colleagues Isabel Lipartito and Robert Casini first describe how budding researchers should pick a PhD to work on. It’s vital as that first project can determine the tra­jectory of your future career.

But if your eyes are set on a job outside academia, careers guru Crystal Bailey from the American Physical Society runs through your options and calls on academics to learn more about what’s on offer so they can advise their students better.

If you’d rather just stick your head in the sand about your career options, however, then why not enjoy the cover feature of the October issue, in which former Microsoft chief tech officer and Intellectual Ventures boss  Nathan Myhrvold discusses his massive new five-volume treatise Modernist Bread.

Mixing history and science – as well as the results of more than 1600 of his own experiments – the book is sure to be the last word on this foodstuff that humans have been baking for millennia.

Don’t miss either Jon Cartwright’s feature on the quest for metallic hydrogen.

Remember that if you’re a member of the Institute of Physics, you can read the whole of Physics World magazine every month via our digital apps for iOS, Android and Web browsers.

For the record, here’s a run-down of what else is in the issue.

• Budget crunch hits Brazilian physics – Brazil’s political crisis is taking its toll on science, with thousands taking to the streets in protest, as Henrique Kugler reports

• A quantum mission – Toshio Hirano, president of Japan’s new National Institutes for Quantum and Radiological Science and Technology, tells Fred Myers how the organization will use its expertise in physics to tackle cancer

• Mobility matters – After recently being denied entry into the UK to attend a science festival, Jessamyn Fairfield warns against limits to the free movement of people

• The scientific sublime – Robert P Crease identifies a physics experiment that philosophers would dub “sublime”

• Show us your metal – One of the rarest metals in the universe, metallic hydrogen could solve many energy problems – but has it finally been isolated in the lab? Jon Cartwright tries to sort out claim from counter-claim

• The physics of bread – Nathan Myhrvold – the polymath physicist whose passions range from cosmology to cooking – is this month publishing a massive, five-volume book about the science of bread and bread-making. Robert P Crease catches up with this intellectual livewire at his Cooking Lab headquarters in Seattle

• The great detector – Tushna Commissariat travels to the forests of Louisiana to visit the LIGO Livingston gravitational-wave observatory, where one of the most sensational discoveries of recent times was made

• Tale of two physicists – Philip Ball reviews The Quantum Labyrinth: How Richard  Feynman and John Wheeler Revolutionized Time and Reality by Paul Halpern

• Science at the fringe – Andrew Glester visits this year’s Edinburgh Festival Fringe and seeks out science-themed shows

• Colouring outside the lines – Tushna Commissariat reviews Visions of Numberland: a Colouring Journey Through the Mysteries of Maths by Alex Bellos and Edmund Harriss and Phases of Matter by Colm P Kelleher, Rodrigo E Guerra, Andrew D Hollingsworth and Paul M Chaikin

• Your pathway to industry – Physics graduates must sharpen and tailor their skills for careers in industry, says Crystal Bailey, who also calls on faculty members to provide more
support in this area

• Starting out strong – Philip G Judge, Isabel Lipartito and Roberto Casini share their thoughts for the budding research scientist on how to choose meaningful research

The measurement problem – David Faux has timing trouble at his local swimming pool

Go-ahead for protest-hit telescope in Hawaii

Hawaii’s Board of Land and Natural Resources (BLNR) has granted a construction permit for the protest-hit $1.4bn Thirty Meter Telescope (TMT). The BLNR’s board, which voted 5-2 in favour, places 43 conditions on the construction including that employees have at least one day’s cultural and natural resources training and that the TMT officials implement an invasive-species control programme.

When built, the TMT will be one of the world’s largest ground-based telescopes with a 30 m primary mirror that is made up of 492 hexagonal segments. The structure that will house the telescope will be 66 m wide and 56 m tall. Mauna Kea was chosen as the observatory’s site in July 2009 and over the following six years, the TMT organization received a series of necessary approvals and permits. However native Hawaiians, who regard the Mauna Kea summit as sacred – and who had previously objected to the growth in the number of telescopes there – carried out a protest at the telescope’s ground-breaking in October 2014.

Further demonstrations

Six months later, following further demonstrations, construction was postponed. Then in December 2015, the Hawaiian Supreme Court invalidated the TMT’s building permit, ruling that the BLNR had not followed due process when it was approved. Meanwhile, TMT chose La Palma in the Canary Islands as a back-up site earlier this year.

The court then remanded the case back to the board, who appointed retired judge Riki May Amano to re-hear it. In July, Amano recommended that the BLNR reissue the permit, so long as a number of conditions are met including that the building work abides by government rules.

With the approval by the BLNR, building work on the TMT could now begin in April 2018 with completion by 2022.

Black-hole growth accelerated by supersonic gas

Supersonic streams of gas could have triggered the birth of supermassive black holes in the early universe. That is the implication of hydrodynamic simulations done by Shingo Hirano of the University of Texas at Austin and colleagues. The research could explain how huge black holes were able to form when the universe was less than one billion years old.

Supermassive black holes have masses millions to billions of times that of the Sun and lurk at the centres of nearby large galaxies. While these behemoths have had billions of years to form, astronomers also know that supermassive black holes have powered quasars less than a billion years after the Big Bang. These early sightings are a mystery because astronomers do not have a good explanation of how these huge objects could have formed so quickly in the early universe.

One leading hypothesis is that enormous clouds of gas collapsed under the force of their own gravity, condensing directly into a black hole. However, scientists have struggled to model how enough gas can fall into the cloud to build up its mass before the cloud fragments to form stars. Now, work by Hirano and colleagues suggests that high-velocity streams of gas can quickly build up a gas cloud’s mass, facilitating its collapse.

The streams have their origins in the epoch of recombination, which occurred about 378,000 years after the Big Bang. This was when the cosmic microwave background radiation was emitted and baryonic matter and radiation become decoupled, allowing photons to travel unhindered through the universe. The decoupling set baryonic matter such as gas in motion, but not dark matter (which does not interact with light). Therefore, ordinary matter adopted streaming motions relative to the haloes of dark matter, inside which gas clouds congregated and the first stars and galaxies formed.

Counteracting feedback

In its simulations, Hirano’s team shows that the streaming motions initially prevented gas from settling inside dark-matter haloes. However, the haloes soon grew more massive and, about 100 million years after the Big Bang, the dark-matter halo in the team’s simulation had grown to 22 million solar masses, with gravity now strong enough to trap even the fast-moving streaming gas. Thousands of solar-masses worth of primordial hydrogen could now gather inside the halo and, at its centre, a protostar is born, surrounded by a massive, dense envelope.

Normally, protostars self-regulate their growth by emitting radiation that counteracts the infall of gas, blowing it away. This feedback grows stronger as the protostar’s mass increases until it goes beyond the Eddington limit and the protostar’s luminosity becomes too great. This was another stumbling block on the road to explaining how enough mass could gather to form the first black holes.

However, Hirano explains, “our protostar is surrounded by a dense envelope of gas and rapidly grows via efficient gas infall, and this rapid infall can change the stellar structure and deactivate the self-regulation mechanism.”

Above an accretion rate of 0.04 solar masses per year, the stellar envelope begins to inflate, allowing the inner region of the cloud to cool to less than 6000 K. Under this temperature the star’s production of ultraviolet light falls off and the self-regulation mechanism is too weak to halt the accretion of the gas. Within two millennia the core of the central protostar grows to 50 solar masses, while its extended envelope of infalling gas swells to an enormous 34,000 solar masses. It is at this point that gravity overwhelms all other processes and the entire cloud – protostar and all – collapses into a black hole.

Middle-mass black holes

The collapse of the cloud in Hirano’s simulation creates an intermediate-mass black hole, which can then reach supermassive status through a variety of processes including mergers with other black holes and the accretion of more gas. Meanwhile, the dark-matter halo around the black hole continues to build up, providing the scaffolding for what will become a galaxy.

Although the model works in theory, it is not the only plausible explanation for how intermediate-mass black holes could form from the direct collapse of gas clouds. For example, in work published in 2016, a team led by John Regan of Durham University ran its own simulations indicating that primordial gas clouds could bypass stellar feedback and inhibit the birth of stars, giving the clouds enough time to grow in mass, thanks to heating from background radiation and nearby starbursts.

The birth of these black holes could potentially be seen by the next generation of gravitational-wave detectors. “For example, a gravitational-wave signal caused by the coalescence of intermediate black holes could be detected by eLISA,” says Hirano, referring to the European Space Agency’s planned space-based gravitational-wave detector, set to launch in 2034. The characteristics of the gravitational-wave signals could therefore potentially validate models of how the first massive black holes were formed.

The research is described in Science.

Bringing neutrons into the steel industry

The world produces 1.6 billion tonnes of steel every year. According to analysts at AME Research, this figure could grow to as much as 3 billion tonnes by 2050. For the steel industry, however, improvements in quality matter as much as increases in quantity. The importance of continuous improvement can be seen by comparing the Eiffel Tower with modern steel structures such as the Tokyo Skytree. The former is 324 m tall – a record at the time of its completion in 1889 – and made of wrought iron, a material that begins to deform irreversibly when subjected to stresses of 100–200 MPa (the yield stress). In contrast, the weldable high-strength steel used for the 634 m Tokyo Skytree has a yield stress of up to 630 MPa. Without innovation in steel, such a huge construction would not have been possible.

The steel industry has long been alert to emerging technologies and keen to benefit from them. The Japanese steel firm Nippon Kokan (a predecessor of JFE, our employer) began using state-of-the-art microbeam analytical instruments such as the transmission-electron microscope (TEM) and electron probe micro-analyser (EPMA) as early as the 1960s. TEMs were used to observe microstructures such as dislocations and small precipitates in steel, while the EPMA was mainly used in the steel-making division for measuring elemental segregation and non-metallic inclusions such as oxides and sulphides. More recently, analyses based on synchrotron radiation have also become widely used for R&D in Japan’s steel industry.

History lesson

Compared with X-rays and electrons, neutrons are a less common tool. The relative rarity of neutron beams was, until fairly recently, a high hurdle to their use in the steel industry. Nevertheless, beginning in 2006, JFE began to promote the use of neutron scattering to analyse the microstructures of large-volume steel samples. The history of our approach is documented more fully in a recent report (2017 JFE Technical Report 22 1–5), but the following summary covers a few of the most important points.

One promising technology for developing high-tensile-strength steel involves dispersing tiny particles (mostly carbides such as titanium carbide and niobium carbide, but sometimes metallic copper) within the steel matrix. These nanometre-sized precipitates hinder the movement of dislocations in the matrix, making the steel strong without sacrificing its formability (at least, not too much). Electron microscopy is a powerful method for directly determining the size and distribution of these precipitates, but the area that can be observed with this technique is limited and the statistical procedures for extrapolating to larger (and thus more representative) areas are tedious. It is also not easy to evaluate nanometre-sized precipitates with chemical extraction methods, since such small particles are often chemically unstable and difficult to collect with filtration.

Two graphs: see caption for detail

In comparison, we have found that non-destructive neutron measurements possess considerable advantages, combining reliable data from bulk steel with ease of use. In one series of experiments, we used small-angle neutron scattering (SANS) to determine the size of nanometre-sized titanium carbide (TiC) particles within 10 × 10 × 2 mm (thick) sheets of hot-rolled steel. The sample volume for this measurement was about 108 times larger than that used for ordinary TEM thin-foil observations. In the “Particles” figure (a) shows the SANS profile change as a function of heat treatment in this steel. This profile enabled us to determine the average size (b) and the number density of TiC particles. Such knowledge is important because optimizing precipitates is a time-consuming procedure, with numerous combinations of micro-alloying elements and heat treatments that can be tried.

A promising approach

The promising nature of our approach led the Institute of Steel and Iron of Japan (ISIJ) to award its 2011 Tarawa best paper award to JFE Steel and our collaborators at the Japanese National Institute for Materials Science (NIMS), the Japan Atomic Energy Agency and Ibaraki University. The easy and precise evaluation of precipitation from bulk steel, without the need for tedious specimen preparations, will certainly promote R&D of new types of steel.

In addition to studying precipitates with SANS, we are also using neutron diffraction to study steel texture, and to analyse residual stress in welds. The “Welds” figure is an example of a contour plot of (transverse) lattice strain in the weld joint of welded 980 MPa grade steel. At the welding site, there is an increased risk of cracks due to residual stress from the welding and from absorption of hydrogen. Neutron diffraction revealed that the root portion of the weld experiences the highest stress – over 1000 MPa – in the transverse direction. Tensile stress at the root is also present along the weld line and in the plate thickness direction. From this analysis, we were able to infer that high levels of tensile residual stress and the resulting hydrogen accumulation is the key factor for cold cracking. Hence, this neutron-based analysis gave us an important insight into ways of optimizing the weld metal and the welding conditions. As demand for even higher-strength steels continues to increase, the development of steels that are less susceptible to hydrogen embrittlement is vital.

Photo of an experimental set-up and a graph - see caption for detail

Another application of neutron diffraction concerns “retained” austenite in high-tensile strength steel. Austenite is a high-temperature phase of steel with a face-centred cubic structure, and normally it transforms into ferrite (which has a body-centred cubic structure) when cooled. However, with careful control of the heat treatment, areas of steel with high carbon and/or manganese content can be “retained” in the austenite phase even at low temperatures, including room temperature. This is useful because when steel containing retained austenite is deformed, the austenite is transformed into another phase, martensite, creating a strong, ductile material known as TRIP (transformation-induced plasticity) steel that is widely used in the automotive industry. Neutron diffraction is useful for measuring the fraction of retained austenite in steel by volume – something that electrons and X-rays cannot do, because they cannot penetrate very far below the material’s surface. In the future, we hope to fully exploit the potential of neutrons to improve dynamic measurements of phase transformations, and also to analyse hydrogen in steel and perform 3D imaging of embedded defects or inhomogeneities.

Strong collaborations

As the application of neutrons to steel research is still relatively new, it is worth taking a closer look at how this research started and how it has been promoted in the steel community in Japan. The research described in the previous section would not have been possible without the efforts of Yo Tomota, a materials scientist then at Ibaraki University (now at NIMS), who pioneered the use of neutrons in steel research, and who organized a research group with JFE Steel under the auspices of the ISIJ. In 2006 the ISIJ chose the development of neutron-based techniques for steel research as its first industry-driven collaborative project. Scientists from other major steel companies and from academia – including both materials and characterization experts– joined in. Over a three-year period, members of the group tested various neutron techniques, including reflectometry as well as SANS and neutron diffraction.

Having scientists who are “multilingual” in both industry and academia, is, we believe, the key to a successful synergy

During the second stage of this group’s work, something very unfortunate happened: the To¯hoku earthquake and tsunami devastated Japan, and the Japan Proton Accelerator Research Complex (J-PARC) and many nuclear reactors had to cease operations. This prompted us to think hard about possible alternative neutron sources. For the third stage of the ISIJ neutron research activity, we strengthened our collaboration between ISIJ members and RIKEN, Hokkaido University and Kyoto University (all of which possess compact or medium-sized neutron sources). In recent years, members of the group have been evaluating the feasibility of using compact neutron sources for steel-related research topics.

In July 2017 at the International Conference on Neutron Scattering in Daejeon, Republic of Korea, one of us (KS) gave a presentation outlining JFE’s use of neutrons. Afterwards, an audience member observed that it is not easy to stimulate interest in neutrons from industry scientists, and asked why JFE was an early adopter in this area. The immediate answer was that historically, Japanese companies have strong R&D divisions, and the research scientists who work there (in analysis, for example) are used to collaborating closely with academic scientists as well as with process development and steel product development engineers. This means that scientists working in analysis and characterization can understand the languages of both neutron scientists and materials scientists in the company.

Having scientists who are “multilingual” in both industry and academia, and who can therefore bridge or mediate between each side, is, we believe, the key to a successful synergy. It is worth noting that the steel industry is not the only sector of Japanese manufacturing to place high expectations on neutron technology. RIKEN has provided an excellent summary of the ways neutrons are being used in Japanese industry, as well as the fundamentals of neutron science.

Even so, we believe that further steps are still required for neutrons to become a common measurement tool within industry. Measurements using neutrons are generally easy; however, users often have to wait for several months or even longer for experiments after their proposals are approved. On the other hand, we can use electron microscopes and X-ray diffractometers daily. This lack of accessibility is a huge bottleneck for expanding neutron applications. Recognizing this problem, in 2015 the Japan Science and Technology Agency began an A-STEP (Adaptable and Seamless Technology Transfer Program through Target-Driven R&D), aimed at developing key technologies for compact neutron sources and their industrial applications. This A-STEP includes 10 research themes, and is being supervised by Hideki Yoshizawa of Tokyo University and seven research advisers – including five from industry.

Achieving goals

We hope that laboratory-scale neutron measurements will become routine in the near future. In order to achieve this goal, further collaboration between industry and academia is a must. Demonstrating useful results is the only way to ensure that industry will show more interest in neutrons and even invest in such research programmes.

Great wrong theories, new spin on golf, physics photo competition

By Hamish Johnston

What is the greatest wrong theory in physics? Physicist Chad Orzel asks that question in his latest blog for Forbes and by wrong he does not mean incorrect, but rather incomplete. He makes the argument for the Standard Model, which he says has been wrong for a very long time – but continues to be phenomenally successful.

Putting a golf ball the last few metres in front of a hushed crowd must be one of the most nerve-wracking aspects of being a professional golfer. Some are better at it than others and Canadian Adam Hadwin was notoriously bad until physics came to the rescue. You can read about how a careful analysis of the crucial first 0.5 m of Hadwin’s putts led to a dramatic improvement. The secret, according to “The hole in Hadwin’s game”, is controlling the bounce and backspin of the ball.

Have you taken any nice photos that capture the essence of physics? If so, you have two more days to enter the Physics Pics competition that is organized by the Manchester and District branch of the Institute of Physics. Prizes of £50 will be awarded in each of four categories: primary school age; secondary school age; sixth form/college and adult. Entrants must be UK residents and a selection of entries will be exhibit at the Manchester Central Library during Manchester Science Festival 2017.

Mechanical fluctuations track how bacteria respond to antibiotics

A new piezoelectric sensor can identify the most appropriate antibiotic for an infection in less than an hour, according to physicists in the US. While conventional antimicrobial tests can take days, the device detects changes in bacteria motion upon initial exposure to antibiotics. Faster antibiotic selection could improve treatment outcomes and help tackle antimicrobial resistance.

Antimicrobial susceptibility testing (AST) is used to identify the most appropriate antibiotic for a bacterial infection. Current tests are constrained by bacterial growth rates, as they examine the effect of antibiotics on the growth of bacteria colonies cultivated from patient samples. However, the two to three days this takes to produce results can cause problems.

David Livermore, medical microbiologist at the University of East Anglia and Public Health England’s lead on antibiotic resistance, told Physics World: “During the [testing period] the patient must be treated empirically, based on likely pathogens and local resistance rates. Many patients are over-treated – given potent broad-spectrum antibiotic when they transpire to have very susceptible pathogens. This is wasteful and may select resistance in the gut flora. A few have highly resistant pathogens, not covered by the empirical treatment, and so are under-treated, increasing their mortality risk.”

Major public health concern

The emergence of antibiotic resistant bacteria, which is often caused by using inappropriate or ineffective antibiotics, is a major public health concern. Resistance is rising in most pathogenic bacteria and infections that were once easily treated – such as pneumonia and gonorrhoea – are increasingly resistant to all antibiotics. In the US, antibiotic-resistant infections are responsible for at least two million illnesses and 23,000 deaths every year, according to the Centers for Disease Control and Prevention.

To accelerate AST, Ward Johnson and colleagues at the National Institute of Standards and Technology (NIST) in Boulder, Colorado are developing a biophysical method that can measure changes in mechanical fluctuations of bacteria.

Their sensor is based on thin quartz crystal disc, with an electrode on each surface. One of the electrodes is used to deliver an electrical signal that is close to the disc’s resonance frequency, while the other measures the piezoelectric voltage created by the resulting crystal vibrations.

Mechanical fluctuations

The technique involves coating the disk in bacteria. Fluctuations in the mechanical properties of the organisms affect the frequency of the output signal, which in turn can be used to detect changes in the bacteria population. To test the process, the researchers coated sensors with around two million non-resistant Escherichia coli bacteria. These were then treated in three ways: some were exposed to either polymyxin B or ampicillin (two antibiotics with different modes of action), while others received no antibiotics.

On exposure to polymyxin B, cell-generated frequency noise fell rapidly, reaching almost zero within seven minutes. Frequency noise began decreasing within 15 minutes of ampicillin being introduced, then dropped more rapidly as the bacteria cell walls began to rupture. Cell imaging and post-experiment counting of viable bacteria confirmed that efficacy of the antibiotics.

A non-motile strain of E. coli, with paralysed flagella, was used in the experiments to confirm that the frequency noise was generated by cell-wall vibrations, rather than bacteria movement. According to the researchers, the results provide evidence that bacteria cell death can be sensed through measurements of cell-generated frequency noise.

Clinically better

Ward says that the main advantage of this technique over conventional AST is speed. “Although observation of growth may remain the ultimate gold standard for predicting the effectiveness of an antibiotic in halting the progression of an infection, the timescale of such growth often makes conventional AST partly or entirely ineffective in establishing a course of treatment in many clinical situations,” he explains. “In contrast, our approach is focused on immediately sensing biophysical responses of microbes to antibiotics as these changes occur.”

Livermore says that while the results suggest that the technique “works for E. coli with polymyxin and ampicillin,” Johnson and colleagues “need to test it on a much wider range of bacterial species and antibiotics, including those that, unlike polymyxin and ampicillin, don’t rapidly kill bacteria”. He adds that if it was necessary to grow a bacteria culture from a patient sample before testing, then the time needed would be extended to around 24 hours.

Johnson hopes to achieve a level of sensitivity that will allow testing directly from clinical samples without the need for pre-culturing. He adds that “this would require add-on technologies to process the sample, to get it into the right state for passing to the resonators”.

The researchers have been granted a patent for the technique and are currently testing it on other and are currently testing it on other bacteria and antibiotics.

The research is described in Scientific Reports.

Understanding vanadium leaching improves prospects for reuse of steel slag

World steel production resulted in between 160 and 240 million tonnes of slag in 2016. Now, a UK team has discovered a mechanism for vanadium leaching in basic oxygen furnace (BOF) slag that could make it possible to repurpose more of this byproduct.

The BOF method, used in around two-thirds of steel production, involves blowing oxygen over molten pig iron. Adding limestone or dolomite removes impurities; this step forms slag. Some of the slag can be recycled in a blast furnace or used as aggregate for road surfacing. The rest goes to landfill.

According to Andrew Hobson of the University of Leeds, the findings “will assist in the safe, cost-effective use and/or storage of BOF slag in addition to assessing environmental risks posed by legacy slag heaps”.

Although vanadium can help control diabetes in humans, its toxicity tends to increase with oxidation state. Its highest oxidation state, V(V), which is prevalent in steel slags, may be carcinogenic and harm the immune and nervous systems.

Hobson and colleagues at the University of Leeds, Nottingham Trent University and the University of Hull discovered that vanadium leaching is regulated by the solubility limits of calcium vanadate (Ca3(VO4)2), rather than being redox-controlled as initially thought.

The team identified four phases in the slag. Two – dicalcium silicate and dicalcium aluminoferrite – contained vanadium, whilst the free lime (calcium oxide) and refractory oxide did not.

Scanning electron microscope (SEM) analysis revealed that of the vanadium-bearing phases, only dicalcium silicate reacted during weathering, so must be responsible for releasing vanadium into solution. Microfocus X-ray absorption near-edge spectroscopy (µXANES) spectra showed that all the vanadium in dicalcium silicate is, V(V). Some subsequent incorporation of vanadium took place during calcium silicate hydrate precipitation, which readily occurs at high pH following dicalcium silicate dissolution.

Hobson and colleagues noticed an altered region depleted in calcium and silicon at the surface of the slag. When isolated from air, the layer formed was thicker, and the aqueous vanadium concentration lower. The team reckons that such a layer has an “armouring effect on slag” by reducing further dissolution.

What’s more, under aerated conditions calcium reacts with carbon dioxide to form calcium carbonate, which provides a sink for aqueous calcium. The inverse relationship imposed by calcium vanadate (Ca3(VO4)2) solubility limits means that when calcium is removed from solution, higher concentrations of vanadium can leach out.

In real-world applications, aerated conditions prevail. Natural rainfall is often used to leach free lime from slag and so maximize potential for reuse. The team suspects that in real slag heaps the high concentrations of vanadium predicted would not be seen because the dicalcium silicate phases containing vanadium become progressively depleted, not to mention that there are kinetic limitations on vanadium release due to the altered region at the surface of the slag.

Beijing and Vienna have a quantum conversation

A quantum cryptography key has been shared between Beijing and Vienna using a satellite – allowing the presidents of the Chinese Academy of Sciences and Austrian Academy of Sciences to communicate via a secure video link. The call was made earlier today using China’s Micius quantum-communications satellite, which was launched in 2016 and orbits about 500 km above Earth.

Quantum key distribution (QKD) uses the laws of quantum mechanics to guarantee complete security when two people exchange information using a secret cryptographic key. The sender and receiver exchange a key made up of a series of quantum states. Quantum mechanics dictates that an eavesdropper is unable to intercept the key without altering those states and thereby revealing the eavesdropper’s actions.

Stream of photons

A team led by Jian-Wei Pan of the University of Science and Technology of China had previously shown that a quantum key can be encoded into the polarization states of a string of photons sent from the Micius satellite to a ground station near Beijing. A key was also stored on board Micius for two hours before being sent to a second ground station 2500 km away from Beijing in north-west China. This allowed QKD to be performed between these two ground stations.

The Chinese physicists also teamed up with Anton Zeilinger of the University of Vienna and colleagues to download a quantum key from Micius to a ground station near Vienna.

Now, the Chinese–Austrian collaboration has used Micius to share a QKD key between Beijing and Vienna – which are separated by about 7400 km.

Three keys

The process began with Micius transmitting a quantum key to Vienna so that Micius and Vienna share a key. This is then repeated between Micius and Beijing, creating a second shared quantum key. The two keys are then combined on board Micius and the result is distributed to both ground stations. Using their separate keys and the combined key, both ground stations can then generate a common code.

This morning, such a code was used as the encryption key for a secure video link conversation involving Zelilinger – who is president of the Austrian Academy of Sciences – and his Chinese counterpart Chunli Bai. Zeilinger described the demonstration as “a very important step towards a worldwide and secure quantum internet”.

Copyright © 2026 by IOP Publishing Ltd and individual contributors