Skip to main content

Evidence for string theory could be lurking in gravitational waves

Signatures of the extra dimensions required by string theory could be seen by future gravitational-wave detectors. That is the conclusion of David Andriot and Gustavo Lucena Gómez at the Max Planck Institute for Gravitational Physics in Germany, who have identified two ways in which gravitational waves could be affected by the consequences of string theory – a theoretical framework that invokes speculative concepts such as extra dimensions to try to fill important gaps in our understanding of physics, including the nature of quantum gravity.

Gravitational waves are ripples in space–time that are created when massive objects are accelerated under certain conditions. Andriot and Lucena Gómez calculate that adding N extra dimensions to 4D space–time results in a “breathing-mode” oscillation that would be present in a gravitational wave. The second distinct feature of extra dimensions, say the researchers, is a discrete set of higher-frequency signals accompanying a gravitational wave.

Third detector

The first gravitational-wave detection was made in 2015, when the LIGO observatory spotted a signal from a coalescing binary black hole. Andriot and Lucena Gómez say that LIGO’s current configuration of two detectors will not be able to detect the breathing mode. However, it is possible that a breathing mode could be detected once a third detector in Italy (called Virgo) reaches its full sensitivity in 2018.

As for the higher-frequency signals, Andriot and Lucena Gómez point out in the Journal of Cosmology and Astroparticle Physics that the trend in future detectors is towards lower frequencies, and therefore a special observatory would be needed to see that effect.

Ghana converts obsolete telecomms dish into radio telescope

Scientists in Ghana have successfully converted a communications antenna into a Very Long Baseline Interferometry (VLBI) radio telescope. The country is the first partner of the African VLBI Network (AVN) to complete a full refurbishment. The 32 m-diameter dish is located in the Ghana Intelsat Satellite Earth Station at Kutunse and has passed a series of detection tests.

VLBI networks – arrays of radio telescopes – are used to image the radio universe in unprecedented detail. They rely upon the principle that each telescope will receive a different signal of the same source. When combined, the multiple signals create an extremely high-resolution image of radio sources.

Continent wide

Square Kilometre Array (SKA) Africa and its partners aim to convert redundant telecommunication dishes found across Africa into VLBI telescopes. These antennas have been made obsolete because of the introduction of optical-fibre networks. Alongside radio telescopes in South Africa, each converted telescope will be integrated into the AVN, forming a continent-wide array. The AVN will then be connected to similar groups across the world, thereby creating a global VLBI network.

Bubble cavitation spotted in liquid crystals

The formation and subsequent collapse of bubbles has been seen for the first time in a flowing liquid crystal. This process is called cavitation and occurs when the pressure drop in a flowing fluid is large enough to allow some of the fluid to vaporize and create a bubble. Cavitation is of great interest in hydrodynamics because the collapsing bubbles can dissipate large amounts of energy in small regions and cause significant damage to machinery such as propellers.

The discovery was made by Tillmann Stieger and colleagues at the Max Planck Institute for Dynamics and Self-Organization in Göttingen, the Technical University of Berlin and the ETH Zürich. Liquid crystals are fluids that are made of rod-like molecules that tend to align under certain conditions. In its experiments, the team pumped liquid-crystal fluids through tiny channels just 0.1 mm wide. The channels contained obstructions, which increase the speed of the flow and encourage cavitation (see image).

Lower energy

The researchers found that cavitation occurs very easily in liquid crystals. This, they say, is because the molecules tend to align themselves in the same direction as they flow along the channel – and this lowers the energy required to create a bubble. Indeed, the team found that it could control the rate of cavitation by controlling the degree to which the molecules lined up.

The discovery could be useful for developing microfluidic systems in which cavitation is used to control the rate at which different fluids mix while flowing through tiny cavities. The research is described in Nature Communications.

Longest synchrotron experiment hits 1000 days at Diamond

The longest-running synchrotron experiment has reached 1000 days. The experiment has been taking place at the Long Duration Experimental (LDE) facility at the UK synchrotron Diamond Light Source. Led by Claire Corkhill from the University of Sheffield, the experiment investigates the hydration of cement used to store nuclear waste.

The only resource of its kind in the world, the LDE facility allows scientists to test complex materials under a range of conditions over long periods of time, and then regularly characterize the material with synchrotron X-ray powder diffraction. The cement experiment was the first to be set up at LDE and is still running 1000 days on, making it the longest-running experiment to take place at a synchrotron.

More and more cement

Cement (also known as cement paste) is a mixture of silicates and oxides that, after reacting with water, forms a hard grey solid. Special types of cement are used to encapsulate some radioactive waste before storage, such as that arising from reprocessing spent fuel rods or from decommissioning nuclear facilities. It is therefore very important to understand how different types of cement behave over long periods of time. “The cement is being used to safely lock away the radioactive elements in nuclear waste for timescales of more than 10,000 years,” explains Corkhill, “so it is extremely important that we can accurately predict the properties of these materials in the future.”

This also applies to designing future waste storage. The nuclear industry’s current plan is to create a giant hole 500 m to 1 km deep and lined with different layers of cement, which will contain a series of isolated vaults for intermediate- to high-activity-level waste. Once each vault is packed full of the encapsulated waste, it will be closed and backfilled with even more cement. High-level waste is more of a challenge due to the extreme radiation fields, but will involve more cement.

Common but tricky

Although a common material, Corkhill explains that cement is tricky. The combination of water and powder forms some amorphous materials that are difficult to analyse. Because cement only stops reacting once the water added to it has run out, it can continue to transform for long timescales – indeed some of the eight nuclear-waste cement samples running at LDE are still changing. This is obviously a problem for underground storage where groundwater is likely to be present at some point over the several-thousand-year storage period.

At the moment, the team’s samples are in a “straightforward” environment – no additional water is present and they are running at room temperature and pressure. But the data collected once a week by the beamline scientists have allowed the researchers to build a geochemical model that can predict the cement phases in 1000 years. The more promising, stable cement will then be investigated further during another long-term experiment incorporating groundwater. Corkhill also hopes to encapsulate enriched uranium within cement and record the damage caused by the radioactivity from the inside out.

Successful collaboration

LDE senior support scientist Sarah Day and Corkhill both attribute some of the experiment’s success to the collaboration between academic users and beamline scientists at Diamond. “Being able to run experiments over 1,000 days is a great testament to how well we’ve engaged with our user community, enabling them to use synchrotron light to probe the frontiers of science,” explains Day.

Diamond scientists plan to build five more beamlines by 2020 and have recently demonstrated a proof-of-concept upgrade to the machine, which if delivered would provide light that is 10 times brighter to each beamline. But in the meantime there are no plans to end the cement experiment and it will likely continue breaking records until the materials are no longer changing or another experiment needs the space.

Is the proton lighter than we thought?

The most precise measurement ever of the mass of the proton suggests that the particle is a tiny bit lighter than the current accepted value. Although the difference is less than one part in 10 billion, it has a statistical significance of 3σ.

The measurement was made by Sven Sturm of the Max Planck Institute for Particle Physics and colleagues, who compared the cyclotron frequency of the proton to that of an ionized carbon-12 atom. The particles were held in a Penning trap using magnetic and electric fields. The fields cause a charged particle to follow a looping orbit that is defined by the cyclotron frequency – which is related to the electrical charge and mass of the particle. Comparing the frequencies of the proton and carbon-12 gives the mass of the proton in atomic mass units, which are defined by the mass of the carbon-12 atom.

Thrice more precise

Writing in a preprint on arXiv, the team gives its measurement of the proton mass as 1.007 276 466 583(15)(29) atomic mass units, where the first brackets contain the statistical uncertainty and the second brackets the systematic uncertainty. The result is three times more precise than the current accepted value published by CODATA.

An extremely precise measurement of proton mass could help solve several important mysteries in physics, including the “proton radius puzzle” of why the radius of the particle appears to be smaller than expected. Knowing the mass of the proton (and the antiproton) to high precision could also lead to an understanding of why there is much more matter than antimatter in the universe.

Cheap electrocatalysts convert CO2 to CO using solar cells

Researchers in Switzerland and Spain have developed a solar-driven electrolyser that converts CO2 to CO using only Earth-abundant elements in place of precious metals. Reducing emissions of CO2 by converting it electrochemically to CO is an attractive prospect due to the potential use of CO as a precursor to fuels and high-value chemicals. The work leads the way to further exploration of Earth-abundant metals that might perform comparably to the rare elements used until now.

The conventional method to convert CO2 to CO uses precious-metal catalysts (gold, silver, palladium) at considerable overpotentials and also requires electrolyte additives. What this research has demonstrated is an innovative, inexpensive and stable alternative using a bifunctional system of SnO2-modified CuO electrodes separated by a bipolar membrane and driven by a solar cell. This newly developed system constitutes a milestone for the catalysis community as it avoids the requirement – until now – of noble metals.

Preparing and testing the electrodes

The team, led by Jingshan Luo and Michael Grätzel of École Polytechnique Fédérale de Lausanne (EPFL), anodized copper films to produce CuO nanowires (figure 1), and then applied an SnO2 coating by atomic-layer deposition (ALD). The synthesized electrodes’ structure and shape were characterized by X-ray diffraction and energy-dispersive spectroscopy (figure 2).

To evaluate the prepared electrodes, the researchers compared the CO selectivity of their SnO2-coated electrodes to the bare CuO equivalents. CO selectivity is a measure of the moles of CO2 converted to CO in relation to the total moles of all products. The unmodified electrodes achieved a 36% CO selectivity, while the new, surface-modified, electrodes reached 97%.

The spent electrodes exhibit copper in a metallic state coexisting with metastable tin and copper oxides. The researchers attribute the presence of these metastable oxides to enhanced CO2 reduction activity. Still, further work is needed to fully address this issue under electrochemical conditions.

Understanding how they work

The group measured the evolution rates of H2 and CO at their respective current densities. At higher overpotentials, the unmodified samples primarily yielded hydrogen and other carbon-containing products, but this was inhibited by SnO2addition. Thus, the boost in CO selectivity for the modified electrodes was shown to be due to the suppression of hydrogen production.

Gas adsorption experiments showed that the binding strength of both CO and adsorbed hydrogen was substantially decreased with the SnO2-coated electrodes. The lower hydrogen availability restrains the reduction of CO to alcohols and hydrocarbons while CO’s weaker binding strength on copper is comparable to that exhibited for noble metals like gold and silver. The results from this experiment and the electrochemical tests suggest that SnO2 and CuO are working cooperatively.

Constructing the bifunctional solar-driven device for CO2reduction

A complete electrochemical system needs a cathode for CO2reduction, an anode for water oxidation, and a membrane for product separation. Grätzel and collaborators used the same cathode composition of SnO2-coated CuO to build the anode. Gas chromatography measurements confirmed oxygen production with no sign of anode corrosion, proving that SnO2-coated CuO electrodes can be used for both CO2 reduction and electrochemical oxidation of water.

The remaining challenge the researchers addressed was that each reaction requires an environment with a different pH. To solve this problem, the scientists utilized a bipolar membrane consisting of an anion exchanger on the anode side and a cation exchanger on the cathode side, allowing the use of a different electrolyte solution in each compartment. Finally, the solar-driven device was completed using a single three-junction GaInP/GaInAs/Ge photovoltaic (PV) cell, seen in figure 3. This was tested with simulated sunlight at room temperature, and achieved a solar-to-CO free-energy conversion efficiency of up to 13.4%.

There is no doubt that Grätzel and colleagues have constructed a novel, bifunctional and cost-efficient device for catalytic conversion for energy applications. This work will lead to new ways to alter the surfaces of plentiful and inexpensive elements in order to reach catalytic activities that rival or even surpass those of the noble metals.

Full details of the work can be found in Nature Energy.

Could volcanic eruptions be predicted using satellite observations?

A new way of monitoring volcanic activity by integrating ongoing satellite measurements into dynamic models has been demonstrated by researchers from France. Based on data assimilation, the method might one day allow for real-time eruption forecasts in volcanic regions.

As magma moves beneath the Earth’s surface – such as under a volcano – the ground above flexes. These ground movements can be measured using both GPS and satellite-based radar data, and used to develop models of the depth and shapes underlying magma reservoirs.

A limitation of many of these models, however, is that they are kinematic in nature – focusing on motion alone. This makes them unable to yield information on the pressures of the underlying magma system. This is important in determining when a magma chamber will rupture and the volcano’s capacity to feed any resulting eruption. The surface disruption caused by a small pressure change in a large magma chamber may look identical to a large pressure change in a small magma chamber, for example – even though the latter case is more likely to lead to an eruption.

Dynamic models

To distinguish between such scenarios, volcanologists must use dynamic models that can consider how the surface displacements change with time. A small chamber, for example, would pressurize much faster than a large chamber. Most dynamic models tend to be based on data inversion and require extensive calculation and the incorporation of all observations beforehand. This makes them unsuitable for real-time eruption forecasting because they are unable to incorporate ongoing measurements.

In their new study, geophysicist Mary Grace Bato and colleagues at the Institut des Sciences de la Terre in Grenoble have addressed this issue by turning to data assimilation. This is a time-stepping approach that combines models, observations and error statistics to forecast the state of a dynamic system. Data assimilation has long been used to produce weather forecasts and predict the effects of greenhouse-gas emissions.

We foresee a future where daily or even hourly volcanic forecasts will be possible – just like any other weather bulletin
Mary Grace Bato, Institut des Sciences de la Terre

The researchers simulated the ground deformation caused by a simple, two-chambered volcanic system, and then tested the capacity of a data-assimilation approach to interpret the results. They found that the approach was able to predict the evolution of the magma pressure. It could also constrain both the shape of the deepest reservoir and the rate of the basal magma flow into this chamber – providing results comparable with existing inversion methods.

“Data assimilation offers great potential for assessing volcanic unrest,” comments Bato, concluding: “We foresee a future where daily or even hourly volcanic forecasts will be possible – just like any other weather bulletin.”

While bulletins may be some way off – and require improvements in existing volcanic models for widespread application – Bato is optimistic about the potential of data-assimilation methods in volcanology. To that end, the researchers are now looking to apply their new approach to two real-life case studies: the Grímsvötn volcano in Iceland and Alaska’s Mount Okmok.

New forecast models

“Applying cutting-edge statistical approaches to active volcanoes is the first step towards developing a new generation of volcano forecast models,” says Patricia Gregg, a geophysicist from the University of Illinois at Urbana-Champaign. Advancements such as these, she adds, “are critical to take full advantage of the excellent data made available through new satellite missions as well as data collected by volcano observatories around the world”.

“For people working in volcano observatories, it is crucial to anticipate eruptions early enough to warn local authorities. However, providing an unequivocal identification of volcano reawakening remains a challenging problem,” comments Aline Peltier, a volcanologist who works at the Piton de la Fournaise volcanic observatory on Réunion Island in the Indian Ocean, where warnings of an eruption are only possible a few hours before its onset. The potential of such complementary methods to estimate magma-chamber pressurization weeks before the event, she notes, would be “of prime importance for eruptive risk mitigation”.

The research is described in Frontiers in Earth Sciences.

Infused Antarctic ice could boost neutrino detection

 

The sensitivity of the IceCube Neutrino Observatory at the South Pole could be boosted by adding optical materials to the icy boreholes that contain its detectors – according to physicists in the US.

Encompassing 1 km3 of ice, IceCube comprises 86 cables, each up to 2.5 km long, suspended inside vertical boreholes in the ice. Attached to each cable are dozens of photomultiplier tubes (see figure), which record the Cherenkov radiation given off by the secondary particles created when incoming neutrinos collide with nuclei inside the ice.

In 2013, IceCube made the first every detection of cosmic neutrinos from throughout the universe and physicists are now thinking about how the detector could be upgraded. In a preprint on arXiv, Imre Bartos, Zsuzsa Marka and Szabolcs Marka of Columbia University in New York describe how filling sections of the boreholes with materials with desirable optical properties could boost IceCubes’s detection efficiency.

Index increase

They first looked at the effect of surrounding each photomultiplier tube detector with a material with a higher index of refraction than ice – the idea being that the change in refractive index will focus light towards the photomultiplier tubes. The found that for every 0.1% increase in the material’s index of refraction, a 10% increase in light flux to the photomultipliers could be achieved.

The trio also looked at the effect of filling parts of the boreholes with a material that would shift the wavelength of the Cherenkov light from ultraviolet to visible wavelengths – the latter being easier to detect using photomultiplier tubes. Their calculations suggest that filling most of a borehole with a wavelength shifter, but not the region immediately surrounding each detector, could lead to a very large increase in sensitivity.

The team does, however, point out several challenges that would have to be overcome to implementing its scheme. These include dealing with the natural radioactivity of the optical materials and the effect of freezing on the optical properties.

The underlying physics of natural hazards

Natural hazards threaten lives and livelihoods across the globe and can result in huge financial costs. Despite significant progress in understanding hazards, we are still feeling powerless and inadequate in the aftermath of destructive events, which can strike with little warning and often affect vulnerable communities. One of the core missions of the US Geological Survey (USGS) is to conduct research into a range of natural hazards so that the public and policymakers can be better prepared for these events.

At the Menlo Park Science Center in California, USGS scientists are engaged in a range of basic and applied research. In our first video (above), geophysicist Eric Geist explains his studies into the mechanics of tsunamis – giant waves that can wreak havoc on coastal communities and infrastructure. Geist is particularly interested in tsunamis triggered by earthquakes occurring at subduction zones on the seafloor, regions where oceanic plates slides underneath continental plates.

One of the other major hazards affecting the west coast of the US is the threat of large earthquakes. Often they occur along the San Andreas fault, which forms the tectonic boundary between the Pacific Plate and the North American Plate. In our second video (below), you can take a look inside a couple of different earthquake labs at Menlo Park. Brian Kilgore spends his days triggering mini earthquakes on a bench-top and examining their characteristics. Meanwhile, David Lockner works in the rock deformation and friction laboratory where he recreates the conditions in the Earth under which earthquakes occur. This often involves subjecting rock samples to extreme pressures and temperatures and examining how they respond.

To find out more about the science of natural hazards, check out the July issue of Physics World, a special issue that includes feature articles on wildfires, tornadoes and “slow” earthquakes. Physics World‘s editor Matin Durrani introduces that special issue on our blog.

Our hazardous planet: when the world is out to get you

PWJul17cover-200By Matin Durrani

For people afflicted by last month’s devastating fire at Grenfell Tower in London or for those caught up in recent terrorist atrocities, it can seem that many problems in this world are entirely of our own making.

Yes the modern world has benefited from our collective wisdom and creativity – especially through science and engineering – but often it feels as if irrational human behaviour lies at the root of many of our troubles.

Nevertheless, we should remember that our planet itself holds many natural hazards too, as the latest special issue of Physics World reminds us.

Remember that if you’re a member of the Institute of Physics, you can read Physics World magazine every month via our digital apps for iOS, Android and Web browsers.

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors