Skip to main content

US motorists prepared to pay more for fuel to lower emissions

Motorists in the US appear surprisingly flexible when it comes to choosing fuels that reduce carbon dioxide emissions, researchers have found.

After conducting surveys online and face-to-face at fuel stations, John Helveston of George Washington University, US, and colleagues determined that consumers of gasoline (petrol) and E85 – an ethanol-dominated blend – were most willing to consider alternative fuels, whereas diesel users were more strongly wedded to their current choice. Nearly all respondents, however, were prepared to pay more at the pump if it would reduce emissions.

The US transport sector is one of the country’s largest sources of carbon dioxide. In the long term, adoption of electric or hydrogen-powered vehicles might decarbonize the sector, but only at the cost of significant investment in new infrastructure. As transport emissions are dominated by “light-duty” vehicles – privately owned cars and small trucks – more modest improvements might be achieved in the meantime by influencing consumer habits. The question is, how amenable are motorists to changing their fuel choices?

“Most research in this area focuses on the vehicle,” says Helveston. “Because most cars only take one type of fuel (gasoline, diesel, electricity, etc.) it’s difficult to study how consumers feel about the fuel without also having to consider other issues with the vehicle.”

To separate fuel-related factors from other influences, Helveston and colleagues created a survey in which different combustion fuels – gasoline, diesel, E85 and compressed natural gas (CNG) – were compared to each other in a format that mimicked the information on a fuel pump. Presenting three random choices each time, the researchers included information about the fuels’ carbon dioxide emissions, origin, and the cost of a tank that would last 300 miles. Participants were asked to use this information to choose between the fuels under the assumption that their vehicle could run on any of them.

“To me, the most surprising finding was the relatively high value we found that consumers stated they would be willing to pay to reduce carbon dioxide emissions from their fuel,” says Helveston. “This was a surprisingly robust finding.”

Based on fuel prices in 2015 (just before the survey was conducted), this premium works out to an average of $4.63 per 300-mile tank, or about $150 per tonne of carbon saved. In comparison, $150 is at least three times higher than the estimated “social cost of carbon” commonly used to quantify the effect of emissions.

The results were also surprising because this premium was accepted by almost everybody: 96% of respondents were prepared to suffer a financial burden to reduce emissions. One might wonder, then, how the survey population’s near-universal willingness to pay can be reconciled with the prevalence of climate scepticism in the US.

“I think the key to this apparent contradiction are the words ‘survey population’,” says Helveston. To minimize the effect of self-selection, the researchers recruited participants without first revealing the subject of the questionnaire. The sample included 331 online respondents and 127 who were approached in person at fuel stations in California, where E85 and CNG are most widely available. But although individuals with a range of political and environmental opinions were among the respondents, the sample was certainly not representative of the population at large.

“Given the large variation in preferences and views of the American populace, I am certain we did not capture the full set of perspectives,” says Helveston. “I think this is a major limitation of the study and an important one to highlight.”

But even if the results are not entirely representative, it seems that at least a portion of the public is willing to shoulder additional costs to protect the environment. The finding suggests that improved labelling on the fuel-station forecourt might prompt some consumers to switch to less carbon-intensive fuels where possible.

Next, Helveston and colleagues will investigate consumer knowledge and preferences regarding electric vehicles (EVs), which promise greater emissions reductions – if they become widely adopted.

“Given how new EVs are to the world, most consumers know very little about them, and that lack of knowledge may be a significant barrier to consumers forming positive preferences for EVs,” says the researcher.

Helveston and colleagues reported their findings in Environmental Research Letters (ERL).

Turbulence transforms polymers into gecko-like dendrites

Turbulence – the culprit behind the heart-stopping moments on a bumpy aeroplane ride – is unpredictable and difficult to control, but it can come in handy sometimes. Now, Orlin Velev and colleagues at North Carolina State University have harnessed the power of turbulence to develop a versatile method for synthesizing dendritic polymer particles. These dendritic polymers have nanoscale wispy tendrils bearing striking structural similarities to the soles of gecko feet. They show promise for a wide range of applications from adhesives to gelation agents for cosmetics and foods.

A dissolved polymer that is exposed to a “bad” solvent in which the polymer is not soluble will precipitate. The shape of the polymer precipitate can be tuned by how rapidly the bad solvent is flowing around the polymer solution droplet.

Turbulence is associated with fast, chaotic flow. When the researchers injected a dissolved polymer into a bad solvent that was flowing at extremely high shear rates, the polymer was dispersed by the new solvent medium in all directions as it precipitated. This gave rise to dendrites that resemble split ends upon split ends that you might experience on an extremely bad hair day.

Surprisingly organized

“Turbulent flow is highly disorganized; nevertheless, when we precipitate polymers under these conditions, the resulting polymer structures exhibit multiscale branching,” said Velev. “This is surprising because turbulent flow is not expected to organize materials, but that is what we see in our structures.”

Velev calls the materials organized because the polymer dendrites are hierarchically structured like Russian dolls. The micron-thick cores of the polymers branch repeatedly, until the very tips of the strands are thousands of times finer.

As a result, the polymers have a huge surface area. This allows the omnipresent attractive van der Waals forces to act between the polymer and any surface it comes into contact with. Such a structure makes the polymer sticky – indeed, it resembles the bottom of gecko feet at the nanoscale.

Sticky situation

Almost any polymer can be processed into sticky dendrites via turbulent shearing regardless of its original structure or chemical makeup, as long as it can be dissolved.

When the researchers dried dendritic cellulose acetate between two glass slides, the lap shear force required to pull the slides apart was larger than that for double-sided adhesive tape. Cellulose acetate is typically not a sticky polymer, but the researchers had transformed it into a glue that outperforms common household adhesives.

The study shows that dendritic polymers have potential for other applications, such as mechanically resilient porous sheets or gelation agents. Velev also wants to untangle the mathematically inaccessible principles of turbulent flow, at least in the context of polymer precipitation.

“Turbulent flow is intriguing, but it is complex,” says Velev. “Copious fundamental work needs to be done to understand the correlation between the flow parameters, the size of the fluid eddies, and the nanofibres in the dendritic colloids. We hypothesize that they are all correlated and aim to develop a model for the correlations.”

The research is described in Nature Materials.

Doing physics in microgravity environments

In this month’s Physics World Stories podcast, Andrew Glester discovers why microgravity environments are such interesting places to do physics experiments. Perhaps the ultimate microgravity laboratory is the international space station (ISS), where astronauts carry out experiments designed by scientists across the globe. But microgravity environments can also be created here on Earth, via parabolic flights and drop towers that can achieve microgravity conditions within the gravitational field of the Earth.

In the episode, Glester travels to Swindon to meet Libby Jackson, the human exploration programme manager at the UK Space Agency. Jackson explains why removing gravity from the equation can allow researchers to probe a range of questions, not necessarily related to space science. She herself, has flown on a so-called “vomit comet” and she describes the experience of adapting to weightlessness while trying to control a science experiment.

Marco Marengo, a thermal engineering research at the University of Brighton, UK, is another frequent flyer on parabolic flights. He describes some of the physics experiments he has been involved with and the process through which researchers can apply for time at these facilities. Unsurprisingly, he always finds time to have some fun while weightless in addition to doing the serious science.

Within Europe, researchers requiring a microgravity environment regularly visit the ZARM drop-tower, located in Bremen, Germany. Just shy of 150m in length, this facility comprises an experimental capsule housed inside a long steel tube. In the video below, you can see Paxi – the European Space Agency’s educational mascot – falling down this drop. The ESA website has full details of how to apply to use parabolic flights, drop towers and other related facilities.

While researchers are less likely to be making a trip themselves to the ISS, the options for sending your experiment there are expanding. Jackson explains how it is now possible to buy time on the ISS through the ICE Cubes service, which involves launching your experiment in a 10cm3 container. Companies can also pay for time on the the ISS securing the rights to any resulting intellectual property.

Glester will be back with another episode of Physics World Stories next month. In the meantime you can listen to our more regular podcast Physics World Weekly. You can subscribe to both programmes on Apple podcasts or your chosen podcast provider.

Support for this podcast came from Pfeiffer Vacuum.

 

 

Portable MR sensor diagnoses dehydration

Dehydration – defined as a total body water deficit – is associated with detrimental health outcomes across many populations. In patients admitted to hospital, dehydration can impair post-operative recovery and increases the risk of mortality by 80% on average.  Elsewhere, soldiers are especially vulnerable to dehydration due to exposure to extreme environments, diminished thirst perception and limited availability of water.

But despite the serious consequences of unmanaged dehydration, there’s no quick and accurate diagnostic method available. Clinical symptoms are non-specific and easily confounded by other comorbidities, while existing diagnostic approaches require invasive bodily fluid sampling. To address this shortfall, researchers at Massachusetts Institute of Technology (MIT) are developing a portable MR sensor for non-invasive dehydration diagnosis (Magn. Reson. Med. 10.1002/mrm.28004).

“MRI provides an unparalleled non-invasive window into the body to study disease, physiology and fluid distribution,” says first author Ashvin Bashyam. “We felt that there was massive untapped clinical potential by bringing these systems out of centralized imaging facilities and towards the patient bedside and beyond.”

Measuring fluid shifts

Dehydration leads to relatively large volume depletion in the intramuscular fluid compartments (extracellular and intracellular), suggesting that measuring these tissue-specific fluid volumes could inform diagnosis. In particular, the extracellular fluid (ECF) compartment is highly responsive to fluid loss. The researchers propose that a portable MR sensor that measures ECF shifts could reliably identify clinically meaningful volume depletion.

To test this idea, Bashyam and colleagues examined fluid shifts in the skeletal muscle of dehydrated mice (exposed to high temperatures under fluid restriction) and control animals. They first used a benchtop NMR system to perform whole-animal multicomponent T2 relaxometry before and after dehydration, and showed that this MR measurement could identify dehydration.

Next, the researchers used a standard MRI scanner to perform multicomponent T2 relaxometry of skeletal muscle in the mice. They analysed the change in muscle ECF signal from before to after dehydration (to compensate for initial hydration variability) and observed significantly greater signal loss in dehydrated mice than in control animals. These findings demonstrate that the MR signal originating from muscle tissue alone is sufficient to identify and estimate the degree of dehydration.

Portable scanner

Using traditional MRI to diagnose dehydration is limited, however, by complexity and long measurement time, as well as the high cost and lack of portability of MR scanners. Thus, the team developed a portable, single-sided MR sensor – roughly 1000 cm3 in size and weighing 4 kg – that could perform comparable measurements.

“Traditional MRI systems involve a closed bore, so the sample must be placed inside the device where the magnetic fields are the strongest and most uniform,” explains Bashyam. “Our device utilizes a single-sided design, which means that the magnetic field is located outside of the sensor. This allows the sensor to be placed against a large sample and still perform a measurement, so the sensor can be miniaturized and highly portable.”

The sensor is based around a permanent magnet array that generates a static 0.28 T magnetic field. Permanent magnets benefit from low power consumption and far lower costs than superconducting coils used in conventional scanners. The resulting portable MR sensor can be used in environments including hospitals, outpatient facilities, sporting events and military operations.

T2 relaxometry

The team used the portable sensor to perform T2 relaxometry on skeletal muscle in the mouse leg, to characterize its ability to identify systemic fluid depletion in vivo. The relaxometry signal showed a decrease in decay time (from 67.1 to 45.4 ms) after dehydration, with the ECF component most responsive to fluid depletion.

Analysing the change in intramuscular ECF signal from before to after dehydration revealed significantly greater change in this signal among dehydrated versus control animals. The estimated change in fluid volume attributed to the ECF strongly correlated with weight loss in dehydrated mice, but not in the controls.

Bashyam says that the portable MR sensor can be adapted for use on human muscle tissue by increasing its size or applying novel pulse sequences to increase measurement depth. He also believes that, in future, it will be possible to measure dehydration changes from a single measurement through improved pulse sequence designs and establishing a range of normal hydration values.

“This work will move forward in two main directions,” Bashyam tells Physics World. “The first will be to develop more advanced sensors that can perform measurements more rapidly and deeper within tissue. The second will more robustly demonstrate the clinical utility of these sensors by testing them on healthy subjects as well as patients at nearby hospitals and medical centres. We are actively engaging with industrial partners to accelerate this work.”

Climate influencers recognized in Madrid

Climate Reality Project Spain

Having contributed to Physics World’s journalism from its Bristol headquarters for the past decade, I’ve recently relocated to sunnier climes – namely Spain’s lively capital, Madrid. By sheer coincidence, my Airbnb host upon arrival turned out to be a fellow science communicator, Manuel Montijano, who invited me along to an award ceremony last night to recognize Spanish climate influencers. The gala was organized by the Spanish wing of the Climate Reality Project, an NGO founded in 2011 from the consolidation of two of Al Gore’s environmental organizations.

Climate change is a global issue but it’s fair to say that the Iberian Peninsula is on the front line of the threats facing Europe. From the wildfires in 2017 to severe droughts in 2018 and flooding in 2019, Spain and Portugal are particularly vulnerable to extreme weather events. On the flip side, this location in south-western Europe is also well-placed to offer solutions to the problem. That was nicely illustrated in the month of March 2018, when Portugal provided 104% of its electricity consumption from renewable sources, mainly hydroelectric and wind (the excess 4% was passed on to Spain).

Last night’s event took place at the Kinépolis de la Ciudad de la Imagen, a cinema complex to the west of Madrid’s urban area. Hosting a social event about the planet’s grave perils is a tricky gig. But the comedian Sara Escudero walked the line expertly between delivering jokes and a serious message about climate. As well as the gags, her dress made from recycled materials – including CDs and feathers – brought a touch of glam to the event.

Jon Kortajarena

Speaking of style, before the show began the Spanish press pack was huddled around the model and actor Jon Kortajarena, who was to be crowned the “public personality of the year” for his activities promoting climate issues. “My generation is pleading and demanding real, serious and immediate change from international companies and politicians. Don’t fail us because the world is watching,” said Kortajarena who has 2.3 million followers on Instagram.

Other winners on the night included the national newspaper El País for its climate coverage and the Basque regional government for its climate change strategy. Most recipients were Spanish, but there was also a special “under 18s” award for Greta Thunberg and all the students across the globe involved in the Fridays for Future movement. Although Thunberg could not attend, there was a slightly surreal stunt of a man dressed in Greta’s yellow raincoat taking to the stage with the “todos somos Greta” (we are all Greta).

Preceding the award presentations was a screening of the documentary Hacia un Planeta Verde (Towards a Green Planet), a whistle-stop tour of humanity’s relationship with fossil fuels, which you can watch below, in Spanish. Produced by journalist Manuel Campo Vidal with technical support from the Spanish utility company Iberdrola, the science covered familiar ground. For me, the interesting part was the strong call for the world’s wealthiest companies need to lead by example in the transition to lower carbon economies. It also recognized the moral obligation to invest European money in communities built around fossil fuel extraction and processing to help with the transition.

My one criticism of the film is that it could have taken a more nuanced view of the challenges of bringing about drastic cuts in carbon emissions. For instance, the UK was hailed as a European climate leader for reducing emissions. Yet there was no mention of the strong criticism from the government’s own climate-change advisers, who say the UK is not acting anywhere near the speed needed to achieve its climate targets.

But ultimately, this was an evening to celebrate achievements and inspire attendees about the possibilities of change. In that department the gala was a big success. In true local style, the evening was finished off with some vegetarian tapas and local wine. Look out for more coverage from Spain in the future.

Whiskey webs identify American spirits, undercompressive shocks give wine its tears

Derived from the Gaelic word for water, it’s usually spelled whiskey for spirits made in the US and Ireland and whisky everywhere else. And whisky is big business in places like Scotland and Kentucky, where distillers are keen to protect their markets from counterfeits.

Now, physicists at the University of Louisville in Kentucky have found a way to identify genuine American bourbon whiskey from the pattern of residue it leaves after evaporating. Unlike other spirits, which leave spots of residue, the American tipple leaves a distinctive spiderweb pattern.

Describing the effect in Physical Review Fluids, Stuart Williams and colleagues believe that the pattern is linked to chemicals that seep into the whiskey as it is aged in barrels lined with charred wood. This is unlike other types of whisky, which are not aged in charred barrels.

Whisky is sometimes aged in wine casks, and it is often said that a good wine can be identified by an abundance of tears. These are droplets of wine that form on the side of a glass after it is swirled.

In “A theory for undercompressive shocks in tears of wine”, Yonatan Dukler, Hangjie Ji, Claudia Falcon and Andrea Bertozzi of the University of California, Los Angeles take a fresh look at this famous effect. They conclude that wine tears “emerge from a reverse undercompressive shock originating at the meniscus”.

Try saying that after a few glasses of research into the subject.

Open access from a researcher’s perspective: Gábor Csányi

Open access from a researcher's perspective: Gábor Csányi

This week it’s the 10th International Open Access Week, which is designed to help researchers learn about open access and share their ideas about how open-access science can best work. Open Access Week was set up by SPARC – a global coalition to help make “openness” the default for research and education.

But what’s open-access like from the point of view of a researcher? In this audio interview, I talk to Gábor Csány, a professor of molecular modelling at the University of Cambridge in the UK, who’s on the board of a new IOP Publishing open-access journal Machine Learning: Science and Technology.

And if you want to find out more about open-access publishing, check out our interview with Antonia Seymour, publishing director at IOP Publishing, about its efforts to improve open access.

Strontium detection confirms heavy elements form in neutron star mergers

The first spectroscopic evidence that heavy elements are created by the merger of two neutron stars has been found by an international team of scientists. By analysing light captured by the Very Large Telescope in Chile, the team has shown that strontium was produced in a huge “kilonova” explosion that followed such a merger. As well as confirming that neutron-star mergers are a significant source of heavy elements, the study also provides the first spectroscopic evidence that neutron stars comprise neutron-rich matter.

Physicists already know that hydrogen and helium were created shortly after the big bang and that heavier elements up to and including iron are formed by nuclear fusion within stars. But the origins of the 64 naturally occurring elements heavier than iron have been difficult to pin down. One potential source is asymptotic giant branch (AGB) stars, which are cool and luminous objects. The slow neutron capture process (s-process) of nucleosynthesis is believed to occur in these stars, creating about half of the heavy elements in the universe.

The other half is believed to be created by the rapid neutron capture process (r-process) of nucleosynthesis. This requires an enormous flux of neutrons and is thought to be responsible for creating the heaviest naturally-occurring elements such as gold, platinum and uranium.

Prime candidate

Although the r-process was first proposed about 60 years ago, it had not been clear exactly where such neutron fluxes could occur. In August 2017, astronomers witnessed the collision of two neutron stars that resulted in a kilonova explosion and the creation of a black hole. This involved the violent acceleration of vast quantities of neutrons, making it a prime candidate for r-process nucleosynthesis.

Initial analysis of light from this event – dubbed GW170817 – revealed a glow that was attributed to the radioactive decay of heavy elements. This was interpreted as strong evidence that neutron-star mergers are responsible for the other half of the universe’s heavy elements.

Now, an international team led by Darach Watson at the Niels Bohr Institute in Copenhagen is the first to spot the spectroscopic signature of one of those heavy elements in light from the kilonova. That element is strontium, which has a strong spectroscopic feature at an observed wavelength of about 810 nm (on the boundary between visible and infrared light). While most strontium is expected to be created by the s-process, about 30% is attributed to r-process nucleosynthesis.

“Unequivocal evidence”

“Before this we were unable to identify any specific element created in a neutron star merger,” says Watson, describing the observation as “unequivocal evidence” that heavy elements are created in kilonovas.

Seeing relatively light-weight strontium came as a bit of a surprise because it had been thought that only the heaviest elements would be made in neutron-star collisions. Team member Jonatan Selsing comments, “Now we know that the lighter of the heavy elements are also created in these mergers…it tells us that neutron star collisions produce a broad range of the heavy elements, from the lightest to the very heaviest”.

Watson and colleagues also point out that their detection of strontium is the first spectroscopic evidence that neutron stars are made mostly of neutrons. This comes almost exactly 85 years after the existence of neutron stars was first postulated by Walter Baade and Fritz Zwicky.

The team is now working towards finding spectroscopic evidence for heavier elements in light from GW170817. Potential candidates including barium and the lanthanides – which have strong spectroscopic features at wavelengths shorter than about 650 nm.

The observation is reported in Nature.

 

Multifunctional nanoparticles show promise for cancer therapy and imaging

A key objective of radiotherapy is to increase the therapeutic ratio – by maximizing the death of cancer cells while minimizing radiation dose to normal tissues. One approach is to use high-atomic-number nanoparticles to enhance the efficacy of the incident radiation. This works because photons interacting with such nanoparticles produce photoelectrons and Auger electrons, thereby amplifying the local radiation dose and upping the therapeutic ratio.

Bismuth-based nanoparticles, which benefit from high X-ray absorption coefficient, low toxicity and low cost, hold promise as a potential radiosensitizer. Now a research team at Tsinghua University has investigated the use of bismuth gadolinium oxide (BiGdO3) nanoparticles as a new theranostic agent for enhancing radiation therapy, as well as acting as an imaging agent (Phys. Med. Biol. 10.1088/1361-6560/ab2154).

"Bismuth and gadolinium with high atomic number have shown the ability to enhance radiation," explains first author Azimeh Rajaee. "Moreover, they both provide CT contrast, while gadolinium can be employed for MRI T1 contrast."

Rajaee and colleagues used a simple, inexpensive sol-gel method to synthesize BiGdO3 nanoparticles with an average diameter of 11.3±1.6 nm. For in vitro and in vivo experiments, they modified the surface with PEG to reduce the toxicity and increase blood circulation time. The researchers first assessed biocompatibility by exposing MCF-7 breast cancer cells to nanoparticle concentrations from 0 to 500 µg/ml. They saw negligible change in cell viability compared with a control group, demonstrating the nanoparticles' low cytotoxicity.

To evaluate the radiosensitizing effect of the BiGdO3 nanoparticles, the team performed a clonogenic assay on two breast cancer cell lines. The survival rate for MCF-7 cells irradiated at 2 Gy was 87%. Survival decreased to 70% and 49%, for irradiation plus 50 and 100 μg/ml BiGdO3 nanoparticles, respectively.

For 4T1 breast cancer cells, the survival rate decreased from 70% (radiation only) to 53% (radiation plus 50 µg/ml) and 42% (radiation plus 100 µg/ml). Irradiation at 4, 6 or 8 Gy showed similar patterns. For the 100 μg/ml concentration, this represents a sensitizer enhancement ratio of 1.75 and 1.66, for MCF-7 and 4T1 cells, respectively.

The researchers also used gel dosimetry to measure the dose enhancement factor (DEF). Comparing dosimetry results in the presence and absence of (500 µg/ml) nanoparticles revealed a DEF of 2.3.

In vivo investigations

Having proved the dose enhancement effect in vitro and using gel dosimetry, the researchers moved on to in vivo radiotherapy studies. They injected mice with 4T1 cells and divided the animals into four groups to receive: no treatment; intra-tumour injection of BiGdO3-PEG nanoparticles; X-ray irradiation at 10 Gy; nanoparticle injection plus 10 Gy irradiation.

Tumour growth curves

In the control group, tumour volume steadily increased, growing by about 10 times after 20 d. Irradiating the mice slowed this tumour growth, with tumour volume increasing by about 6 times and 4.5 times after 20 d, without and with BiGdO3-PEG nanoparticles. These results confirmed the radiosensitizing effect of the nanoparticles.

Finally, the team evaluated the use of BiGdO3 nanoparticles as a dual-modal imaging agent for CT and MRI. Comparing T1 MR images of mice tumours before and after intra-tumour nanoparticle injection showed that the nanoparticles increased the signal intensity and brightness. Likewise, CT images of mice demonstrated increased CT values after nanoparticle injection, confirming contrast enhancement.

CT images

Following these findings, the researchers propose that the nanoparticles might be appropriate for improving image-guided radiation therapy, to not only improve CT and MR images, but also to increase the local radiation dose. "Our results show that the nanoparticles deserve further study in radiation therapy," says Rajaee.

The researchers note that for this proof-of-concept investigation they injected the nanoparticles directly into the tumour. "However, for investigation of BiGdO3 distribution in tissues and organs, the nanoparticles should be injected intravenously," Rajaee tells Physics World. "So further research needs to be done on surface modification of the nanoparticles with neutral polymers, and optimizing the size to enhance absorption into tumours according to the enhanced permeability and retention effect."

Flooding set to increase in Ganges-Brahmaputra-Meghna river basin

Warming of 1.5 °C is often portrayed as a “safe” limit but the consequences of climate change don't distribute themselves evenly around the world. For some regions even 1.5 °C of warming will have serious consequences. A new study shows that 1.5 °C of warming will bring significant increases in extreme precipitation events and flooding to the highly populated Ganges-Brahmaputra-Meghna Basin.

The Ganges-Brahmaputra-Meghna river system covers a wide area through Bangladesh, Bhutan, China, India and Nepal. Monsoon rains between June and September dominate the system. In 2017 monsoon flooding produced record river levels, resulting in around 1200 deaths and severe loss of crops and infrastructure. Will events like the 2017 floods become more common with 1.5 °C of warming?

Previous studies used high-emission scenarios to look at the impact of warming on this region. But no-one had investigated the impact of a low-emission scenario aimed at stabilising the climate at 1.5 °C or 2 °C of warming. Some climate drivers, such as aerosol emissions, are likely to be different under a low emissions scenario and could produce a differing precipitation response to similar levels of warming under a high-emissions scenario. To investigate this further, Peter Uhe from the University of Bristol, UK, and colleagues combined a high-resolution flood hazard model for the Ganges-Brahmaputra-Meghna region with climate model simulations for 1.5 °C and 2 °C of warming.

All but one of the models analysed show a significant trend in increasing rainfall in the region, with the trend being stronger for extreme rainfall. Flood hazard also increases. For floods with a 1 in 5 year probability the flood area under 1.5 °C of warming is estimated to increase between 7 and 25% compared to present day.

“Bangladesh will be particularly vulnerable to the more frequent floods because it is flat and very low lying and a large portion of the county is a flood plain,” explains Uhe, whose findings are published in Environmental Research Letters (ERL).

More severe floods (1 in 20 year and 1 in 100 year probability) also increased in frequency and extent under the 1.5 °C and 2 °C warming scenarios. The percentage increase in area was not as great as for the 1 in 5 year floods but the waters were deeper. This is partly because of the topography of river valleys, with the more frequent floods spilling out onto the flatter valley bottom, whilst more severe floods tend to create deeper waters because they are spatially constrained by the steeper valley sides.

“These floods could become more dangerous by impacting areas that are not often flooded, which may be less prepared,” says Uhe.

Even under low emission scenarios and just 1.5 °C of warming, the research shows that there will be an increase in extreme precipitation events for this region, and a greater likelihood of flood events like the one seen in 2017.

“People living there may have to change how and where they build, and their farming practices,” says Uhe. “Bangladesh, for example, has a large amount of flood defences along the Brahmaputra river and these could be modified to adapt to future flood risk. However, these measures will not remove the flood risk completely.” Preparing and adapting will be vital.

Copyright © 2025 by IOP Publishing Ltd and individual contributors