New findings predict that global temperature increases will be twice as high by the end of the century as previously forecast, unless international policy action is taken. That is the prediction of scientists using the Integrated Global Systems Model (IGSM), a project funded in part by the US Department of Energy.

IGSM is unique amongst climate predictors because it is underpinned by a flexible economic model that projects future changes in human activities such as trade between nations. Climate scientists at Massachusetts Institute of Technology (MIT) have used the model taking into account physical factors like the cooling effect of volcanic eruptions for the first time.

The researchers predict a 90 % probability that surface temperatures will be 3.5° to 7.4° higher by 2100, under a scenario involving no policies to specifically reduce greenhouse gas emissions. These temperature increases are more than twice those predicted under the previous version of IGSM, which was run back in 2003. The model was also run for different scenarios involving “strong” policies to curb emissions, and the temperature never rose above 2.5°, which is relatively unchanged from the 2003 prediction.

The findings are published this month in the American Meteorological Society’s Journal of Climate.

Living with uncertainty

“4° is a very, very dangerous amount of warming - that’s 8° Celsius of polar warming,” said Ronald Prinn, one of the MIT modellers speaking earlier this year at the European Geosciences Union Conference in Copenhagen.

The standard international reference for climate predictions is the SRES scenarios of the Intercontinental Panel on Climate Change (IPCC) - a body which shared the Nobel Peace Prize in 2007. However, whilst the IPCC make detailed predictions for the end of the 21st century, there is still a wide range of uncertainty within each prediction. What’s more, the IPCC scenarios are deliberately independent of policy and projected human responses.

Frustrated by the continued lack of clarity in climate change predictions, Prinn and his colleagues set out to quantify the likelihoods for specific climate outcomes. For each climate scenario, they carried out 400 runs, where each run involved slight variations in the input parameters with each set of parameters equally likely. In this way they reduced the uncertainty of both input parameters and climate responses.

The new standard?

The MIT scientists find that a business-as-usual approach to greenhouse gas emissions will result in 1400 parts per million of carbon dioxide equivalent by 2091-2100 leading to an 85 % chance of temperatures rising by more than 4°. However, in the case where carbon dioxide equivalent levels were stabilized at 552 parts per million, all 400 forecasts led to an increase of 4°. To further enhance the clarity of the results, Prinn and his colleagues conceptualize the scenarios as a game of roulette, in what they call the “climate gamble”.

The MIT researchers say that many factors contribute to the stronger warming in these latest predictions. In particular the models take into account for the first time volcanic eruptions, which helped to cool the Earth in the second half of the 20th century. They also say that a more sophisticated method for projecting growth in Gross Domestic Product (GDP) was used, which eliminated many low emission scenarios.

“Whereas the IPCC scenarios can reproduce climate histories very accurately, they don’t take into account all the uncertainties of such a complicated system,” says John Reilly, one the MIT scientists.

Richard Tol, an economist who looks at the impacts of climate change told physicsworld.com that these findings are an important development in climate modelling. “The study shows that the IPCC may have underestimated the size of the climate problem,” he said. However, he still has reservations about the economic predictions. “The model was calibrated in 2008, to data that had rapid economic growth and very energy-intensive growth at that,” he added.