Skip to main content

Web life: ComplexityBlog.com

URL: http://complexityblog.com

So what is the site about?

The site’s homepage calls it “a repository of ideas and perspectives regarding the science, engineering and philosophy of complexity”, and it pretty much does what it says on the tin. Part blog, part links archive, part library of modelling tips and tricks, the site is chock full of information that comes under the general heading of “complexity”.

What are some of the topics covered?

One of the joys – or problems, depending on your perspective – of complexity science is that almost anything worth thinking about can, potentially, fall under its aegis. The blog section of the site celebrates this trait. In between “serious” posts on topics like game theory, cosmology and the methodology of agent-based models, you can also find oddities like an evolutionary theory of dating and a map of the popular (and notoriously complex) TV series Lost. The non-blog portion of the site features a short-but-helpful glossary of complexity-related terms, a list of popular books on complexity, and a host of useful links to academic institutes and private companies involved in complexity research.

Who is behind it?

The site began as a joint effort by two University of Michigan PhD students, Aaron Bramson and Kenneth Zick, both of whom are interested in the science of complexity. Bramson’s academic background is in mathematics, economics and philosophy, and a lot of his posts focus on the social side of complexity theory. Zick, who left the project in late 2007, is a computer scientist and engineer with a keen interest in topics like game theory and evolutionary algorithms.

Why should I visit?

Because this is the nearest thing to a one-stop complexity shop available on the Web today. There have been other good sites in the past, notably The Complexity and Artificial Life Research Concept, but sadly, although www.calresco.org is still active, it suffers from an outdated site design and multiple broken links. Only time (and its authors’ future career paths) will determine whether ComplexityBlog.com meets a similar fate, but for the moment, there is plenty of up-to-date, fresh material here for experts and complexity neophytes alike.

Can you give me a sample quote?

Bramson, while discussing whether complexity science should have its own separate department within universities, notes that “Complex systems is still a young science; it suffers from conceptual immaturity and technical inability. It might appear to be a backwards argument to claim that we ought to establish a degree-granting institution so that individuals can work to make the discipline credible. But we must recognize that pursuing a clearer understanding of the underlying mechanisms of phenomena…will simultaneously enlarge [other] fields and ground complex systems. And by tying those mechanisms across disciplines, we all gain a better understanding of the individual applications.”

Obama cancels Moon return

US President Barack Obama has ended plans to return astronauts to the Moon by 2020. The administration’s budget request for the financial year (FY) 2011, announced yesterday, proposes cancelling the Constellation programme – outlined by President George W Bush in 2004 to develop, test and operate spacecraft that will return humans to the Moon by the end of the next decade – and makes no new provision for future manned missions to the Moon or Mars.

In place of Constellation, the Obama administration calls for “a bold new course for space exploration and scientific research” that will extend operation of the International Space Station (ISS) to at least 2020 and rely on commercial launch services to ferry astronauts to the station. According to NASA administrator Charles Bolden, NASA will “invest in critical and transformative technologies [that] will enable our path beyond low Earth orbit through development of new launch and space transportation technologies, nimble construction capabilities on orbit.”

To encourage those initiatives, the administration’s proposed budget gives NASA an extra $6bn over the next five years. For FY 2011, which starts on 1 October 2010, the proposal provides $11bn for NASA’s research budget – an increase of 18.3% over the FY 2010 figure. Obama’s request calls for NASA’s budget to increase to $19bn for FY 2011 and calls for future increases that will take the agency’s budget to $21bn in 2015.

‘Lacking innovation’

The 2011 budget is the first entirely within President Obama’s power. Dismissing Constellation as “over budget, behind schedule and lacking in innovation”, Obama called on NASA to support the commercial spaceflight industry. The proposed budget will provide roughly $50m to a handful of companies to develop commercial support for human spaceflight.

The administration also intends to increase collaboration with other space-faring countries, and to develop new approaches to space exploration. “Imagine trips to Mars that take weeks instead of nearly a year, people fanning out across the inner solar system, exploring the Moon, asteroids and Mars nearly simultaneously in a steady stream of firsts,” Bolden told a press conference. “And imagine all of this being done collaboratively with nations around the world…we can’t underestimate the rich promise of space exploration to draw nations together, and this budget gives us the means and the guidance to build even stronger alliances in the future.”

Norman Augustine, the former chief executive of Lockheed Martin who chaired the panel that provided human spaceflight options for NASA last year, gave the budget proposal guarded approval. “We found that the current Constellation programme was unsustainable and was highly unlikely to get humans to the ISS before its planned de-orbit or back to the Moon until roughly 20 years in the future,” he explains. “While many of us who believe strongly in human spaceflight might have hoped that still further funding would have been possible, this is obviously a demanding period from a budgetary standpoint.”

Budget boosts

The scientific community’s fears that the administration would reduce funding for research proved unfounded. The budget proposal calls for an increase of $824m – a 6.6% increase from the FY 2010 figure – for the three major US science agencies: the National Science Foundation, the National Institute of Standards and Technology, and the Department of Energy’s Office of Science. “Even after adjusting for the expected inflation of 1.1% in the coming year, these focused increases in science and technology R&D promise to accelerate America’s economic advancement and assure America’s position as a global leader well into the future,” says presidential science adviser John Holdren.

In another change of direction, the proposed budget almost trebles (from $18.5bn to $54.5bn) loan guarantees intended to help the US nuclear energy industry build new reactors. The guarantees cover 80% of the costs of building new reactors. Energy secretary Steven Chu has also appointed a commission that will study interim options for storing nuclear waste. The administration had already decided not to pursue the use of Nevada’s controversial Yucca Mountain site as a waste repository.

The budget request, however, is just that: a request. Congress will inevitably demand changes before it approves the budget. The proposals for NASA’s future have already met opposition from representatives of states with heavy investments in the Constellation contracts. Other critics have argued that the administration cannot justify the cost of cancelling the programme – about $2.5bn beyond the $9bn already spent on it.

Drop in warming linked to water vapour decrease

Much of the flattening in global temperature rises seen over the last decade can be attributed to a reduction in the concentration of water vapour in the lower stratosphere, according to a group of US climate scientists. Conversely, it is likely that a significant part of the warming observed during the 1990s was due to a rise in such concentrations. The inability of current climate models to incorporate these effects represents a significant weakness in those models, say the researchers.

Global average temperatures on the surface of the Earth have increased by about 0.13° per decade over the last half century. In its 2007 assessment of global warming, the Intergovernmental Panel on Climate Change (IPCC) concluded that it is very likely that most of this rise is due to the accumulation of manmade greenhouse gases, such as carbon dioxide and methane, in the atmosphere. However, water vapour, which is responsible for some 60% of the overall greenhouse effect (needed for human existence), also contributes to global warming via feedback effects.

These effects are relatively well understood in the lowest level of the atmosphere, the troposphere, where increased warming leads to greater evaporation, causing more water vapour and so further warming, although this is offset to some extent through the formation of clouds that reflect incoming sunlight back into space. However, the climatic effects of water vapour in the layer immediately above the troposphere, the stratosphere, are not well understood.

Difficult to model

Scientists know that levels of water vapour have been steadily rising in the upper part of the stratosphere owing to the oxidation of methane that has been building up there since the industrial revolution. But understanding what happens lower down is more difficult. Water vapour is transported upwards from the troposphere but that transport is controlled by temperatures in a thin boundary layer in the tropics known as the tropopause. Unfortunately, modelling this process accurately requires a vertical resolution greater than exists in most climate models.

To work out how much global warming these models might be missing by not tracking the fluctuating levels of water vapour in the lower stratosphere, Susan Solomon and colleagues at the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Colorado, together with Gian-Kasper Plattner of the University of Bern in Switzerland, took the observed changes in these concentrations, worked out the heating or cooling effect of these changes and then plugged the resulting numbers directly into a model.

Using satellite data showing a 10% drop in concentrations over the last decade, the researchers found that global surface temperatures rose some 25% more slowly than they would have done had water vapour levels remained static – an increase of 0.10° rather than 0.14°. Applying the same approach to data collected by balloon-borne atmospheric instruments, showing a rise in water-vapour concentrations between 1980 and 2000, the group found that global warming in the 1990s had been enhanced by about 30%.

Long-term negative feedback?

Solomon and colleagues say they don’t know what is causing these changes in concentration. A possible correlation with tropical sea-surface temperatures suggests that levels of stratospheric water vapour might respond systematically to global warming, providing a long-term negative feedback to such warming (since warmer seas in the tropics reduce the transfer of water vapour from the troposphere to the stratosphere). But it is also possible that such concentration changes only take place over a period of a decade and therefore do not provide a long-term check on climate change.

Indeed, Gavin Schmidt, a climate scientist at NASA’s Goddard Institute for Space Studies in New York, points out that the El Niño event of 1997–1998 might have moistened the lower stratosphere more than usual, with the result that this part of the atmosphere has been drying ever since. Alternatively, he says, the cause might be increases in aerosol emissions in Asia, since these have affected temperatures in the tropics.

No matter what the origin is, however, Karen Rosenlof, a member of Solomon’s team, says it is now clear that stratospheric water vapour has a significant effect on global warming and that models’ inability to take this effect into account is a significant failing. “Given the calculated 25% drop in decadal warming,” she points out, “you could say that these models are only 75% right.” But she maintains that this result does not mean that the IPCC is barking up the wrong tree. “It doesn’t change the conclusion that global warming is manmade,” she says.

The research has been published online in Science.

Lasers zap fusion doubts at NIF

Researchers in the US say they have made a crucial breakthrough towards achieving laser fusion and that they expect to generate the conditions for a sustained nuclear reaction by the end of the year. These claims are backed up by the publication of the first science results from the National Ignition Facility (NIF) and among the highlights was a new world record for laser intensity.

Four times over budget and five years behind schedule, the NIF project has been under pressure to deliver since it finally began its operations last March at the Lawrence Livermore National Laboratory (LLNL) in California. NIF’s main goal is to focus a number of lasers capable of generating 180 MJ of energy – more than 60 times more energetic than any machine in existence – onto a hollow sphere 2 mm in diameter made of beryllium.

Sparking ignition

By dumping all of this energy into a tiny volume, the aim is to try to squeeze together isotopes of deuterium and tritium contained inside the sphere. This could then spark “ignition”, which is the point at which the deuterium and tritium undergo sustained nuclear fusion that produces excess energy.

With these first results, the NIF scientists have been working to configure conditions within the target spheres, known as “hohlraums”. They zap these tiny targets with 192 lasers fired in synchrony to deposit 680 kJ of energy in the space of 10 billionths of a second – more than 20 times more power than any previous laser experiment.

NIF scientists are relieved by these results because there had been a fear that conditions required for fusion within the hohlraums could be fatally disturbed by the laser pulses, which could trigger damaging plasmas. Fortunately, this effect did not prevent the researchers from emulating fusion conditions – compressing 1.8 mm capsules and forcing radiation temperatures of 3.3 million K.

Full-sized hohlraums

“We show that we can heat hohlraums to temperature and radiation symmetry close to what is needed for ignition,” says Siegfried Glenzer, an LLNL researcher who was involved in this research. “When we extrapolate the results of the initial experiments to higher-energy shots on full-sized hohlraums, we feel we will be able to create the necessary hohlraum conditions to drive an implosion to ignition later this year.”

Chris Edwards, a researcher at the Central Laser Facility in the UK describes these results as “a very exciting development” and sees no reason to doubt this claim that NIF will achieve ignition within a year. “I am holding my breath and very excited about the opportunities this could open up for energy applications and astrophysical research,” he says.

The NIF project is a national collaboration between the US government, industry and academia with the aim of protecting national and global security. One application in the shorter term will be to use the facility to validate computer simulations of nuclear weapons to ensure that the US’s nuclear stockpile is safe. In the longer term it is hoped that NIF could lead the way towards practical fusion energy.

This research is published in Science.

Cyclone model forecasts more Katrina-like storms

The number of very intense hurricanes striking the east coast of North America and the Caribbean could increase over the next century if ocean temperatures continue to rise. The total number of storms, however, is set to fall. That is according to researchers in the US who have applied a popular model for forecasting cyclone activity to a series of climate projections made by the Intergovernmental Panel for Climate Change (IPCC).

Hurricane Katrina, which struck the southern US including New Orleans in 2005, is an example of the sort of havoc that can be wreaked when North Atlantic cyclones make landfalls. Katrina was a category 5 hurricane, the most intense grade, where sustained wind speeds were as strong as 280 kilometres per hour. Since the 1980s there has been an increase in the number of category 4 and 5 hurricanes forming over the North Atlantic.

Meanwhile, several modelling groups have predicted that rising sea temperatures in the North Atlantic will lead – perhaps counter-intuitively – to a reduction in hurricane activity over the North Atlantic over the course of the 21st century. These climate models, however, fail to provide predict the nature of individual storms because they lack the spatial resolution to determine hurricane intensity. In this latest research Morris Bender and his colleagues at the National Oceanic and Atmospheric Administration (NOAA) have addressed this lack of data in a three-step process.

Seeking the eye of the storm

To begin with, Bender and colleagues predict the effect of rising sea temperatures on the large-scale flow of hurricanes over the Atlantic, with a resolution of about 200 km. They use data based on the average of 18 global models that contributed to the latest scientific report of the IPCC. In the second stage, the researchers feed this information into a regional model of atmosphere over the Atlantic, known as the ZETAC model. At this point, however, they have a spatial resolution of about 18 km, which can only produce storms of the lowest intensity – categories 1 and 2.

The key comes in the final stage when Bender’s team feeds the ZETAC data into a hurricane model that has been used by the US navy since 1995 to predict storms. In this way the researchers are able to resolve the intense winds that circulate close to the eye of the storm in type 4 and 5 hurricanes, made possible by a spatial resolution of 8 km. Called the GFDL Hurricane model, it predicts nearly a doubling of category 4 and 5 hurricanes by the end of the 21st century.

Bender admits that he did not expect quite such an increase, but he is not surprised that rising sea temperatures may have dramatic effects. “We may have the broad predictions for climate but the specific impacts and their locations are far from a done deal,” he tells physicsworld.com.

This research is published in Science.

Organic transistor mimics brain synapse

Researchers in France claim to have made the first transistor that mimics connections in the human brain. The device, which is based on pentacene and gold nanoparticles, could lead to a new generation of neuro-inspired computers as well as help connect artificial structures to biological tissue.

Dominique Vuillaume of the University of Lille and colleagues studied how electric charges flow through the device and discovered that they behave in the same way as chemical neurotransmitters moving through a synaptic connection in the brain. “This is the first time that an electronic device has been shown to mimic a biological synapse,” Vuillaume told physicsworld.com.

The team, which includes scientists from the CNRS (the French National Science Agency) and CEA (the French Atomic Energy Commission), began by adding gold nanoparticles to the interface between an insulating layer (gate dielectric) and an organic transistor made of pentacene. They fixed the nanoparticles, which were 5, 10 and 20 nm in diameter, into the source-drain channel of the device using surface chemistry techniques and finished the structure by covering it with a 35 nm thick film of pentacene. The resulting device is called a nanoparticle organic memory field-effect transistor or “NOMFET”.

Short-term plasticity

A biological synapse transforms a voltage spike (action potential) arriving from a pre-synaptic neuron into a discharge of chemical neurotransmitters that are then detected by a post-synaptic neuron. These are subsequently transformed into new spikes, leading to a succession of pulses that either become larger or diminish in size. This fundamental property of synaptic behaviour is known as short-term plasticity, which is related to a neural network’s ability to learn. It is this plasticity that Vuillaume and colleagues have succeeded in mimicking.

In the NOMFET, the pre-synaptic signal is simply the pulse voltage applied to the device and the output signal is the drain current, explains Vuillaume. The holes – the charge carriers in the p-type organic semiconductor employed – are trapped in the nanoparticles and act like the neurotransmitters. A certain number of holes are trapped for each incoming spike voltage and in the absence of pulses, the holes escape in a matter of seconds

This time delay is carefully adjusted by the researchers by optimizing nanoparticle number and device geometry. “The output of the NOMFET is thus able to reproduce the deceasing or amplifying behaviour typical of a synapse depending on the frequency of spikes,” said Vuillaume.

Neuro-inspiration

The technique could be used build nanoscale devices for neuro-inspired computers, he added. “The human brain contains more synapses than neurons by a factor of 104 so we need to develop nanoscale, low-power, synapse-like devices if we want to scale neuromorphic circuits to the brain level.”

Although neural networks based on silicon chips have already been developed and used in certain applications, such approaches are limited because it takes at least seven transistors to build one electronic synapse. In this latest work, the same job is done with just a single NOMFET device.

The devices could also be used to increase the performance of neural-network computing circuits. And because the nanomaterials employed work on flexible, plastic substrates, they might be used to connect artificial neuromorphic circuits based on the NOMFET to “soft” biological tissue, speculates Vuillaume.

The work was reported in Advanced Functional Materials.

Planetary physics shrunk into a lab as MIT pursues fusion

Researchers in the US have simulated a magnetic field structure normally produced by the core of a planet, and they say that their design could lead to an efficient way of harnessing nuclear fusion for power generation. The experiment, based at the Massachusetts Institute of Technology (MIT), could also provide an opportunity for space physicists to model the dynamics of planetary magnetic fields and their interaction with charged particles from space.

Nuclear fusion is the powerhouse of stars resulting in the release of vast amounts of energy and the formation of heavier elements – the building blocks of the world we see around us. Some physicists believe that fusion could be harnessed as a source of energy here on Earth by combining deuterium and tritium at high temperatures to form helium-4 plus a neutron. The abundance of its raw materials, the absence of direct carbon dioxide emissions, and the minimal amount of harmful waste are among fusion’s major selling points.

One of the most promising ways of reaching the appropriate temperature and pressure is to use magnetic fields to “confine” plasma – clouds of ionized gas. In the majority of these experiments, plasmas are confined inside large doughnut-shaped vessels called tokamaks. Physicists have so far failed, however, to get more energy out of a tokamak than the energy used to heat and confine the plasma.

Planetary inspiration

In this latest research, Michael Mauel of Columbia University, New York, and his colleagues explore an alternative design inspired by observations of planetary magnetic fields. They suspend a half-ton magnet using powerful electromagnetic fields, and use this to manipulate plasma at 10 million K trapped inside a steel ring structure in an experiment called the Levitated Dipole Experiment, or LDX. The results confirm the researcher’s prediction that random turbulence inside the magnetic chamber increases the density of plasma – a crucial step towards fusion.

This experiment was inspired by space research that has occurred over the past 50 years Michael Mauel, Columbia University

“This experiment was inspired by space research that has occurred over the past 50 years,” says Mauel. “Satellites have explored the magnetospheres of planets such as Earth’s or Jupiter’s and these space observations showed a dipole magnetic field could confine hot ionized matter at high pressure.”

Mauel says that the LDX has distinct benefits over tokamak experiments because the dipole magnetic field is not “twisted or helical” and the plasma is able to circulate from the edge to the hot core without producing a drain on the plasma’s energy. He says that confining fusion with dipole fields would be particularly suitable for so-called “second-generation” fusion fuel, which avoids the need to breed radioactive tritium from lithium, which is the fuel of choice for tokamaks.

Manuel believes that these results could also aid space science. “These results will be of interest to space physicists who study the dynamics of ionized bases confined to outer space by the dipole magnetic field of planets.”

To develop their work, the researchers intend to create hotter plasmas to increase the rate of fusion. They also wish to improve the precision of temperature measurement in their experiment.

This research is published in Nature Physics.

Carbon-cycle feedback smaller, but still positive

Researchers in Switzerland and Germany have analysed data stretching back 1000 years to get the best estimate yet of how changes in global temperature affect the biosphere’s ability to soak up carbon dioxide. The team found that this feedback coefficient is about five times smaller than previously expected – which suggests that the amplification of manmade global warming by carbon-cycle feedback will be less than previously thought. Furthermore, the reduced uncertainty of this latest result could lead to better predictions of climate change caused by increasing amounts of carbon dioxide in the atmosphere.

To understand climate change, scientists need to know how changes in global temperature affect the amount of carbon dioxide in the atmosphere. Rising temperatures could, for example, turn a green landscape into a desert, which would reduce that region’s ability to absorb carbon dioxide. Conversely, a warmer climate could lengthen the growing season in mid and high latitudes, increasing the absorption of carbon dioxide in these places. Changes in temperature could also affect the amount of carbon dioxide produced by the vast numbers of micro-organisms in soil.

The overall effect of this “feedback” is expected to be positive – higher temperatures leads to less carbon dioxide being absorbed, which means more of the gas in the atmosphere, which in turn makes the climate even warmer. However, scientists have struggled to get a precise value for the feedback coefficient – a process that involves studying historical carbon dioxide and temperature data.

Large uncertainty

The best estimate had been that a rise in the mean global temperature of one degree boosts the carbon dioxide concentration by about 40 parts per million by volume (40 ppmv/°C) – but this could be off by 30 ppmv/°C or more. Such a large uncertainty makes the prediction of future carbon dioxide levels – and therefore future temperatures – all the more difficult. Indeed, 40% of the uncertainty in such predictions can be attributed to carbon dioxide feedback.

Now, David Frank and colleagues at the Swiss Federal Research Institute in Birmensdorf, the University of Bern and the Gutenberg University in Mainz have performed the most comprehensive analysis of carbon dioxide and temperature data yet. The team studied the period 1050–1800 AD, when manmade emissions were small enough to be ignored. Carbon dioxide levels were determined from three Antarctic ice cores. Average temperatures in the northern hemisphere were derived from nine different “proxy reconstructions” of temperature – average temperatures derived mostly from tree rings and the isotopic content of ice cores.

Two distinct periods

Frank and colleagues conclude that the feedback coefficient is probably about 2–21 ppmv/°C, with 8 ppmv/°C being the median value. The team also found that the coefficient appears to be significantly different in the periods 1050–1549 and 1550–1800 – when it was about 4 and 16 ppmv/°C respectively. The former era corresponds roughly to the “medieval warm period” and 1550–1800 to the “little ice age” – which experienced different patterns of global temperature and precipitation. As a result, Frank and colleagues believe that the shift from one period to the next could have decreased the carbon storage capabilities of some parts of the globe.

The research excludes the previously accepted value of 40 ppmv/°C with a confidence of 95% and provides further evidence that the coefficient is positive rather than negative. The latter is important because it suggests that the biosphere will not be able to soak up manmade emissions of carbon dioxide, which are believed to contribute to global warming.

The work is described in Nature 463 527.

Helium sell-off risks future supply

The US must stop selling off its helium reserves so that the country has enough of the gas to meet the needs of researchers and medical programmes. That is the conclusion of a new report, Selling the Nation’s Helium Reserve, published by the National Academy of Sciences (NAS). It says that failure to halt the sale of helium could lead to a drop in supply for the gas, which is vital for research into magnetic resonance imaging (MRI) techniques and low-temperature physics.

Helium is mostly produced by extracting it from natural gas fields, which contain up to 7% helium. There is estimated to be about 8.6 million tonnes of helium in the world, with the US having the biggest fraction of reserves at 35% followed by Qatar with 20%. About 32,000 tonnes of helium were produced around the world in 2008, three-quarters of which came from the US alone.

The US began building a huge stockpile of helium in 1925 as a strategic supply of gas for airships, and the reserve later became an important source of coolant for rockets during the Cold War. In 1996, however, the Helium Privatization Act came into law allowing US companies to recover and sell the helium, which is mainly stored in Amarillo, Texas, in a natural geological gas storage rock formation. The act was deliberately designed to exhaust the US stockpile by 2015 so that the government could recoup the cost of setting up the facility.

However, the report says that selling off the helium stockpile “has adversely affected critical users of helium and is not in the best interest of the US taxpayers or the country.” One problem posed by the selling off the stockpile, which the report says accounts for up to a third of global demand, is that the price for helium is low, being “not set by current market conditions but by the terms of the 1996 Act”. The fear is that once the supplies run out, the price will shoot up.

The NAS report, written by an 18-strong panel led by geologist Charles Groat from the University of Texas, recommends opening the price of federally owned helium to the market. It also warns that the US will become a net importer of helium in the next 10–15 years if it does not stop selling its reserves and will have to get its helium from new sources such as from the Middle East or Russia.

“US government sales are assumed to end by 2015, with future supplies coming from new extraction plants both in the US and elsewhere,” says William Nuttall, from the Cambridge Judge Business School, who, in collaboration with the Culham Centre for Fusion Energy in Oxfordshire and the gas-supplying firm BOC developed a model for the future of helium supply and use. “Supplies in the US are steadying while those in Europe, the Middle East and Asia are increasing.”

MRI uses the biggest chunk of the world’s helium, requiring 7000 tonnes (22% of the total) every year to cool the superconducting magnets that lie at the heart of these devices. However, large-scale research facilities also use lots of helium, with the Large Hadron Collider (LHC) at CERN requiring 150 tonnes of liquid helium last year to cool the 27 km ring and the ITER fusion project that is being built in France planning to use 50 tonnes once it is up and running in 2018.

One option to relieve the demand for helium is to develop new technologies that do not need the gas. “Intermittent shortages and price rises have now become an unwelcome feature of helium use and this encourages high-tech users, particularly those using cryogenics, to substitute technologies,” says Richard Clarke at the Culham centre. He adds that investment in new helium production facilities is also necessary.

Bohr, Dirac, Rutherford – and tea and buns in the library

By Matin Durrani

Skimming through the latest issue of CavMag – a glossy newsletter about the latest developments at the Cavendish Laboratory, Cambridge – I at first thought I had misread an article that stated: “In 1930, when I gratefully accepted a research studentship from Girton College…”

Marie_Sparshott.jpg
Marie Constable visiting the Cavendish Laboratory with director of development Malcolm Longair (right) and Geoffrey Constable (right)

In fact, it was not a misprint but part of a wonderful article by Marie Constable, a 101-year-old physicist who had done her PhD at the Cavendish back in the early 1930s.

Constable, who was writing all about her visit to the Cavendish in September last year to attend its Alumni Open Day, gave some marvellous insights into several legendary figures from physics.

Lord Rutherford, she writes, was “a big, bluff and hearty New Zealander”, who was “friendly and helplful”. He would, apparently, make random visits around the lab to his research students, knocking loudly on their door before asking if they needed any help. It was Rutherford, she says, who also instituted the practice of the Cavendish afternoon tea break, serving tea and buns every Wednesday in the library.

James Chadwick, who discovered the neutron and was Rutherford’s effective second-in-command, is described as being “friendly and kind” although he had a reputation for not tolerating silly mistakes and could sometimes “get cross”.

Meanwhile, Constable recalls Patrick Blackett, another Nobel laureate who had served in the Royal Navy, as being “tall, handsome and helpful” and “a remarkable addition to the Cavendish staff”.

Marie also once attended a lecture by Niels Bohr although, perhaps not surprisingly given his taciturnity, she says little about Paul Dirac, other than he “was often seen in the Cavendish and regularly attended the Wednesday afternoon tea-break”.

As for life at the Cavendish, it was apparently “serene and decent” and the atmosphere was “collaborative rather than adverserial”.

But despite being the only female graduate student at the Cavendish at the time, Constable says she did not encounter much discrimination. However, she admits that when she was an undergraduate, women had to sit at the front of the lecture room – “for fear their attention might be distracted by too much male proximity”. And she was later prevented from attending a workshop course for research students, forced to enrol instead at a local technical college instead.

Luckily that worshop experience proved handy and Marie carved out a career as a safety expert.

It’s a great little story and marvellous to see there are still living links with the glory days of Cambridge physics. The full article can be read at CavMag, which will be put online at this link shortly

Copyright © 2026 by IOP Publishing Ltd and individual contributors