Skip to main content

Records reveal robust ocean warming

A re-analysis of conflicting ocean-temperature records covering the past 15 years shows that the seas are getting warmer, according to an international team of scientists. The group compared a number of different published records and found that the discrepancies arise mostly from how different scientists corrected for the variable performance of early temperature probes.

Armed with this knowledge, the researchers were able to combine the data to obtain a more definitive record. Although the calculated warming of 0.64 ± 0.11 W/m2 is in line with other climate observations and global-warming predictions, the study does not explain why the records suggest that ocean warming has stalled since 2004.

Scientists have long known that the Earth’s oceans can absorb and release tremendous amounts of heat and therefore have a large effect on climate. However we know little about how ocean temperatures change on a decadal basis, which makes it hard to understand how oceans respond to climate change.

Expendable bathythermographs

Until about five years ago, most ocean temperature measurements were collected by expendable bathythermographs (XBTs) – devices that have been dropped from ships since the 1960s. XBTs are designed to sink at a known rate while measuring and transmitting the water temperature back to the ship via two wires.

However, there are two problems with XBT data. First, the probes were normally launched from ships on routine passages – and therefore missed out on much of the Southern Ocean and other less-travelled waters. Second, the design of the XBT has changed over the years, and researchers have discovered that some probes do not actually fall as expected. This means that temperature versus depth data could be flawed.

The situation improved in the early 2000s, when researchers began to deploy about 3000 “Argo floats”, which constantly gather temperature data at depths of up to 2000 m. Scattered throughout the oceans, the network of sensors was fully installed by 2007 thanks to a massive international effort.

Networked findings

An important challenge facing scientists is how to put all these data together to get a better picture of how the heat content of the entire ocean changes on decadal time scales. A number of different research groups have used different methods to create “upper-ocean heat content anomaly” (OHCA) curves, but these are not consistent with each other. For example, the curves do not seem to display the same year-to-year patterns. Indeed, during the 1997-98 el Niño climate oscillation in the Pacific Ocean, some curves show cooling, some warming and others no change.

Now, however, a team led by John Lyman of the National Oceanic and Atmospheric Administration (NOAA) in Seattle has taken a close look at these curves and has gained a better understating of why they differ from one another.

The team examined five different methods used to correct the XBT data, one of which is supposed to take into account slight changes in the shape of XBTs that could affect the OHCA curve. However, this correction is particularly difficult to make because the XBT type and manufacturer were not always recorded by users. Other studies tried to correct XBT data by comparing it with more reliable measurements such as Argo floats and conductivity, temperature and depth (CTD) probes.

‘Consistent’ with global warming

After gaining an understanding of the sources of uncertainty in each OHCA curve, the team was able to combine the data to obtain a curve that is more representative of global ocean temperature than its constituents. It reveals that the oceans have warmed at a rate of about 0.64 ± 0.11 W/m2 over the past 16 years. According to Kevin Trenberth of the National Center for Atmospheric Research in Colorado, this is “reasonably consistent with expectations from other indications of global warming”.

However, the re-analysis sheds little light on why ocean temperatures appear to have remained steady since about 2004. This is at odds with satellite measurements, which suggest the Earth has continued to heat up over the past six years, leading to questions over where the “missing heat” has gone.

Indeed, Stefan Rahmstorf, a climate scientist at Potsdam University near Berlin, says that the new study does solve this problem. “The accuracy of measurements is still not sufficient to close the energy budget particularly for short-term variations, in other words, over a few years, as associated with El Niño”.

Team member Doug Smith of the Hadley Centre in the UK points out that this stalling seems to occur just when the Argo floats became the primary data source. This could mean that further work is needed on how to interpret Argo results and how to integrate them into temperature records.

Argo data could also help scientists build a better picture of past ocean temperatures by revealing relationships between temperatures in regions where there is good historical data and parts of the oceans where there is not. Smith told physicsworld.com that he plans to exploit such relationships to fill gaps in the data back to 1950 and perhaps even earlier.

The work is reported in Nature 465 334.

Producing novel semiconductors en masse

Compound semiconductors like gallium arsenide (GaAs) could bring a revolution in optics and electronics, with the promise of highly efficient solar cells and a new generation of components. The trouble is that devices made from these materials are difficult to produce and their delicate nature leaves them prone to damage. But now a group of researchers based in the US and South Korea has designed a new assembly technique, which they say could produce these materials en masse and integrate them into devices with relative ease. They demonstrate their technique by producing a number of rudimentary electronic components.

While silicon is still the dominant material in many hi-tech industries, compound semiconductors could hold big advantages for certain applications. This is on account of the high mobility of electrons within the materials and their direct band gaps, which make them particularly effective at manipulating light. But the problem facing the compound-semiconductor industry is that they are competing with a market dominated by silicon with its established manufacturing base that is incompatible with compound semiconductors.

Printing technique

Now, John Rogers, working with colleagues at the University of Illinois at Urbana-Champaign and a related company, offer a production method by adapting a transfer-printing technique that they have been developing for the past few years. They begin by growing stacks consisting of multiple layers of gallium arsenide and aluminium gallium arsenide, which they then “peel” off one-by-one using a silicone-based stamp. The layers detach easily on account of van der Waals forces, which are stronger on the stamp surface than between layers of compound semiconductor. The researchers then stamp these individual flakes to target sites on a silicon wafer.

To demonstrate the precision and robustness of its technique, Rogers’ team creates three well-known components: field-effect transistor with logic gates; near-infrared (NIR) imaging devices and photovoltaic modules. The scientists use the NIR to show the full detail of the devices.

The research group intends to develop its research by developing more complicated component, including detectors for solar cells. The advantage of using direct band-gap materials like this is that incoming photons can easily form liberate electrons, which can then be collected as current. In silicon-based solar cells, the band gap is indirect so electron–hole pairs will only form if a lattice vibration known as a phonon – with the right momentum – is available.

Practical challenges

“Our biggest opportunity is in solar cells, where compound semiconductors haven’t been competitive so far,” says Rogers. He also believes that there is a compelling opportunity in other optoelectronic applications, such as highly efficient switches.

Chris Phillips, a semiconductors researcher at Imperial College London says that he welcomes this kind of practical approach to compound-semiconductor research. However, he warns that there are still major practical challenges to address regarding the “flimsy” nature of these materials. “With conventional semiconductors, the action is deep within the crystal. In these thin flakes, the layers could easily become dislocated leading to electron-hole recombination and severe reductions in efficiency,” he says. Philips feels that the biggest opportunities for compound semiconductors will be with hybrid devices where new materials can be integrated into silicon-based circuitry.

This research is published in Nature.

The great life of Carl Sagan

By Hamish Johnston

As a young lad in the 1970s I remember enjoying Carl Sagan’s television programme Cosmos.

Did it inspire me to become a physicist? Not really, but it was entertaining and there was something very soothing about the way Sagan spoke in an accent best described as “Brooklyn intellectual”.

One physicist (and TV personality) who was inspired by Cosmos is Brian Cox, who was on BBC Radio 4’s Great Lives programme yesterday to sing the praises of Sagan.

“As a young boy of 13, Brian Cox stared at his television screen every Wednesday evening, as Carl Sagan took him on a journey across the Cosmos”, says the BBC’s promotional material.

Sagan, who died in 1996, was somewhat controversial as both a scientist and a promoter of science and the BBC programme asks: “So just how good a scientist was he, and what is his legacy?”

You can listen to the programme here .

The art of science

By Michael Banks

Physicists have come out on top in Princeton University’s fourth “art of science” competition.

The annual exhibition features images created during scientific research and this year’s event was held on 7 May with the theme of “energy”. Jerry Ross, a postdoc at the Princeton Plasma Physics Laboratory won first place for his “xenon plasma accelerator” image. The picture (below) is of a so-called “Hall effect thruster” – a type of ion thruster where electrons, held in a magnetic field, are used to ionize a propellant, which is then used to produce a thrust.

Xenon Plasma Accelerator

Third place also went to a physicist. Tim Koby, a physics undergraduate at Princeton, produced a picture of the interaction of a neutron star with a black hole in the centre of a galaxy.

Koby was beaten into second place by David Nagib, a chemistry graduate at Princeton who produced an image called “therapeutic illumination”.

Ross bagged $250 for winning best exhibit, with $154.51 awarded to Nagib in second place and $95.49 to Koby in third.

And if you are wondering why those last two figures are not rounded to $150 or $100, it is apparently because they are derived according to the golden ratio – equal to 1.6180339887 – that represents, in this case, the ratio of the higher to the lower number.

You can also watch a video of the exhibits here.

XENON100 is certain about its uncertainty

xenon.jpg
Part of the XENON100 experiment (Courtesy: XENON100 collaboration)

By Hamish Johnston

Is the XENON100 collaboration in the dark about dark matter, or will its critics see the light? The latest installment of this debate has appeared on the arXiv preprint server.

On 6 May Jon Cartwright reported on a furore that has broken out in the dark-matter detection community.

Earlier that week the XENON100 collaboration posted a preprint with an analysis of the first experimental results from its dark-matter detector. It didn’t see any of the dark stuff, which means that the positive sightings reported by two other experiments (DAMA and CoGeNT) could be false.

But then two US-based physicists – Juan Collar and Dan McKinsey — posted a preprint that took XENON100 physicists to task on their analysis of the data. In particular, Collar and McKinsey believe that the XENON100 team is overconfident about how it extrapolated the known response of the detector to high-energy particles to lower energies – where the response is unknown.

This low-energy response is crucial because that is where XENON100, DAMA and CoGeNT have all looked for dark matter.

Now, XENON100 has responded with yet another preprint defending its analysis and claiming that it has “properly taken into account the uncertainty” in the low-energy response.

I can’t wait for the next preprint in this dark-matter “he said, she said”!

Spotting explosives with a puff of air

Transport authorities are about to be presented with a new type of body scanner that could identify explosives on the clothes of passengers with unprecedented resolution and fewer false alarms. The device has been created by academics and their spin-off company based in Austria by adapting a mass spectrometry technique that is common in environmental science.

Foiled terrorist attacks in New York City and Detroit in the past year have raised security levels, which were already on high alert since the attacks on New York in 2001. In response, governments around the world have been seeking more advanced technologies to scan travellers for weapons and explosives in airports. Any new technology deployed, however, must always represent a difficult balance between many factors including quality, cost and intrusiveness. Indeed, some passengers and human rights groups have already protested that full-body scanning is in breach of an individual’s rights. And various scientists have said that existing scanners are not fit for purpose anyway because there are common explosives that could pass through the scan undetected.

Spectrometry, a ubiquitous chemistry technique for determining the quantity or concentration of chemical species, offers a more directed approach to detecting chemical traces. One variety known as ion mobility spectrometry (IMS) is already used in a number of airport scanning systems. It works by taking a sample of dust from each passenger – usually by wiping their shoe with a cloth – and transferring this sample to an ionization chamber. The problem with IMS is that it is prone to false alarms because it is sensitive to interference and cannot distinguish between different types of volatile.

Quick puff of air

Now, however, a research group based at the University of Innsbruck claims to have created a new type of scan, based on an alternative mass spectrometry technique, which is far more sensitive and reliable. It involves a process known as proton-transfer-reaction mass spectrometry (PTR-MS) and the scanning process begins by blasting each passenger with a quick puff of air, which should liberate any residual explosives. This passenger “dust” is then passed into a chamber containing protonated water, which contains water molecules with an extra proton. If the passenger were to have traces of explosives about their person, the extra protons would “jump” to these volatile compounds because they have a higher proton affinity. Finally, the water vapour mix is passed into a high-resolution mass spectrometry device, which can identify protonated explosives in less than a second.

Since inventing the technique, academics at the University of Innsbruck have created a spin-off company, Ionicon Analytic, which has gone on to fine-tune the technology. It now works to a sensitivity of one part per quadrillion (1015), with the scanner being comparable to a standard home refrigerator in size. Kurt Becker, who works for Ionicon but is based at the Polytechnic University of New York, says that his project is seeking contracts with security agencies in the UK, Germany and France. He also revealed that he intends to meet with operators of New York airports to present the technology within the next few weeks.

Negotiations should be aided by the fact that this spectrometry technique is already established in a number of other applications including waste incineration and air quality control. Becker says that his company was inspired to adapt the technique for airport security after recent security alerts, which led to fears that terrorists are getting smarter with their methods and the type of substances they might deploy. “Terrorists may use improvized devices involving a combination of different explosives, including TNT, RDX, HDX and PETN,” he tells physicsworld.com.

Selectivity issues

Paul Monks, a chemist at the University of Leicester in the UK, agrees that the technique does hold benefits over established scanners. “Though ion mobility spectrometry is sensitive, it does suffer from selectivity issues,” he says. “The advantages of PTR-MS and its variants is that only those molecules with proton affinities in excess of that of H2O can accept a proton from H3O+, a criterion that excludes the major components of air such as N2, O2 and CO2, but includes many trace gases, including most volatile organic compounds.”

Ionicon and its various academic partners intend to develop the technology by improving the user interface to make life easier for security guards. “To make this airport-ready we need to create a push-button system,” says Chris Mayhew, one of Ionicon’s affiliates based at the University of Birmingham in the UK. Mayhew says that he will be presenting the technology to the UK government at a meeting in London this week.

Japanese robotics couple married by android

By James Dacey

We all know colleagues who allow their work to stray a little bit too far into their personal lives.

But a pair of robotics researchers in Japan surely took this to whole new levels when they were married yesterday by a 1.5 m android named I-Fairy.

The bride was 36-year-old Satoko Inoue who works for Kokuro, the firm that produced I-Fairy, one of their new generation of androids.

“This was a lot of fun. I think that the Japanese have a strong sense that robots are our friends,” she told the Associated Press.

Her new husband is Tomohiro Shibata, a 42-year-old professor of robotics at the Nara Institute of Science and Technology, was a bit more critical of their plastic priest. “It would be nice if the robot was a bit more clever, but she is very good at expressing herself,” he said.

The service took place at a rooftop restaurant in central Tokyo and you can enjoy the happy couple exchanging vows in this short YouTube clip.

Happy birthday to the laser

By Margaret Harris

Fifty years ago today, a little-known scientist working in an underfunded lab in California set off a scientific and technological revolution. On 16 May 1960, Theodore Maiman and his assistant Irnee d’Haenens succeeded in coaxing a beam of coherent light out of a flashlamp-pumped crystal of pink ruby. The laser had arrived.

Of course, the events of that day were not the whole story. Although Maiman is rightly honoured for inventing the first working laser, many others played a role in the laser’s development, both before and (particularly) after the initial breakthrough. Among the key early figures were Einstein, whose predictions about stimulated emission laid the theoretical groundwork; and Charles Townes, who invented the laser’s microwave predecessor, the maser.

To learn more about the early days of the laser, I’d highly recommend downloading Physics World’s May special issue, which you can do for free via this link. On page 23, you’ll find a great article by Pauline Rigby called “And then there was light”, which describes the events leading up to Maiman’s breakthrough and some of the controversy that followed it.

As for what happened next, I think the thing that surprised me most when I was researching the special issue was just how quickly researchers in various fields found ways of putting Maiman’s new toy to use. Barely a year after its invention, a device that d’Haenens memorably called “a solution looking for a problem” was already being used for human eye surgery.

So what will we be doing with it in 2060? Well, as Niels Bohr supposedly said, “Prediction is difficult, especially about the future” — but if you want to hear some experts’ views , check out “Where next for the laser?” on p53 in the downloadable pdf. You can also watch our laser video series .

Update: Pauline Rigby has written an entry on her own blog about how the article “And then there was light” came into being — including additional material from her interview with Maiman’s wife Kathleen. You can read it here

‘Pushy’ electrons move atoms

Researchers in the US have gained important new insights into how electrons travel through nanoscale metal wires. They discovered that the force with which electrons push atoms around in these structures is much stronger than previously thought – which could help improve next-generation nanoelectronic components.

As electronic devices become ever smaller, researchers need to better understand how electrical currents affect the atomic structure of tiny circuits. In particular, the electromigration of atoms in a nanowire could alter its electronic properties – or even cause it to fail. On a positive note, this movement of atoms could be used to assemble tin structures.

Ellen Williams and colleagues at the University of Maryland began their study by creating a range of different nanoscale structures, such as islands and “steps” (that contain between 100 and 100,000 atoms), on top of very thin silver wires measuring 2 to 50 nm across. The researchers then used a scanning tunnelling microscope to observe how the structures moved or changed shape when a current was sent through the wire. “It was amazing – when we changed the direction of the current, we found that we could the move the structures back and forth,” Williams told physicsworld.com.

Twenty times stronger

The Maryland team says that the force with which charge carriers (in this case electrons) push atoms around in such nanostructures is much stronger – by up to 20 times – than previously thought. According to the researchers, this strong “electromigration force” could be used to intentionally move atoms around in nanoelectronic components – something that might help in self-assembling nanowires, for example, to create devices that can be cycled through different structures under an alternating current. It may even be used to move nanomachines in the future.

And that’s not all: the team also found that the electromigration force could be greatly decreased by an adding an electron-withdrawing structure (or defect), such as C60, along monatomic step edges.

The different ways that electrons can move through a nanowire can be described by how easily electrons travel, or are transmitted, through the structure, explains Williams. Most atomic structures allow electrons to travel through easily but defects slow down electron movement. This results in a local “resistivity dipole”, which means that the defect sites have a local resistance and local electrical field very different from that in the rest of the material. “The key point is that the special atomic structures of defects cause weak transmission of electrons (or strong scattering),” she said.

Graphene is next

Williams’ team is now studying similar effects in nanoscale structures on top of graphene (sheets of carbon just one atom thick) as well. “Our group has been creating defect structures of graphene and deposing small amounts of scattering atoms onto the carbon material,” revealed Williams. “Using our powerful microscope techniques, we expect to see comparable effects of atomic motion and local resistances when graphene is carrying current.”

The results of this work might eventually lead to new ways of exploiting graphene’s unique electronic properties, she says.

The work was reported in Science 328 737.

All together now…

chair.jpg

With a little help from their friends: some of the DØ collaboration

By Hamish Johnston

Many years ago I wanted to be an experimental particle physicist (didn’t you?).

But then I cast my eyes over a few papers and realized that my name would be buried between D Johnston and A Jonckheere in a two-page list of authors (if I was lucky enough to join the DØ collaboration above).

I can’t say that was the only reason that I switched to condensed matter physics – I found it more interesting, for example – but the idea of being a small cog in a huge machine wasn’t that appealing.

Since then I’ve often wondered how hundreds (indeed, thousands) of particle physicists get together to write one paper.

If you are curious, Tommaso Dorigo has a blow-by-blow account on his blog.

Among other things, it involves committees referred to as “godparents” and arguments over British versus American spellings – although I would have thought the journal would have the last word on the latter.

Dorigo writes, “Now, if you think that the above baroque, surreal, ridiculous procedure is crazy, you might be right”.

However, he also points out that the process is “extremely democratic”, which he says is one of its “striking positive qualities”.

But is democracy the best way of doing science?

Cast your ballot now!

Copyright © 2026 by IOP Publishing Ltd and individual contributors