Skip to main content

Report slams UK’s leading physics funding agency

“Substantial and urgent changes” are required at the Science and Technology Facilities Council (STFC) — the UK’s main funding body for physics and astronomy — following its handling of an £80m budget shortfall, according to a damning report released today by a 14-strong committee of MPs.

The report claims that poor leadership and “lamentable” decision-making have damaged the UK’s international scientific reputation and exposed serious deficiencies within STFC’s senior management, with chief executive Keith Mason coming in for particular criticism.

It says that Mason’s explanation for axing support for solar-terrestrial physics as a result of the budget shortfall is “inaccurate, unconvincing and unacceptable”. The MPs also question his ability to retain the confidence of the scientific community as well as to carry through the substantial changes required to give the STFC “the leadership it desperately needs”.

Expressing dismay that STFC has been attempting to play down the effects of the cuts on the grounds that reductions in future grants are not problematic, the report calls for the council to postpone any cuts until the government-appointed “Wakeham review” of physics is published in September.

It is very rare and rather sad that an individual or organization comes under such strong criticism Phil Willis MP

“It is very rare and rather sad that an individual or organization comes under such strong criticism in a report like this,” committee chairman Phil Willis MP told physicsworld.com. “So serious is the damage that appears to have been done to STFC that committee members took nearly four hours to agree on the wording.”

Mason, who declined to be interviewed, released the following statement: “I intend for STFC to look forward, though we will take account of some areas where we could have done better. I would hope that the difficulties will not overshadow the considerable advances and successes that we have achieved during our first 12 months.”

Failures on all sides

The 55-page report, published by the Innovation, Universities, Science and Skills committee of the House of Commons, is based on evidence provided by STFC bosses, civil servants and physicists at three hearings earlier this year.

The report is also critical of the government for its role in creating the funding shortfall. It claims that the Department for Innovation, Universities and Skills (DIUS) “should have known” of the £80m black hole when STFC was created in April 2007 by merging the Particle Physics and Astronomy Research Council (which awarded research grants) with the CCLRC, which built scientific facilities.

The merger meant that STFC would inherit the future operating costs of the ISIS neutron source and the brand new Diamond synchrotron at the Rutherford Lab in Oxfordshire, which amounted to £25m per annum that the former CCLRC did not have.

As well as call upon the government to honour its original commitment to leave “no legacy funding issues” from the merger, the report states that the formation of STFC was “untimely and poorly conceived”, given its proximity to the last autumn’s Comprehensive Spending Review (CSR), which set the UK’s science budget for the period 2008-11.

Full economic cost

Although the MPs welcome the 17.5% rise in the overall science budget in the CSR, they point out that the increase does not fully cover the requirement of research councils to pay a much bigger fraction of the “full economic cost” of research projects. Given that large parts of the budget are also tied to “cross council” programmes that largely follow a government agenda, the MPs are concerned that the government has failed to protect the research base and that it might be tinkering too much in decisions best left to scientists — violating the cherished “Haldane” principle.

I intend for STFC to look forward, though we will take account of some areas where we could have done betterKeith Mason, STFC chief executive

The STFC-funded Daresbury laboratory in Cheshire is a case in point. While the government wants to develop the site into a world-leading “science and innovation campus” by teaming up with the private sector, the funding cuts — which will see 80 staff lose their jobs this year alone — could leave it without a major accelerator facility capable of generating the necessary scientific critical mass. “This report is calling for a fundamental reform not only in STFC but in terms of its relationship with government,” says Willis.

But weaknesses in STFC’s peer-review system, communications and management have only exacerbated the funding shortfall, the report claims. Initial STFC programmes to be axed in December when the £80m hole appeared were UK participation in the International Linear Collider (ILC) and the Gemini observatory plus all ground-based solar terrestrial physics research.

According to the report, the fact that these decisions were made with little or no consultation cost the STFC the trust of the scientific community, and the way the ILC and Gemini decisions have played out make the UK look like an “unreliable” and “incompetent” international partner, respectively.

All 100 or so STFC programmes have since been ranked to help target the cuts and, following strong opposition by many particle physicists and astonomers, this ranking is currently under review with final cuts due to be announced in July. The MP’s report expresses “grave” concerns about the effect of such cuts on renowned institutions such as Jodrell Bank, and states that the decision to de-prioritize two of the smaller (ALICE and LHCb) experiments at the Large Hadron Collider at CERN — where the UK has been a major and hitherto consistent partner — just as they are about to deliver results is a cause of “consternation and embarrassment”. It is this “lower priority” category in which Daresbury’s accelerator and laser-science project ALICE finds itself.

Management shortcomings

Not confident that STFC’s problems can be solved by rearranging the responsibilities of existing staff, the MP’s report recommends that the council urgently appoints a permanent communications director.

Peter Main of the Institute of Physics, who gave evidence in the first select committee hearing, welcomes the more consultative approach recently adopted by STFC and is also pleased that the report has supported the Institute’s call for a moratorium on cuts. “It is now up to DIUS to clarify whether the Wakeham review will address the issues affecting STFC, and for DIUS and STFC to agree arrangements which will allow substantive changes to be delayed,” he says.

But Main points to a significant omission in the report: the fact that funding for particle physics and astronomy is exposed to short-term international currency fluctuations. “This has compounded the funding issues for STFC because a significant proportion of their money is tied up in international partnerships,” he says. Indeed, with the UK’s annual subscription to CERN taking up the bulk of these costs, and the pound declining in value against the Swiss Franc, it seems STFC could face another black-hole in its budget before the year is out.

Computing with Playstations

Mueller.jpg

With $5000 of his research grant left burning a hole in his wallet earlier this year, Frank Mueller, a computer scientist from North Carolina State University, decided to hit the shops and buy eight Playstation 3 games consoles. Not for pleasure, you understand — no, Mueller figured that with eight Playstations strapped together he could create a modestly powered supercomputer. “The cost for performance is unbeatable,” he says.

Mueller’s “cluster” of games consoles doesn’t quite break into the TOP500 list of supercomputers. But if he had another $4m lying about, he reckons he could string together 10,000 Playstations to make the fastest supercomputer in the world.

So, could physicists use Playstations to save a few trips to Blue Gene? “Yes,” says Mueller, “if they are willing to substantially rewrite their most computer-intensive code portions in a non-standard API.” That’s the Application Programming Interface, for those of you who don’t know.

On his website, Mueller notes that he uses his Playstation cluster for “educational purposes”.

Tabloid climate change

You don’t need a research paper to tell you that tabloid newspapers aren’t the best source of scientific information.

Or do you? Maxwell Boykoff and Maria Mansfield from the University of Oxford, UK, seem to think so. In a paper published today in Environmental Research Letters, they have surveyed nearly 1000 articles dating back seven years from the UK’s most-read tabloids: the Daily Mail, the Sun, the Express and the Mirror. It seems that around a quarter of the articles have strayed from scientific consensus — that is, that anthropogenic greenhouse gases are “very likely” to be causing the observed global warming over the past half century.

This conclusion is buttressed by interviews with journalists and editors, as well as examples of dodgy environmental reporting. Here’s a few to whet your appetite:

“Experts are still arguing over whether [global warming] is a natural phenomenon, or the effect of industrial societies releasing heat-trapping gases into the atmosphere…” (Ivor Key of the Express)

“It seems that the most significant global warming is caused by the hotheads who are anxious to believe their own propaganda.” (Commentary in the Mail on Sunday)

“This confirms what I have been saying for years — cars do not cause global warming. Now we learn that all along it was bloody sheep and cows.” (Jermey Clarkson, motoring journalist and regular aristarch of environmentalists, commentating in the Sun after learning that methane emissions from cattle are significant in global warming)

There’s an interview with Boykoff on our sister website, environmentalresearchweb.

Physicists quantify the ‘coefficient of inefficiency’

Like many physicists, Stefan Thurner doesn’t like spending his time in long meetings. But after his employer — the Medical University of Vienna — underwent a major restructuring several years ago, he found that the time he had to devote to committees and other administrative tasks increased fivefold.

To understand why, Thurner and fellow physicists Peter Klimek and Rudolf Hanel turned to the British historian C Northcote Parkinson, who studied how the British Navy was once administered. Parkinson, who died in 1993, discovered a strong correlation between a committee’s ability to make a good decision, and its size. In particular, Parkinson found that committees with more than about 20 members are much more ineffectual at making decisions than smaller groups — something he dubbed the “coefficient of inefficiency”.

While many organizations are aware of the 20 person rule, Thurner and colleagues had not been able to find any reference to a mathematical explanation of the coefficient. So they set out to first empirically verify Parkinson’s law and then develop a mathematical model to describe it (arXiv:0804.2202v1 ) .

Governance Indicators

To do so the team looked at the governing cabinets of 197 countries worldwide. These bodies contain anywhere from five members (Liechtenstein and Monaco) to 54 members (Sri Lanka). The effectiveness of each cabinet was gauged using four parameters: the UN’s Human Development Indicator (HDI) — which assesses the health, wealth and education of people in a country — and three measurements used by the World Bank to calculate its Worldwide Governance Indicator.

The team found a very strong negative linear relationship between cabinet size and each of the indicators. In other words, countries with larger cabinets tended to have poorer HDIs than those with smaller cabinets, for example. According to Thurner, the statistical significance of their finding is orders of magnitude better than what is usually considered a good result in the social sciences.

The team also discovered that for all four indicators, most countries with above average scores have less than 20 cabinet members, which appears to support Parkinson’s law.

Cultural factors

There were important exceptions however. For example, Australia, Canada and New Zealand – which have high governance scores — have cabinets with 27, 32 and 27 members respectively. These three countries have similar political systems and cultures and Thurner hopes to collaborate with social scientists to understand if such factors affect the behaviour of committees.

Once they had verified Parkinson’s observations, the team developed a mathematical model to help them understand why 20 is a special number. Each cabinet member is defined as a node in a network. The state of a node (for or against an issue) can be influenced strongly by a subset of other members. This subset could represent a faction in a cabinet such as a political party. A node can also be influenced weakly by other nodes outside of its subset — reflecting the fact that a member can also be swayed by the opinions of a more distant colleague.

The dynamics of a cabinet with a fixed number of members was simulated by starting the model with each node in a specific state. The state of a node is then flipped if enough of its influencers are in the opposite state. This process is repeated many times until the system settles into a stable configuration of coalitions of “fors” and “againsts”. This process is repeated for every possible initial configuration and all the outcomes are used to define a “ dissensus” parameter that quantifies the inability of a cabinet to reach a majority consensus. The larger the dissensus, the less able a cabinet is to reach a consensus.

The team used this technique to study virtual cabinets with 5-35 members and found that the dissensus parameter climbed steadily with increasing cabinet size until 20 members — beyond which the rate of increase slows by a factor of two. This seems to indicate that the dynamics of the cabinet change just where, and how, Parkinson predicted.

Multiple factions

Thurner and his colleagues believe that this change occurs at the point where a cabinet can support multiple independent factions — something that could impair its ability to make good decisions.

Thurner hopes that the team’s research could help committee-driven organizations such as the European Union create effective decision making bodies. This will become more difficult as the EU admits more members (there are currently 27). Indeed, the EU is considering reducing the number of commissioners on its executive council from 27 to 18, to avoid the curse of Parkinson’s coefficient of inefficiency.

Synchrotron proves Europeans were not the first painters to use oils

Europeans are often a little too eager to take credit for innovation. Copernicus may have formalized the heliocentric model of the solar system in the early 1500s, for example, but the Pole only did so with the help of vast tables of astronomical measurements taken 200 years earlier in Iran. Even the scientific method itself, often thought to have emerged from Galileo’s experiments in Italy around the same time, has its roots with Arab scientists of the 11th century.

Similar lapses of history occur in the art world. Many still think of oil painting as a European invention of the early Renaissance, perfected by the 15th century Flemish painter Jan van Eyck, who supposedly stumbled across the medium while experimenting with glazes. But they too are mistaken.

In Afghanistan we have a non-European example of oil painting, which supports a far more internationalist story of art Jenny Graham, University of Plymouth

“A whole mythology sprang up around van Eyck’s so-called invention of oil painting,” explains Jenny Graham, an art historian from the University of Plymouth, UK, and author of the recent book Inventing Van Eyck. “But it has long been recognised that oil painting was documented in the 12th century or even earlier and may have originated outside Europe.”

Art historians have always lacked real examples to bear out this documentary evidence. Now, however, scientists performing experiments at the European Synchrotron Radiation Facility (ESRF) on samples of murals taken from Afghanistan say they have uncovered what could be the earliest known examples of oil paintings.

Seventh century art

The Afghan murals were discovered back in 2001 after Taliban fighters demolished two sandstone Buddha statues, each around 15 storeys tall, in the highland town of Bamyan. Behind the rubble was the entrance to a network of some 50 caves where the murals had been painted. They were dated to the mid-7th century, more than seven centuries before the Renaissance.

Yoko Taniguchi of the National Research Institute for Cultural Properties in Tokyo first looked at the paintings three years ago, and noticed what appeared to be a shrunken film on the surface. “I thought that it could be oil, but since it was not a major material [used in the Afghan region], I did not really consider it,” she says. Taniguchi decided to take some small samples to Grenoble, France, where she could work with Marine Cotte and colleagues at the ESRF.

The ESRF provides synchrotron light with a high brightness and wavelengths from infrared to X-rays, which means Cotte’s team were able to use three different imaging techniques to study the samples. Micro X-ray fluorescence and micro X-ray diffraction could penetrate deep into the samples to discern the composition of the pigments. But it was using micro Fourier-transform infrared spectroscopy, which provides spectra for separate layers in the samples, that the researchers could discover the signatures of carbon–hydrogen and carbon–oxygen bonds. These bonds indicated that the pigments must have been bound with oil (J. Anal. At. Spectrom. doi: 10.1039/b801358f).

“We were very fortunate that analytical techniques using synchrotron radiation made it possible to analyse layer-by-layer at the micro level,” says Taniguchi. “If we could analyse samples from other areas — such as west-Asian and Mediterranean regions — we may find similar examples.”

We were very fortunate that analytical techniques using synchrotron radiation made it possible to analyse layer-by-layer at the micro level Yoko Taniguchi, National Research Institute for Cultural Properties, Tokyo

Binding pigments

Aside from supporting the idea that oil painting may have been known to non-Western cultures before it was practiced in Europe, it could shift our understanding of when oils were first used to bind pigments, rather than to simply glaze a piece made with other materials. The medical writer Aetius described the use of drying oils as a varnish in connection with artists in the 6th century, but it was not until the 12th century, with the writings of the German monk Theophilus, that more concrete references were made to the mixing of oil with pigment to make paint.

“The significance of this find for art historians,” explains Graham, “rests on the distinction between glazing with oil as described by Aetius, and what we seem to have here, genuine oil painting, where the pigment itself is mixed with an oily binder, a practice usually dated to around the 12th century. So in Afghanistan, we not only have real rather than documentary evidence of one of the earliest instances of oil painting, we have a non-European example which supports a far more internationalist story of art.”

‘Buckypaper’ stretches in a strange way

When most materials are stretched in one direction, they get thinner in the other directions — like a rubber band. However, researchers in the US have discovered that some types of buckypaper — sheets made of woven carbon nanotubes — increase in width when they are stretched. This unexpected property could come in handy for making composite materials, artificial muscles, gaskets and sensors, say the researchers.

Conventional stretching is quantified by a positive Poisson’s ratio (the percent lateral contraction to the percent applied stretch). However, in the past hundred years or so physicists have become aware of a small but growing number of materials with negative Poisson’s ratios. These “auxetic” materials get wider when stretched and include some rocks and living tissue. The structure of these materials tends to include scissor-like struts much like a collapsible wine rack.

Now Ray Baughman and colleagues at the University of Texas at Dallas along with researchers in Brazil have discovered that some types of buckypaper are auxetic (Science 320 504).

The team made their buckypaper in the same way that ordinary writing paper is made — by drying a slurry of nanotube fibres. Carbon nanotubes are sheets of carbon one atom thick that are rolled up into tubes that are only several nanometres in diameter.

SWNTs and MWNTs

The slurry consists of a mixture of carbon single-walled nanotubes (SWNTs) and multiwalled nanotubes (MWNTs). A MWNT comprises several concentric nanotubes. The researchers found that increasing the amount of MWNTs in the paper produces a sharp transition from a positive Poisson’s ratio of about 0.06 to a much larger negative value of around -0.20.

According to the researchers, this transition can be understood using a “wine-rack” model of buckypaper. If two neighbouring nanotube layers are coupled like the struts in a compressible wine rack, the Poisson’s ratio is positive and the rack becomes narrower when stretched. In contrast, if the rack is blocked so that it can no longer be collapsed (but the struts are stretchable), increases in strut length produce a negative Poisson’s ratio.

The team found that the nanotube sheets containing both single-walled and multiwalled nanotubes had a 1.6 times higher strength-to-weight ratio, 1.4 times higher modulus-to-weight ratio and a 2.4 times higher toughness than sheets made of just SWNTs or MWNTs alone. That the properties of buckypaper can be enhanced by mixing nanotube types could also apply to other nanotube arrays, like nanotube yarns, say the researchers.

Artificial muscles

The ability to tune the Poisson’s ratio could be exploited for creating special sheets that wrap around objects with concave, convex, or saddle-shaped surfaces – something that could be used to make structures with a wide range of shapes. The team also believe that tuned materials could be used to make gaskets, artificial muscles and stress/strain sensors in which exposure to certain chemicals induces mechanical stress.

“By choosing a suitable ratio of SWNTs and MWNTs, the Poisson ratio can also be adjusted to zero, which is useful for designing cantilevers for sensing that do not distort in the width direction during bending,” explained Baughman. This he claims, could improve the sensitivity of such sensors.

Astronomers get clear view of blazar

Researchers have obtained the clearest picture yet of the physics behind one of the most spectacular processes in the universe — huge bursts of light called blazars that are created as supermassive black holes suck in surrounding matter.

Using data from several different telescopes, the researchers identified a sequence of events whereby charged particles are accelerated and focussed by magnetic fields into a “jet” that blasts into space at 99% of the speed of light, creating a burst of light. Their findings appear to back up the existing theory of how jets form.

Lurking at the heart of many galaxies are supermassive black holes — hundreds of millions times heavier than the Sun — that pull nearby material into a relatively thin “accretion disk” that spirals inwards. As it does so, the material becomes very hot, forming a plasma of charged particles that creates magnetic field lines perpendicular to the plane of the disk.

Twisted nozzle

Astrophysicists believe that these magnetic field lines are twisted by the rotation of the disk to form a tight nozzle-like structure that points out from the centre of the accretion disk. The pressure within this nozzle is expected to be greatest closer to the black hole, and physicists believe this pressure difference along the nozzle drives some plasma away from accretion disk, creating a jet pointing out into space.

Blazar outbursts are thought to begin with a violent explosion near to the black hole, which sends a shockwave of energy through the nozzle. The twisted magnetic field lines are then thought to accelerate the shockwave and focus it into a tight “knot”. Because of the twisted nature of the magnetic field lines, the theory also predicts that the particles caught up in the knot will follow a spiral path as they are ejected.

Now, Alan Marscher and colleagues Boston University in the US along with an international group of collaborators are the first to catch a glimpse of this process. The team focused on a blazar at the centre of BL Lacertae, a galaxy some 950 million light years from Earth (Nature 452 966) . The blazar jet points almost directly at Earth and it fires intense bursts of radiation that last for several days and occur about once or twice a year.

Velocity measurement

The team focused on an outburst that occurred in late 2005. When the outburst emerged from the nozzle, its movement through space was tracked using the Very Long Baseline Array (VLBA) — which comprises ten radio telescopes spread across the US. This allowed the team to determine the velocity of the knot.

The behaviour of the outburst within the nozzle was studied using a number of optical, X-ray and gamma-ray telescopes — both ground-based and in space.

Using optical telescopes, the team saw a burst of light they believe was created when the knot reaches the point of maximum velocity in the nozzle. The presence of very high energy gamma rays in this burst suggests that at this point in time the plasma was being accelerated to velocities approaching the speed of light.

The time delay between the optical burst inside the nozzle and the radio burst outside the nozzle was measured. This was then multiplied by the velocity of the knot to get an idea of the length of the nozzle — which the team estimate to be about 1013 km or about 10,000 times the distance between the Earth and the Sun.

Rotating polarization

The team also found that the polarization of the light emitted from the knot rotated as the knot travelled through the nozzle. Mark Birkinshaw — an astrophysicists at the University of Bristol, UK, who was not involved in the research — told physicsworld.com that this is “consistent with the concept of a shocked structure moving along a helical magnetic field”. He added that Marscher’s findings “fit in nicely with most theories”.

According to Marscher, the team also gained some insight into what happens to the knot once it emerges from the nozzle. At this point, the team detected a second burst of high-energy gamma radiation, which they believe is related to the onset of turbulence in the flow of the plasma, which generates a great deal of heat and high-energy gamma rays.

One important question that the team was not able to answer however, is why the initial explosion occurs.

Marscher is now looking forward to gaining a better understanding of the gamma rays coming from blazars using NASA’s Gamma-ray Large Area Space Telescope (GLAST), which is expected to launch in May 2008. Further in the future a new radio telescope to be built in Japan in 2012 could be used in conjunction with the VLBA and other radio telescopes to peer more deeply into the nozzle of the jet.

Phonons fail to explain high-temperature superconductivity

Two independent teams have hammered what could be the final nails in the coffin of a mechanism long thought able to explain high-temperature superconductivity.

Since compounds of copper oxide or “cuprates” were first discovered to exhibit superconductivity at temperatures well above absolute zero in the mid 1980s, physicists have tried to explain how these materials behave by adopting the approach used to understand their low-temperature counterparts. This would mean that the signature of superconductivity — a flow of charge with zero resistance — results from electrons interacting with vibrations in the cuprate’s crystal lattice.

It’s a politically charged problem, a little like trying to find the tomb of Jesus Tom Timusk, McMaster University

In recent years, despite mounting experimental evidence against it, some physicists have clung on to this interpretation. But now teams from Germany and the US have performed calculations to suggest that lattice vibrations in cuprates can at best account for just a small fraction of the materials’ superconducting behaviour.

“It’s a politically charged problem, a little like trying to find the tomb of Jesus,” says Tom Timusk of McMaster University in Canada, who is not involved in the research. “An atheist could not care less, nor would a Muslim or a Buddhist be excited, but for believers it would be a monumental discovery. So it is with the phonon mechanism of [high-temperature] superconductivity.”

From low to high

Low-temperature superconductors, which make a transition to the superconducting phase close to absolute zero, have been well described by Bardeen–Cooper–Schrieffer (BCS) theory for 50 years. In this theory, the natural repulsion between two electrons is overpowered by a lattice vibration, known as a phonon, which binds the electrons into a “Cooper pair”. It turns out that at low temperatures an electron can avoid the effect of any resistance by being with its partner, so the Cooper pair does not experience resistance at all.

BCS theory itself cannot explain how electrons pair up in high-temperature superconductors, but physicists — notably Alex Müller, who shared the 1987 Nobel Prize with Georg Bednorz for discovering the materials — have suggested that it might still be some process based on phonons. Although many techniques including neutron scattering and far-infrared spectroscopy have not agreed, in 2001 separate teams performing photoemission spectroscopy found evidence to back up the phonon interpretation.

They found that, after shining light at different angles onto cuprates, electrons emitted by photoionization had an energy–momentum relationship containing a prominent “kink” between 50 and 80 meV. Such a kink implied that the electrons were interacting in the cuprates with some sort of boson (an integer-spin particle) — though it was not clear whether that boson could be a phonon.

Now Dirk Manske and colleagues at the Institute for Solid-State Physics in Karlsruhe and the Max–Planck Institute for Solid-State Research in Stuttgart have calculated how the energy–momentum relationship should look from first-principles calculations to see to what extent phonons can be the cause of the kink. They used a “local density approximation” to calculate the number of energy states available to electrons in the high-temperature superconductor YBa2Cu3O7, which they combined with a calculation of how the electrons should change in energy due to a phonon interaction.

It’s embarrassing for people to admit they have worked on something for 20 years if it is not true Dirk Manske, Institute for Solid-State Physics, Karlsruhe

Small kink

Manske’s team found that the theoretical energy–momentum relationship produced by these calculations did contain a kink — but about a three to five times smaller than the 2001 observations (Phys. Rev. Lett. 100 137001). This is bad news for physicists who have been hoping phonons can account for all of the behaviour of high-temperature superconductors. “It is embarrassing for people to admit they have worked on something for 20 years if it is not true,” jokes Manske.

Meanwhile, Steven Louie and colleagues at the Univerisity of California in Berkeley have come to a similar conclusion with the cuprate LaSrCuO4. From their calculations, the phonon contribution is almost an order of magnitude too small for the observed kink (Nature 452 975).

Although phonons are now effectively ruled-out as the underlying cause of high-temperature superconductivity in cuprates, researchers still have the task of finding another process that works. One possibility is that the electrons pair up via interactions between their spins. Such interactions occur over much shorter ranges than phonons, and in fact there has already been significant evidence to support this notion.

“There are many experimental techniques being used to interrogate the high-temperature superconductors,” says Louie. “With increasingly better samples and advances in experimental techniques, data from these measurements are revealing more and more detailed, systematic results. The correct theory of superconductivity for the high-temperature cuprates will have to consistently explain all these data.”

Dark matter claims disputed

Physicists remain sceptical about claims made last week that dark matter has been detected at a laboratory in Italy, insisting that the data need to be backed up by further experiments. The Italian team, which made the claims at a meeting in Venice, says it has strong evidence that dark matter takes the form of weakly interacting particles.

Dark matter was originally put forward to explain why galaxies appear to contain just a fraction of the mass they need to rotate at their observed speeds. Nobody has yet managed to see it directly, although astrophysical measurements imply that it should make up around 95% of the total matter in the universe, mostly in “haloes” around the centre of galaxies.

No-one disputes that the DAMA team observes an annual modulation, but the community needs to be persuaded that dark matter has anything to do with it Henrique Araujo, ZEPLIN-III experiment

In 2000 physicists working at the DAMA experiment at the Gran Sasso laboratory in Italy said they had discovered what many believe is the most promising candidate for dark matter: a weakly interacting massive particle, or “WIMP”. Such a particle would not interact with light, although it would interact with normal matter through the weak nuclear force. In fact, it might interact enough to occasionally collide with a nucleus inside some 100 kg of sodium-iodide, which the DAMA team kept in an array of detectors buried more than a kilometre underground.

The DAMA team found that flashes of light produced by recoiling nuclei inside their detectors tended to be more frequent in July than in December. This seasonal modulation fitted with the prediction that the Earth should switch its movement to be with or against the prevailing “wind” of WIMPs as it orbits the Sun, which is itself moving through our galaxy’s dark-matter halo. After analysing their results, they claimed this modulation pointed to a WIMP in the mass range 44 to 62 GeV — at the lighter end of the scale of those predicted by theory — and at a confidence level of 6.3 σ (or 99.9999998% certainty).

At the time many physicists suspected that the seasonal modulation could be the result of something more mundane, such as a change in ambient temperature. So, over the past eight years, the team have upgraded the amount of sodium iodide in their detector array to 250 kg. Last week they claimed that this upgrade, branded as “DAMA/LIBRA”, has enabled them to confirm the existence of the WIMP with an improved confidence of 8.2 σ.

Even with this level of certainty in the modulation, however, many remain unconvinced. “The DAMA announcement has had a mixed reception,” says Henrique Araujo of ZEPLIN-III, another WIMP experiment. “No-one disputes that they observe an annual modulation, but the community needs to be persuaded that dark matter has anything to do with it.”

Agrees with predictions

The DAMA team insist that their signature matches the dark-matter signature that was predicted in the mid 1980s by theorist Katherine Freese. Rita Bernabei, a spokesperson for the DAMA team, says: “The former DAMA experiment over seven annual cycles and the new DAMA/LIBRA experiment over four annual cycles show, with high confidence level, agreement with all the features expected for the presence of dark-matter particles in the galactic halo.”

Other physicists point out other potential causes for the modulation. Any neutral particle can scatter off a nucleus to make it recoil, so it is possible that the modulation signature is caused by neutrons from cosmic rays. However, several independent experiments have failed to find the same rate of bombardments by cosmic-ray neutrons in the DAMA experiment’s energy range (2–6 keV), which goes against this interpretation.

Another possibility is that the DAMA team are actually observing flashes of light produced by particles scattering off atomic electrons, rather than nuclei. “ZEPLIN-III might be able to detect an electron-recoil modulation at the rates and energies reported by DAMA, but for that we need a longer run,” says Araujo.

A third possibility is that the recoils are being produced by a hypothetical particle known as an axion. These light particles, which were conjured to resolve the “strong-CP” problem in the theory of the strong force in the late 1970s, have been suggested as candidates for dark matter before, but have so far evaded all attempts to be found. An axion fit to explain the modulation in the DAMA experiment would need a mass around 3 keV.

The DAMA experiments show agreement with all the features expected for the presence of dark-matter particles in the galactic halo Rita Bernabei, DAMA experiment

New physics?

Even if none of these alternative interactions is producing the seasonal modulation seen at the DAMA experiment, there is still a question as to why other direct WIMP searches — such as ZEPLIN-III in the UK, CDMS-II and COUPP in the US, and XENON10 in Italy — have not recorded similar results. In the DAMA experiment’s energy range, these experiments have found no evidence for coherent scattering of nuclei, which is the WIMP-scattering process favoured by theory. “For the DAMA/LIBRA result be a true dark-matter signal requires a new interaction process, or other new physics,” says Alexander Murphy, also at the ZEPLIN-III experiment. “This would be extraordinary! The particle-physics theorists have predicted many scenarios, but not this.”

“If someone has detected dark matter, it’s clearly a Nobel-prize-winning experiment,” says Richard Gaitskell, who works on several direct WIMP searches. Gaitskell also notes that other interactions could be producing the signature seen at DAMA, but feels that the DAMA team has not yet been rigorous enough in calibrating their detector system. “What would be best would be if they could switch off their beam, like you can in a particle detector,” he explains. If such a thing were possible they would be able to see if the seasonal modulation is caused by anything else. “Unfortunately, you can’t switch off WIMPs,” he adds.

Bernabei told physicsworld.com that she and the rest of the DAMA team are confident they have performed all the necessary calibrations. But Gaitskell says it would have been better if the DAMA team performed a running calibration while they were taking experimental data by creating their own light flashes and looking for signs of seasonal modulation. “If I could have one thing for Christmas it would be to see clear calibration data showing the necessary stability,” he says.

• Preprints including the DAMA/LIBRA results can be found at arXiv:0804.2738 and arXiv:0804.2741.

Graphene transistors cut from ribbons into dots

Researchers at Manchester University in the UK have made the first transistors from graphene quantum dots, suggesting that graphene is a promising replacement for silicon in the next-generation of electronic devices.

Graphene is a two-dimensional sheet of carbon just one atom thick, and is usually made by cleaving small crystals of graphite. At a molecular level it looks like a sheet of chicken wire — a continuous spread of joined-up benzene rings.

Because of its unusual physical properties, graphene is often touted to replace silicon as the electronic material of choice. These include the fact that electrons in the material behave like relativistic particles, which have no rest mass and can therefore travel at speeds of around 106 m/s. “The good thing is that these properties remain when we scale graphene devices down to a few benzene rings, which is what one needs for top-down molecular electronics,” team member Kostya Novoselov told physicsworld.com.

Until now, researchers have only been able to make transistors from ribbons of graphene. But such a long shape does not maximize the conductivity, which is why Novoselov and colleagues cut the ribbons back into sizes that can quantum confine electrons. They do this using a combination of electron beam lithography and reactive plasma etching to carve small islands out of large graphene sheets (Science 320 356). “We have demonstrated a proof of concept — that it is possible to create a transistor based on graphene quantum dots by standard technological procedures,” says Novoselov. “Furthermore, the device will be able to operate at room temperature.”

Andre Geim, another member of the team, says they can now make reproducible transistors with features as small as 10 nm, which should reduce to 1 nm in the future. “It is molecular electronics using the top-down approach. No other material allows this approach for making structures smaller than 100 nm, which is the dimension needed for operating single-electron transistors at room temperature.”

Jie Chen of the University of Alberta in Canada, whose team also works on making electronic devices from graphene, is impressed at how quickly Novoselov, Geim and colleagues are making developments with graphene. “They are the global leaders in this field,” he says.

Copyright © 2025 by IOP Publishing Ltd and individual contributors