Skip to main content

No extra cash for UK physics

Any remaining hopes that the UK government might plug an £80m hole in the nation’s physics-research budget were dashed yesterday. In a 27-page response to a parliamentary select committee inquiry that had criticized the way the government and the Science and Technology Facilities Council (STFC) handled the 2008–2011 science budget allocations, the government’s Department of Innovation, Universities and Skills (DIUS) says it is “simply wrong” to suggest that the STFC budget was cut by the government in the first place.

Instead, the government has reiterated its claim that the STFC’s budget will increase by £185m (about 13.6%) over the three-year period, adding that the £80m figure was derived from “STFC aspirations” drawn up before receiving its budget. “There are no plans to move money around,” a DIUS spokesperson told physicsworld.com.

It is a nonsense to say that the way the STFC announced we were pulling out of the ILC has not damaged our international standing Brian Foster, Oxford University

Funding fiasco

In April this year a 14-strong panel of MPs on the Innovation Universities, Science and Skills Select Committee came down heavily on the government and the STFC for their roles in the funding fiasco, which emerged late last year. In addition to prompting the STFC to announce it was pulling out of the International Linear Collider (ILC), the Gemini telescope and all ground-based solar terrestrial physics projects, research grants are expected to be slashed by up to 25% over the next three years as a direct result of the shortfall. This could cost tens if not hundreds of jobs and threaten observatories and experiments such as those at the Large Hadron Collider (LHC) at CERN.

The MP’s report apportioned some of the blame to the way STFC was created in April 2007 by merging the Particle Physics and Astronomy Research Council (which awarded research grants) with the CCLRC (which managed scientific facilities). It claimed DIUS should have known that STFC would inherit the future operating costs of the ISIS neutron source and the brand new Diamond synchrotron at the Rutherford Appleton Lab in Oxfordshire, which amounted to £25m per year, and therefore should honour its original commitment to leave “no legacy funding issues” from the merger. But in its formal response to the select committee report, DIUS states that the STFC did not inherit a deficit from CCLRC and considers that the government “fully meets the commitment given at the time of STFC’s creation”.

International reputation

The government also believes that the UK remains a reliable international partner despite the committee’s findings that pulling out of the ILC and Gemini had made it look “unreliable” and “incompetent”. But European leader of the ILC project Brian Foster of Oxford University says the response is a non sequitur. “Like much of its response to the Select Committee, the government view regarding the ILC is actually very measured,” he told physicsworld.com. “But it is a nonsense to say that the way the STFC announced we were pulling out of the ILC has not damaged our international standing.”

On the particle physics side, it looks like the consultation panel did an excellent job Tim Gershon, University of Warwick

The report acknowledges that communication between STFC and its community and between government and the STFC could be improved, and says the government is working closely with STFC on the lessons learnt from the allocations process. Although it says that changes to the leadership of the STFC (which received damning criticism in the committee report) would only be “disruptive”, the council has agreed to ministers’ request to conduct an independent organisational review to assess its effectiveness. This is due to report in September, along with another government-appointed review of UK physics in general chaired by Bill Wakeham of Southampton University.

In the mean time, the STFC has completed a major consultation with the scientific community to help determine which projects should take priority in the face of inevitable cuts. The original “programmatic review” announced in February caused uproar, with the ranking procedure (which was carried out largely behind closed doors by two panels of around 10 researchers: PPAN for particle physics, nuclear physics and astronomy; and PALS for all other STFC programs) leaving many physicists baffled.

In particular, the LHCb experiment — one of four giant detectors around the LHC ring about to start taking data and which has a large UK contingent — found itself in the “medium–low” category, while another LHC experiment, ALICE, and the e-Merlin radio–telescope were classed in the “lower” category. The latter led to widespread concern about the future of the Jodrell Bank observatory.

Specialist panels

Some 1400 letters and emails sent in response to the review have now been digested by 10 specialist panels, and last week the STFC posted PPAN’s and PALS’s revised rankings on its website. The effort mostly seems to have paid off.

For each of the projects in the review, PPAN worked out an “alpha grade” where alpha 5 is the highest and alpha 1 means the project is unlikely to be funded. Overall the rankings have not changed that much, but significantly LHCb has been bumped up to alpha 4, ALICE to alpha 3, and e-Merlin notched up slightly to alpha 2. The MINOS neutrino experiment at Fermilab in the US has also been boosted to alpha 2, and the panel recommended that R&D for ILC detectors should be ramped down slowly in order to maximize their scientific value. The STFC executive expects to take a final decision at a meeting on July 1.

Overall there is still a lot of uncertainty about how much money physics departments are going to lose from existing grants Mike Green, Royal Holloway, University of London

“On the particle physics side, it looks like the consultation panel did an excellent job,” says Tim Gershon of the University of Warwick. “The only problems remain in areas where PPAN has decided not to accept its recommendations, such as the decision not to fund the BaBar experiment and to cut LHCb at 10% rather than the recommended 5%. This is hard to understand given that the consultation panel had worked through the costings [e.g. by shaving costs off other projects].”

Mike Green of Royal Holloway, University of London, who was on the particle physics panel, says that even though a 10% cut on LHCb is better than the original 25%, it still means that the UK may have to hand over some of its crucial detector responsibilities to another country.

“Overall there is still a lot of uncertainty about how much money physics departments are going to lose from existing grants, which is making it very difficult to plan ahead,” says Green, who is also a member of the Particle Physics Action Group set up in response to the funding crisis. “While we welcome the government’s and STFC’s commitment to improve communications and engagement, let us not forget that the government continues to insist that STFC received a 13.6% increase in its budget when this fails to recognize that a substantial fraction of this increase represents past investment. The future ‘near-cash’ increase in the STFC budget is 8%, which is lowest of all the research councils and will mostly be swallowed up to provide for full economic costing.”

NMR sheds new light on protein interactions

Biophysicists in Germany and the US have used nuclear magnetic resonance to gain an important new insight into how proteins interact.

The researchers were, for the first time, able to map out all the different shapes that the protein ubiquitin can take over a period of several microseconds. The team found that many of these shapes are nearly identical to those adopted by ubiquitin when it binds with other proteins — a discovery that suggests the prevailing theory of protein binding is incomplete, and could pave the way for new types of drugs (Science 320 1471).

All living things contain proteins and many biological processes involve these chain-like molecules binding with one another. Biophysicists know that a bound protein appears to have a different shape than its free counterpart — but understanding exactly how this change in shape occurs has proven very difficult.

For more than 50 years, most biophysicists subscribed to the “induced fit” theory, whereby a free protein is coaxed by its partner to undergo a gradual change in its shape during the binding process.

Many different shapes

However, biophysicists are beginning to realize that free proteins fluctuate between many different shapes in the absence of a binding partner — and it could be that binding simply occurs when a free protein spontaneously assumes the correct shape.

This theory of “conformational selection” had been hard to establish because standard techniques such as X-ray diffraction cannot identify the large number of short-lived shapes that a free protein can adopt.

Now, Bert de Groot and colleagues at the Max Planck Institute for Biophysical Chemistry in Goettingen and Vanderbilt University in Nashville have used nuclear magnetic resonance (NMR) to map out all the possible shapes of free ubiquitin, which is a protein found in many living cells.

The team used a relatively new NMR technique, which can follow the positions of atoms in a molecule on timescales up to several microseconds. This is much longer than traditional NMR, which can only detect motion on nanosecond timescales — not long enough to get a good look at all of the possible shapes the protein can adopt, according to De Groot.

Matching 46 bound structures

The team then compared this “structural ensemble” of free protein shapes with the 46 shapes that ubiquitin is known to adopt when bound within larger structures. They discovered that every one of the bound shapes also occurred in the free protein.

De Groot told physicsworld.com that the result implies that “no induced-fit motions are required for ubiquitin to adapt to its different binding partners”.

Writing in the same issue of Science, David Boehr and Peter Wright of the Scripps Institute in California point out that the NMR work does not necessarily mean that the induced fit theory is incorrect (Science 320 1429). Rather it is possible that both mechanisms are involved in the binding process. They also point out that de Groot’s results suggest that structural fluctuations could play an important role in how the function of a protein evolves over time.

De Groot added that the findings will shed light on a number of processes that involve protein binding including biochemical signalling processes, and receptor-ligand recognition. Both of these processes are important to those designing new drugs, and de Groot believes that the team’s work could “contribute to the design of novel drugs”.

Two new reactors for Canada

darlington.jpg

By Hamish Johnston

Summer can be a miserable time in Toronto. It can get very hot and humid, causing folks to turn up their air conditioning, which in turn puts the region’s coal-fired generating plants into overdrive blanketing the city in a sickly yellow smog that can harm those with breathing difficulties.

As a result, local utilities have begun to shut down aging coal-fired plants in an attempt to improve air quality — but leaving some wondering where the city and surrounding province of Ontario will get its electricity.

Now, Ontario’s Minister for Energy Gerry Phillips has given the go ahead for two new nuclear reactors to be built at the Darlington generating plant just east of the city, which is already home to four reactors. These are the first power reactors to be built in Canada in over 15 years.

According to the Toronto Star, the reactors will come online in 2018 and the design will be chosen in November from a short list of three firms: Atomic Energy of Canada; US-based Westinghouse; and Areva of France.

The reactors are expected to generate about 3200 MW of power, which will double Darlington’s current capacity.

The move is part of Ontario’s CDN$26 billion plan to maintain its current nuclear capacity of 14,000 MW through a series of upgrades over the next 20 years.

Darlington is located in a part of Ontario that has been hard hit by lay-offs in the automotive industry, so Phillips may be hoping that the promise of 3500 new jobs will offset the concerns of environmentalists.

A real gem on arXiv

By Hamish Johnston

I spent an hour or so this morning trawling through the arXiv preprint server, where many physicists post their research results before they are published formally. It’s a great way to keep up with the latest breakthroughs — and a good source of more controversial or off-beat stories.

That’s where I spotted this gem: “Growth of Diamond Films from Tequila” by Javier Morales, Miguel Apátiga and Victor M Castaño, who are physicists based in (you guessed it) Mexico.

It seems that the three physicists have used the famous spirit in their chemical vapour deposition (CVD) machine to create tiny diamonds.

Although the paper notes that diamonds have already been made by CVD using a number of other precursors, the trio suggest that tequila provides “an excellent alternative to produce industrial-scale diamond thin films for practical applications using low-cost precursors”.

It’s not clear from the paper why tequila was used rather than vodka, gin or whisky — and of course, if this paper was entitled “Growth of Diamond Films from Water and Ethanol”, I wouldn’t have given it a second thought.

Cold-fusion demonstration: an update

By Jon Cartwright

Several of you have asked when I’m going to give you an update on Yoshiaki Arata’s cold-fusion demonstration that took place at Osaka University, Japan, three weeks ago. I have not yet come across any other first-hand accounts, and the videos, which I believe were taken by people at the university, have still not surfaced.

However, you may have noticed that Jed Rothwell of the LENR library website has put some figures with explanations relating to Arata’s recent work online. I’ve decided to go over them and some others here briefly to give you an idea of how Arata’s cold-fusion experiments work. It’s a bit more technical than usual, so get your thinking hats on.

ColdFusionFig3.jpg

Above is a diagram of his apparatus. It comprises a stainless-steel cell (2) containing a sample, which for the case of the demonstration was palladium dispersed in zirconium oxide (ZrO2–Pd). Arata measures the temperature of the sample (Tin) using a thermocouple mounted through its centre, and the temperature of the cell wall (Ts) using a thermocouple attached on the outside.

Let’s have a look at how these two temperatures, Tin and Ts, change over the course of Arata’s experiments. The first graph below is one of the control experiments (performed in July last year) in which hydrogen, rather than deuterium, is injected into the cell via the controller- (8) operated valve (5):

ColdFusionFig2.jpg

At 50 minutes — after the cell has been baked and cooled to remove gas contamination — Arata begins injecting hydrogen into the cell. This generates heat, which Arata says is due to a chemical reaction, and the temperature of the sample, Tin (green line), rises to 61 °C. After 15 minutes the sample can apparently take no more hydrogen, and the sample temperature begins to drop.

Now let’s look at the next graph below, which is essentially the same experiment but with deuterium gas (performed in October last year):

ColdFusionFig1.jpg

As before, Arata injects the gas after 50 minutes, although it takes a little longer for the sample to become saturated, around 18 minutes. This time the sample temperature Tin (red line) rises to 71 °C.

At a quick glance the temperatures in both graphs, after saturation, appear to peter out as one would expect if heat escapes to the environment. However, in the case of deuterium there is always a significant temperature difference between Tin and Ts, indicating that the sample and cell are not reaching equilibrium. Moreover, after 300 minutes the Tin of the deuterium experiment is about 28 °C (4 °C warmer than ambient), while Tin/Ts of the hydrogen experiment is at about 25 °C (1 °C warmer than ambient).

These results imply there must be a source of heat from inside the cell. Arata claims that, given the large amount of power involved, this must be some form of fusion — what he prefers to call “solid fusion”. This can be described, he says, by the following equation:

D + D = 4He + heat

(According to this equation, there should be no neutrons produced as by-products — thanks to those of you who pointed this out on the last post.)

If any of you are still reading, this graph below is also worth a look:

ColdFusionFig4.jpg

Here, Arata also displays data from deuterium and hydrogen experiments, but starts recording temperatures after the previous graphs finished, at 300 minutes. There are four plots: (A) is a deuterium and ZrO2–Pd experiment, like the one just described; (B) is another deuterium experiment, this time with a different sample; (C) is a control experiment with hydrogen, again similar to the one just described; and (D) is ambient temperature.

You can see here that the hydrogen experiment (C) reaches ambient temperature quite soon, after around 500 minutes. However, both the deuterium experiments remain 1 °C or more than ambient for at least 3000 minutes while still exhibiting the temperature difference between the sample and the cell, Tin and Ts.

Could this apparently lasting power output be used as a energy source? Arata believes it is potentially more important to us than hot or “thermonuclear” fusion and notes that, unlike the latter, it does not emit any pollution at all.

Disorder puts the brakes on matter waves

Two independent teams of physicists have used ultracold atomic gases to demonstrate how a little bit of disorder can paralyse a quantum system — an effect called “Anderson localization”. Their experiments mark the first time that this phenomenon has been seen in matter, despite being predicted to occur in crystalline solids 50 years ago.

Although the experiments were done in 1D optical lattices with non-interacting atoms, the teams claim that their methods could be expanded to investigate 2D and 3D systems or to study the effects of interactions on localization. This could allow such systems to be used as “quantum simulators” to gain insight into real materials that are difficult to probe experimentally or simulate numerically.

Much of what we know about the electronic properties of metals and semiconductors is based on the idea that electrons with certain momenta can travel freely through a crystalline lattice, while others cannot. This is embodied in Felix Bloch’s 1928 quantum theory of conduction, which describes the lattice as a periodic electric potential through which some electrons (behaving as “matter waves”) diffract with ease.

Disordered lattice

Some 30 years later, Philip Anderson worked out what would happen in such a system if the potential lost its periodicity. This could happen, for example, if the lattice remained periodic, but the potential has a different value at each lattice site. He found that electrons would be unable to move through such a “disordered” lattice, and instead become trapped by specific atoms. This prediction of “Anderson localization” won him the 1977 Nobel Prize in Physics.

Although Anderson localization has been seen in a number of systems — including those based on light and microwaves — it has never been observed directly with electrons. This is because lattice vibrations and interactions between electrons that occur in real material washes out the subtle effects of localization.

Now, two independent groups claim to have seen Anderson localization in matter waves for the first time. Both teams saw the phenomenon in Bose–Einstein condensates (BECs) made from ultracold atomic gases. In both systems it was individual atoms — rather than electrons — that became localized. Instead of being in a solid crystal, the atoms were in an optical lattice created by overlapping laser beams.

Speckled pattern

In the first experiment, Alain Aspect and colleagues at the University of Paris-Sud and CNRS’s Institut d’optique began by confining about 17,000 rubidium atoms in a magnetic trap to make a BEC (Nature 453 891). Turning the fields off makes the atoms quickly leave the centre of the trap. But if the BEC is placed in a lattice created by a laser “speckle” pattern — a random distribution of light and dark regions that is created when a laser beam is reflected from a rough surface — the atoms stay put when the magnetic fields are turned off. The team claims this effect is caused by Anderson localization.

Phllippe Bouyer at Paris-Sud told physicsworld.com that the team is now looking at how to repeat their experiment in 3D — something that will require a major redesign to overcome (among other things) the effect of gravity pulling down on the atoms.

Meanwhile in Italy, Massimo Inguscio and colleagues at the University of Florence began with a 1D optical lattice that is a standing wave created by an intense laser beam (Nature 453 895). If a second, weaker laser with a different wavelength is shone into the lattice, the two beams interfere to create a 1D “quasicrystal” with a certain amount of disorder. The degree of disorder in the quasicrystal can be adjusted by changing the intensity of the second laser.

The team then loaded the centre of their lattice with potassium atoms and watched what happened for different degrees of disorder. Sure enough, when the second laser was switched off, some of the atoms began to diffuse away from the centre of the lattice, finding it easy to move through the periodic potential. However, when the experiment was repeated with the second laser switched on, Anderson localization kept the atoms in the centre of the lattice.

Interacting atoms

The Florence team are planning to repeat their experiment with interacting atoms. Team member Giovanni Modugno told physicsworld.com that this will be done by adjusting the applied magnetic fields that the team currently use to eliminate interactions between atoms — a phenomenon known as “Feshbach resonances”.

Daniel Steck, a physicist at the University of Oregon, told physicsworld.com that “these experiments with cold atoms can certainly help in the study of disordered media”. He added, “in real materials, it’s very difficult to see coherent quantum effects directly, such as localization”.

Cylinders of silence

cloak%202.jpg

By Hamish Johnston

I have just updated the “Featured Journal” slot on physicsworld.com to include a paper published today that presents a “feasible” recipe for creating a metamaterial that could completely cloak an object from sound — at least in two dimensions.

Instead of absorbing and/or reflecting sound like traditional acoustic insulation, an acoustic cloak would guide sound waves around an object, much like water flowing around a stone in the middle of a stream. This analogy makes it easy to see why those charged with hiding submarines from sonar would like to get their hands on such a material.

But don’t expect to be able to cover the walls of your bedroom with the stuff and finally get a good night’s sleep, because the authors haven’t actually made the material yet (although if your neighbour’s microwave oven is keeping you awake, physicists have made a microwave cloak).

The paper is by Daniel Torrent and José Sánchez-Dehesa of the Polytechnic University of Valencia, Spain. I spoke with Sánchez-Dehesa earlier this year when I wrote a news story about a paper they published in February. This earlier work suggested that an acoustic cloak could be made by surrounding an object with an array of cylindrical rods. If the rods had the right elastic properties and their radii and spacing were varied in a specific way, silence would reign.
cloak%201.jpg
However, it looks like they have decided that this earlier design cannot be built using real materials. Now, they have refined their design to a multilayered composite of two different “sonic crystals” — with each crystal being a lattice of cylindrical rods.

Torrent and José Sánchez-Dehesa have calculated that about 200 metamaterial layers would be required to cloak an object over a wide range of acoustic frequencies — although it’s not clear from the paper how thick this would be if real materials were used.

Both papers are published in the New Journal of Physics, which will be putting out a special issue on “Cloaking and Transformation Optics” later this year.

‘Plutoids’: the new name for Pluto-like dwarf planets

The International Astronomical Union (IAU) has decided that Pluto — and other dwarf planets in the Solar System that share similar characteristics — should now be sub-classified as “plutoids”.

The revised classification comes almost two years after Pluto was demoted from “planet” to “dwarf planet” in order to dispel inconsistencies in Solar System nomenclature that have arisen as more orbiting bodies have been discovered.

A dwarf planet is a body orbiting the Sun that has enough gravity to assume a near-spherical shape, but that is not the sole occupant of its orbit. To be a “plutoid”, according to the IAU, the dwarf planet must also be orbiting at a greater distance than Neptune. Aside from Pluto, the only other known dwarf planet fitting this specification is Eris, although more are expected to be found in the future.

The IAU, which has been responsible for the classifying planetary bodies and satellites since the early 1900s, had always planned to create sub-classes within the class of “dwarf planet”. The name “plutoid” was proposed by the IAU Committee on Small Body Nomenclature, and then accepted by the IAU Working Group for Planetary System Nomenclature and the board of IAU’s Division III, which concerns planetary systems. It was finally approved by IAU executives at a recent meeting in Oslo, Norway.

Still criticized

When Pluto was first reclassified, some astronomers criticized the definition for not being robust enough, in particular because the orbits of certain planets — including Earth — also overlapped with other bodies. On top of that, they complained that there were not enough astronomers consulted when the decision was made.

Catherine Cesarsky, president of the IAU, dismisses such past protests. “They form a very small part of the astronomy community,” she told physicsworld.com. She added that “practically nobody” is now trying to get Pluto reclassified as a planet.

However, Alan Stern, principal investigator for NASA’s New Horizons mission to Pluto, has already mocked the new plutoid definition. “Plutoids or haemorrhoids, whatever they call it. This is irrelevant,” he has been reported as saying.

Cesarsky admits that she has not yet heard the response from the astronomy committee for the rebranding. “I don’t think there will be a big [reaction],” she says. “A few people make a lot of noise.”

Astronomy in the dock

By Hamish Johnston

Earlier this week our Paris correspondent Belle Dumé was back in her hometown of Liverpool, where she took in an exhibition of astronomy images on display around the city’s famous Albert Dock.

Belle took a selection of photos and reported back to us.

albert%203.jpg

Called “From Earth to the Universe“, the exhibition runs until 29 June 2008. Belle says that it provides a taste of things to come during the International Year of Astronomy celebrations next year.

2009 has been proclaimed the International Year of Astronomy (IYA2009) by the International Astronomical Union (IAU) and UNESCO. Its mission is to bring astronomy into the wider public domain.

albert%202.jpg

Belle tells me that the new exhibition, sponsored by the Science Photo Library and the ASTRONET Symposium, is probably the first real IYA2009 event and consists of 48 stunning images taken by professional as well as amateur astronomers. These include photographs of our Milky Way, the Andromeda galaxy, the horse head nebula and the now famous image of the Cosmic Microwave Background revealed by the Wilkinson Microwave Anisotropy Probe (WMAP) satellite in 2003.

Belle’s hometown was chosen to host the exhibit because it is the European Capital of Culture this year. The exhibition also coincides with a major European astronomy meeting, the ASTRONET Symposium, which will take place from 16-19 June at the Liverpool John Moore’s University.

albert%201.jpg

Belle tells me that the display is a prototype for an exhibit that will tour the world next year. So it might be coming to a park, shopping centre, metro station or airport near you.

There will be special coverage of IYA2009 in upcoming issues of Physics World.

Going once…a first edition of Copernicus’s magnum opus

Copernicus%20book.jpg

By Jon Cartwright

Most of you will never have raised an arm at Christie’s auction house. But, if you’re partial to the odd extravagance, there’s a first edition of Nicolaus Copernicus’s De Revolutionibus Orbium Coelestium (“On the Revolutions of Celestial Spheres”) up for grabs. It’ll probably cost you around a million dollars.

Bidding for the 1543 volume starts on 17 June, and I expect it will end up in the vault of some blasé collector. No-one will ever read it, but then it is in Latin, and who understands that these days? Nil desperandum, though, that’s what I like to say.

Still, I know of least one physicist who would love to get his hands on it. Owen Gingerich, a historian of astronomy from Harvard University, has spent years tracing copies of Copernicus’s masterpiece, partly as an exercise for a book he wrote in 2004. A first edition would be the darling possession on his mantelpiece. “There aren’t that many copies in private hands these days,” he lamented on the phone to me a few moments ago.

Nowadays Gingerich finds solace in a second-edition. Although considerably less valuable, it does have annotations by Rheticus, the young mathematician who persuaded Copernicus to publish his radical ideas. Gingerich did get the opportunity a few years ago to buy a bona-fide first edition for $50,000, which would have been a good investment but which unfortunately would have required him to re-mortgage his house.

Will Gingerich put in a bid at Christie’s this time round? “I figure that even if I had it I’d have to rent a bank safety deposit box to keep it in,” he says. “So I’ll give it a pass.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors