Skip to main content

'Abundant health from radioactive waste'

By Hamish Johnston

Earlier this week I received a press release about a paper entitled ‘Abundant health from radioactive waste‘, which was published today in the International Journal of Low Radiation.

Not surprisingly, this set the alarm bells ringing, but I couldn’t resist following it up.

The paper is by Don Luckey who is Emeritus Professor of Biochemistry at the University of Missouri. Luckey is a proponent of “radiation hormesis” — the idea that small doses of radiation can actually be good for you, even if much larger doses will kill you.

In his paper, Luckey goes so far as to suggest that schools be built “in used nuclear power plants”, and children be given sculptures that are impregnated with nuclear waste to boost their exposure to radiation (and their health). He does caution, “However, children should not ride [sculptures of] radioactive ponies for more than a few minutes every day”.

I had never heard of radiation hormesis, so I got in touch with several health physicists in the UK and I was genuinely surprised to get a mixed verdict on the theory. Although they all agreed that hormesis was at the fringes of health physics, some did say that there could be something to it.

Indeed, I was told that the theory has a small but very vocal group of supporters, particularly in France, Japan and the US, who have been lobbying the International Commission on Radiological Protection to look into revising its Linear No-Threshold (LNT) principle. The LNT maintains that there is no exposure level below which radiation has no harmful effects (although these effects are extremely small at very low levels).

The reality is that it is very difficult to understand the effects — good or bad — of very low levels of radiation. As a result, the literature is full of seemingly conflicting reports and scientists who have a passionate belief in radiation hormesis can pick and choose studies that support the theory, while dismissing those that don’t.

A case in point is the controversial 1995 study by Bernard Cohen, which suggested that people living in parts of the US with high levels of the radioactive gas radon tend to be less likely to die from lung cancer — strong evidence for radiation hormesis, according to Luckey. However, in 2003, Jerry Puskin showed that this could be explained by considering the different rates of smoking in these regions — something that Luckey seems to have ignored in his latest paper.

So, will my children be playing on a radioactive pony? I don’t think so!

Combination technique shows the strain

Generally you don’t want to put engineering materials under too much strain. But in transistors and other electronic devices just the right amount can improve the mobility of charge carriers, making them work faster with less energy loss. The image above is a two-dimensional map of the strain in a silicon transistor on a substrate. Bluer areas indicate where the silicon lattice has been compressed while yellower areas indicate where it has been stretched. “Without measurements, you can’t know what you’ve done is what you think you’ve done,” explains Martin Hÿtch at the National Centre for Scientific Research (CNRS) in France.

Hÿtch and colleagues at CNRS are able to produce such strain maps because they have successfully combined existing “moiré” technique with electron holography. In moiré technique, a coherent electron beam is sent through a strained sample that has been stacked on top of an unstrained sample. Each sample produces its own diffracted beam, and the interference between the beams creates fringes that reveal the samples’ relative strain.

Normally, moiré technique only gives a modest spatial resolution, and it is difficult to stack nano-sized samples on top of each other. However, Hÿtch’s group places the samples side by side and, as in electron holography, interferes the diffracted beams using a “biprism” (Nature 453 1086). This method produces fringes that give both a high spatial resolution (down to 4 nm) and a wide field of view (up to 1 µm).

Renaissance man

zeil.jpg

By Matin Durrani

I was up in London yesterday at the headquarters of the Institute of Physics to listen to a talk by top quantum-information scientist Anton Zeilinger from the University of Vienna.

Zeilinger was giving the inaugural Isaac Newton lecture after being named the first recipient of the Institute’s Newton medal.

Unlike the Institute’s other medals, the Newton medal is awarded to “any physicist, regardless of subject area, background or nationality”, rather than to a physicist with specific links to the UK.

I’d say there were about 200 physicists in the audience to listen to Zeilinger whizz through topics like entanglement and decoherence — and how these have applications in quantum communication, quantum cryptography and quantum teleportation, some of which are being commercialized.

His basic message is that, thanks to various technological advances, we can now examine some of the fundamental questions in quantum mechanics that the likes of Heisenberg, Schrödinger, Bohr and Einstein posed as mere “thought experiments”, such as whether measurements on one particle can instantly affect an entangled partner a finite distance away. We are in fact living through a “quantum rennaissance”.

Zeilinger and his colleague Markus Aspelmeyer are fleshing out these themes in an article to appear in the next issue of Physics World. I was delighted that he referred several times to the article, even flashing up a couple of figures from the article that our art studio has redrawn from Zeilinger’s hand sketches.

After the lecture, I caught up with Zeilinger over champagne and quizzed him on the fact that he had put his neck firmly on the line when it comes to decoherence — the fact that fragile quantum states can be lost when they interact with the environment.

Having described how molecules as large as buckyballs can demonstrate quantum behaviour, Zeilinger had told the audience that he thinks “there is no limit” to how heavy, complex or warm a molecule can be while still showing quantum phenomena. “Decoherence won’t be a problem for molecules as large as viruses even at room temperature,” he speculated. “The limit is only one of money.”

After the lecture, delegates were treated to a concert by the Abram Wilson Jazz Quartet. Zeilinger is, apparently, something of a jazz buff.

No extra cash for UK physics

Any remaining hopes that the UK government might plug an £80m hole in the nation’s physics-research budget were dashed yesterday. In a 27-page response to a parliamentary select committee inquiry that had criticized the way the government and the Science and Technology Facilities Council (STFC) handled the 2008–2011 science budget allocations, the government’s Department of Innovation, Universities and Skills (DIUS) says it is “simply wrong” to suggest that the STFC budget was cut by the government in the first place.

Instead, the government has reiterated its claim that the STFC’s budget will increase by £185m (about 13.6%) over the three-year period, adding that the £80m figure was derived from “STFC aspirations” drawn up before receiving its budget. “There are no plans to move money around,” a DIUS spokesperson told physicsworld.com.

It is a nonsense to say that the way the STFC announced we were pulling out of the ILC has not damaged our international standing Brian Foster, Oxford University

Funding fiasco

In April this year a 14-strong panel of MPs on the Innovation Universities, Science and Skills Select Committee came down heavily on the government and the STFC for their roles in the funding fiasco, which emerged late last year. In addition to prompting the STFC to announce it was pulling out of the International Linear Collider (ILC), the Gemini telescope and all ground-based solar terrestrial physics projects, research grants are expected to be slashed by up to 25% over the next three years as a direct result of the shortfall. This could cost tens if not hundreds of jobs and threaten observatories and experiments such as those at the Large Hadron Collider (LHC) at CERN.

The MP’s report apportioned some of the blame to the way STFC was created in April 2007 by merging the Particle Physics and Astronomy Research Council (which awarded research grants) with the CCLRC (which managed scientific facilities). It claimed DIUS should have known that STFC would inherit the future operating costs of the ISIS neutron source and the brand new Diamond synchrotron at the Rutherford Appleton Lab in Oxfordshire, which amounted to £25m per year, and therefore should honour its original commitment to leave “no legacy funding issues” from the merger. But in its formal response to the select committee report, DIUS states that the STFC did not inherit a deficit from CCLRC and considers that the government “fully meets the commitment given at the time of STFC’s creation”.

International reputation

The government also believes that the UK remains a reliable international partner despite the committee’s findings that pulling out of the ILC and Gemini had made it look “unreliable” and “incompetent”. But European leader of the ILC project Brian Foster of Oxford University says the response is a non sequitur. “Like much of its response to the Select Committee, the government view regarding the ILC is actually very measured,” he told physicsworld.com. “But it is a nonsense to say that the way the STFC announced we were pulling out of the ILC has not damaged our international standing.”

On the particle physics side, it looks like the consultation panel did an excellent job Tim Gershon, University of Warwick

The report acknowledges that communication between STFC and its community and between government and the STFC could be improved, and says the government is working closely with STFC on the lessons learnt from the allocations process. Although it says that changes to the leadership of the STFC (which received damning criticism in the committee report) would only be “disruptive”, the council has agreed to ministers’ request to conduct an independent organisational review to assess its effectiveness. This is due to report in September, along with another government-appointed review of UK physics in general chaired by Bill Wakeham of Southampton University.

In the mean time, the STFC has completed a major consultation with the scientific community to help determine which projects should take priority in the face of inevitable cuts. The original “programmatic review” announced in February caused uproar, with the ranking procedure (which was carried out largely behind closed doors by two panels of around 10 researchers: PPAN for particle physics, nuclear physics and astronomy; and PALS for all other STFC programs) leaving many physicists baffled.

In particular, the LHCb experiment — one of four giant detectors around the LHC ring about to start taking data and which has a large UK contingent — found itself in the “medium–low” category, while another LHC experiment, ALICE, and the e-Merlin radio–telescope were classed in the “lower” category. The latter led to widespread concern about the future of the Jodrell Bank observatory.

Specialist panels

Some 1400 letters and emails sent in response to the review have now been digested by 10 specialist panels, and last week the STFC posted PPAN’s and PALS’s revised rankings on its website. The effort mostly seems to have paid off.

For each of the projects in the review, PPAN worked out an “alpha grade” where alpha 5 is the highest and alpha 1 means the project is unlikely to be funded. Overall the rankings have not changed that much, but significantly LHCb has been bumped up to alpha 4, ALICE to alpha 3, and e-Merlin notched up slightly to alpha 2. The MINOS neutrino experiment at Fermilab in the US has also been boosted to alpha 2, and the panel recommended that R&D for ILC detectors should be ramped down slowly in order to maximize their scientific value. The STFC executive expects to take a final decision at a meeting on July 1.

Overall there is still a lot of uncertainty about how much money physics departments are going to lose from existing grants Mike Green, Royal Holloway, University of London

“On the particle physics side, it looks like the consultation panel did an excellent job,” says Tim Gershon of the University of Warwick. “The only problems remain in areas where PPAN has decided not to accept its recommendations, such as the decision not to fund the BaBar experiment and to cut LHCb at 10% rather than the recommended 5%. This is hard to understand given that the consultation panel had worked through the costings [e.g. by shaving costs off other projects].”

Mike Green of Royal Holloway, University of London, who was on the particle physics panel, says that even though a 10% cut on LHCb is better than the original 25%, it still means that the UK may have to hand over some of its crucial detector responsibilities to another country.

“Overall there is still a lot of uncertainty about how much money physics departments are going to lose from existing grants, which is making it very difficult to plan ahead,” says Green, who is also a member of the Particle Physics Action Group set up in response to the funding crisis. “While we welcome the government’s and STFC’s commitment to improve communications and engagement, let us not forget that the government continues to insist that STFC received a 13.6% increase in its budget when this fails to recognize that a substantial fraction of this increase represents past investment. The future ‘near-cash’ increase in the STFC budget is 8%, which is lowest of all the research councils and will mostly be swallowed up to provide for full economic costing.”

NMR sheds new light on protein interactions

Biophysicists in Germany and the US have used nuclear magnetic resonance to gain an important new insight into how proteins interact.

The researchers were, for the first time, able to map out all the different shapes that the protein ubiquitin can take over a period of several microseconds. The team found that many of these shapes are nearly identical to those adopted by ubiquitin when it binds with other proteins — a discovery that suggests the prevailing theory of protein binding is incomplete, and could pave the way for new types of drugs (Science 320 1471).

All living things contain proteins and many biological processes involve these chain-like molecules binding with one another. Biophysicists know that a bound protein appears to have a different shape than its free counterpart — but understanding exactly how this change in shape occurs has proven very difficult.

For more than 50 years, most biophysicists subscribed to the “induced fit” theory, whereby a free protein is coaxed by its partner to undergo a gradual change in its shape during the binding process.

Many different shapes

However, biophysicists are beginning to realize that free proteins fluctuate between many different shapes in the absence of a binding partner — and it could be that binding simply occurs when a free protein spontaneously assumes the correct shape.

This theory of “conformational selection” had been hard to establish because standard techniques such as X-ray diffraction cannot identify the large number of short-lived shapes that a free protein can adopt.

Now, Bert de Groot and colleagues at the Max Planck Institute for Biophysical Chemistry in Goettingen and Vanderbilt University in Nashville have used nuclear magnetic resonance (NMR) to map out all the possible shapes of free ubiquitin, which is a protein found in many living cells.

The team used a relatively new NMR technique, which can follow the positions of atoms in a molecule on timescales up to several microseconds. This is much longer than traditional NMR, which can only detect motion on nanosecond timescales — not long enough to get a good look at all of the possible shapes the protein can adopt, according to De Groot.

Matching 46 bound structures

The team then compared this “structural ensemble” of free protein shapes with the 46 shapes that ubiquitin is known to adopt when bound within larger structures. They discovered that every one of the bound shapes also occurred in the free protein.

De Groot told physicsworld.com that the result implies that “no induced-fit motions are required for ubiquitin to adapt to its different binding partners”.

Writing in the same issue of Science, David Boehr and Peter Wright of the Scripps Institute in California point out that the NMR work does not necessarily mean that the induced fit theory is incorrect (Science 320 1429). Rather it is possible that both mechanisms are involved in the binding process. They also point out that de Groot’s results suggest that structural fluctuations could play an important role in how the function of a protein evolves over time.

De Groot added that the findings will shed light on a number of processes that involve protein binding including biochemical signalling processes, and receptor-ligand recognition. Both of these processes are important to those designing new drugs, and de Groot believes that the team’s work could “contribute to the design of novel drugs”.

Two new reactors for Canada

darlington.jpg

By Hamish Johnston

Summer can be a miserable time in Toronto. It can get very hot and humid, causing folks to turn up their air conditioning, which in turn puts the region’s coal-fired generating plants into overdrive blanketing the city in a sickly yellow smog that can harm those with breathing difficulties.

As a result, local utilities have begun to shut down aging coal-fired plants in an attempt to improve air quality — but leaving some wondering where the city and surrounding province of Ontario will get its electricity.

Now, Ontario’s Minister for Energy Gerry Phillips has given the go ahead for two new nuclear reactors to be built at the Darlington generating plant just east of the city, which is already home to four reactors. These are the first power reactors to be built in Canada in over 15 years.

According to the Toronto Star, the reactors will come online in 2018 and the design will be chosen in November from a short list of three firms: Atomic Energy of Canada; US-based Westinghouse; and Areva of France.

The reactors are expected to generate about 3200 MW of power, which will double Darlington’s current capacity.

The move is part of Ontario’s CDN$26 billion plan to maintain its current nuclear capacity of 14,000 MW through a series of upgrades over the next 20 years.

Darlington is located in a part of Ontario that has been hard hit by lay-offs in the automotive industry, so Phillips may be hoping that the promise of 3500 new jobs will offset the concerns of environmentalists.

A real gem on arXiv

By Hamish Johnston

I spent an hour or so this morning trawling through the arXiv preprint server, where many physicists post their research results before they are published formally. It’s a great way to keep up with the latest breakthroughs — and a good source of more controversial or off-beat stories.

That’s where I spotted this gem: “Growth of Diamond Films from Tequila” by Javier Morales, Miguel Apátiga and Victor M Castaño, who are physicists based in (you guessed it) Mexico.

It seems that the three physicists have used the famous spirit in their chemical vapour deposition (CVD) machine to create tiny diamonds.

Although the paper notes that diamonds have already been made by CVD using a number of other precursors, the trio suggest that tequila provides “an excellent alternative to produce industrial-scale diamond thin films for practical applications using low-cost precursors”.

It’s not clear from the paper why tequila was used rather than vodka, gin or whisky — and of course, if this paper was entitled “Growth of Diamond Films from Water and Ethanol”, I wouldn’t have given it a second thought.

Cold-fusion demonstration: an update

By Jon Cartwright

Several of you have asked when I’m going to give you an update on Yoshiaki Arata’s cold-fusion demonstration that took place at Osaka University, Japan, three weeks ago. I have not yet come across any other first-hand accounts, and the videos, which I believe were taken by people at the university, have still not surfaced.

However, you may have noticed that Jed Rothwell of the LENR library website has put some figures with explanations relating to Arata’s recent work online. I’ve decided to go over them and some others here briefly to give you an idea of how Arata’s cold-fusion experiments work. It’s a bit more technical than usual, so get your thinking hats on.

ColdFusionFig3.jpg

Above is a diagram of his apparatus. It comprises a stainless-steel cell (2) containing a sample, which for the case of the demonstration was palladium dispersed in zirconium oxide (ZrO2–Pd). Arata measures the temperature of the sample (Tin) using a thermocouple mounted through its centre, and the temperature of the cell wall (Ts) using a thermocouple attached on the outside.

Let’s have a look at how these two temperatures, Tin and Ts, change over the course of Arata’s experiments. The first graph below is one of the control experiments (performed in July last year) in which hydrogen, rather than deuterium, is injected into the cell via the controller- (8) operated valve (5):

ColdFusionFig2.jpg

At 50 minutes — after the cell has been baked and cooled to remove gas contamination — Arata begins injecting hydrogen into the cell. This generates heat, which Arata says is due to a chemical reaction, and the temperature of the sample, Tin (green line), rises to 61 °C. After 15 minutes the sample can apparently take no more hydrogen, and the sample temperature begins to drop.

Now let’s look at the next graph below, which is essentially the same experiment but with deuterium gas (performed in October last year):

ColdFusionFig1.jpg

As before, Arata injects the gas after 50 minutes, although it takes a little longer for the sample to become saturated, around 18 minutes. This time the sample temperature Tin (red line) rises to 71 °C.

At a quick glance the temperatures in both graphs, after saturation, appear to peter out as one would expect if heat escapes to the environment. However, in the case of deuterium there is always a significant temperature difference between Tin and Ts, indicating that the sample and cell are not reaching equilibrium. Moreover, after 300 minutes the Tin of the deuterium experiment is about 28 °C (4 °C warmer than ambient), while Tin/Ts of the hydrogen experiment is at about 25 °C (1 °C warmer than ambient).

These results imply there must be a source of heat from inside the cell. Arata claims that, given the large amount of power involved, this must be some form of fusion — what he prefers to call “solid fusion”. This can be described, he says, by the following equation:

D + D = 4He + heat

(According to this equation, there should be no neutrons produced as by-products — thanks to those of you who pointed this out on the last post.)

If any of you are still reading, this graph below is also worth a look:

ColdFusionFig4.jpg

Here, Arata also displays data from deuterium and hydrogen experiments, but starts recording temperatures after the previous graphs finished, at 300 minutes. There are four plots: (A) is a deuterium and ZrO2–Pd experiment, like the one just described; (B) is another deuterium experiment, this time with a different sample; (C) is a control experiment with hydrogen, again similar to the one just described; and (D) is ambient temperature.

You can see here that the hydrogen experiment (C) reaches ambient temperature quite soon, after around 500 minutes. However, both the deuterium experiments remain 1 °C or more than ambient for at least 3000 minutes while still exhibiting the temperature difference between the sample and the cell, Tin and Ts.

Could this apparently lasting power output be used as a energy source? Arata believes it is potentially more important to us than hot or “thermonuclear” fusion and notes that, unlike the latter, it does not emit any pollution at all.

Disorder puts the brakes on matter waves

Two independent teams of physicists have used ultracold atomic gases to demonstrate how a little bit of disorder can paralyse a quantum system — an effect called “Anderson localization”. Their experiments mark the first time that this phenomenon has been seen in matter, despite being predicted to occur in crystalline solids 50 years ago.

Although the experiments were done in 1D optical lattices with non-interacting atoms, the teams claim that their methods could be expanded to investigate 2D and 3D systems or to study the effects of interactions on localization. This could allow such systems to be used as “quantum simulators” to gain insight into real materials that are difficult to probe experimentally or simulate numerically.

Much of what we know about the electronic properties of metals and semiconductors is based on the idea that electrons with certain momenta can travel freely through a crystalline lattice, while others cannot. This is embodied in Felix Bloch’s 1928 quantum theory of conduction, which describes the lattice as a periodic electric potential through which some electrons (behaving as “matter waves”) diffract with ease.

Disordered lattice

Some 30 years later, Philip Anderson worked out what would happen in such a system if the potential lost its periodicity. This could happen, for example, if the lattice remained periodic, but the potential has a different value at each lattice site. He found that electrons would be unable to move through such a “disordered” lattice, and instead become trapped by specific atoms. This prediction of “Anderson localization” won him the 1977 Nobel Prize in Physics.

Although Anderson localization has been seen in a number of systems — including those based on light and microwaves — it has never been observed directly with electrons. This is because lattice vibrations and interactions between electrons that occur in real material washes out the subtle effects of localization.

Now, two independent groups claim to have seen Anderson localization in matter waves for the first time. Both teams saw the phenomenon in Bose–Einstein condensates (BECs) made from ultracold atomic gases. In both systems it was individual atoms — rather than electrons — that became localized. Instead of being in a solid crystal, the atoms were in an optical lattice created by overlapping laser beams.

Speckled pattern

In the first experiment, Alain Aspect and colleagues at the University of Paris-Sud and CNRS’s Institut d’optique began by confining about 17,000 rubidium atoms in a magnetic trap to make a BEC (Nature 453 891). Turning the fields off makes the atoms quickly leave the centre of the trap. But if the BEC is placed in a lattice created by a laser “speckle” pattern — a random distribution of light and dark regions that is created when a laser beam is reflected from a rough surface — the atoms stay put when the magnetic fields are turned off. The team claims this effect is caused by Anderson localization.

Phllippe Bouyer at Paris-Sud told physicsworld.com that the team is now looking at how to repeat their experiment in 3D — something that will require a major redesign to overcome (among other things) the effect of gravity pulling down on the atoms.

Meanwhile in Italy, Massimo Inguscio and colleagues at the University of Florence began with a 1D optical lattice that is a standing wave created by an intense laser beam (Nature 453 895). If a second, weaker laser with a different wavelength is shone into the lattice, the two beams interfere to create a 1D “quasicrystal” with a certain amount of disorder. The degree of disorder in the quasicrystal can be adjusted by changing the intensity of the second laser.

The team then loaded the centre of their lattice with potassium atoms and watched what happened for different degrees of disorder. Sure enough, when the second laser was switched off, some of the atoms began to diffuse away from the centre of the lattice, finding it easy to move through the periodic potential. However, when the experiment was repeated with the second laser switched on, Anderson localization kept the atoms in the centre of the lattice.

Interacting atoms

The Florence team are planning to repeat their experiment with interacting atoms. Team member Giovanni Modugno told physicsworld.com that this will be done by adjusting the applied magnetic fields that the team currently use to eliminate interactions between atoms — a phenomenon known as “Feshbach resonances”.

Daniel Steck, a physicist at the University of Oregon, told physicsworld.com that “these experiments with cold atoms can certainly help in the study of disordered media”. He added, “in real materials, it’s very difficult to see coherent quantum effects directly, such as localization”.

Cylinders of silence

cloak%202.jpg

By Hamish Johnston

I have just updated the “Featured Journal” slot on physicsworld.com to include a paper published today that presents a “feasible” recipe for creating a metamaterial that could completely cloak an object from sound — at least in two dimensions.

Instead of absorbing and/or reflecting sound like traditional acoustic insulation, an acoustic cloak would guide sound waves around an object, much like water flowing around a stone in the middle of a stream. This analogy makes it easy to see why those charged with hiding submarines from sonar would like to get their hands on such a material.

But don’t expect to be able to cover the walls of your bedroom with the stuff and finally get a good night’s sleep, because the authors haven’t actually made the material yet (although if your neighbour’s microwave oven is keeping you awake, physicists have made a microwave cloak).

The paper is by Daniel Torrent and José Sánchez-Dehesa of the Polytechnic University of Valencia, Spain. I spoke with Sánchez-Dehesa earlier this year when I wrote a news story about a paper they published in February. This earlier work suggested that an acoustic cloak could be made by surrounding an object with an array of cylindrical rods. If the rods had the right elastic properties and their radii and spacing were varied in a specific way, silence would reign.
cloak%201.jpg
However, it looks like they have decided that this earlier design cannot be built using real materials. Now, they have refined their design to a multilayered composite of two different “sonic crystals” — with each crystal being a lattice of cylindrical rods.

Torrent and José Sánchez-Dehesa have calculated that about 200 metamaterial layers would be required to cloak an object over a wide range of acoustic frequencies — although it’s not clear from the paper how thick this would be if real materials were used.

Both papers are published in the New Journal of Physics, which will be putting out a special issue on “Cloaking and Transformation Optics” later this year.

Copyright © 2025 by IOP Publishing Ltd and individual contributors