This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

Free weekly newswire

Sign up to receive all our latest news direct to your inbox.

Physics on film

physicsworld.com's multimedia channel features exclusive video interviews with leading figures in the physics community.

Visit our multimedia channel to see the latest video.

Jon Cartwright: May 2008 Archives

There’s been another development in the nascent field of iron-based high-temperature superconductors, which were recently shown to be able to turn superconducting at the very respectable temperature of 55 K.

Scientists at the National Institute of Standards and Technology (NIST) in the US have used neutron beams to investigate the magnetic properties of the iron-based materials. They found that, at low temperatures and when undoped, the materials make a transition into an antiferromagnetic state in which magnetic layers are interspersed with non-magnetic layers. But when the materials are doped with fluorine to make them into high-temperature superconductors, this magnetic ordering is suppressed.

This is reminiscent of the behaviour of cuprates — the highest temperature superconductors known to-date. Is this more than a coincidence? We’ll have to wait and see.

The research is published online here in Nature.

Robert Aymar, the director-general of CERN, has said that the Large Hadron Collider (LHC) — the world’s biggest particle physics experiment — will be in “working order” by the end of June, according to the French news agency Agence France-Presse (AFP).

It is not clear what Aymar means by this, given that the last announcement from CERN was for a July start-up. It seems unlikely that LHC has raced ahead of schedule, so it might be that he thinks the cooling of the magnets will be complete by the end of June. However, the status report on the LHC website would indicate otherwise.

I spoke to a press officer at CERN, and she said that the AFP journalists quoted Aymar from a recent meeting they had at the European lab. She said that, as far as she is aware, the beam commissioning is still set to take place in July.

I have not yet spoken to James Gillies, the chief spokesperson for CERN, because he is tied up in meetings all day. When he gets back to me, I will give you an update.

UPDATE 3.15pm: I have just spoken to Gillies and he said that there is no change to the start-up schedule — the plan is still to begin injecting beams towards the end of July. Aymar was indeed referring to the cooling of the magnets, which should be complete by the end of June. Four of the eight sectors have already been cooled to their operating temperature of 1.9 K; the last (sector 4–5) began the cooling process today.

The reason for the gap between the cooling and beam-injection is that there must be a series of electrical tests, which will take around four weeks.

ColdFusion.jpg

On 23 March 1989 Martin Fleischmann of the University of Southampton, UK, and Stanley Pons of the University of Utah, US, announced that they had observed controlled nuclear fusion in a glass jar at room temperature, and — for around a month — the world was under the impression that the world’s energy woes had been remedied. But, even as other groups claimed to repeat the pair’s results, sceptical reports began trickle in. An editorial in Nature predicted cold fusion to be unfounded. And a US Department of Energy (DOE) report judged that the experiments did “not provide convincing evidence that useful sources of energy will result from cold fusion.”

This hasn’t prevented a handful of scientists persevering with cold-fusion research. They stand on the sidelines, diligently getting on with their experiments and, every so often, they wave their arms frantically when they think have made some progress.

Nobody notices, though. Why? These days the mainstream science media wouldn’t touch cold-fusion experiments with a barge pole. They have learnt their lesson from 1989, and now treat “cold fusion” as a byword for bad science. Most scientists* agree, and some even go so far as to brand cold fusion a “pathological science” — science that is plagued by falsehood but practiced nonetheless.

[*CORRECTION 29/05/08: It has been brought to my attention that part of this last sentence appears to be unsubstantiated. After searching through past articles I have to admit that, despite it being written frequently, I can find no factual basis that “most scientists” think cold fusion is bad science (although public scepticism is evidently rife). However, there have been surveys to suggest that scientific opinion is more likely divided. According to a 2004 report by the DOE, which you can read here, ten out of 18 scientists thought that the hitherto results of cold-fusion experiments warranted further investigation.]

There is a reasonable chance that the naysayers are (to some extent) right and that cold fusion experiments in their current form will not amount to anything. But it’s too easy to be drawn in by the crowd and overlook a genuine breakthrough, which is why I’d like to let you know that one of the handful of diligent cold-fusion practitioners has started waving his arms again. His name is Yoshiaki Arata, a retired (now emeritus) physics professor at Osaka University, Japan. Yesterday, Arata performed a demonstration at Osaka of one his cold-fusion experiments.

Although I couldn’t attend the demonstration (it was in Japanese, anyway), I know that it was based on reports published here and here. Essentially Arata, together with his co-researcher Yue-Chang Zhang, uses pressure to force deuterium (D) gas into an evacuated cell containing a sample of palladium dispersed in zirconium oxide (ZrO2–Pd). He claims the deuterium is absorbed by the sample in large amounts — producing what he calls dense or “pynco” deuterium — so that the deuterium nuclei become close enough together to fuse.

So, did this method work yesterday? Here’s an email I received from Akito Takahashi, a colleague of Arata’s, this morning:

“Arata’s demonstration…was successfully done. There came about 60 people from universities and companies in Japan and few foreign people. Six major newspapers and two TV [stations] (Asahi, Nikkei, Mainichi, NHK, et al.) were there…Demonstrated live data looked just similar to the data they reported in [the] papers…This showed the method highly reproducible. Arata’s lecture and Q&A were also attractive and active.”

I also received a detailed account from Jed Rothwell, who is editor of the US site LENR (Low Energy Nuclear Reactions) and who has long thought that cold-fusion research shows promise. He said that, after Arata had started the injection of gas, the temperature rose to about 70 °C, which according to Arata was due to both chemical and nuclear reactions. When the gas was shut off, the temperature in the centre of the cell remained significantly warmer than the cell wall for 50 hours. This, according to Arata, was due solely to nuclear fusion.

Rothwell also pointed out that Arata performed three other control experiments: hydrogen with the ZrO2–Pd sample (no lasting heat); deuterium with no ZrO2–Pd sample (no heating at all); and hydrogen with no ZrO2–Pd sample (again, no heating). Nevertheless, Rothwell added that Arata neglected to mention certain details, such as the method of calibration. “His lecture was very difficult to follow, even for native speakers, so I may have overlooked something,” he wrote.

It will be interesting to see what other scientists think of Arata’s demonstration. Last week I got in touch with Augustin McEvoy, a retired condensed-matter physicist who has studied Arata’s previous cold-fusion experiments in detail. He said that he has found “no conclusive evidence of excess heat” before, though he would like to know how this demonstration turned out.

I will update you if and when I get any more information about the demonstration (apparently there might be some videos circulating soon). For now, though, you can form your own opinions about the reliability of cold fusion.

You might recall a while back physicsworld.com reported on a prediction for peculiar event that takes place on the two equinoxes. On the 20 March and the 22 September (or thereabouts) at two places on the Earth’s surface, many of the gravitational forces in the Milky Way should cancel out.

Such a quiet time in the turmoil of our galaxy provides an ideal opportunity for a ruthless test of Newton’s laws of motion. Some physicists think that if there were any deviation in the laws at very low accelerations it would mean dark matter — the elusive substance thought to make up around 95% of the universe’s mass and the dream catch of experiments worldwide — does not exist. Instead, all the phenomena associated with dark matter could be explained by a slight alteration in the laws known as modified Newtonian dynamics (MOND).

When Alex Ignatiev from the Theoretical Physics Research Institute in Melbourne, Australia, came up with the idea for the equinoctial experiment, there were a couple of problems with his proposal. First, there was a worry that stray icebergs at high latitudes where one of the experiments would have to be performed might give a false gravitational signal. Second, Ignatiev did not know the exact time that the desired signal would occur.

Now, in a new paper, he has resolved both of these. He has shown that even the biggest icebergs would not produce a signal big enough to confuse the data. And he has also shown how to predict the exact signal times.

One of the referees for Ignatiev’s paper has given a rich endorsement to the proposal: “MOND is the leading alternative to cosmic dark matter. It has passed a surprising number of astronomical tests and is desperately in need of laboratory tests. The author’s idea for testing MOND in a terrestrial setting is the only viable suggestion I’ve ever heard for such a possibility. This is an incredibly important problem, and deserves to be explored just as much as CDMS and the many other dark matter search experiments.”

The dusty cosmos

| | TrackBacks (0)

DustyGalaxy.jpg

Astrophysicists have a better idea of how dust obscures the light from galaxies, according to a paper published in Astrophysical Journal Letters.

It is already well known that dust, which permeates all galaxies, attenuates the light reaching Earth from the cosmos. It absorbs light of most wavelengths and then re-emits it as a blanket of infrared radiation. Now, Simon Driver of St Andrews University in the UK and colleagues have produced the first model that accounts for this absorption.

One of the model’s implications — that dust absorbs just under half the radiation produced by stars — will not be a surprise to astronomers. They already know this, having compared the average magnitude of the infrared radiation in the sky with the magnitude of the radiation from pinpoint sources like stars and galaxies. But what might be of interest is that Driver and colleagues can show how the dust affects the light output of galaxies depending on their orientation.

I spoke with Alastair Edge of Durham University, who is familiar with Driver’s team’s work, and he was pleased that that the researchers have managed to model the dust successfully. He followed up our conversation with an email: “The authors have made an important link between the observed properties of the galaxies we see from the light coming directly from their stars to the amount of long wavelength radiation we see coming from the dust within the galaxies. Obtaining a match between the energy absorbed and that re-radiated allows us to understand the global properties of galaxies in a more holistic fashion.”

PieChart.jpg

I’m sorry to say that, having taken a day’s leave on Monday, this snippet of news (above) about ScienceDebate 2008 escaped my attention. According to a poll conducted by Harris Interactive on behalf of ScienceDebate 2008 and Research!America, 85% of US adults think agree that the presidential candidates should participate in a debate on science in the run up to the November election.

(For those of you who have missed the protests of the 37,000 signatories of ScienceDebate 2008, see my last news story on their progress.)

Shawn Otto, CEO of Science Debate 2008, gave the following statement in a press release:

“This topic has been virtually ignored by the candidates, but this poll shows that Americans of all walks know how important science and technology are to our health and way of life. We’ve heard a lot about lapel pins and preachers. But tackling the big science challenges is critical to our children’s future — to the future of the country and the future of the planet. Americans want to know that candidates take these issues seriously, and the candidates have a responsibility to let voters know what they think.”

The poll also shows that:

  • 67% of adults think scientific research has contributed either “a lot” or “a great deal”
  • 67% think that scientific evidence, rather than personal belief, should influence science policy
  • 69% rate alternative energy as one of the most serious long-term issues
  • 53% rate climate change as one of the most serious long-term issues

You can read more here.

Observatory.jpg

I wonder if many other scientists under the wing of the Holy See agree with Jose Gabriel Funes, the head of the Vatican observatory, or whether he’s something of a radical.

In an interview in yesterday’s edition of the Vatican newspaper L’Osservatore Romano, Funes not only admits that he believes in the Big-Bang model of the universe’s creation, he states that humans should be open to the possibility of alien life. “Just as there’s a multiplicity of creatures on earth,” he says, “there can be other beings, even intelligent, created by God.”

To be clear, Funes is in no way dismissing the first two chapters of Genesis. In fact, he sees “no contrast” between the notion of aliens and the Catholic faith. The other beings might also be worshipping God, he says.

The interview is headlined “The extraterrestrial is my brother”.

Astrophysicists have known for more than three decades that black holes shouldn’t be totally black — they should emit a certain amount of “Hawking radiation” from the production of particle–antiparticle pairs around their event horizons. But detecting Hawking radiation has so far proved tricky, mostly because its temperature would be at least eight orders of magnitude lower than the cosmic microwave background left over from the Big Bang.

One way round this problem, as Ulf Leonhardt and colleagues from the University of St Andrews, UK, demonstrated earlier this year, might be to create systems that are analogous to black holes in the lab in which the temperature of the radiation is much higher. The researchers showed that a pulse of light travelling through a fibre can behave like a black hole, and, although they didn’t actually detect Hawking radiation, they showed that in principle it should be possible.

Now, in a paper published today in the New Journal of Physics, is seems as though Leonhardt’s group are one step closer. Rather than use pulses of light as an analogous system to a black hole, they have built a system of water waves. I confess that I haven’t yet studied this paper carefully enough to describe with any certainty what the researchers have done, suffice it to say they claim to have observed “negative-frequency” waves, the classical analogue of anti-particles which are the hallmarks of Hawking radiation.

In a brief email conversation last week, Leonhardt told me that they are not yet sure whether this is enough to constitute an observation of a classical analogue of Hawking radiation: “Hawking’s effect is a quantum phenomenon, a spontaneous quantum process, but like all spontaneous processes it can be stimulated. This is what we did, we sent in waves and saw a tiny bit of stimulated negative-frequency waves, but there are quantitative differences between experiment and theory that we do not understand yet.”

Of course, if and when Leonhardt’s group do find negative-frequency waves that agree with theory, there will be a debate as to whether they are “real” Hawking radiation. No doubt you will be seeing more of this on physicsworld.com soon.

“So what would you do if string theory is wrong?” asks string theorist Moataz Emam of Clark University, US, in a paper posted on arXiv yesterday. It’s obvious, you might think. String theorists would briefly mourn the 40 years of misspent speculation and leave furtively through the back door, while anti-string theorists would celebrate in light of their vindication.

Not so, says Emam — string theory will continue to prosper, and might even become its own discipline independent of physics and mathematics.

Oddly, the reason Emam gives for this prediction is precisely the same reason why many physicists despise string theory. For example, in reducing the 10 dimensions of string theory to our familiar four, string theorists have to fashion a “landscape” of at least 10500 solutions. Emam says that such a huge number of solutions — of which only one exists for our universe — may make string theory unattractive, but in studying them physicists are gaining “deep insights into how a physical theory generally works”:

So even if someone shows that the universe cannot be based on string theory, I suspect that people will continue to work on it…The theory would be studied by physicists and mathematicians who might no longer consider themselves either. They will continue to derive beautiful mathematical formulas and feed them to the mathematicians next door. They also might, every once in a while, point out interesting and important properties concerning the nature of a physical theory which might guide the physicists exploring the actual theory of everything over in the next building.

Peter Woit, author of the string-theory polemic Not Even Wrong, notes on his blog that physicists looking to pursue string theory for its beauty should “go and work in a maths department”:

The argument Emam is making reflects in somewhat extreme form a prevalent opinion among string theorists, that the failure of hopes for the theory, even if real, is not something that requires them to change what they are doing. This attitude is all too likely to lead to disaster.

Sabur.jpg

According to the Guinness Book of World Records, and what appears to be most major media outlets, Alia Sabur (pictured above) has broken the record for the world’s youngest professor.

Sabur, 19, will begin teaching physics next month at the Department of Advanced Technology Fusion at Konkuk University, Korea. It will be just another entry on the teenager’s laden CV, which reveals she received a bachelor’s degree at 14 and a masters in materials science at 17.

Something might be awry here, though. There’s nothing wrong with the media adopting the American English definition of “professor” (i.e. any university teacher) — after all, Sabur was born in New York. But it appears that the previous record holder was Scottish physicist Colin Maclaurin, who was appointed professor of mathematics at the University of Aberdeen when he was a few months over 19 in 1717.

I might have to explain to our international readers that in the UK “professor” is a more distinguished title, reserved for heads-of-departments and the like. (At least it has been as far back as any of us at Physics World can vouch for.) Sabur, I note, is yet to defend her PhD.

Does this mean the titles of Sabur and Maclaurin are being confused? Does Maclaurin, who is credited with the mathematical “Maclaurin series”, deserve to keep his accolade?

Of course, science was a considerably narrower discipline back in the 18th century, and achieving a professorship might have taken a little less time than it does today (it certainly wouldn’t have required a PhD). But Maclaurin can’t defend his honour, and offhand I don’t know enough about science in the early 1700s to cast a vote either way.

Do any of you have any thoughts? Feel free to comment below.