Skip to main content

Mind the hack

Late in the afternoon of Monday 19 June 2000, the US news website NASA Watch reported details of a classified briefing from NASA to the White House that concerned a potential landmark discovery in planetary science. Within 24 hours the news had spread onto other small websites, and after two days several of the world’s biggest media outlets, including the BBC and USA Today, were running the story on their homepages. Although the details were sketchy, the bottom line was clear: pictures from the Mars Global Surveyor spacecraft implied that the red planet had water on its surface.

The problem was that the stories appeared a little over a week early. In the eyes of the journal Science, in which the research was to be published, and NASA, which was to hold a press conference to announce the findings, NASA Watch had broken what is known as a science embargo — a system that aims to coordinate when the media report on new research.

As a result of the broken embargo, NASA decided to bring forward the press conference, while Science swiftly published the paper online to clarify the “varying degrees of accuracy” of the premature stories. But are the public and other scientists entitled to know about discoveries like this as soon as possible? Or is it right that research is embargoed, so that all journalists have an equal chance to report on it accurately?

Age-old system

Science embargoes date back to the emergence of dedicated science journalism in the 1920s. Around this time, the science editors of newspapers and news agencies began to demand advance copies of research papers in order to get to grips with difficult concepts and so prepare accurate stories. Publishers relented, and by the 1950s most elite journals were distributing preprints under embargo; in other words, stipulating that the editors should not run stories before the research papers were actually published.

Today, the two biggest journals employing embargo systems are Nature, published by Macmillan Publishers in the UK, and Science, published by the American Association for the Advancement of Science (AAAS). Both journals send preprints and summary press releases of upcoming research papers to journalists a little under a week in advance of publication; Science, for example, e-mails about 5400 journalists worldwide every week. In addition, media agencies such as EurekAlert!, also run by the AAAS, and AlphaGalileo, its European competitor, flood journalists’ inboxes daily with embargoed press releases from other journals and research institutions.

I don’t see why research that is being funded by the taxpayer should be manipulated by a private company for its commercial benefit David Whitehouse, freelance science writer

There is no legal obligation for journalists to adhere to embargoes. For this reason, the system is often referred to as a “gentleman’s agreement” in that it frees journalists from the pressure of tight deadlines while effectively guaranteeing journal publishers a spread of well-crafted news articles bearing the journal’s name. “I think [the embargo system] helps the media with complicated research,” says Ruth Francis, head of press relations at Nature. “It allows them time to gather information, speak to the researchers and know that they’ve got the story right.”

Not everyone agrees with this view. Some insist that the arbitrary timing of news coverage as dictated by embargoes gives the public the false impression that science is a progression of breakthroughs. Others object to the fact that embargoes ensure maximum coverage for a journal, because they allow slower reporters the time to catch up with those who are more able. “I mostly hate them,” says Dennis Overbye, science editor at the New York Times. “I think they serve the purpose of the journals very well. I’d just be happy to see them abolished.”

Media dissent

Given that embargoes are merely a gentleman’s agreement, where are the journalists who want to evade them? One journalist who prefers not to be the thrall of press officers is David Whitehouse, a freelance science writer who used to be science editor of BBC News Online. In fact, while Whitehouse was at the BBC he wrote one of the embargo-breaking news stories about water on Mars. “Yeah, we got in a bit of trouble about that,” he recalls.

Like other journalists who got wind of the Mars discovery early, Whitehouse was not able to put together an entirely accurate article. For example, he wrote that the images revealed “what appears to be brackish water seeping from beneath the Martian surface”, when in fact the images only showed the remnants of spring activity that could have been two million years old. (To this day there is still no consensus on the existence of surface water on Mars.) This error could be seen as a reason to favour embargoes, but it more likely highlights an awkward caveat of many embargo systems known as the Ingelfinger rule.

The rule, devised in 1969 by Franz Ingelfinger, then editor of the New England Journal of Medicine, states that scientists with papers pending publication must not discuss their work with journalists — essentially to prevent other scientists learning about the research from prior news stories rather than the journal. Nature and Science both employ variants of the Ingelfinger rule and only permit scientists to talk openly with journalists once an advance press release has been issued. This means that, even though Whitehouse and others sourced the information about the Mars discovery without the help of press officers — and so technically were free from the authority of Science’s embargo — they were unable to check facts with the scientists involved.

But Whitehouse says that sometimes he considered breaking embargoes even after he had received the Nature or Science preprint and press release. “You wouldn’t normally break an embargo that you’d agreed to,” he explains. “But if it was a humdinger of a story, you would, and you’d argue about it later. You’d say that the BBC was more important than Nature, [and] if you want to stop sending us your embargoed releases, then it’ll hurt you more than it’ll hurt us.”

Breaking embargoes is nothing new, although major cases are rare. In 1989 the University of Utah in the US invited journalists to a press conference to unveil a new type of room-temperature or “cold” fusion energy generation. But one of the scientists, Martin Fleischmann, unwittingly revealed details beforehand to a reporter from the Financial Times, which led to the UK newspaper running the story a day early. And way back in 1961, newspapers jumped the gun on new results from astrophysicist Martin Ryle that went against Fred Hoyle’s steady-state theory of the universe, in which new matter is supposed to be created endlessly to compensate for its expansion. As Hoyle’s theory was already unpopular with Christians, this led to the decisive headline from the Evening News: “The Bible was right”.

For Whitehouse, the main irritation of embargoes is that they “dragoon” everybody to report science news at the same time, which can only suit the “vested interests” of the journals. “They know that on a certain day, in certain newspapers, in certain outlets, their name is going to be mentioned,” he says. “And that is good for their advertising…I don’t see why research that is being funded by the taxpayer should be manipulated by a private company for its commercial benefit.”

Flawed articles

Naturally, press officers disagree. Ginger Pinholster, director of public programmes at Science, points out that the AAAS is a non-profit organization. She also believes that when one media outlet breaks an embargo, others must “scramble to catch up with the crowd”, sometimes producing flawed articles. “Distortion of research findings can jeopardize [both] public confidence in scientific discovery, and in funding for research, thus hindering advances that benefit society,” she says.

Most scientists contacted by Physics World were also at odds with Whitehouse’s argument. “[Embargoes] do not stop communication between scientists,” insists Andre Geim, a condensed-matter physicist from the University of Manchester in the UK. “They stop scientists making hype before [their work is published].”

If you’re frantic to be the first to publish, then frankly what you will come up with is not a very well researched story David Dickson, editor, SciDev.Net

Geim is correct to stress that, in general, scientists are not held back by embargo systems: the Ingelfinger rule adopted by Nature and Science still permits scientists to talk to each other, to speak at conferences and to upload preprints of their work to servers such as arXiv (though many are unaware that they can do the last). But the question is whether the public — and indeed scientists in unrelated fields — should also be privy to this circle of communication. In his 2006 book Embargoed Science, US journalist Vincent Kiernan suggests that shielding the public from scientists’ dialogue has the knock-on effect that they misunderstand the “essence” of scientific research, thereby leaving them open to pseudoscience. “The embargo,” he writes, “by promoting an unending stream of coverage of the ‘latest’ research findings, diverts journalists from covering the process of science, with all of its controversies and murkiness.”

Whitehouse also takes this view. “It’s a journalist’s job to reflect what’s going on, not to act as a cheerleader for science.” He adds that none of the reasons given by journal publishers for using embargoes has benefited him at all. “One of the reasons that the embargo was first put in place was to give everyone a level playing field. What journalist wants everybody to have the same chance? It’s ridiculous. Journalists by their nature are competitive and want to get their story out there first.”

No advantages

There is no doubt that the “level playing field” stymies the work of quick journalists by forcing them to wait for the embargo period to tick over. But some journalists contend that it is a small price to pay in order to communicate science accurately worldwide.

“In a lot of developing countries, journalists — even staff journalists — will be paid according to the number of words they get into a newspaper,” explains David Dickson, editor of the website SciDev.Net, which promotes effective science communication in the developing world. “And if the main stories that the news editor is interested in are the political or the sports or the crime stories, then those are the jobs that will get well paid…The cards are stacked against science reporters. If you think of all the hurdles that science journalists in developed countries face, just double them all.”

Dickson goes so far as to encourage trainee press officers to implement embargo systems. “If you’re frantic to be the first to publish, then frankly what you will come up with is not a very well researched story,” he says. As for the argument that the Ingelfinger rule inhibits reporting before publication, Dickson is not swayed. “When do you report on scientific research?” he asks. “When it’s in progress it’s actually rather boring. Why not wait until there’s something to report? That’s far more interesting.”

The truth is that even if the mainstream media were freer to cover more diverse research and incremental advances, they probably would not have the personnel to do so. For example, the Guardian, a UK daily newspaper, has nine science journalists to cover all disciplines. The New York Times has 17. BBC News has six, spread over various TV and radio programmes. In fact, such modest resources might be a symptom of present-day journalism in general. According to the recent book Flat Earth News by Nick Davies, a UK investigative journalist, on average just 20% of stories in quality UK newspapers are generated without the help of press releases or wire services. Davies has even coined a name for the modern practice: “churnalism”.

It is possible that if embargoes were removed, slower sections of the media would have to bow out of the race and switch their attention to the more overlooked journals such as Physical Review Letters (published by the American Physical Society), which has been devoid of anything resembling an embargo system for more than 30 years. Then again, they might simply give up on science reporting altogether. But whether or not embargoes are good for science communication, neither Science nor Nature has any plans to abandon them, so it looks as though they are here to stay. “They serve a lot of purposes,” Dickson continues. “They’re not written in stone. Everybody knows the rules. That’s the way journalism works.”

• Comment on this article below, or e-mail pwld@iop.org

Mission impossible

Achieving the impossible might turn out to actually be impossible, but in the process of trying we can redefine the possible. It is the difference between ambition and complacency. Again and again — especially towards the end of the 19th century — complacent scientists have made future fools of themselves by proclaiming the impossibility of things such as determining the composition of stars or discovering the ultimate structure of matter. Ambitious authors, on the other hand, write books about “impossible” science. John Horgan tried with The End of Science, and John Barrow with Impossibility: The Limits of Science and the Science of Limits. Now the physicist and science communicator Michio Kaku offers us Physics of the Impossible.

Kaku has already given us one crystal ball with his book Visions: How Science Will Revolutionize the 21st Century and Beyond — aptly published in 1999 — which assessed the immediate future of computing, quantum physics and biomolecular science. Having explored these important frontiers, Kaku now focuses his sights much further ahead. Taking various examples from contemporary science fiction as a guide, he charts the possible future course of scientific and technological exploration.

The book, however, misses a good example of when science was actually influenced by fiction in this way. In Britain in the 1930s, while complacent politicians tried to appease the Nazi regime, imaginative scientists looked at the possibility of a high-energy electromagnetic-radiation weapon to zap enemy aircraft. They quickly found out that a radio wave “death ray” was impossible, but instead discovered an effective way of detecting planes using a technique that would eventually become known as radar.

Physics of the Impossible begins with force fields, which are firmly based in physics, and concludes with the similarly weighty topics of parallel universes, precognition and perpetual-motion machines. What comes in between, however, consists mainly of a menu of geeky technology — ray guns, starships, robots and so on — as well as a dose of telepathy and psychokinesis. So although physics is up there in the title, most of the physics content is confined to the final 100 of the book’s 350 pages.

A book about the impossible should point out that the quantum world already features plenty of “impossible” things, including time travel. Like embarrassing medical conditions, other puzzles and paradoxes of quantum physics used to be banished from view, but the graceful emergence of quantum entanglement from the fog of the Einstein—Podolsky—Rosen paradox, and the implications of the Aharonov—Bohm effect for nanoscience, have changed this. On the other hand, Schrödinger’s disreputable cat is still confined to a conceptual closet. Kaku lets it out and explores the possibility of reaching all kinds of “parallel universes”, where unfamiliar things happen. Elvis still lives in one such universe.

Quantum physics also points the way to achieving teleportation, which also once purely belonged to science fiction. Now photons — and even entire atoms — can be teleported across hundreds of metres. But Kaku remains ambivalent about whether we will ever be able to teleport on the macroscopic scale. The negative energy of the Casimir effect is another quantum peculiarity become fact. However, this force only operates under microscopic dimensions, which seems to rule out using it, or something like it, to drive spaceships carrying humans, not to mention the negative implications that such travel has for those on board.

Whatever the scientific potential of his topic, Kaku invariably turns to the human dimension, which injects pace into the book. Science has taught us, however, that the human scale is arbitrary and fragile. Just because we can think about these things does not mean that we have a privileged place in the universe. If Elvis lives, for example, any topological conduit to his parallel universe would be microscopically small or fiendishly uncomfortable, so Kaku’s dogged insistence on human involvement pushes many of his prospects out of reach.

A slight deficiency with the book is that Kaku includes little on the promising technology of smart materials, although they could be useful for accomplishing his goal of invisibility. Meanwhile, his treatment of custom-built biomolecules and nanobots — which could fight disease or tune physiology — is confined to psychokinesis: the ability to influence material objects using the mind. Looking beyond current technology, the author points out that hitherto reliable extrapolations like Moore’s law cannot continue forever, and replacements for silicon could open up scary new possibilities where computers could take over from people.

Although Kaku mainly sidesteps the grim implications of contemporary arms and weapons, his chapter on ray guns points out that a weapon taming the mechanism of astronomical gamma-ray bursters would make a terrestrial thermonuclear bomb look like a fizzle. An alien civilization could use one to completely wipe us out. But if he is concerned about the future of human civilization, Kaku should perhaps have looked at the possibilities of future science for the environment and climate change. What of the possibility of using physics to control the weather? That is surely more important than alien civilizations with cyborgs or phaser guns.

He also highlights tachyons — unorthodox particles that travel faster than light and that could have been responsible for fundamental mechanisms such as the primordial inflation of the Big Bang, or the Higgs effect, which endows particles with mass. While we are about it, why not also have creationist tachyons capable of producing a universe of astronomical dimensions within a biblical timescale?

On the whole, Kaku’s future physics is mainly a reductionist “bottom-up” synthesis, which ignores collective behaviour, such as superfluidity, that also surely has a message for the future. Each chapter predictably conforms to a template — an introduction based on Star Trek or some other convenient science fiction, the conjecture, and a final hook to the next chapter. Despite these shortcomings, however, Physics of the Impossible is a stimulating and entertaining read, underlining the need for know-all scientists to avoid smug complacency. It should really have been called “Science of the Impossible”, but that title has been pounced on elsewhere.

Once a physicist: Ian Leigh

Why did you originally choose to study physics?

I always found the sciences much more satisfying than the arts — it seemed far tidier and more rewarding to be able to reach a solution to a problem rather than to interpret situations where there was no right answer. I studied maths, physics and computer science at the University of Edinburgh but chose to specialize in physics because it seemed to be so fundamental and to have so much relevance to everyday life.

How much did you enjoy the subject?

I enjoyed the practical side of physics far more than the theoretical side, but even found the theory fascinating when it could be related to real-life situations. That said, I developed a deep admiration for those of my tutors — such as Peter Higgs — who appeared to be so comfortable with theoretical concepts I could never grasp.

What did you do next?

After I graduated in 1979 I took a job at the National Physical Laboratory (NPL), where I researched the development of standards for micro-indentation hardness. The problems encountered in getting accurate and repeatable results on a microscopic scale were enormous because there were so many sources of error in the equipment, the measuring system and thematerials themselves. I submitted this research for my PhD thesis, which was examined by the late David Tabor, the undisputed master of this subject.

Why did you move away from physics?

From very early in my career I was given opportunities to develop administrative and policy skills as well as practical scientific expertise. I rapidly became a “jack of all trades” — more politely known as a “technological generalist” — and before long I found myself doing scientific administration and programme formulation rather than working at the lab bench. For example, in the late 1980s I established the LINK programme in nanotechnology, which provided UK government support for projects undertaken jointly by industrial and academic partners. This laid the foundations for some of the work that is taking place in this field today.

How did you end up working for Postwatch?

After many years of technological generalism, I eventually transferred to a purely administrative job in the Department of Trade and Industry (DTI), dealing with their regulatory programme and its impact on business. From there it was a short step to Postwatch, which was sponsored by the DTI.

What does your role there involve?

My job is to try and ensure that consumer needs are at the heart of the debate over the transformation of postal services in the UK and beyond. The postal industry worldwide is evolving very rapidly in response to changing methods of communication and customer behaviour, and traditional monopolists such as Royal Mail are having to adapt. They not only need to improve efficiency by cutting labour costs and introducing new technological solutions, but also understand the requirements of consumers and provide products that will encourage these customers to continue to value postal services. I also manage the organization’s programme of research on consumer needs. As the only country with a statutory body dedicated to understanding and articulating the needs of postal consumers, the UK is well placed to contribute to current international debates — such as the full liberalization of the European postal market.

How does your physics training help the way that you work?

I’m really good at fixing the photocopier when it jams or needs the toner replacing! More seriously, the disciplined way of thinking and analysis that a physicist develops, and the metrologist’s attention to detail, are useful in any field of work, as is the experience of presenting complex concepts and ideas to a sceptical audience. And although it’s not the result of cutting-edge scientific research, people are often surprised to learn how much technology is actually involved in modern postal operations. That said, the accuracy with which postal operators measure size and weight are not quite up to the standards of NPL!

Do you still keep up to date with any physics?

Apart from trying (and often failing!) to help my son with his sixth-form physics homework, I still take a keen interest in developments at NPL. I also read the science pages in my daily newspaper and look forward to receiving Physics World every month, which is now my most regular contact with the world of physics.

Passing of a legend

John Wheeler, who died last month at the age of 96, was one of the few true giants in physics (p7 print version only). Best known for his work on nuclear fission and general relativity — and for introducing the terms black hole, wormhole and quantum foam — Wheeler was one of the last surviving links with Niels Bohr and Albert Einstein, with whom he famously debated the meaning of quantum mechanics. Wheeler counted Richard Feynman among his PhD students and, like all the best physicists, never turned his back on the enthusiasm of youth. As he wrote in his 1998 autobiography Geons, Black Holes, and Quantum Foam, “Throughout my long career…it has been interaction with young minds that has been my greatest stimulus and my greatest reward.” Helping those minds to achieve great things is sure to be one of Wheeler’s lasting legacies.

The Bohr paradox

Niels Bohr

In his book Niels Bohr’s Times, the physicist Abraham Pais captures a paradox in his subject’s legacy by quoting three conflicting assessments. Pais cites Max Born, of the first generation of quantum physics, and Werner Heisenberg, of the second, as saying that Bohr had a greater influence on physics and physicists than any other scientist. Yet Pais also reports a distinguished younger colleague asking with puzzlement and scepticism “What did Bohr really do?”.

We can sympathize with that puzzlement. In history books, Bohr’s chief contribution to physics is usually said to be “the Bohr atom” — his application in 1912–3 of the still-recent quantum hypothesis to overcome instabilities in Rutherford’s “solar-system” model of the atom, in which electrons travelled in fixed orbits around a positively charged nucleus. But this brilliant intuitive leap, in which Bohr assembled several puzzling features from insufficient data, was soon superseded by more sophisticated models.

Bohr is also remembered for his intense conversations with some of the founders of quantum mechanics. These include Erwin Schrödinger, whom Bohr browbeat into a (temporary) retraction of his ideas; Heisenberg, who broke down in tears under Bohr’s relentless questioning; and Einstein, whom Bohr with debated for years. Bohr is remembered, too, for “complementarity” — an ordinary-language way of saying that quantum phenomena behave, apparently inconsistently, as waves or particles depending on how the instruments that measure them are set up, and that we need both concepts to fully capture the phenomenon.

He has, though, been mocked for the supposed obscurity of his remarks on this subject and for extending the idea to psychology, anthropology, free will, love and justice. Bohr has also been wrongly blamed for mystical ideas incorrectly ascribed to the “Copenhagen interpretation” of quantum mechanics (a term Heisenberg coined), notably the role of the subjectivity of the observer and the collapse of the wave packet.

Now Bohr is back in focus. Publishing giant Elsevier is this month putting the massive 12-volume Niels Bohr Collected Works online for the first time and has also created a print index for the entire set, which contains Bohr’s extensive correspondence and writings about various aspects of physics and society. Most of volume 10 and much of volumes 6 and 7, for instance, are about complementarity. The time is therefore ripe for re-evaluating Bohr and clarifying the “Bohr paradox”: why he is both revered and underappreciated?

What Bohr did

Bohr practised physics as if he were on a quest. The grail was to fully express the quantum world in a framework of ordinary language and classical concepts. “[I]n the end,” as Michael Frayn has Bohr’s character say in the play Copenhagen, “we have to be able to explain it all to Margrethe” — his wife and amanuensis who serves as the onstage stand-in for the ordinary (i.e. classically thinking) person.

Many physicists, finding the quest irrelevant or impossible, were satisfied with partial explanations — and Heisenberg argued that the mathematics works: that’s enough! Bohr rejected such dodges, and rubbed physicists’ noses in what they did not understand or tried to hide. However, he did not have an answer himself — and he knew it — but had no reason to think one could not be found. His closest approximation was the doctrine of complementarity. While this provoked debate among physicists on the “meaning” of quantum mechanics, the doctrine — and discussion — soon all but vanished.

Why? The best explanation I have heard is advanced by the physicist John H Marburger, who is currently science advisor to US President George Bush. By 1930, Marburger points out, physicists had found a perfectly adequate way of representing classical concepts within the quantum framework using Hilbert (infinite-dimensional) space. Quantum systems, he says, “live” in Hilbert space, and the concepts of position and momentum, for instance, are associated with different sets of coordinate axes that do not line up with each other, thereby resulting in the situation captured in ordinary-language terms by complementarity.

“It’s a clear, logical and consistent way of framing the complementarity issue,” Marburger explained to me. “It clarifies how quantum phenomena are represented in alternative classical ‘pictures’, and it fits in beautifully with the rest of physics. The clarity of this scheme removes much of the mysticism surrounding complementarity. What happened was like a gestalt-switch, from a struggle to view microscopic nature from a classical point of view to an acceptance of the Hilbert-space picture, from which classical concepts emerged naturally. Bohr brokered that transition.”

Thus while Bohr used the notion of complementarity to say that quantum phenomena are both particles and waves — somewhat confusingly, and in ordinary-language terms — the notion of Hilbert space provided an alternate and much more precise framework in which to say that they are neither. Yet the language is abstract, and the closest outsiders can come to grasping it is Bohr’s awkward and imperfect notion.

The critical point

In the first generations of quantum theory, Bohr was revered for leading the quest to keep the field together within a single framework expressible in ordinary language and classical concepts. The Bohr paradox arises because the results of the quest were manifested not in citation indices linked with Bohr’s name, but in an increased integrity of thought that pervaded the entire field, which has proved hard for subsequent generations of physicists — and even historians — to appreciate.

If Bohr’s quest had a specific result, it was the idea of complementarity. But physicists soon found a more effective and satisfying way to represent quantum phenomena in a technical language using coordinate systems in Hilbert space. Scientists need to pursue possible and important paths even if they do not pan out. Bohr’s greatness was to recognize the importance of this quest, and to relentlessly carry it out with insight and passion. If it did not succeed, and if in the end he would not be able to explain it all to Margrethe — for whom it would have to remain esoteric — that was nature’s doing and not Bohr’s failing. It should not diminish our appreciation of his achievement.

The heat is on

Planes travelling from Europe to the west coast of the US usually fly directly over Greenland. Most passengers miss it, but if you have a window seat and keep watch at about the time that the dinner trays are being cleared away, then you may be lucky enough to catch a glimpse of a truly majestic landscape in which massive glaciers, fed from a vast and featureless ice sheet, spill into iceberg-choked fjords. Although your plane will be thousands of metres overhead, these remote glaciers are nonetheless feeling the impact of human activities like air travel. As the temperatures over Greenland rise as a result of climate change, the speed at which many of these glaciers are moving is increasing so rapidly that more ice is being lost from the ice sheet than is being replaced by new snowfall. In other words, the ice sheet is giving up its mass to the oceans, and, as a result, sea levels are rising.

The rate of sea-level rise has startled both scientists and policy-makers enough to make headlines and become embedded in government and international reports. It is easy to see why they are concerned — even a half-metre rise would cause flooding that would affect hundreds of millions of people in low-lying areas. Suddenly, “glacier dynamics” — the physics that controls how fast glaciers flow — has become a subject of international importance.

The 2007 report from the Intergovernmental Panel on Climate Change (IPCC) cites retreating glaciers and rising sea levels as evidence that warming of the climate system is unequivocal. And with enough water stored in the Greenland and Antarctic ice sheets to raise global sea levels by approximately 7 m and 57 m, respectively, being able to predict how these large ice sheets will behave in a warming climate is critical if we are fully to understand the consequences of climate change.

In the May issue of Physics World, Tavi Murray, Ian Rutt and David Vaughan describe how physicists can predict the movement and melting of glaciers.

To read the full version of this article — and the rest of the May issue of Physics World — please subscribe to our print edition.

Auroral light is polarized after all

Fifty years ago an Australian physicist called Bob Duncan reported that light from the Aurora Australis (or Southern Lights) was polarized. Although his discovery could have provided a new way of studying the atmosphere of Earth, other scientists at the time were unconvinced. Duncan’s finding was quickly cast aside and the prevailing wisdom for the last half century has been that such light is not polarized. Now, however, an international team of physicists has made a similar measurement from the Arctic island of Svalbard, which suggests that Duncan was right all along.

The Aurora Borealis (Northern Lights) and Aurora Australis are spectacular displays of light that can be seen at high and middle latitudes. Aurorae occur when charged particles (mostly electrons and protons) ejected by the Sun are focussed and accelerated by the Earth’s magnetic field. These particles are believed to follow helical paths along Earth’s magnetic field lines.

Duncan considered what would happen when such twirling electrons collide with gas atoms about 200 km up in the atmosphere. He proposed that the electrons transfer energy to the atoms, leaving them in a specific excited quantum state. When they return to the ground state, the atoms emit light with a specific polarization through the process of fluorescence. And on one night in 1958, he measured a degree of polarization of 30% in light coming from the southern sky over Tasmania.

Jostled atoms

However, leading physicists of the day challenged Duncan’s finding because, in the process of emitting the light, the atom is expected to remain in the excited state for some time. During this time, it would be jostled about by other gas atoms, which — or so fellow researchers of the time argued — should destroy the polarization. As a result, Duncan’s observation was quickly discredited and forgotten.

Now, Roland Thissen and colleagues at the Planetary Science Laboratory at the CNRS Planetary Science Laboratory in Grenoble, France, and collaborators in the Netherlands and Norway have confirmed that auroral light is indeed polarized (Geophys Res Lett 35 L08804). The team focussed on red light emitted from fluorescing oxygen atoms and found that it has a maximum polarization of about 6% when it reached their instrument.

This figure was seen during in the “quiet” times between visible aurorae events when light is still being created even though it cannot be seen by the unaided eye. The polarization dropped to about 2% when an aurora was visible.

This drop, according to Thissen, means that Duncan’s critics were at least partially right. During visible aurorae the colliding electrons are thought to have higher energies than at quiet times. This means the electrons transfer more of their energy to the oxygen than at quiet times — and the process of giving up this extra energy before emitting red light tends to “smear” the polarization, said Thissen.

The observation was made using a standard optical technique involving splitting the light from the region of the aurorae into two channels, and passing one channel through a polarization filter. The polarization of the light can be determined by comparing the intensity of light in the filtered and unfiltered channels.

This is a standard technique that could be adapted to study the polarization of light from aurorae on planets such as Saturn and Jupiter. It could even be used to examine light coming from “exoplanets” orbiting other stars, says Thissen. This could help astronomers gain a better understanding of the both the magnetic fields and atmospheres surrounding these distant worlds.

Here on Earth, polarization studies could lead to a better understanding of quiet times in the aurorae and why the lights can suddenly flare up — often disrupting radio communications and other technologies.

Enigmatic measurement

Sadly, Duncan did not live to see his ideas revived — he died in 2004 — and Thissen said that it was unfortunate that they could not invite him to participate in their research. He said that Duncan’s measurement of 30% polarization remains an “enigma” because it represents the maximum polarization of light emitted from oxygen — something that can be seen in the lab, but should not be seen in light that is created in the atmosphere and then travels over 200 km before being detected.

Dawn of the memristor

As all popular electronics textbooks will verify, any passive circuit can be created with a combination of just three standard components: a resistor, which opposes charge flow; an inductor, which opposes any change in the flow of charge; and a capacitor, which stores charge. But, if research by physicists in the US is anything to go by, the textbooks may have to be appended with a fourth standard component: a “memristor”.

Memristance will herald a paradigm shift not only in engineering, but also in the sciences Leon Chua, University of California at Berkeley

In simple terms, a memristor “remembers” the amount of charge that has flowed through it and as a result changes its resistance. The effect was predicted in 1971 by electronics engineer Leon Chua, but the only clues that it actually exists have been in the reports of strange “hysterisis” loops in the current–voltage relationships of thin-film devices. This means that when the voltage increases the current follows a different relationship to when the voltage decreases.

“[Scientists] essentially reported this behaviour as a curiosity or a mystery with some attempt at a physical explanation for the observation,” says Stanley Williams of Hewlett Packard Labs in Palo Alto, California. “No-one connected what they were seeing to memrisistance.” Now, Williams and colleagues from Hewlett Packard have made the first memristors.

Model behaviour

To make their memristors, Williams’s team first considered how memristance might originate at the atomic level. They came up with an analytical model of a memristor that consists of a thin piece semiconductor containing two different regions: a highly doped region, which has a low resistance, and a zero-doped region, which has a high resistance. When a voltage is applied across the semiconductor, it causes some of the dopants to drift so that the combined resistance changes, thereby producing the characteristic hysterisis effect of memristance.

To put this model into practice, Williams’s team attached a layer of doped titanium dioxide to a layer of undoped titanium dioxide. Through current–voltage measurements, they found that it did indeed exhibit the hysterisis effect of memrisistance (Nature 453 80).

Chua told physicsworld.com that he is “truly impressed” that the Californian team has proved his theory. “Most of the anomalous behaviours that have been widely reported in the nano-electronics literature over the last decade can now be understood as simply the manifestation of memristive dynamics,” he says. “It will herald a paradigm shift not only in engineering, but also in the sciences.”

Williams says his team has already made and tested thousands of memristors, and has even used them in circuits containing integrated circuits. Because the hysterisis of memristors makes them able to operate like a switch, the team are now looking at how they can exploit memristance for tasks normally reserved for digital-logic electronics. These include a new form of non-volatile random access memory, or even a device that can simulate synapses — that is, junctions between neurons — in the brain.

LHC magnets pass test

LHC.jpg

On April 3 last year, the left-hand side of the Fermilab Today website had a graphical weather forecast depicting storm clouds. It was a fitting metaphor for the mood of the US lab, which had recently discovered that one of the “quadrupole” magnets it supplied the European lab CERN for the Large Hadron Collider (LHC) had failed a preliminary test. On the right-hand side of the website, Pier Oddone, the director of Fermilab, admitted they had taken “a pratfall on the world stage”.

Indeed they had. The failure meant they had to replace all similar magnets with redesigned models and skip the low-energy test runs that were due to take place before winter. It also added to the problems that forced CERN to delay the LHC’s (already repeatedly delayed) start up from May to July this year.

Now, though, everything looks to be well again. On the right-hand side of Fermilab Today, Oddone writes that the first of the replaced magnets has passed the test it failed last year. He writes that the 50 or so scientists, engineers and technicians at CERN who made the repairs deserve “a crown”. And the left-hand side of the website is forecasting sunshine.

The original problem was that the magnets had inadequate support to withstand the forces produced during “quenching”. This is when a magnet gets warmed up above its 1.9 K operating temperature, and could happen happen, for example, if one of the LHC’s proton beams veers off course. Last Friday the replaced magnet passed the one-hour test designed to simulate quenching.

“Everyone commissioning the LHC,” writes Oddone, “both accelerator and detectors, is racing excitedly towards colliding-beam operation and the great physics results that we can almost taste.”

'He should seriously consider his position'

The current “crisis” in physics funding in the UK was big news this morning on BBC Radio 4’s Today programme.

In an interview that ran just before the 8:00 news, physicist Brian Cox of the University of Manchester said that Keith Mason — head of UK’s main funding body for physics and astronomy (the STFC) — “should seriously consider his position”.

Cox was commenting on a report released today by a committee of British parliamentarians that poured criticism on recent funding decisions made by the STFC and also questioned the competency of its senior management.

Cox told the programme that the STFC has caused a great deal of anxiety in the physics community, with some researchers being told that they would have to pull out of long-standing international collaborations. He also said that morale was very low among people working at the STFC itself.

Interviewed in an earlier piece on Today was committee chairman Phil Willis MP, who said that the STFC’s actions had jeopardized the UK’s high international standing in physics research.

Mason declined the BBC’s request for an interview.

You can listen to a repeat of the Today programme using the BBC’s Listen Again service.

Copyright © 2025 by IOP Publishing Ltd and individual contributors