Skip to main content

Happy Birthday, PRL

prl%2050.jpg

That’s some cake!

Yesterday evening my IOP Publishing colleagues and I managed to blag our way into a posh reception celebrating 50 years of the journal Physical Review Letters. I forgot to take my camera, so the photo is courtesy of James Riordon at the APS.

And yes, we did sing:
Happy birthday Physical Review Letters,
Happy birthday to you.

A new spin on silicon and graphene

At last year’s March Meeting in Denver, Ian Appelbaum gave a ten-minute talk about how he had injected spin-polarized electrons into a piece of silicon, transported them micrometres and then detected a spin-polarized current at the other end. It was just one of thousands of talks given that year.

But then Appelbaum published his results in Nature and this year he has been invited back to speak for 30 minutes — which he did today in a packed session that focused on spin injection in silicon.

The ultimate goal of Appelbaum’s research is to find practical ways to make “spintronic” devices, which in principle, could use the spin of the electron to process information much more efficient ways than coventional electronics.

To make a spintronic device, you need a material through which electrons can flow without losing their spin polarization — and it would be nice if that material was compatible with chip-making processes. Silicon fits the bill on both accounts, but is also has several drawbacks — it is difficult to inject spin-polarized electrons into the material; and once they are there it is difficult to measure their polarization.

Working at the University of Delaware Appelbaum’s team were the first to overcome these problems and you can find out how here.

I spoke with Appelbaum before his talk about how the fledgling field of silicon spin injection was shaping up. He described his breakthrough as a “clarification of the technologies that are needed”, and added that at least one more year of work by his team and others was needed before it would be possible to take a broader view of where the field was going.

Also speaking at the session was Berry Jonker of the Naval Research Lab. While Appelbaum detected spin polarization electrically, Jonker has worked out ways to detect it using light — something that is not usually possible thanks to silicon’s poor optical properties. Jonker finished his talk by declaring “There is a bright future for silicon spintronics”.

The future could also be bright for spintronics based on pieces of graphene — which are tiny flakes of carbon just one atom thick. It turns out that graphene shares many of silicon’s spin-friendly properties including weak spin-orbit and hyperfine interactions.

Speaking at a session on graphene, Bart van Wees of the University of Groningen, described a similar experiment to Appelbaum’s — but with graphene as the conductor. The experiment revealed that graphene is a good conductor of spin — but nowhere as good as silicon. The Groningen team found that spin polarization decays after the electrons travelled about 2 micrometres — a tiny distance compared to silicon. Indeed, Appelbaum told me that he hopes to transmit spins through a centimetre of silicon by the end of the year.

Van Wees described this shortcoming as a “mystery”.

Graphene has earned a reputation as a “wonder material” thanks to its outstanding electrical, thermal and mechanical properties. It’s comforting to know that graphene has been beaten by humble silicon when it comes to spintronics — at least for now!

Free-electron laser benefits from ‘seed’ light

Researchers in France and Japan have developed a technique that not only reduces the size of a free electron laser (FEL) but also generates coherent light at X-ray wavelengths down to 32 nm for the first time. The technique, which involves seeding the laser with another light source, could be refined to produce fully-coherent pulses with wavelengths of 2–4 nm, opening up the “water window” crucial for studying biological samples.

Conventional lasers work by amplifying the radiation emitted when electrons move between certain energy levels in atoms or molecules. Things are different in an FEL: a beam of electrons is accelerated in a wavy, sinusoidal trajectory along a series of magnets collectively called an undulator. As the beam approaches the speed of light, the electrons emit spontaneous radiation known as synchrotron radiation.

The electrons are never fully detached from this synchrotron radiation — they interact and amplify it to produce self-amplified spontaneous emission (SASE) at a variety of wavelengths. The FEL can be tuned by changing the energy of the electrons and the position of the undulator’s magnets. However, this configuration presents a problem in that “spikes”, or fluctuating light pulses, appear at short wavelengths in the SASE temporal and spatial profiles. These spikes make it difficult to perform time-resolved experiments, which are important for many applications.

An external source

One solution to this problem is to seed the FEL with a coherent, external source. “The FEL properties are significantly improved because they reflect the coherent properties of the seed source,” team member Guillaume Lambert told physicsworld.com.

The method, which the researchers developed during experiments at the SPring-8 Compact SASE Source (SCSS) test accelerator in Japan, involves focusing intense infrared light from a titanium–sapphire laser onto a cell containing xenon gas. The “higher-order harmonic generation” (HHG) beam produced — the seed — is then refocused onto the FEL to produce light that is coherent in both space and time. This particular technique leads to a much more compact FEL source for generating light with wavelengths ranging from ultraviolet to X-rays: the undulator length is reduced from nine metres in a normal SASE configuration to just four metres (Nature Phys. doi:10.1038/nphys889).

“Compared to conventional synchrotron sources, FELs provide a high degree of temporal coherence, pulses that last just tens or hundreds of femtoseconds [10–13–10–14 s] and a ten-billion times higher peak brightness,” explains Toru Hara of the Japanese team. Synchrotron radiation is widely used in biology for determining the structure of proteins, whereas fourth-generation light sources, like this FEL, will allow protein function within cells to be observed. This is because the light produced is not absorbed by water, which allows cells to be observed.

At 160 nm, the seed has the shortest wavelength ever produced in such experiments — and non-linear harmonics are produced down to 32 nm. Moreover, the technique is the first demonstration of seeding an FEL device with HHG. Earlier seeding in FELs has been performed with conventional lasers at 10.6 µm (far infrared), 800 nm (infrared) and 266 nm (ultraviolet). The researchers are now planning to measure the wavefront of the light produced with a view to investigating its spatial properties, and later this year will try to seed at 50–60 nm.

The HHG seeding technique will be implemented shortly on the SPARC and FLASH FEL facilities in Italy and Germany, respectively.

Black holes as quantum-information mirrors

untitled1.jpg

This is the tale of Alice, Bob and a black hole.

Alice and Bob are a couple with a big communication problem — they only talk using quantum information systems. Usually this involves sending encrypted messages via quantum dots or entangled photons, which are unreliable at the best of times.

But now Caltech’s John Preskill believes that they should try to exchange messages via a black hole. The idea is that Alice sends her message into the black hole where it gets mixed up in whatever goes on inside the event horizon. The event horizon marks the distance from the black hole from which nothing — not even light — can escape, so you would be forgiven for thinking that her message would be lost for ever.

Not so says Preskill — the information is slowly transmitted out of the black hole in the form of Hawking radiation. This is radiation that is thought to slowly leak out of a black hole, eventually causing the black hole to vanish.

All Bob has do is gather the Hawking radiation and use it to build a quantum state that is entangled with the state of the black hole inside the event horizon. Eventually, Alice’s message will leak out with the Hawking radiation, and the entanglement will allow Bob to read it –or something like that!

But, like all Alice and Bob’s other quantum conversations, there are problems — the process would take a very long time, and no-one has actually been able to see Hawking radiation from a black hole. Alice and Bob need to talk this over before they agree to Preskill’s scheme.

Templeton Prize again goes to physicist

The Templeton Foundation today announced that the Polish mathematical physicist and former priest Michael Heller will receive the 2008 Templeton Prize. Heller, whose more than 40 year-long career has encompassed research in theology, philosophy, mathematics and cosmology, intends to use the £820,000 prize to found an inter-university institute in Poland that will investigate questions in science, theology and philosophy.

The Templeton Prize, which was established in 1972 by the philanthropist Sir John Templeton, is awarded annually to a living recipient for “progress toward research or discoveries about spiritual realities”. According to the Templeton Foundation, the award is intended to encourage the concept that resources and manpower are needed to accelerate progress in spiritual discoveries, which means in practice that the award often goes to a scientist. Physicists have been particularly successful in recent years: former laureates include John Barrow (2006), Charles Townes (2005), George Ellis (2004), John Polkinghorne (2002), Freeman Dyson (2000) and Paul Davies (1995).

It seems obvious that science just reads the mind of God Michael Heller

Heller continues this trend. Although only holding degrees in theology and philosophy, he also unofficially studied physics while at the Catholic University of Lublin in Poland, which did not have the right to grant degrees in physics. Since then Heller has conducted research in several areas of cosmology and mathematics, and is currently working on a new branch of mathematics known as non-commutative geometry.

“The standard model of cosmology treats space as a differential manifold, but this concept breaks down at singularities, so understanding the geometry of the big bang is a major issue for cosmologists,” Heller told physicsworld.com. “Non-commutative geometry — which is totally non-local in character so the idea of points, for example, is totally meaningless — can cope with singularities.” According to Heller, the theory also has the potential to unify the areas of gravity, quantum mechanics, thermodynamics and probability.

Heller, 72, is also a devout Catholic, and indeed he originally trained as a priest. This might seem like an unusual combination, but attitudes in Soviet-controlled Poland were supportive of religious scientists. “The communist regime was atheist, but the Poles were rebellious against that and unofficially I was welcomed by Polish physicists,” he says. And, in common with other Templeton Prize-winning scientists, Heller sees no incompatibility between religion and science.

“From the point of view of theology I see no contradiction because it seems obvious that science just reads the mind of God,” he says. Looking at it from the point of view of science, however, Heller admits that the situation is less clear cut. “It depends on your concept of rationality,” he explains. “If you think that the limits of rationality coincide with the limits of the scientific method, then science and religion are incompatible.” Heller, however, believes that rationality extends beyond what can be discovered with science — in other words that there may be aspects of the universe that are beyond the reach of science.

These views, together with his strong background in both science and religion, are undoubtedly part of the reason why Heller was chosen as this year’s winner. As the author of some 30 books, including The New Physics and a New Theology and Creative Tension: Essays on Science and Religion, he is, in his own words, a “bridging person” who can facilitate a dialogue between the two communities. Heller intends to use the prize money to do just that by setting up an inter-university institute to investigate questions in science, philosophy and theology. Dubbed the “Copernicus Centre”, the new institute will be affiliated with the Jagiellonian University and the Pontifical Academy of Theology in Cracow. And, of course, he will be continuing his physics research.

WMAP gives thumbs-up to cosmological model

It’s easy to forget that until recently cosmology was largely a theoretical science. Thanks in particular to the Wilkinson Microwave Anisotropy Probe (WMAP), which was launched by NASA in 2001 to study the cosmic microwave background, researchers are now able to talk about the first instants of the universe with the kind of certainty normally associated with a bench-top experiment.

With the analysis of two further years of WMAP data announced last week, that view of the early universe has just got even more detailed. As well as placing tighter constraints on parameters such as the age and content of the universe, the five-year WMAP data provide new, independent evidence for a cosmic neutrino background. The detection of such low-energy neutrinos, wrote Steven Weinberg in 1977 in his famous book The First Three Minutes, “would provide the most dramatic possible confirmation of the standard model of the early universe” — yet at the time no-one knew how to detect such a signal.

Ghostly radiation

WMAP measures the cosmic microwave background: a cold blanket of photons, which according to the new data hails from 375,900 years after the Big Bang (give or take 3100 years or so). This was the moment when the universe had expanded and thus cooled sufficiently for hydrogen atoms to form, allowing photons to flee what had previously been a dense plasma of charge particles. However, long before such “decoupling” between matter and radiation took place — just a second or two after the Big Bang — neutrinos (which interact much more weakly than photons) should have been similarly liberated, producing a shroud of even colder cosmic neutrino radiation.

The first year of WMAP data, which was announced in February 2003, measured the tiny temperature fluctuations in the cosmic background photons (thought to have been produced by the same density perturbations in the primordial plasma that led to the formation of galaxies) in unprecedented detail. The observations were a huge success for the standard model of cosmology, which describes a flat, homogenous universe dominated by dark matter (unidentified gravitating but non-luminous matter) and dark energy (a mysterious entity speeding up the expansion of the universe).

By March 2006, having collected three times more data, the WMAP team had measured the polarization of the background photons. This provided rare if not rigorous constraints on models of inflation — a period of enormous expansion thought to have taken place 10–35 seconds after the Big Bang, and a key component of the standard cosmological model.

The five-year data, which was collected between August 2001 and August 2006, determines the temperature fluctuations at small angular scales more precisely. In particular, the theoretical prediction of the third peak in the “angular power spectrum”, which shows the relative strength of the temperature variations as a function of their angular size, only matches the data if the very early universe was bathed in a vast number of neutrinos which would have smoothed out the density perturbations very slightly. The neutrino background was first inferred in 2005 from WMAP data in conjunction with galaxy surveys, but this is the first time the incredibly faint signal has been measured solely from the cosmic microwave background.

As such, the cosmic microwave background provides an independent estimate of the number of neutrino “families” in nature: 4.4 ± 1.5. Despite having been inferred from a totally different cosmological epoch, this value agrees with constraints from Big-Bang nucleosynthesis, the first few minutes of the universe during which light nuclei were manufactured, and with precision measurements at particle accelerators which fix the number of families at three. The WMAP5 data also constrain the combined mass of all types of neutrino to be less than 0.61 eV.

“The discovery of the neutrino background tells us that our models are pretty much right,” says cosmologist Pedro Ferreira of the University of Oxford. “Stuff from particle physics that you’re not putting in by hand just drops out of them — that’s pretty cool if you ask me.”

The discovery of the neutrino background tells us that our models are pretty much right Pedro Ferreira, University of Oxford

Polarizing results

The buzz surrounding the release of the three-year WMAP results in 2006, which has been lower-key for the five-year results, was partly due to their impact on inflation. By measuring the incredibly weak polarization signal of the photons, the WMAP3 data were able to tighten the limits on the “spectral index” of the fluctuations, ns. This is a central parameter in inflationary models which describes the slope of the angular power spectrum once its oscillatory features have been removed: the WMAP3 data favoured a “tilted” spectrum (ns < 1), which is a natural feature of simple inflation models. WMAP5 strengthens this picture: ns = 0.960 ± 0.014.

Another hallmark of inflation is gravitational waves, which would have been produced by motion on the quantum scale and then blown up during inflation. “The five-year results put tighter limits on the gravitational wave amplitude,” says Gary Hinshaw of NASA’s Goddard Flight Center in Maryland, who heads the data analysis for the WMAP science team. “Now gravity waves can contribute no more than 20% to the total temperature anisotropy [corresponding to a “tensor–scalar ratio”, r = 0.2], as opposed to r = 0.3 for the three-year result.” The new combined limits on spectral index and gravitational waves rule out a swathe of inflation models.

A third big result for the five-year data concerns the origin of stars and galaxies. Because the polarization of the cosmic background photons is affected by the presence of ionizing material, WMAP provides new insights into the end of the “cosmic dark ages” when the first generation of stars began to shine.

“We now know that the first stars needed to come earlier than the first billion years in order to give a large enough polarization signal in the CMB,” says WMAP team member Joanna Dunkley. “There is evidence [from WMAP3] that this started at around 400 million years, but since quasars at later times indicate that the universe was still partly neutral at around 1 billion years, the lighting up process was likely quite extended.”

The WMAP5 data don’t change anything we thought, they change what we know Gary Hinshaw, NASA’s Goddard Flight Center

Weaving threads

Among the tens of other cosmological parameters that have been tightened by the new WMAP results are the age of the universe (13.73 ± 0.12 Gyr) and its content, reinforcing the unsettling fact that 95% of the universe is made of stuff that science cannot explain. According to WMAP, neutrinos made up 10% of the universe at the time of recombination, atoms 12%, dark matter 63%, and photons 15%, while dark energy was negligible. Today, by contrast, 4.6% of the universe is made of atoms, 23% dark matter, 72% dark energy and less than 1% neutrinos.

“The WMAP5 data don’t change anything we thought, they change what we know,” says Hinshaw. “Despite the enigmas of dark matter and dark energy, the more different threads we can weave together [such as determining the number of neutrinos based on different mechanisms that occur at separate cosmic epochs], the more confident we become in the basic picture.”

• The five-year data are reported in seven papers submitted to The Astrophysical Journal

Comparisons

show.jpg

Being at the APS meeting last year in Denver, I can’t help but think about some comparisons. Although I was at the march meeting last year in a different capacity (as a researcher, giving my 10+2 minute talk), It seems to me that the scale of this year’s event is much smaller compared to last year.

Although people from the APS say this year more than 7000 participants are registered, having chatted to a few people in the exhibition area, where most of the companies selling their equipment are based, they say business is much less than last year, and indeed the number of people just walking around and in sessions, to me at least, seems much less.

Most people who I have talked to seem to think along the same lines. Of course one can come up with their own conclusions about this, but maybe physics departments and institutions are tightening their belts as a result of funding tightening both in the UK and US.

I don’t have any figures to back up my claim about any possible decrease in participant figures, but possibly Bush’s last US science budget request for the financial year 2009 will promise more money for physicists and a return turnout for the APS march meeting 2009.

Physicists and climate change

I just sat in on a press conference about how physicists are contributing to the study of climate change.

As a physicist writing for a physics publication, I often find it very difficult to cover stories about physicists tackling problems outside of “mainstream” physics. The problem is that I usually don’t know enough about the other discipline — be it botany, climatology or whatever — to really know if what the physicist has done is relevant.

One way to find out is to call up a botanist (say) and ask them. But it’s often the case that they are completely unaware that physicists are working in their field, and they speak a completely different language so it can be difficult to understand their take on the work.

Climate change offers rich seams of data and complex systems for physicists to study, and I’m guessing that research funds are not hard to come by. But despite the endless calls for more interdisciplinary collaboration, I fear that many physicists are tempted to look at climate change from a purely physics perspective — and miss the chance to make a more significant contribution.

I raised my concerns with the panel and John Wettlaufer of Yale University said that it was very important for physicists working outside the mainstream “to have a genuine interest in learning about someone else’s problem”. However, he admitted that “not many people want to do this”.

Brad Marston of Brown University added that it was important that physicists try to publish their work in general-interest journals such as the Proceedings of the National Academies of Science, where they will be peer reviewed (and hopefully read) by non-physicists.

Hurricanes and extreme weather systems

lemmings.jpg

Now for a topic that is close to the heart for many inhabitants of New Orleans – hurricanes. The devastation caused by the events in 2005 by hurricane Katrina led to most of the inhabitants of New Orleans to be displaced and caused news and debate around the world for months – still today, nearly two years on from the events many inhabitants are still homeless.
.
A hurricane can be modeled as a vortex, with a depth of only around 50 – 100 ft, but being miles across. Katrina was a Category 5 hurricane which brought water levels to around 30 ft at the coastal lines and was the sixth strongest Atlantic hurricane ever recorded.

Greg Holland from the National Center for Atmospheric Research, Boulder, Colorado, talked about understanding extreme hurricanes. Hurricane activity has increased substantially since the 1970’s, Holland pointed out that in his simulations just a 5 m/s increase in wind speeds can lead to a 100% increase in category 5 hurricanes – which he says seems to correlate well with observed behavior.

Of course the economic costs can be great of the devastation brought from a hurricane such as Katrina, which has now cost around $140 per household in the US. A following talk from Harold Brooks from National Severe Storms Laboratory, in Norman Oklahoma, showed another aspect of climate change – large hail stones. He showed that the frequency of large hail fall (i.e that the size of a baseball – still going on the baseball theme) is increasing by about 6% a year, and the ‘favorable severe environment’ for such weather conditions is increasing by 0.8 % per year. Both exploding just after the 1980’s.

Town planning was also a subject under scrutiny, with Holland pointing out the lemming like way we are build more and more communities next to the coast on some of the dangerous places like on marsh land.

However, it was noted during the session that Katrina is likely a few hundred year event, but this, however, now remains to be seen.

Doubt cast on liquid water on Mars

Is water flowing on Mars right now? Ask any planetary physicist two years ago, and the response would have been that the planet is probably dry, bar deposits of non-frozen water hidden a few kilometres beneath the surface. But towards the end of 2006, images from NASA’s Global Surveyor spacecraft published in Science revealed what seemed to be sediment left by rivulets that had flowed down the sides of craters in recent years. It strengthened the possibility, however remote, that Mars is supporting life.

A new study by Jon Pelletier and colleagues of the University of Arizona in the US, together with Randy Kirk from the US Geological Survey, also based in Arizona, suggests the Global Surveyor team may have jumped the gun. Analysing stereo images taken from space by the High Resolution Imaging Science Experiment (HiRISE) on board NASA’s Mars Reconnaissance Orbiter, the researchers have mapped the topography of the Centauri Montes region on Mars where the sediment had been spotted in 2006. They then performed computer simulations to see how this 3D land profile would shape the flow of either water or granular material.

The Arizona researchers found that a flow of water, in which drag would be due to surface roughness, would produce wide channels that end at a rounded point. On the other hand, the flow of granular material, in which drag would be due to viscous and yield stresses, would produce fairly narrow channels that peter out into finger-like extremities. These fingers can clearly be seen in the HiRISE images, leading Pelletier and colleagues to conclude that the tracks at Centauri Montes are not sediment, but are probably, in fact, the remnants of a mini-landslide (Geology 36 211).

“Our findings show that the previous evidence for water was far from definitive,” Pelletier told physicsworld.com. “Of course we cannot prove that liquid water has not flowed on Mars, because it’s always possible that some very minor trickle of liquid water has occurred that is simply too small to see.”

Michael Malin of Malin Space Science Systems in San Diego, US, which operated the camera on Global Surveyor and which published the original analysis in Science, was not aware of the Arizona group’s work so does not yet want to pass judgement. He admits, though, that he “would not be surprised that some of the light-toned features seen in gullies are not formed by water deposition.” He added that definitive observations may not be possible with orbiting satellites.

Existing doubts

There have already been doubts about the conclusions of the 2006 report. Malin’s team attributed the whitish colour of the tracks to ice, frost or salt — deposits that would favour the theory of water flow. If they are indeed ice or frost, they ought to sublime — that is, turn into vapour — over a period of 1–2 Martian years (1.9–3.8 Earth years).

However, other images taken by HiRISE since 2005 have showed no change in the brightness of the tracks that would accompany sublimation. Spectra taken in 2007 by the Compact Reconnaissance Imaging Spectrometer (CRISM), also on board the Mars Reconnaissance Orbiter, have showed no evidence for salts, again implying that there has been no liquid water.

But these results alone have not been enough to rule-out a water origin. Now, although planetary physicists will agree that water has flowed on the surface within the last million years or so — HiRISE scientists also recently discovered the bed of an ancient Martian lake — and that ice is prevalent to some extent, it could be some time before a consensus on present-day water flow is reached. “We have only been closely monitoring Mars for ten years, so it is wrong to say that if we don’t have consensus now we will never achieve it,” says Pelletier. “It’s early days yet.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors