Skip to main content

Sound causes colossal drop in resistance

Physicists know that the electrical resistance of certain manganese oxides called manganites can drop by as much as ten orders of magnitude when the materials are exposed to a magnetic field. While a full explanation of why this colossal magnetoresistance (CMR) occurs has evaded researchers, physicists have suspected for some time that it is related to interactions between electrons and phonons.

Now, an international team led by Andrea Cavalleri at Oxford University has performed an experiment that provides further insight into the role of phonons in CMR. The team fired a short terahertz (THz) laser pulse at a manganite sample while monitoring its electrical resistance by measuring the current flowing through it. When the energy of the laser is tuned to a specific phonon frequency, the resistance of sample drops dramatically for about 5 ns before returning to its original value.

According to Cavalleri, the pulse – which is about 300 fs in duration — is long enough to create phonons at a specific frequency (about 17 THz). However, it is short enough to avoid exciting electrons and other phonons at other frequencies. This allowed the team to conclude that the drop in resistance was caused exclusively by interactions between 17 THz phonons and electrons in their equilibrium state. The pulse was also short enough to ensure that the electrons did not heat up, which means that CMR does not necessarily require the electrons to be “hot”.

By exciting only 17 THz phonons and not heating the sample the team has managed to avoid the “chicken and egg” problem, which normally makes it very difficult to study materials such as manganite. In such materials the electrons interact with each other via phonons and if an experiment excites both the electrons and phonons it can be impossible to determine, for example, what is a cause of CMR and what is an effect of CMR.

The “chicken and egg” problem also affects those studying cuprate high-temperature superconductors and Cavalleri and colleagues now plan to use the technique to gain a better understanding of the role of electron-phonon interactions in these materials.

Cavalleri told physicsworld.com that there could be practical applications for colossal phonoresistance – particularly because it works at room temperature. It could be used, for example, to make THz radiation detectors and other THz optoelectronic devices. He also believes that the technique could be used to change the magnetic properties of certain materials using a THz laser pulse.

Single-atom entanglement goes further

Unlike classical bits of information, which must take either the value 0 or 1, quantum bits or “qubits” can assume a mixed-up superposition of the values 0 and 1. Furthermore, two qubits can be entangled so that the value of one qubit is revealed by measuring the value of the other. Although these odd properties have spawned an array of applications such as quantum encryption, future devices will hinge on the ability to remotely entangle qubits in a network that have already been separated by large distances.

Ideally atoms would be used to store qubits because they would remain stable over long timescales, while photons — which can travel undisturbed over long distances — would entangle them. Now Chris Monroe from the University of Maryland and others from the University of Michigan have demonstrated that photons emitted towards each other from separated atomic qubits can — after they have met midway — entangle the qubits from afar.

In their experiment, two atomic ions trapped a metre apart by electric fields are excited into a higher energy state using a pulse of laser light. Moments later, each ion falls back into one of two distinct energy states while emitting a photon of a corresponding frequency that can show what the new state is. Both of these photons are captured by a lens and guided towards each other along fibre optics.

At the ends of the fibres the photons meet at a beamsplitter, and if they are the same frequency they interfere. Monroe and co-workers can then detect the photons at the two outputs of the beamsplitter, from which they learn what the atomic states are. However, because they cannot know what ions these states belong to, the ions are left in a superposition of the two possibilities — in other words, they are left entangled.

The Michigan team can prove this entanglement exists by using another laser to probe the two ions, which fluoresce differently depending on their state, for signs of correlations. Over many experiment runs, they found that the correlations persisted, even when the ions were “rotated” to satisfy all the statistical conditions. “Useful entanglement of such states of matter has never been established before over such a distance,” Monroe told physicsworld.com.

The system may not have practical applications just yet, however. The losses in the apparatus conspire to produce an entanglement probability of about 10-9, meaning the researchers only get a successful entanglement every few minutes despite repeating the process a million times a second. Moreover, the near-UV photons required are of high loss in optical fibre, which limits the system’s long-distance potential. “We are looking at the possibility of efficiently converting these photons to more friendly — or even telecom — wavelength, where they could safely go many kilometres,” Monroe said.

Dinosaur extinction linked to colliding asteroids

Many scientists believe that the 180 km-diameter Chicxulub crater in Mexico was created by a massive rock that came from the asteroid belt. Lying between between Mars and Jupiter, the belt contains about a million objects that are greater than 1 km in size. Some of these asteroids are grouped in families, which appear to have been created when large asteroids collide with each other.

Now, Bill Bottke and colleagues at the Southwest Research Institute in Colorado and Prague’s Charles University have discovered a new family of asteroids and claim that there is a 90% probability that one of them created the Chicxulub crater. Dubbed the Baptistina asteroid family, the team believe it was formed when two large asteroids about 60 km and 170 km in diameter collided about 160 million years ago. Four different numerical simulations of the underlying of physics of collisions and planetary motion were used to track the origin and subsequent movement of the collision fragments.

The team began by trying to understand how asteroids break up after colliding at more than 10,000 km/h. This was done using numerical “hydrocodes”, which have previously been used to model explosions on Earth including below-ground nuclear blasts. These simulations revealed the size distribution of the collision fragments — information that was then used with measurements of the chemical composition of asteroids to decide which asteroids are members of the Baptistina family.

The team used a second numerical model to simulate how thermal energy from the Sun caused these fragments to slowly shift in their orbits – the so-called Yarkovsky effect. By running these simulations backwards in time, the team concluded that the collision happened 160 million years ago.

The Yarkovsky simulations were then used along with models of how the motions of asteroids are affected by collisions with other asteroids to decide how many large fragments – bigger than about 1 km – managed to make their way to “escape hatches” in the asteroid belt. These are special regions where the gravitational pull of nearby planets such as Jupiter can hurl asteroids into orbits that can put them on a collision course with Earth.

Finally, the team focussed on the motion of the fragments as they pass through an escape hatch and on to Earth. This was done using a fourth computer simulation that works out the trajectories of the fragments by taking into consideration the gravitational pull of the Sun and planets.

The team calculated that tens of asteroids 10 km or larger managed to escape the asteroid belt and that a handful of these managed to strike Earth, along with many smaller objects. This could explain the relatively large number of craters on Earth formed during the Cretaceous period 145 to 65 million years ago. The team also believe that there is a 70% probability that a large Baptistina fragment struck the moon 108 million years ago, creating the 85 km-wide Tycho crater.

The connection between the Baptistina family and Chicxulub is further strengthened by the work of geologists, which suggests that the crater was formed by huge chunk of carbonaceous chondrite, which is also common to the Baptistina family.

Bottke told physicsworld.com that the team is now applying their methods to several other known asteroid collisions that could be related to craters on Earth.

Single-photon transistor plans unveiled

It is normally very difficult to use single photons from one beam of light to control another beam because photons rarely interact with each other. Physicists believe that the way to get photons to interact with each other is to “squeeze” them into tiny spaces such as a quantum dot or even a single atom in an optical cavity. Squeezing the photons is essential because it intensifies their electromagnetic fields, thereby increasing the chances that they will interact.

Now, Mikhail Lukin and fellow physicists at Harvard University along with a colleagues at the Niels Bohr Institute in Copenhagen have proposed a new way of doing this by focusing photons onto tiny metallic nanowires. Here they are converted into surface plasmons – oscillations of conduction electrons – which travel along the nanowire. This process is analogous to sending a radio wave along a coaxial cable and squeezes the photons into a space that is smaller than their wavelength.

Lukin and colleagues have calculated that if a single atom is placed near the nanowire, it will absorb the first plasmon pulse that passes by, leaving the atom in an excited state. The excited atom will be unable to absorb subsequent photons and the transistor will be in the “on” position. The device could be switched “off” by firing either another single photon or a conventional laser pulse at it, causing the excited state to decay.

According to Lukin, the advantage of using a nanowire – rather than an optical cavity – to squeeze the photons is that a nanowire device would work over a wide range of wavelengths, whereas optical cavities are tuned and therefore will only work at certain frequencies.

The researchers believe that the device could someday be used as a very efficient single-photon detector in optical communication. They also point out that the device could function as a quantum logic gate that could be used in quantum computers. Key challenges in building a real device include identifying a suitable atom that can be strongly coupled to nanowire plasmons and connecting a fibre-optic cable to the nanowire to ensure that the photons are transmitted in and out of the device.

Lukin told physicsworld.com that the team is trying to build a device in the laboratory using artificial atoms such as quantum dots.

Stringscape

A PDF version (750 kB) of this article is also available.

Problems such as how to cool a 27 km-circumference, 37,000 tonne ring of superconducting magnets to a temperature of 1.9 K using truck-loads of liquid helium are not the kind of things that theoretical physicists normally get excited about. It might therefore come as a surprise to learn that string theorists – famous lately for their belief in a theory that allegedly has no connection with reality – kicked off their main conference this year – Strings07 – with an update on the latest progress being made at the Large Hadron Collider (LHC) at CERN, which is due to switch on next May.

The possibility, however tiny, that evidence for string theory might turn up in the LHC’s 14 TeV proton–proton collisions was prominent among discussions at the five-day conference, which was held in Madrid in late June. In fact, the talks were peppered with the language of real-world data, particles and fields – particularly in relation to cosmology. Admittedly, string theorists bury these more tangible concepts within the esoteric grammar of higher-dimensional mathematics, where things like “GUT-branes”, “tadpoles” and “warped throats” lurk. However, Strings07 was clearly a physics event, and not one devoted to mathematics, philosophy or perhaps even theology.

But not everybody believes that string theory is physics pure and simple. Having enjoyed two decades of being glowingly portrayed as an elegant “theory of everything” that provides a quantum theory of gravity and unifies the four forces of nature, string theory has taken a bit of a bashing in the last year or so. Most of this criticism can be traced to the publication of two books: The Trouble With Physics by Lee Smolin of the Perimeter Institute in Canada and Not Even Wrong by Peter Woit of Columbia University in the US, which took string theory to task for, among other things, not having made any testable predictions. This provided newspaper and magazine editors with a great hook for some high-brow controversy, and some reviewers even went as far as to suggest that string theory is no more scientific than creationism (see “Stringing physics along”).

Some of the criticism is understandable. To most people, including many physicists, string theory does not appear to have told us anything new about how the world really works despite almost 40 years of trying. “Sadly, I cannot imagine a single experimental result that would falsify string theory,” says Sheldon Glashow of Harvard University, who shared the 1979 Nobel Prize for Physics for his role in developing the unified electroweak theory that forms the core of the Standard Model of particle physics. “I have been brought up to believe that systems of belief that cannot be falsified are not in the realm of science.”

String theory is certainly unprecedented in the amount of time a theoretical-physics research programme has been pursued without facing a clear experimental test. But while one can debate whether it has taken too long to get this far, string theory is currently best thought of as a theoretical framework rather than a well-formulated physical theory with the ability to make specific predictions. When viewed in this light, string theory is more like quantum field theory – the structure that combines quantum mechanics and special relativity – than the Standard Model, which is a particular field theory that has been phenomenally successful in describing the real world for the last 35 years or so.

String theory is a theory of the “DNA” of a universe, but we only get to study a single “life form” – our own local patch of space. It’s as though Gregor Mendel had only a single pea and a simple magnifying glass to work with, from which he was expected to discover the double helix and the four bases A, C, G and T. Leonard Susskind, Stanford University

Ed Witten of the Institute for Advanced Study (IAS) at Princeton University, who is widely regarded as the leading figure in string theory, admits that it is difficult for someone who has not worked on the topic to understand this distinction properly. “String theory is unlike any theory that we have dealt with before,” he says. “It’s incredibly rich and mostly buried underground. People just know bits and pieces at the surface or that they’ve found by a little bit of digging, even though this so far amounts to an enormous body of knowledge.”

Some critics also slam string theory for its failure to answer fundamental questions about the universe that only it, as our best working model of quantum gravity, can seriously address. Some of these questions, says David Gross of the University of California at Santa Barbara (UCSB) – who shared the 2004 Nobel prize for his work on quantum chromodynamics (QCD) – have been around since the days of quantum mechanics. “String theory forces us to face up to the Big Bang singularity and the cosmological constant – problems that have either been ignored until now or have driven people to despair,” he says.

Gross also thinks that many people expect string theory to meet unfairly high standards. “String theory is full of qualitative predictions, such as the production of black holes at the LHC or cosmic strings in the sky, and this level of prediction is perfectly acceptable in almost every other field of science,” he says. “It’s only in particle physics that a theory can be thrown out if the 10th decimal place of a prediction doesn’t agree with experiment.”

So what is stopping string theory from making the sort of definitive, testable predictions that would settle once and for all its status as a viable theory of nature? And why does the prospect of working on something that could turn out to be more fantasy than physics continue to attract hundreds of the world’s brightest students? After all, a sizable proportion of the almost 500 participants at Strings07 were at the very beginning of their careers. “I feel that nature must intend for us to study string theory because I just can’t believe that humans stumbled across something so rich by accident,” says Witten. “One of the greatest worries we face is that the theory may turn out to be too difficult to understand.”

Irresistible appeal

In some ways, string theory looks like a victim of its own success. It did not seek to bridge the two pillars of modern physics – quantum mechanics and Einstein’s general theory of relativity – while simultaneously unifying gravity with the three other basic forces in nature: electromagnetism, the strong and the weak forces. Rather, string theory began life in 1970 when particle physicists realized that a model of the strong force that had been proposed two years earlier to explain a plethora of experimentally observed hadrons was actually a theory of quantum-mechanical strings (see timeline below).

In this early picture, the quarks inside hadrons appear as if they are connected by a tiny string with a certain tension, which meant that the various different types of hadrons could be neatly organized in terms of the different vibrational modes of such 1D quantum strings. Although this model was soon superseded by QCD – a quantum field theory that treats particles as being pointlike rather than string-like – it soon became clear that the stringy picture of the world was hiding something altogether more remarkable than mere hadrons.

String theory is different to religion because of its utility in mathematics and quantum field theory, and because it may someday evolve into a testable theory (aka science). Sheldon Glashow, Boston University

One of several problems with the initial hadronic string model was that it predicted the existence of massless “spin-2” particles, which should have been turning up all over the place in experiments. These correspond to vibrations of strings that are connected at both ends, as opposed to the “open” strings the harmonics of which described various hadrons. But in 1974 John Schwarz of the California Institute of Technology and others (see timeline below) showed that these closed loops have precisely the properties of gravitons: hypothetical spin- 2 particles that crop up when you try to turn general relativity, a classical theory in which gravity emerges from the curvature of space–time, into a quantum field theory like the Standard Model. Although the fundamental string scale had to be some 1020 orders of magnitude smaller than originally proposed to explain the weakness of the gravitational force, string theory immediately presented a potential quantum theory of gravity.

“Quantum field theories don’t allow the existence of gravitational forces,” says Leonard Susskind of Stanford University, who in 1970 was one of the first to link hadrons with strings. “String theory not only allows gravity, but gravity is an essential mathematical consequence of the theory. The sceptics say big deal; the string theorists say BIG DEAL!”

String theory succeeds where quantum field theory fails in this regard because it circumvents the shortdistance interactions that can cause calculations of observable quantities to diverge and give meaningless results. In the Standard Model – which is based on the gauge symmetry or gauge group SU(3) × SU(2) × U(1), where SU(3) is QCD and SU(2) × U(1) the unified electroweak theory – elementary particles interact by exchanging particles called gauge bosons. For instance, photons mediate the electromagnetic interaction, which is described by the original and most successful field theory of all time: quantum electrodynamics (QED), which was developed by Feynman and others in the 1940s.

Pictorially, these interactions take place where and when the space–time histories or “world lines” of pointlike particles intersect, and the simplest of such Feynman diagrams corresponds to the classical limit of the quantum theory. Provided the strength of the underlying interaction – which is described by the coupling constant of the theory, or the fine-structure constant in the case of QED – is weak, theorists can calculate the probabilities that certain physical processes occur by adding up all the quantum “loop” corrections to the basic underlying diagram (see “Why can’t string theory predict anything?” in part 2 of this article).

When trying to incorporate gravity into the Standard Model, however, such “perturbative expansions” of the theory (which amount to power series in the coupling constant) go haywire. This stems from the fact that Newton’s gravitational constant is not dimensionless like, say, the fine-structure constant. As a result, gravitons – which arise from quantizing the space–time metric in general relativity – lead to point-like interactions with infinite probabilities. String theory gets round this by replacing the 1D paths traced out by point-like particles in space–time with 2D surfaces swept out by strings. As a result, all the fundamental interactions can be described topologically in terms of 2D “world sheets” splitting and reconnecting in space–time. The probability that such interactions occur is given by a single parameter – the string tension – and the shortdistance divergences never arise. “String theory grew up as the sum of the analogue of Feynman diagrams in 2D,” says Michael Green of Cambridge University in the UK. “But working out the rules of 2D perturbation theory is only the start of the problem.”

This is because perturbation theory only works if space–time has some rather otherworldly properties, one of which is supersymmetry. While the strings in the initial hadronic theory were bosonic (i.e. their vibrations corresponded to particles such as photons that have integer values of spin in units of Planck’s constant), the world is mostly made up of fermions – particles such as electrons and protons, which have half-integer spins. In the mid-1970s Schwarz and others realized that the only way string theory could accommodate fermions was if every bosonic string vibration has a supersymmetric fermionic counterpart, which corresponds to a particle with exactly the same mass (and vice versa). String theory is thus shorthand for superstring theory, and one of the main goals of the LHC is to find out whether such supersymmetric particles actually exist.

The other demand that string theory places on space–time is a seemingly ridiculous number of dimensions. The original bosonic theory, for example, only respects Lorentz invariance – an observed symmetry of space–time that states there is no preferred direction in space – if it is formulated in 26 dimensions. Superstrings require a more modest 10 dimensions: nine of space and one of time. But in order to explain the fact that there are only three spatial dimensions, string theorists have to find ways to deal with the additional six, which is usually done by “compactifying” the extra dimensions at very small scales.

“To call them extra dimensions is a misnomer in some sense because everything is granular at the Planck [string] scale,” says Green. “Because they are defined quantum mechanically, they should be thought of as some kind of internal space–time structure.” Indeed, while the job of string theorists would be much easier if the universe was 10D and not 4D, the fact that strings have six extra dimensions into which they can vibrate can account for the otherwise mysterious intrinsic properties of elementary particles, such as their spins and charges.

Box: Strings in context

1968
Gabriele Veneziano discovers that the Euler “beta function” brings order to the measured scattering amplitudes of different types of hadrons.
1970
Leonard Susskind, Yoichiro Nambu and Holger Neilsen independently identify Veneziano’s amplitudes with solutions to a quantum-mechanical theory of 1D bosonic strings.
1971
Claud Lovelace realizes string theory requires 26 dimensions; Yuri Gol’fand and Eugeny Likhtman discover supersymmetry in 4D; John Schwarz, André Neveu and Pierre Ramond realize that string theory requires supersymmetry to accommodate fermions as well as bosons; Gerard ‘t Hooft shows that electroweak unification proposed by Steven Weinberg in 1967 is “renormalizable”, thus making gauge theories physically viable.
1973
Julius Wess and Bruno Zumino develop supersymmetric quantum field theories; David Gross, Frank Wilczek and David Politzer discover asymptotic freedom and so establish QCD; combined with electroweak theory, the Standard Model is established.
1974
Schwarz and Joel Scherk (and, independently, Tamiaki Yoneya) realize that string theory contains gravitons, and propose a unified framework of quantum mechanics and general relativity; Sheldon Glashow and Howard Georgi propose grand unification of the Standard Model forces via the symmetry group SU(5).
1976
Stephen Hawking claims that quantum mechanics is violated during the formation and decay of a black hole; mathematicians reveal Calabi–Yau spaces.
1978
Eugène Cremmer, Bernard Julia and Scherk construct 11D supergravity, which incorporates supersymmetry in general relativity.
1981
Schwarz and Michael Green formulate Type I superstring theory; Georgi and Savas Dimopoulos propose the supersymmetric extensions of the Standard Model.
1982
Green and Schwarz develop Type II superstring theory; Andrei Linde and others invent modern inflationary theory from which the multiverse follows.
1983
The discovery of W and Z bosons at CERN seals a decade of success for the Standard Model; Ed Witten and Luis Alvarez-Gaumé show that the gauge anomalies cancel in Type IIB superstring theory.
1984
Green and Schwarz show that the anomalies in Type I theory cancel if the theory is 10D and has either SO(32) or E8 × E8 gauge symmetry; T duality is discovered.
1985
Gross, Jeff Harvey, Ryan Rohm and Emil Martinec construct heterotic string theory; Philip Candelas, Andrew Strominger, Gary Horowitz and Witten find a way of compactifying the extra six dimensions using Calabi–Yau spaces.
1987
Weinberg uses anthropic reasoning to place a bound on the cosmological constant.
1994
Susskind proposes the holographic principle by extending work done by ‘t Hooft.
1995
Paul Townsend and Chris Hull, and Witten, propose that Type IIA theory is the weak-coupling limit of 11D “M-theory”; Polchinski discovers D-branes; Witten and others conjecture that all five string theories are linked by dualities, some of which are facilitated by D-branes.
1996
Witten and Polchinski discover that Type I theory and SO(32) heterotic theory are linked by S-duality; Witten and Petr Hořava show E8 × E8 is the low-energy limit of M-theory; Strominger and Cumrun Vafa derive the Bekenstein–Hawking black-hole entropy formula using string theory; Susskind and others propose a candidate for M-theory called Matrix theory.
1997
Juan Maldacena discovers the equivalence between string theory and quantum field theory (AdS/CFT duality), thus providing an exact manifestation of the holographic principle.
1998
The experimental discovery of the accelerating expansion of the universe suggests a small, positive vacuum expectation value in the form of a cosmological constant; Lisa Randall and Raman Sundrum propose braneworld scenarios as an alternative to compactification.
1999
Gia Dvali and Henry Tye propose brane-inflation models.
2003
The KKLT paper shows that supersymmetry can be broken to produce a small, positive vacuum expectation value using flux compactification to deal with extra dimensions; Susskind coins the term “landscape” to describe the vast solution space implied by flux compactification, and invokes the anthropic principle and the multiverse to explain the cosmological constant; the KKLMMT paper extends KKLT to cosmology.
2004
Hawking admits he was wrong about black holes and concedes bet to John Preskill.
2005
String theory is mentioned in the context of RHIC quark–gluon plasma thanks to application of AdS/CFT, thereby returning the theory to its roots as a description of hadrons.

A testing time for strings

There is no getting away from it: string theory is an incredibly vast and challenging subject. With its talk of D-branes, 10- or 11-dimensional universes and a myriad of possible solutions – 10500 at the last count – string theory looks to outsiders, including many physicists, more like an arcane branch of mathematics than tangible physics. It appears to have told us nothing new about the real world, despite almost 40 years of trying.

But look into string theory in even a little detail and it is clear why so many young physicists are lured into the field (see “Stringscape”). First, although the details need to be worked out, string theory naturally unifies quantum mechanics and general relativity, thus providing a quantum theory of gravity and a framework that describes all the fundamental interactions in terms of just a single entity: strings, which vibrate in different ways. Second, contrary to what outsiders might expect, string theory is guided by problems in the real world, however remote these may seem.

For instance, string theory has given physicists a better understanding of blackhole entropy and has proved useful in modelling aspects of the quark–gluon plasma observed at the Brookhaven National Laboratory. String theory also offers the only explanation physicists have for the incredibly small value of the cosmological constant, which is thought to be causing the expansion of the universe to accelerate.

These are not, however, the kind of specific, testable predictions that all good physical theories must make before being accepted as a description of the real world. While this is, quite rightly, the main fuel for critics of string theory, such “falsifiability” is not the sole judge of a scientific theory (see “String theory under scrutiny”). Indeed, string theory raises several philosophical issues, such as the role of anthropic reasoning (pp16–17; print edition only), and forces us to face up to the meaning of space and time (pp18–19; print edition only).

With CERN’s Large Hadron Collider (LHC) due to switch on next year, now is the wrong time to slam string theory for its lack of predictive power. While not able to prove string theory is right, the discovery of supersymmetric particles at the LHC would give it a major boost, as would the discovery of “Kaluza–Klein” particles and possibly even mini black holes, which could be a signature of the universe’s putative extra dimensions. A flood of precision cosmological data due in the next few years will also offer new ways to put string theory to the test.

But string theory can be criticized for how it has promoted itself. Since the mid-1980s, many string theorists have oversold their subject by making grandiose claims about a “theory of everything”. Although that tendency has disappeared, it no doubt diverted some physicists from other, potentially more useful, lines of research in theoretical physics. Meanwhile, string theorists have not responded well to recent attacks based on the theory’s lack of testable predictions, most preferring to keep quiet rather than to engage in debate.

However, the richness of string theory that has became apparent in the last decade, and its increasing contact with the real world, gives theorists something to shout about. This is why our main feature on the subject, which started with fairly modest intentions, has ballooned into the longest ever to appear in Physics World. As the views of even many non-string theorists in the article make clear, the theory still holds all the potential it ever did to revolutionize our understanding of the universe.

Blog life: Galactic Interactions

Blogger: Rob Knop
URL: scienceblogs.com/interactions
First post: January 2006

Who is the blog written by?

Rob Knop, who until last month was an assistant professor of astronomy at Vanderbilt University in Nashville, US, researching galaxy formation, evolution and interactions – as the title of his blog suggests. He is now leaving academia to work as a software engineer for Linden Lab, the company that created and runs the online “virtual world” Second Life, which has over eight million registered users.

What topics does the blog cover?

As well as his research area of astronomy, Knop has written angrily and often about the problems faced by researchers attempting to gain tenure at US universities, including a post titled “The astronomy community to Rob Knop: ‘Get out. You aren’t good enough’ “. Knop is also a Christian, and wrote a series of detailed posts about why he believes in God and how he thinks religion fits with modern science.

Who is it aimed at?

In February Knop was invited to join the growing ScienceBlogs collective run by US popular-science magazine Seed, thereby recognizing his talent at communicating astronomy to the layperson. Indeed, Knop has frequently written that teaching and outreach activities are undervalued by the academic system in the US – he found that he was unable to gain tenure at Vanderbilt without first obtaining hard-to-come-by funding from the National Science Foundation (NSF), thus prompting his departure from academia.

Why should I read it?

As well as being an excellent communicator of science, Knop’s personal diatribes about his experience of academia make for impassioned, if sometimes uncomfortable, reading. Indeed the “rant” category in Galactic Interactions contains more posts than those on physics or astronomy. However, there is a happy postscript to Knop leaving research. Shortly after announcing his new job, he was awarded a share in the $250,000 Gruber prize for cosmology as one of the members of the Supernova Cosmology Project that in 1998 co-discovered that the expansion of the universe was accelerating. One comment on his blog noted “When you look up ‘ironic timing’ in the dictionary it links to this post.”

How often is it updated?

Once every few days. It remains to be seen how the blog will develop as Knop moves into his new career.

Can you give me a sample quote?

“I’ve had this Sword of Damocles about funding hanging over my head for years. In the last few years, it’s been weighing more and more heavily on me, as I contemplate how competitive NSF funding is, as I hear stories of even ‘good proposals from big shots’ being turned down, as I hear about established and successful full professors finding it difficult to figure out how to keep funding their graduate students…and as I realize that Vanderbilt has a veto criterion for tenure that requires me to compete successfully in this funding rat race.”

Governing science

Many scientists see US President George W Bush as being bad for science, as has been made clear in numerous books, editorials and Congressional testimony. Although these scientists say that federal funding for science is at a reasonable level, they claim that his administration has rejected the advice of its own scientists, suppressed unfavourable reports, allowed ideologies to damage the infrastructure, and used celebrities as consultants. Apart from three of the 10 Republican presidential candidates – Sam Brownback, Mike Huckabee and Tom Tancredo – who do not believe in evolution, any replacement might seem preferable.

Be careful what you wish for. Consider the record of Bill Richardson, one of eight or so Democrats seeking their party’s nomination, to be decided next August. Richardson, 59, currently the governor of New Mexico, served in the Clinton administration between 1998 and 2001 as secretary of the Department of Energy (DOE), which oversees 10 national labs. While Richardson is lower in the polls than his Democratic rivals Hillary Clinton and Barak Obama, his administrative experience and Hispanic roots earn him a following. Yet during Richardson’s tenure at the DOE, two episodes are disturbing.

Troubled times

One episode is the case of Wen Ho Lee – a nuclear engineer at the Los Alamos National Laboratory. In 1999 media reports charged that Lee was spying for the Chinese government. Although the charges were based on information known to be false, Richardson had Lee fired and was later cited as having leaked Lee’s name to reporters (although Richardson denies this). Lee was arrested, chained in solitary confinement and threatened with execution. He eventually won a case against the DOE and other agencies for violation of privacy. Richardson has declared that he acted responsibly, but others saw him as pandering to the media crusade and being guilty of targetting members of specific ethnic groups – and the Federal judge who oversaw the case publicly apologized to Lee.

The second episode concerns the High Flux Beam Reactor (HFBR) at the Brookhaven National Laboratory (BNL) – one of most important neutron sources in the world. Its innovative design was partly devised by its senior user, Julius Hastings, and its research ranged from cancer cures to superconductivity. The HFBR was closed when Richardson took over. While the reactor was performing safely, its spent-fuel pool was leaking a small amount of tritium-containing water. A decision procedure for a restart had been worked out that included a new Environmental Impact Statement (EIS). Although the leak was confined to lab grounds and posed no threat to employees or the local community, it fostered an outcry among the media and anti-nuclear activists (2001 Hist. Stud. Phys. Bio. Sci. 32 41).

An antinuclear group, members of which included celebrities and Democratic party fundraisers such as supermodel Christie Brinkley, publicly campaigned against the restart in ways that Democrats would now call “Swiftboating” – the term means an unfair attack and derives from an episode during the 2004 presidential campaign. The attacks used material that BNL scientists found distorted and even dishonest, including the circulation of false rumours of the incidence of cancer clusters around the lab.

Richardson met with the antinuclear group and agreed to its demands to extend the comment period for the EIS. Hearing of this, Hastings called Richardson’s office to ask for a meeting. Hastings was refused. Meanwhile, the completed EIS draft concluded that “the environment and public health and safety would be protected” in an HFBR restart – but the DOE refused to release the document.

Instead, Richardson met again with Brinkley’s group. According to a report in George magazine, a then-popular political periodical, Brinkley “reminded Richardson that his aspiration to be Al Gore’s running mate [in the 2000 Presidential election] – a job he hadn’t been coy about lobbying for – would be seriously compromised if he didn’t acquiesce”. According to the article, Richardson was left speechless.

Richardson then terminated the HFBR, thus aborting the carefully arranged restart procedure. He did not bother to tell the lab; officials only learned of his action through the media. Scientists were outraged – not just by the decision and the lame reasons Richardson offered for it, but also for treating eminent scientists as bumpkins who need not be consulted or even informed.

Richardson denies Brinkley had influenced his decision. Brinkley did not think so, and appeared on talk shows claiming responsibility for the HFBR’s demise. Richardson then came to Long Island to accept an award from Brinkley’s group, which was handed out at a pop concert given by Brinkley’s ex-husband Billy Joel.

“Remember the HFBR!” is unlikely to become a popular rallying cry. Richardson is surely banking that the public has forgotten his handling of the HFBR and in any event would not care much about the closure of a research reactor. Yet to those who do remember, the episode raises troubling questions. Would a Richardson administration be better for science than the current one? Would activism and ideology rule science policy? Would experts be consulted, or would fundraisers and celebrities lead officials by the nose?

The critical point

In the upcoming US presidential election, such questions will be important. For the Bush administration, inadvertently, has done a wonderful thing for science. It has shown the importance of political leaders who respect science: its infrastructure, its instruments, its experts and its data. It has shown the need to let facts dictate policies rather than vice versa. It has shown that robust science is essential to the safety and welfare of democracy, and of the planet.

When word surfaced that George W Bush had consulted blockbuster author Michael Crichton about global warming, scientists saw it as a cruel joke. We must therefore be equally critical of other potential presidents. The abuse of science by the left is as dangerous and despicable as that by the right. Politicians who damage our carefully assembled and precious scientific infrastructure for political gain cannot be taken seriously.

Life in the line of fire

For the past 10 years I have been working as an instrument scientist at the Institut Laue-Langevin (ILL) in Grenoble, France. Life in the French Alps is certainly a far cry from my origins in the flatlands of Victoria in Australia, although the quality of the wine is comparable. I came to Europe after completing a physics degree at the University of Melbourne and a doctorate in condensed-matter physics at Monash University. Initially I worked in the UK at Oxford University on neutron and X-ray scattering experiments, but then, just before my contract ended, I was offered a job at the ILL. Although I have now spent more time working here than on both my degrees and my postdoc put together, it feels much shorter!

The ILL is a high-neutron-flux research facility and is arguably the most powerful source of neutrons in the world. More than 40 instruments for experimental science are attached to the nuclear reactor that produces the neutrons. Most of the instruments are used for neutron scattering, in addition to four for nuclear physics, one for radiography and one for interferometry. All the instruments are different, although there is some overlap between the science that can be done with them. I like to think of the institute as a giant toolbox where scientists can choose the right tool to solve each problem that comes along.

Three jobs in one

The ILL employs about 60 full-time scientists, and our work is roughly divided in to three parts. First, we each have responsibility for maintaining and developing one of the instruments. I work on D17, which is a neutron reflectometer that is designed to measure the properties of surfaces and subsurface interfaces buried inside a sample. The instrument is always changing as we think of ways to improve it – from boosting the neutron intensity to developing the software used to run it. The science done with D17 is very broad in scope, ranging from studies of biological membranes to chemical catalysis and magnetism, so careful thought and lateral thinking are required to optimize each experiment.

Second, instrument scientists have “local contact” duties, which means helping visiting academics to carry out and interpret their experiments. The ILL welcomes about 2000 scientists each year who, between them, perform about 750 experiments during that time. Anyone can propose an experiment at the ILL, but they must be approved through competitive scientific evaluation. Once a proposal is accepted, the researchers are assigned “beam time” to carry out the experiment. Neutron-scattering experiments typically take between two days and two weeks, depending on the instrument and the type of experiment, and the visitors want to get the best use out of every available neutron.

This aspect of the work can be very rewarding, as my colleagues and I are exposed to new and exciting ideas, and get to meet many people. We can act as local contact on any of the instruments at the ILL. As well as D17, I often act as the local contact for the “three-axis spectrometers”, which are instruments particularly suited for measuring structural and magnetic vibrations in crystals. Being able to work on other instruments means that I can collaborate closely with visitors as they move around the facilities at the ILL – far more satisfying than being confined to “one-off” experiments on a single instrument.

Finally, all instrument scientists have their own research programmes. Ultimately, the ILL is judged on the science that it produces, and we are encouraged to publish our work regularly. My research is in the measurement of magnetic structures and dynamics. A neutron has no electrical charge, but it does have a magnetic moment that will interact with any magnetic induction in a sample, which makes neutron scattering a sensitive probe for experiments in magnetism. One of my research programmes looks at the magnetic structures of iron-based metallic glasses, which are poorly understood but are used widely in industry for everything from transformer parts to magnetic read-heads. Another programme looks at magnetic structures and vibrations in low-dimensional materials, such as thin magnetic films. Research is probably the most challenging and the most fun part of my work, as I am free to use my imagination and pursue the science that I find the most interesting.

A career with neutrons

While the job of instrument scientist has three parts, the division of time between them can fluctuate enormously. You must fight to make time for all three, and sometimes it feels like you do, in fact, have three jobs! In particular, when there are problems on the instrument or visitors needing help, it can be very difficult to find time for your own research. The reactor runs for four cycles of 50 days each year, and during these cycles it can be very busy indeed. Between cycles, there is more time to concentrate on analysing data, writing journal articles and attending conferences, although any major modifications to the instrument must also be made while the reactor is shut down. Finding time for holidays during all this can lead to friction, particularly when trying to negotiate with one’s family.

Nevertheless, being an instrument scientist means having a great job, tremendous fun and plenty of career opportunities. A few years ago, neutron scattering was considered to be in decline, with many of the older neutron sources being closed down. However, there is a new wave of investment with the building of many new and powerful neutron sources all over the world, and also with new instrumentation at established sources like the ILL. Instrument scientists are in great demand, so it is an excellent time to be starting a career with neutrons.

A good way to begin is to try neutron experiments during doctoral work. I used neutron scattering during my PhD for magnetic- structure determination, which taught me the basics and introduced me to many other people who use neutrons, some of whom I now work with on a regular basis. A background in physics is useful for an instrument scientist, as the techniques are all physics-based, but centres like the ILL have excellent opportunities for multidisciplinary science and also employ instrument scientists who are trained in chemistry or biology.

Am I still enjoying working at the ILL after 10 years? In Nick Hornby’s novel How To Be Good, the main character compares science and the arts, saying one is “all empathy and imagination and exploration and the shock of the new, and the outcome is uncertain”. That is how I feel about my job, every day. In fact, Hornby’s character was actually talking about the arts, going on to say that science “presses this button, then that one, and bingo! Things happen. It’s like operating a lift”. Believe me, this is nothing like what we do, and just goes to show that Hornby should spend more time in a physics lab.

Once a physicist: Chandrika Nath


How did you first get interested in physics?

It was mainly through Patrick Moore and the Open University TV programmes. Even when I was seven, I would sit in front of Open University for the whole of Saturday morning watching really dire programmes about physics and finding them fascinating. Then, when I was 10, my dad got me a telescope and I started looking at the sky and wondering about the size of the universe. And finally, I didn’t want to become a medical doctor like every other Indian daughter I knew.

Where did you study physics and how much did you enjoy it?

I went to Imperial College London as an undergraduate and the course was really good – loads of variety and well taught – but I didn’t enjoy the university itself that much. At the time only one in 20 students were women and being a student in central London isn’t that much fun because no-one can afford to go out and everyone lives miles apart. After graduating, I decided to take a gap year and somehow got from Mexico to Alaska in the space of one summer, mainly by hitchhiking. Then I went to teach in a school for blind children in India. I also worked at Texas Tech University in Lubbock on calorimeter design for the Superconducting Super Collider – hopefully I wasn’t responsible for its demise!

What did you do next?

I went to Oxford University to do a doctorate in particle physics and I enjoyed that university experience much more. Originally, I hadn’t wanted to go to Oxford because someone told me it was like an extension of a girls’ private school, but by that time I was 23 so it was easy enough to escape that side of things. I was based in Hamburg for a couple of years, working at the HERA accelerator and studying theoretical physics in German, which was quite an experience. I’ve completely forgotten the title of my thesis – it was something to do with instantons, though I don’t even remember what those are now.

How did your career develop from there?

In my gap year I ended up in Alaska and I decided then that one day I wanted to study ice. I had a grand plan to make my own way across the Bering Strait. Then one day I was flicking through New Scientist magazine and came across an advert for a job with the British Antarctic Survey (BAS). I didn’t expect to get it, but I did, and decided that it was an acceptable alternative to the Bering Strait. I spent four years with the BAS studying how crevasses form, including five months in the Antarctic living in a tent with three blokes. Having gone from relativistic quantum mechanics back to Newton’s laws of motion, I was intrigued to find that the macroscopic world holds as many mysteries as quarks and gluons.

How did you make the move into science communication?

At the end of my time at the BAS I did a media fellowship with Roger Highfield at the Daily Telegraph. I was vaguely thinking about becoming a journalist and then a job came up at the Parliamentary Office of Science and Technology (POST). I was intrigued by the idea of working within parliament, and it satisfied my need to not have to turn things around on really short timescales. POST’s job is to provide MPs and peers (members of the House of Lords) with balanced information on policy issues that relate to science and technology. Very few parliamentarians have a science background so they need an objective, understandable source of information to help them scrutinize government policy. My biggest piece of work to date has been a 200-page report “Assessing the risk of terrorist attacks on nuclear facilities” but I’ve also worked on reports covering military satellites, digital forensics and online privacy.

What are the main differences between working in policy and in research?

They’re completely different – I don’t think there are any similarities. But if you’ve been a researcher, you apply a lot of the skills you already have, so in a way you make the similarities. All the years of training as a scientist, documenting my work thoroughly and thinking logically are really useful. When you’re doing science, you are always wondering what else there is that you haven’t thought of. You never see something as an answer, just something that raises more questions. I think that pedantry is what makes me good at what I do now.

Copyright © 2025 by IOP Publishing Ltd and individual contributors