Skip to main content

US missile strikes stricken satellite

A missile fired from the American Navy cruiser USS Lake Erie has successfully disabled a malfunctioning spy satellite that was about to enter Earth’s atmosphere. Marine General James Cartwright said there was an 80-90% chance that the missiles, fired from the Pacific Ocean west of Hawaii, had hit the satellite’s fuel tank.

Officials had cited the tank’s 450 kg of hydrazine as the key reason for disabling the satellite. Had the fuel tank reached Earth, the toxic gas could have harmed individuals close by. Cartwright said that debris from the stricken satellite appeared to be too small, however, to cause damage on Earth.

The firing represented an early and serendipitous rehearsal of American anti-missile defence. The SM-3 missile used to hit the satellite is under development as part of the American Navy’s sea-based missile defence system. It is intended to provide protection against medium- and long-range ballistic missiles.

Cartwright said that the missile collided with the 2250kg satellite some 210 km above the Earth’s surface, at a combined speed of 35,000 kilometres per hour. “The technical degree of difficulty was significant here,” he said.

“It’s a brilliant public relations opportunity for the ballistic missile defence system.” Ivan Oelrich, Federation of American Scientists

However, the difficulty of hitting a satellite in low orbit hardly compares with that of striking a missile in flight. “This is easier than the ballistic missile intercept that the system was designed for,” explains Ivan Oelrich, a security specialist at the Federation of American Scientists. “The satellite is a substantially larger target — probably between the size of a minivan and a small school bus. It’s also on a very predictable trajectory. And it’s high enough to predict with high accuracy where it will be.” The US Department of Defense chose to fire from a ship because the ship could be aligned with the track of the satellite.

Missile showcase

Oelrich is among critics who see the event as less a technical test than a chance for the Bush Administration to showcase its hotly disputed missile defence system. “I don’t think the public safety argument holds up. The US produces 16 million kilograms of hydrazine per year, which we ship around in trucks,” Oelrich says. “But it’s a brilliant public relations opportunity for the ballistic missile defence system. We were told that it has saved us from a grave threat.”

Critics have also suggested that the US destroyed the satellite because it feared that its ultrasecret technology could be made public if large parts of the satellite survived the plunge through the Earth’s atmosphere. That, however, seems unlikely. Analysts suggest that such technology would automatically be adapted to burn up before reaching Earth.

More troubling is that the event appears to be a response to China’s use of a ground-based missile to destroy a satellite last year. “We should be working on a treaty to ban antisatellite weapons,” Oelrich says. “But the Bush Administration opposes it and says it couldn’t be verified.”

Scientists probe fireballs with X-rays

Scientists have been baffled for centuries about the strange drifting balls of light that appear occasionally during thunderstorms. Theories put forward so far suggest that this “ball lightning” is either a moving electrical discharge or that it is some kind of self-contained object. Now, research from an Israeli group is making the latter seem more likely. The scientists have created artificial fireballs and then used the European Synchrotron Radiation Facility (ESRF) in Grenoble to analyse their composition.

Although a rare phenomenon, ball lightning has been glimpsed by several thousand people around the world, each of whom seems to recount a different set of properties. Diameters range from a centimetre to over a metre, colours are anything from red to white to blue, while motion encompasses both vertical descent and horizontal meandering. The fireballs have even been seen entering buildings via chimneys and windows. However, such sightings are always unexpected, so there is seldom an opportunity to observe ball lightning systematically.

A number of research groups are therefore trying to recreate ball lightning in the lab. Two years ago, electrical engineers Eli Jerby and Vladimir Dikhtyar of Tel Aviv University in Israel were able to make artificial fireballs by focusing microwaves onto substrates made from silicon and other solids placed inside a shoebox-sized cavity. They melted part of the silicon by delivering microwaves through a metal tip that, when pulled away, could drag some vaporized silicon with it. This created a column of fire that eventually detached to form a buoyant, quivering fireball coloured orange, red and yellow.

Now, Jerby and Brian Mitchell from the University of Rennes 1 in France — together with other researchers from the ESRF, Tel Aviv and Rennes — have tested the composition and properties of their fireballs by installing the cavity, which contains a substrate of borosilicate glass, in a beamline at the ESRF. After passing X-rays through the cavity and generating diffraction patterns, they discovered that the fireballs contain about 109 particles per cm3, each of which has an average diameter of about 50 nm (Phys. Rev. Lett. 100 065001).

They believe that this observation supports a theory put forward by John Abrahamson, a chemical engineer at the University of Canterbury in New Zealand, proposing that ball lightening occurs when ordinary lightning vapourizes carbon and silicon oxides within soil, allowing the carbon to chemically reduce the silicon into its elemental form. The silicon atoms then cool, condense and group together into nanoparticles, which oxidise in the surrounding air and give off thermal radiation.

This latest research does not solve the mystery of ball lightning, however. Whereas many witnesses have reported glowing orbs that persist for several seconds — sightings that back up Abrahamson’s theory — Jerby and Dikhtyar’s fireballs glow for just 30–40 ms once the microwave source is turned off.

The Israeli group speculate that the mechanism responsible for the longevity of ball lightning in Abrahamson’s theory is masked in their experiments. Abrahamson points out that as the silicon nanoparticles oxidize, the rate at which further oxygen molecules can reach the silicon diminishes, thereby slowing the dissipation of the nanoparticles’ chemical energy. Jerby says that this oxidation process may occur while the silicon is being illuminated with microwaves and that their chemical energy is therefore almost spent by the time the microwave source is removed.

In addition to this problem of fireball lifetime, Abrahamson’s theory has still to explain exactly how ball lightning can pass through windows, walls and other objects. Jerby, however, is optimistic that these problems can be overcome and that eventually real ball lightning will be recreated in the laboratory. He also believes that his group’s work could have important practical applications, such as producing nanoparticles directly from solid materials.

Researchers create ‘self-healing’ rubber

Rubbery materials can be easily stretched, but it is not easy to mend them when they break, as anyone who has ever had a punctured car tyre will know. Now, however, researchers in France have created a unique new rubber-like material that can “self-heal” at room temperature. If the material is snapped in half, the two torn pieces can be made to mend themselves simply by bringing the broken surfaces back in contact with each other (Nature 451 977).

The new “supramolecular rubber” has been created by Ludwik Leibler and colleagues at the Ecole Supérieure de Physique et Chimie Industrielles (ESPCI/CNRS) in Paris, France. It consists of “fatty acids” — short chains of carbon atoms — linked together via hydrogen bonds to form a macroscopic 3D network. The material behaves just like an ordinary rubber in that it can stretch to several times its normal length when pulled.

But if the material is cut in half, the two broken pieces of the rubber can self-heal when brought together and simply held in contact for a few minutes. The fracture mends and the material can be stretched and pulled in all directions again. “It is important to stress that the material is not self-adhesive,” Leibler told physicsworld.com. “The surfaces of the material are never sticky to the touch and feel like a rubber band or a plastic bag. Self-mending is possible even 12 hours after the fracture occurred.”

Magic healing

Conventional rubbers usually consist of long polymer chains linked together by chemical bonds. However, the new material consists of small molecules that can link to two or more other molecules through hydrogen bonds, which are much weaker. If the material is fractured, any “open” hydrogen bonds on the surface of the broken material seek out other non-linked open hydrogen bonds, allowing the two broken halves to reform.

“The potential applications are manifold,” say Justin Mynar and Takuzo Aida of the School of Engineering at the University of Tokyo, writing in the same issue of Nature. “Tears in clothes that effectively stitch themselves together, long-lasting coatings and paints for houses and cars, and to take one example on the medical front, self-repairing artificial bones and cartilage.”

Other applications include adhesives and cosmetics, in ink-jet printing, electronics and building materials for the construction industry. The work was carried out in collaboration with the chemical company Arkema, which is now planning to commercialize some products and materials based on this supramolecular technology.

Vector inflation points the way

You’ve got to love physics humour. To mark Valentine’s day last week, Fermilab’s in-house magazine ran a spoof personal ad that stated: “mature paradigm with firm observational support seeks a fundamental theory in which to be embedded”. It was referring to inflation — a period of exponential expansion thought to have taken place 10–35 s after the big bang, which, although able to account for the large-scale appearance of the universe, lacks a firm theoretical footing.

By chance, that same day three theorists posted a paper on the arXiv preprint server that could help remedy this situation. Viatcheslav Mukhanov and co-workers at Ludwig Maximilians University in Munich, Germany, have proposed a model in which inflation is driven by vector fields as opposed to scalar fields as it is in existing models (arXiv:0802.2068v1). Although their model does not fundamentally explain inflation, vector fields are already known to exist in nature whereas scalar fields are not.

Smooth and flat

Inflation, which was developed in the early 1980s, smoothes out anisotropies that were present immediately after the big bang by causing the universe to expand by a factor of least 1030 in a fraction of a nanosecond. Without inflation, it is difficult for cosmologists to explain why the universe looks roughly the same no matter which direction they look or why the geometry of the universe is essentially flat.

But the underlying theory is somewhat ad hoc. In most models, the rapid expansion of the early universe is driven by “negative pressure” produced when the potential energy of a scalar field drops from one state to a lower one. Quantum fluctuations in this field, with which is associated a fundamental particle with zero spin called the inflaton, would have been blown up to cosmic scales and produced density perturbations that later caused matter to clump together into galaxies.

“Ours is not a better model than scalar-field inflation, but it is also not worse.” Viatcheslav Mukhanov, Ludwig Maximilians University

This picture has recently gained strong support from measurements of the distribution of hot and cold spots in the cosmic microwave background. But physicists have never seen spin-zero particles and therefore have no real clue about the microscopic origins of inflation.

On the other hand, fundamental vector fields — which lead to spin-one particles — are known to exist in nature: the photon and the W and Z bosons, for example. The trouble is that vector fields tend to “pick” a preferred direction in space and therefore ruin inflation’s biggest selling point. Mukhanov and colleagues have now overcome this problem by invoking several vectors which, when averaged, lead to a nearly isotropic universe.

Random orientations

According to Larry Ford of Tufts University in the US, who in the late 1980s came up against the problem of anisotropy himself when attempting to build a vector-inflation model, the new model is analogous to the pressure in a gas. “A gas consists of many molecules, each with a specific velocity, but when the effects of the various molecules are averaged the result is a pressure that is isotropic to a very high degree of accuracy,” he explains.

“Ours is not a better model than scalar-field inflation, but it is also not worse,” says Mukhanov, who was one of the first to calculate the fluctuations in the inflaton field. “In addition, the model allows us to get a certain amount of anisotropy during inflationary expansion which could modify the produced perturbations and thus have observational imprints in the cosmic microwave background.”

Because the model requires many vector fields orientated in random directions to produce the required isotropy of the large-scale universe, its predictions depend heavily on the way the masses of these fields are distributed, which is not a desirable property. The upside is that the presence of fields with extremely small masses naturally accounts for the current phase of cosmic acceleration as well as inflation, although team member Alexey Golovnev is quick to point out that scalar fields can also do this job.

Optical lattice beats atomic-clock accuracy

Researchers in the US have built a new optical clock from strontium atoms that is accurate to about one part in 1016 — meaning it would neither gain nor lose a second in more than 200 million years. While the clock is not the world’s most accurate — that honour goes to an optical clock based on a single mercury ion, which is accurate to several parts in 1017 — the strontium device is the world’s most accurate clock to use neutral atoms.

The clock, which was built by Jun Ye and colleagues at JILA and the University of Colorado in Boulder, is based on thousands of strontium atoms that are trapped in an “optical lattice” made from overlapping infrared laser beams (Sciencexpress). The atoms are bathed in light from a separate red laser at a frequency corresponding to an atomic transition in strontium — which causes this light to lock into the precise frequency of the transition. The oscillation of this light is just like the ticking of a clock — the first neutral-atom timekepeer to be more accurate than the standard caesium-fountain atomic clock.

3.5-km optical fibre

The team determined the clock’s accuracy by sending its time signal via a 3.5-km underground optical fibre from JILA to the National Institute of Standards and Technology (NIST) in Boulder, where the signal was compared to that from an optical clock based on neutral calcium atoms. This is the first time that researchers have been able to compare signals from two optical clocks separated by several kilometres in this way.

The time signals from the clocks were compared using two “frequency combs” located at either end of the fibre link. Each clock produces light at different frequencies and the combs are used to convert the signals to a lower common frequency. After being shifted to the lower frequency, the strontium signal was amplified using a laser and then transmitted to NIST.

Adding anti-noise

The two signals were then combined and the physicists looked for signs of interference — or “beating” — which occurs if the time signals have slightly different frequencies. Ye told physicsworld.com that a key challenge in transmitting the signal was to ensure that it was not disrupted by noise in the optical fibre. This was done by monitoring the noise in the fibre and introducing an “anti-noise” signal to cancel it out.

The team is now trying to improve the precision of the clock — a measure of how reliable its time signal is — by increasing the number of atoms used. The team also plan to use their technique to compare the strontium clock to a mercury-ion clock at NIST.

New standard

Most physicists believe that an optical clock will someday replace the caesium-fountain atomic clock as the time standard. That is because the stability of an atomic clock is proportional to its operating frequency, which means that clocks based on narrow transitions at optical frequencies will be much more stable than those based on much lower frequency microwave transitions. Clocks need to be both stable and accurate, with greater stability making it easier and faster to work out how accurate a clock really is.

However, several different optical clocks being developed around the world and no clear frontrunner has emerged. The best single-ion clocks are currently more accurate than their neutral-atom counterparts , but their signal comes from just one ion, making them noisier and perhaps less practical than neutral-atom clocks.

Patrick Gill of the UK’s National Physical Laboratory, which is also developing optical clocks, described JILA’s strontium clock as an “impressive step on the way [to an optical standard]”, but added that it “may not be a clear winner”. According to Gill, while strontium is a popular candidate for a next-generation time standard, there are about a dozen candidates for the job — both neutral atoms and ions.

Who gets the cash?

“Is the title an oxymoron?” Harold Shapiro asked rhetorically at the beginning of his talk, The Responsible Use of Public Resources in Elementary Particle Physics. He wanted to show how one goes about prioritizing funding within the US science budget for high-energy physics. Later in his talk he posed another rhetorical question: “Are we in the US silently executing an exit strategy?”

Shapiro is professor of economics and public affairs at Princeton University and also chairs the US elementary particle physics committee. The committee is composed of nine particle physicists among five non-particle physicists and six non-physicists, and recently submitted a report recommending research priorities to the US National Academy.

The report begins by summarizing, for the uninitiated, the main unresolved issues in physics: the nature of space and time; the origin of mass; and the beginning and fate of the universe. Then it notes that the most likely way for significant progress to be made will be to resolve Einstein’s theory of general relativity, which describes how gravity arises from the curvature of space–time, with the Standard Model of particle physics. A theory known as supersymmetry might be able to do this, but to be tested it really needs the help of particle accelerators that operate at tera-eV (1012 eV) energy scales.

One such accelerator is the Large Hadron Collider (LHC) at CERN, due to start-up this June. However, it ought to be complemented by the International Linear Collider (ILC), a non-hadron accelerator that is still in the R&D stage. The US would like to submit a credible bid to host the ILC, and that requires making significant R&D contributions. Hence firmly recommending it as a priority to the National Academy.

Trouble is, particle accelerators are expensive pieces of kit. The LHC will clock-in at around $9.2bn, while the ILC could easily be double or triple that. Playing the devil’s advocate, I asked why funds allocated for particle-physics facilities would not be better spent on research into more useful physics — alternative energy, for instance. Shapiro said there is no way of quantifying which is more important, adding that, for him, understanding the ways of the universe “is an extraordinarily important issue.”

Lawrence Krauss of Case Western Reserve University, the symposium organizer, also chipped in. “There is no other way of answering these big questions,” he said. “And it’s worth remembering that the entire cost of the LHC is the same as nine days in Iraq.”

Bubble chamber puts new constraints on WIMPs

Bubble chambers, which were first used in the 1950s to detect electrically charged particles, might sound as if they should belong firmly to particle-physics history books. Now, however, physicists working at the Chicagoland Observatory for Underground Particle Physics (COUPP) experiment in the US have resurrected the technique to search for dark matter. Although their bubble-chamber experiments have failed to find any dark-matter particles, the null result has imposed new limits on certain properties of weakly interacting massive particles (WIMPs), which are a leading candidate for dark matter.

The COUPP team also says that their results casts further doubt on claims made in 1998 by members of the Dark Matter (DAMA) experiment at the Gran Sasso National Laboratory in Italy to have observed WIMPs. The DAMA team — which used a large array of sodium-iodide detectors located 1400 m below ground — insists that it is impossible to make a direct comparison between the two experiments.

Invisible to telescopes

Dark matter was first proposed about 70 years ago to explain the abnormally high rotation speeds of galaxies, which would otherwise be torn apart if they did not contain vast amount of hidden mass providing extra “gravitational glue”. It is fundamentally different from normal “luminous” matter such as stars as it is invisible to modern telescopes, giving off no light or heat, and seems to interact only through gravity.

Physicists believe that WIMPs might interact with ordinary matter through several different mechanisms that are either dependent or independent of the nuclear spin of atoms in ordinary matter. Although various experiments appear to have ruled out the possibility that the DAMA observations came from spin-independent interactions, they could still have been explained by a spin-dependent mechanism. Now that possibility appears to have been ruled out as well by the latest COUPP results.

Super-heated state

The COUPP experiment, which is located some 100 m underground in a tunnel at Fermilab, consists a glass jar filled with about a litre of CF3I — a liquid that is normally used as a fire-extinguisher (Science 319 933). The liquid is heated to just below its boiling point, at which point a piston is withdrawn ,which makes the chamber expand. This reduces the pressure of the fluid, leaving it in a super-heated state.

If a WIMP collides with a nucleus in the fluid, the recoiling nucleus heats the surrounding fluid, which boils and forms a tiny bubble. The bubble gets bigger as the chamber expands until it is large enough — about a millimetre in size — to be photographed by digital cameras.

COUPP watches for WIMP collisions nearly all of the time. By doing a statistical analysis of the bubbles in a large number photographs, the team says that they can work out which were caused by dark matter and which were caused by background radiation such as the alpha particles emitted during the radioactive decay of radon. Unlike old bubble chambers and most modern dark-matter search techniques, COUPP is insensitive to most other kinds of background radiation such as muons, gamma rays and X-rays.

If the DAMA result had been due to spin-dependent WIMPs, then COUPP researchers should have found hundreds of WIMPs. Instead, they found none above background. “These results establish the bubble chamber as a new competitive technique in the search for WIMPs as candidates for dark matter,” explains Fermilab’s Peter Cooper who is a senior scientist on COUUP. “They also contradict the observation claimed by DAMA in the last region not already excluded by other dark matter searches”, he said.

DAMA disagrees

However, Rita Bernabei at the University of Rome — spokesperson for the DAMA experiment — disagrees with that this conclusion. “It is impossible to make a direct comparison between the COUPP and DAMA results,” she told physicsworld.com. “In particular, COUPP uses different target materials and approaches [to DAMA]. She also pointed out that COUPP addresses also just one of several possible models of spin-dependent interaction, whereas the DAMA result was not tied to any specific interaction model — spin-dependent or otherwise.

COUPP is not the first experiment to put the DAMA results in question and several dark-matter searches since 1998 have failed to turn up any particles. However, Bernabei believes that like COUPP, these experiments use targets and approaches that are different to DAMA and therefore do not contradict their finding.

Spin-independent interactions

While COUPP does an impressive job of rejecting background radiation, it is relatively insensitive to spin-independent interactions, which are believed to be much more prevalent than spin-dependent scattering. As a result, some physicists have suggested that it is not a significant step towards the detection of WIMPs — unlike competing contemporary experiments such as XENON10 in Italy; CDMS in the US; and ZEPLIN-II in the UK.

Cooper and his COUPP colleagues are now working to increase the sensitivity by increasing the amount of liquid to 30 l in the bubble chamber, and hope to start testing the larger chamber at Fermilab soon.

“The path to much larger bubble chambers is clear and underway now,” says Cooper. “We plan to be operating up to 100 Kg of sensitive mass within a year. The ultimate limitations to this technique will be only the ability to control backgrounds.”

Nuclear unknowns

NuclearSmall.jpg

Don Geesman of Argonne National Laboratory is well-known not only for his work on quarks but also for serenading his audience with folk songs about underground neutrino detectors (“For it’s dark as a dungeon and damp as the dew/where neutrinos come slowly and the funding does too”). Although we were not so fortunate to experience any music at his symposium this afternoon, I did manage to corner the maestro afterwards for a quick chat about the ties between nuclear physics and astrophysics.

Geesman said that most of the crossover between the two disciplines occurs when studying explosive events such as supernovae and novae. Because such events occur on rapid timescales, they can be greatly affected by the physics of nuclear isotopes that have a very short lifetime themselves — ones that can only be glimpsed in facilities such as Argonne’s ATLAS accelerator. A lot of the experiments at ATLAS seek to understand how the nature of these short-lived isotopes changes when they are at the sort of high temperatures found in supernovae.

Why is this important? Thermonuclear, “Type Ia” supernovae have for a long time been considered the “standard candles” of the cosmos because their brightness is so consistent. It was using these standard candles as distance markers that, in 1998, physicists were led to conclude that the expansion of the universe is accelerating, and therefore that there must be some kind of all-pervasive “dark energy”. But recent observations have led astrophysicists to suspect that standard candles might not be so standard after all. Geesman explained that a better understanding of short-lived isotopes will give astrophysicists a solid theoretical basis to tell how reliable standard candles are.

Freebie round-up

Freebies.jpg

There’s not a fantastic selection of freebies at this year’s AAAS meeting, although there are one or two gems. A brain that sticks to walls, a pen that unfolds at the push of a button and — if you can make it past the dark sunglasses and curly-wired earphones of the agents at the FBI stand — a copy of Weapons of Mass Destruction: A Pocket Guide.

However, the freebie producing the biggest buzz comes from ITER, the project that is desperately trying to get a fusion power plant up and running by 2018. Go to its stand and you can pick up a pair of magnetic bean-shaped “fusion particles”. The idea is that you take one in each hand, throw them up in the air, and gasp in amazement as they stick and flutter together, thus demonstrating the basic principle of fusion.

I tried for a good two minutes earlier today trying to make those fusion particles work, but to no avail. Then, back in the press room, a journalist who wished to remain anonymous revealed the secret. “You have to believe in ITER,” he told me. “They won’t ever work unless you say you believe.”

Refashioning American science

The US science funding cuts revealed in last year’s omnibus bill were a terrific blow to US physicists, with Fermilab in particular being forced to lay-off 200 of its staff. If it doesn’t recover, the US might find that key research and development institutions begin to settle elsewhere. “It takes something like the race to the Moon to open up the coffers,” lamented Robert Rosner, Director of Argonne National Laboratory, at a press breakfast this morning.

Rosner was appealing to the media to help the public see the benefit of physics research because, he said, they are not so good at it on their own. Many important discoveries in physics, from the transistor to the internet, only became widely adopted after a long period of development. “But culture is impatient,” he noted. The question is how to convey to a public accustomed to instant gratification the need to be patient in science.

Another problem in Rosner’s eyes is the poor demographic involved in science education. Most university students are Caucasian males, despite them comprising a smaller and smaller cross-section of the population. This might be because of the “nerdy” image of science, which only seems to disappear in extraordinary circumstances such as the launch of Russia’s Sputnik I 50 years ago. Or it might be that scientists receive poor compensation for their efforts, compared with, say, doctors.

Unlike the past, when the US would enjoy brain-draining scientists from abroad, today sees fewer top scientists migrating to the country. Rosner thinks this is because of the difficulty in getting a work visa, and a lack of clear economic incentives. For global institutions, he calls science a “meritocracy”: they take root where the talent is.

Rosner’s presentation echoed feelings from yesterday’s symposium on the media’s coverage of climate change, namely that the best way — if the only way — to persuade the government and the public about the merit of scientific research is through the media. At the symposium, John Holdren of Harvard Univerity pointed out that the erstwhile British prime minister Margaret Thatcher only became convinced of the importance of the environmental cause after she was lumbered with a pile of New Scientist magazines to take on holiday.

Copyright © 2025 by IOP Publishing Ltd and individual contributors