Skip to main content

New optical clock promises increased accuracy

Researchers from the National Institute for Standards and Technology (NIST) in the US have built an optical clock that promises to be the world’s most accurate. The clock is accurate to one part in 10-17, which means it cannot lose or gain a second in more than one billion years.

Optical clocks provide us with the most accurate time keeping known. Unlike conventional atomic clocks, which use microwave radiation, an optical clock uses a beam of visible laser light to fire at an ion. The frequency of the light is tuned so that the ion can absorb a photon and jump from a lower to a higher energy state. When the photon is emitted a moment later the ion returns to lower energy state, much like the ticking of a clock. The jumping is so fast that in one second it can occur over 9 billion times.

Till Rosenband and colleagues selected a single aluminium ion as the basis for their clock because aluminium is less sensitive to temperature and electric fields than mercury, the ion used for the present most accurate clock. But there is a catch: the transition from the ground to excited state in aluminium is hard to probe directly with light. The researchers, therefore, held a beryllium ion next to the aluminium ion in a trap, so that the aluminium ion’s internal state could be transferred to the beryllium ion. This internal state could then be measured — a technique known as quantum logic spectroscopy (Science Express doi: 10.1126/science.1154622).

Apart from the potential of allowing GPS satellites to track position with sub-metre precision, the improved accuracy of the optical clock has led the researchers at NIST to look at whether the fine-structure constant, which governs how light and electrons interact, has actually been changing over time. By measuring the ratio of clock frequencies for aluminium and mercury the researchers at NIST have concluded that the fine structure constant is not changing to within 1.6 x 10-17 per year. Indeed, they believe they have met the required precision to say that it is not changing at all. “This measurement of the ratio of aluminium and mercury clock frequencies is the most accurate known physical constant,” says Rosenband.

The NIST researchers believe that aluminium could be a potential candidate for a clock that is accurate to 1 part in a billionth of a billionth (1 x 1018) of a second, even more accurate than mercury. This would allow researchers to further tighten the constraints on whether the fine structure constant is changing.

Cold Fusion as Policy Posterboy

The March Meeting has everything, including a session on cold fusion.

It is almost 20 years since Pons and Fleischmann told the world that they had seen nuclear fusion in what is essentially an electro-chemistry experiment. The idea is that if you packed enough deuterium into a piece of palladium metal, the deuterium nuclei would somehow overcome considerable electrical repulsion (perhaps being screened by palladium electrons) and fuse together, releasing lots of energy.

The announcement set off a furore that pitted chemists against physicists and led to allegations that big-energy interests and the physics “establishment” were trying to cover up a genuine breakthrough. And sadly, as nuclear physicists scrambled to do experiments involving hydrogen and electricity, there was at least one deadly explosion.

However, other researchers were unable to confim cold fusion and today most of the physics community has forgotten it. Except for a small band of researchers who have somehow convinced the APS to give them a session at the March Meeting.

This year’s session included a talk from a non-physicist, Thomas Grimshaw, who teaches public policy at the University of Texas at Austin. Grimshaw has adopted cold fusion as “a posterboy for rational policy making”. He looked at cold fusion research results using “evidence-based policy making” analysis techniques — the sort of thing a government would use to decide if lower speed limits save lives on the roads.

His conclusion is that there is a “preponderance of evidence” that funding cold fusion research is in the public interest. The minimum response, he believes, is that the US government should reinstate its cold fusion programme — and it would be a reasonable response to give cold fusion the same funding status as conventional approaches to fusion such as magnetic and interial confinement.

While I doubt that this public-policy approach will raise the profile of cold-fusion research, there is something admirable in the fact that the people in session A14 have battled against conventional wisdom for nearly two decades. But writing as someone who did a cold fusion experiment in 1990, my personal opinion is that whatever they are seeing — it’s not fusion.

You can read more about Grimshaw’s work here.

Physicist wins by-election for US Congress

The number of physicists in the US Congress has risen to three after G. William “Bill” Foster, a veteran of the Fermilab National Accelerator Laboratory, won a by-election in Illinois on Saturday. Foster, a Democrat who has never previously stood for political office, secures a seat in the House of Representatives previously occupied by Republican and former House speaker Dennis Hastert, who resigned after 20 years service. “Back in the laboratory, this is what we’d say was a pretty successful experiment,” Foster said.

The two existing physicists in Congress are Republican Vernon Ehlers of Michigan, a nuclear physicist and one-time chair of Calvin College’s physics department, and Democrat Rush Holt of New Jersey, formerly assistant director of the Princeton Plasma Physics Laboratory.

Scientific training teaches you always to look at the facts first Bill Foster

Foster, 52, earned a bachelor’s degree in physics from the University of Wisconsin in 1975. Four years later, as a Ph.D. student at Harvard University, he joined the Irvine–Michigan–Brookhaven (IMB) collaboration where he helped to design, build, and conduct research the IMB detector. Designed to spot proton decay, this instrument gained fame when it detected a burst of neutrinos emitted by the SN 1987A supernova.

On earning his Ph.D. in 1984, Foster began what was to become a 22-year stint at Fermilab. As his first task, he designed and built components of the Tevatron’s CDF detector, which discovered the top quark in 1994. In the early 1990s, he led a team responsible for designing an integrated circuit that ratcheted up the speed and accuracy of measuring particle collisions. Later in that decade he co-invented Fermilab’s “recycler ring”. Foster left Fermilab two years ago to devote himself to political activism. However, because it lies within his constituency, he retains links to the lab.

Before he became a physicist, Foster had launched a successful business career. When he was 19, he and his younger brother Fred used $500 of their parents’ money to found a lighting company, Electronic Theatre Controls Incorporated, which now manufactures more than half the theatre lighting equipment used in the United States. The equipment has featured in Broadway, in Rolling Stones tours, and in half-time entertainment at the Superbowl. Foster used some of the fortune generated by the company to finance his election campaign.

His background in physics played its own critical role in his political success. “The scientific training teaches you always to look at the facts first,” he said in reference to the Iraq war. “If you look at the places this country’s gotten itself in trouble, it’s very often where we ignore facts for political reasons.”

Streetcar to the Garden District

garden.jpg

It’s a lovely day in New Orleans and I managed to get a sunburn walking around the French Quarter this morning….I suppose I’m a real redneck now!

Our hotel is right across the road from the convention centre and there are now lots of physicists milling about — the excitement is building. Like myself, many of them look like they haven’t seen the sun for quite some time, so local pharmacies better stock up on sunburn cream!

After lunch I took an ancient streetcar (tram to our European readers) out to the Garden District — a leafy area of huge moss-covered oak trees, ornate Victorian houses and of course, fragrant gardens.

As I was coming back on the St Charles streetcar, I noticed that the branches of the trees at the side of the road were festooned with hundreds of garish necklaces of every possible colour. I’m guessing that these were thrown from floats during a Mardi Gras parade.

I might go back to the Garden District and try to find the City of the Dead — a cemetery where all the tombs are above ground. Maybe I can persuade the IOP crew to make the journey tonight after dark!

Arrived and ready

river.jpg

The 21 hour door-to-door trip is past us now as we focus on the start of the conference tomorrow. We arrived at the hotel early on Sunday morning after a quick connection in Chicago. The whole trip from Heathrow to New Orleans went quite smoothly, except for the need to change planes in Chicago after we were all seated and ready to go – apparently there was a problem with the braking system, so I wasn’t complaining to change planes. It was also on the flight from Chicago that it became apparent that there were possibly many physicists on-board, most of them armed and ready with poster tubes.

Today, we had our first chance to see some of New Orleans. We had a brief walk around the French Quarter and along the Mississippi river where most of the hotels are situated near to the convention center. Though it was not immediately clear from these areas the devastation that was inflicted by hurricane Katrina in 2005.

I popped into the convention center itself, and already saw a mass of physicists queuing up to register. Coming to the center is the first time the scale of the APS March meeting hits you – containing massive halls where the exhibitions are held. Some people already have their hands on the thick conference book, meticulously studying it, though no doubt looking for the location of colleague’s talks.

After a few too many shrimps this afternoon, we are ready for the conference tomorrow and look forward to keeping you updated on all the in’s and out’s of the 2008 APS March meeting.

Off to New Orleans

nola.jpg

Michael and I are leaving for New Orleans bright and early tomorrow morning — along with five other colleagues from IOP Publishing. Our journey begins in Bristol at about 9.30 in the morning and if all goes well, we will arrive in New Orleans just before midnight (local time). I reckon that’s about 21 hours door-to-door. Unless, we get snowed-in in Chicago!

We have just put the finishing touches on our battle plan for what promises to be a intensive week of condensed matter physics. Actually, more than just condensed matter is on the agenda. Michael will be looking into “econophysics” and physics of the stock market, while I’m looking forward to learn about the physics of hurricane formation and climate change.

See you in the Big Easy!

Nanotubes measure DNA conductivity

Ever since the famous double-helix structure of DNA was discovered more than 50 years ago, researchers have struggled to understand the complex relationships between its structural, chemical and electrical properties. One mystery has been why attempts to measure the electrical conductivity of DNA have yielded conflicting results suggesting that the molecule is an insulator, semiconductor, metal—and even a superconductor at very low temperatures

DNA’s apparent metallic and semiconductor properties along with its ability to self-replicate has led some researchers to suggest that it could be used to create electronic circuits that assemble themselves. Now, however a team of researchers in the US has shown that DNA’s electrical conductivity is extremely sensitive to tiny structural changes in the molecule — which means that it could be very difficult to make reliable DNA circuits.

Reliable connection

Colin Nuckolls of Columbia University, Jacqueline Barton of Caltech and colleagues were able to make reliable conductivity measurements by inventing a new and consistent way of connecting a single DNA molecule to two carbon nanotubes (Nature Nanotechnology 3 163). Past methods had struggled to make a reliable connection between a DNA molecule — which is only about 2  nm wide— and two electrodes. Poor connectivity is thought to be behind many of the inconsistencies in previous measurements.

The team began with a nanotube — a tiny tube of carbon about as thick as DNA itself – that was integrated within a simple electrical circuit. A 6-nm section of the nanotube was removed using plasma ion etching. This procedure not only cuts the tube, but also oxidizes the remaining tips. This makes it possible to bridge the gap with a DNA molecule with ends that have been designed to form strong chemical bonds with the oxidized tips.

Similar to graphite

The conductivity was determined by simply applying 50  mV across the DNA and measuring the current that flowed through it. In a standard piece of DNA, the conductivity was similar to that seen in graphite. This is consistent with the fact that the core of the double helix of DNA consists of stacked molecular rings that have a similar structure to graphite.

A benefit of having the DNA attached securely to the electrodes is that the conductivity can be studied under ambient conditions — in a liquid at room temperature. This allowed the team to confirm that they were actually measuring the conductivity of DNA and not something else in the experiment. This was done by adding an enzyme to the surrounding liquid that cuts DNA – and as expected the electrical circuit was broken.

Mismatched bases

The team were also able to investigate the effect of base mismatches on conductivity. DNA double strands are normally connected through interactions between particular bases—adenine to thymine and cytosine to guanine. If one of the bases in a pair is changed, the two strands will still stick together, but with an altered structure around the mismatched bases.

The team first measured the conductivity of a well matched strand and then exchanged it for a strand with a single mismatch. This single mismatch boosted the resistance of the DNA by a factor of 300. According to Jacqueline Barton “this highlights the need to make measurements on duplex DNA that is well-matched, undamaged, and in its native conformation.”

An important implication of this sensitivity to small changes in structure is that DNA by itself might not be an ideal component for future electronic devices.

Indeed, this inherent sensitivity to structural change could allow living cells to detect DNA damage, which can accumulate in cells and lead to problems including cancer. Cells have ways of repairing this damage, but the mechanism they use to detect damage is still not completely understood. Barton says that this “whole body of experiments now begs the question of whether the cell utilizes this chemistry to detect DNA damage.” This is a question her group is now trying to answer.

Artificial black hole created in lab

Everyone knows the score with black holes: even if light strays too close, the immense gravity will drag it inside, never to be seen again. They are thought to be created when large stars finally spend all their fuel and collapse. It might come as a surprise, therefore, to find that physicists in the UK have now managed to create an “artificial” black hole in the lab.

Originally, theorists studying black holes focused almost exclusively on applying Einstein’s theory of general relativity, which describes how the gravity of massive objects arises from the curvature of space–time. Then, in 1974, the Cambridge University physicist Stephen Hawking, building on the work of Jacob Bekenstein, showed that quantum mechanics should also be thrown into the mix.

Hawking suggested that the point of no return surrounding a black hole beyond which light cannot escape — the so-called event horizon — should itself emit particles such as neutrinos or photons. In quantum mechanics, Heisenberg’s uncertainty principle allows such particles to spring out of the empty vacuum in pairs all the time, although they usually annihilate shortly after. But if two particles were to crop up on either side of a black hole’s event horizon, the one on the inside would be trapped while the one on the outside could break free. To an observer, the black hole would look like a thermal body, and these particles would be the black hole’s “Hawking radiation”.

This is all very well in theory, but in practice Hawking radiation from a black hole would be too low to be detected above the noisy cosmic microwave background (CMB) left over from the Big Bang. Simply put, black holes are too cold. Even the smallest black holes, which according to Hawking should have the warmest characteristic temperature, would still be about eight orders of magnitude colder than the CMB.

Faced with the difficulty of observing Hawking radiation from astrophysical black holes, some physicists have attempted to make artificial ones in the lab that have a higher characteristic temperature. Clearly, generating huge amounts of gravity is both dangerous and next to impossible. But artificial black holes could be based on an analogous system in which the curved space–time of a gravitational field is enacted by another varying parameter that affects the propagation of a wave. “We cannot change the laws of gravity at our will,” Ulf Leonhardt at the University of St Andrews in the UK tells physicsworld.com. “But we can change analogous parameters in a condensed-matter system.” Leonhardt’s group at St Andrews is the first to create an artificial black-hole system in which Hawking radiation could be detected (Science 319 1367).

We cannot change the laws of gravity at our will Ulf Leonhardt, University of St Andrews

Fishy physics

The idea of using analogous systems to create black holes was first proposed by William Unruh of the University of British Columbia in 1981. He imagined fish trying to swim upstream away from a waterfall, which represents a black hole. Beyond a certain point close to the waterfall, the current becomes so strong — like an event horizon — that fish cannot swim fast enough to escape. In the same vein, Unruh then considered what would happen to waves flowing from the sea into a river mouth. Because the current gets stronger farther up a river, the waves can only progress so far upstream before being defeated. In this way, the river is a “white hole”: nothing can enter.

In the St Andrews experiment, which uses the refractive index of a fibre optic as the analogy for a gravitational field, there are actually both black and white holes. It relies on the fact that the speed of light of light in a medium is determined not only by the light’s wavelength, but also by the refractive index.

The group begins by sending a pulse of light through an optical fibre that, as a result of a phenomenon known as the Kerr effect, alters the local refractive index. A split-second later they send a “probe” beam of light, which has a wavelength long enough to travel faster through the fibre and catch up the pulse. But due to the altered refractive index around the pulse, the probe light is always slowed enough to prevent it from overtaking — so the pulse appears as a white hole. Likewise, if the group were to send the probe light from the opposite end of the fibre, it would reach the pulse but would not be able to go through to the other side — so the pulse would appear as a black hole.

What are the minimal properties required to induce Hawking radiation in a lab system the way we think it is induced by gravitational black holes? Renaud Parentani, University Paris-Sud

Over the event horizon

Leonhardt and his colleagues proved that these black- and white-hole event horizons exist by monitoring the group velocity of the probe light, which never exceeded that of the pulse. More importantly, they have calculated that it should be possible to detect Hawking-radiation particles produced at either of the event horizons by filtering out the rest of the light at the far end of the fibre.

The detection of Hawking radiation would help physicists bridge the gap between quantum mechanics and general relativity, two presently incompatible theories. It might also help physicists investigate the mystery surrounding the wavelength of photons emitted at an event horizon, which is thought to start at practically zero before being stretched almost infinitely via gravity.

However, Renaud Parentani of University Paris-Sud in France thinks that, although it may be possible to glimpse radiation from an event horizon in future versions of the group’s system, the radiation might not possess all the expected properties of Hawking radiation generated by astrophysical black holes. For instance, the fibre-optic system is limited by dispersion, which means that the wavelength of photons produced at the event horizon will not be stretched very far. “What are the minimal properties required to induce Hawking radiation in a lab system the way we think it is induced by gravitational black holes?” he asks. “The answer, even on the theoretical side, isn’t clear. But these experiments will encourage us to consider the question more deeply.”

Entangled memory is a first

Physicists in the US are the first to store two entangled quantum states in a memory device and then retrieve the states with their entanglement intact. Their demonstration, which involves “stopping” photons in an ultracold atomic gas, could be an important step towards the practical implementation of quantum computers.

The basic unit of information in a quantum computer is the qubit, which can take the value 0, 1 or—unlike a classical bit — a superposition of 0 and 1 together. A photon could be used as a qubit, for example, with its “up” and “down” polarization states representing 0 or 1.

If many of these qubits are combined or “entangled” together in a quantum computer, they could be processed simultaneously and allow the device to work exponentially faster than its classical counterpart for certain operations. Entanglement could also play an important role in the secure transmission of information because the act of interception would destroy entanglement and reveal the presence of an eavesdropper.

Fragile states

However, this fragile nature of entanglement has so far prevented physicists from making practical quantum information systems.

Now, a team of physicists at the California Institute of Technology led by Jeff Kimble has taken a step towards this goal by working out a way to store two entangled photon states in separate regions of an extremely cold gas of caesium atoms (Nature 452 67).

The entangled states are made by firing a single photon at a beam splitter, in which half the light is deflected left into one beam, and the other half deflected right to form a second beam. These two beams are parallel, separated by about 1 mm and contain a pair of entangled photon states—one state in the left beam and the other in the right.

Slow-moving hologram

Once inside the cloud, a hologram-like imprint of the two entangled photon states on the quantum states of the atoms can be created using an effect called electromagnetically-induced transparency (EIT). This imprint moves through the gas many orders or magnitude more slowly than the speed of light, effectively “stopping” the entangled states for as long as 8 µs.

EIT is initiated by a control laser that is shone through the gas to create the holograms. Then, the laser is switched off, which causes the photon states to vanish leaving the holograms. Then the control laser is switched back on, which recreates the entangled photon states from the holograms. The storage time can be changed by simply varying the time that the laser is off.

When the recreated entangled photon states leave the atomic gas, one of the states passes through a device that shifts its phase, while the other does not. The two states then recombine at a detector. If the states remain entangled, adjusting their relative phase would create a series of bright and dark quantum interference effects at the detector.

20% entangled

By repeating the experiment for a large number of single photons and measuring the interference intensity, the team concluded that about 20% of the entangled photon states were recovered from the atomic gas. While this might seem like a poor success rate, it is good by quantum-computing standards where entanglement efficiencies of 1-2% are common.

According to Lene Hau of Harvard University, who pioneered EIT, the Caltech technique could be improved by cooling the atomic gas below the current 125 mK to create a Bose-Einstein condensate in which all the atoms are in a single coherent quantum state.

Sharing secrets

Hau also believes that the Caltech memory device could be modified to store the entangled states in two different atomic gases. This, she says, would allow quantum keys to be shared securely between users of a quantum encryption system.

The storage and retrieval of individual photons in an atomic gas was first demonstrated in 2005 by two independent groups—one led by Hau and the other working at the Georgia Institute of Technology. The ability to store photons without destroying entanglement is crucial for the transport of single photons over large distances, where the memories would work as quantum repeater that would boost the optical signal without destroying the quantum nature of the signal.

Gravity-test constrains new forces

Physicists in the US have used a tabletop experiment to rule out the existence of strong, gravitational-like forces at short length scales. Such forces, which could hint at additional space–time dimensions or weird new particles, would cause Newton’s inverse square law of gravity to break down. By directly measuring the gravitational force on a micromechanical cantilever, however, Andrew Geraci and co-workers at Stanford University have found no evidence for such effects down to a distance of about 10 μm.

These are the most stringent constraints on non-Newtonian forces to date at this length scale Andrew Geraci

The result represents a small reduction in the amount of “wiggle-room” available in theories that attempt to incorporate gravity with the other three forces of nature, in particular string theory. “These are the most stringent constraints on non-Newtonian forces to date at this length scale,” says Geraci, who is currently based at the National Institute of Standards and Technology in Boulder.

Mysterious force

Gravity is the most mysterious of nature’s four known forces. Because it is so weak, researchers have only been able to test Newton’s inverse square law — which states that the gravitational force between two masses is inversely proportional to the square of their separation — down to distances of about 0.1 mm in the last few years. Compare this with electromagnetism, which is some 40 orders of magnitude stronger than gravity and has been tested at subatomic scales.

On the theoretical side, gravity poses even more of a challenge. Unlike electromagnetism and the strong and weak nuclear forces, which are described by quantum field theories, gravity is described by Einstein’s general relativity — a geometric theory which reduces to Newtonian gravity in everyday situations but which breaks down at the quantum scale. The lack of experimental constraints on gravity at distances less than a millimetre has given theorists broad scope to hypothesize how gravity relates to the other three forces. Testing gravity at short length scales therefore helps put these ideas on more solid ground.

Attonewton measurement

Geraci and co-workers have used an interferometer to measure how much a 0.25 mm-long silicon cantilever loaded with a 1.5 μg test mass is displaced by a source mass located 25 μm beneath it (arXiv:0802.2350; submitted to Phys Rev D). Keeping the apparatus at cryogenic temperatures to reduce thermal noise, this allowed the team to measure the attonewton (10–18 N) forces between two masses directly. This contrasts with the precision torsion-balance experiments that have tested Newtonian gravity at sub-millimetre scales by measuring torque, namely those performed by Eric Adelberger’s group at the University of Washington (see: Exclusion zone).

As is usual when testing for departures from the inverse square law, the team then looked for evidence of a corrected “Yukawa-type” potential: VN(1+α e–r/λ), where VN is the Newtonian gravitational potential, α describes the relative strength of the new force and λ is its range. Although the Stanford apparatus does not allow the Newtonian gravitational interaction between the test masses (i.e. corresponding to α ~1) to be measured, the experiment probes large forces in the regime where α = 104 – 108 at length scales of about 10 μm. It is here that some rather outlandish theories suggest powerful new forces will show up.

Extra dimensions

A decade ago, theorists suggested that all matter and the quantum fields of the strong, weak and electromagnetic interactions (i.e. our entire material universe) are confined to a 3D “brane” that floats around in a higher dimensional space where gravitons (the supposed mediators of gravity) roam wild. The apparent weakness of gravity would therefore be an illusion to inhabitants of the brane: gravity is merely diluted as it spreads out into additional dimensions that we cannot perceive.

In accordance with Gauss’s law, each extra dimension would increase the exponent in Newton’s law: if a single extra dimension turned up at a certain length scale, then the inverse square law would become an inverse cube law at that scale, for example. Provided the extra dimensions are large enough, they could also harbour new “gauge” particles that mediate forces many orders of magnitude stronger than gravity. Unlike gravity, the range of such forces (which are of the order of a fraction of a millimetre owing to the finite masses of the particles) might be independent of the number and size of the extra dimensions, and the forces can be repulsive rather than attractive.

We have taken out a sizeable chunk of the parameter space for moduli Andrew Geraci

Similar forces can arise in string theory, which is the inspiration behind such “braneworld” scenarios and the leading contender for a quantum theory of gravity. String theory demands six extra dimensions, and the exchange of certain “light moduli” particles that determine the way these dimensions are compactified at small scales could mediate forces at least 10,000 times as strong as Newtonian gravity. These forces may be present even if the dimensions are not large but curled up at the Planck length (about 10–33 cm), as most string theorists think is the case.

Moduli constraints

According to Geraci, it is these light moduli that the Stanford experiment has constrained most dramatically. “We have taken out a sizeable chunk of the parameter space for moduli,” he says. “Of course we can not rule out string theory, but we are able to place meaningful bounds on the mass [which determines the range, λ] and coupling strength of moduli [which determines α] that are quite generic.” This adds to constraints set in 2003 by researchers at the University of Colorado using a planar oscillator separated by a gap of 108 μm, which operates at length scales that are intermediate between the Stanford and Washington set-ups (see: Exclusion zone).

The results represent a four-fold improvement on previous experiments performed by the same team in 2003, but the researchers are currently working on a new “rotary” experiment that has a larger interaction area for the test and source masses. With this, the team expects to constrain parameter space in alpha by a further one to two orders of magnitude within a year or so, which would allow the “strange-modulus region” to be surveyed almost entirely (see: Exclusion zone).

“This is a pioneering experiment,” says Stanford theorist Savas Dimopoulos, who was one of the first to propose braneworld scenarios in 1998.

Copyright © 2025 by IOP Publishing Ltd and individual contributors