Skip to main content

And the winner is…

By Matin Durrani

For those of you wondering where we get all our ideas for news stories on physicsworld.com from, well obviously we have a bulging contacts book, we scour many of the leading journals, and we keep tabs on all of the key scientific experiments, facilities and space missions.

But, like all journalists, we do rely as well on press releases, including those supplied by the Alphagalileo service, which lists many of the latest releases from institutions in Europe, and those from a similar US-based service called EurekAlert! from the American Association for the Advancement of Science.

Now, EurekAlert! has revealed which press releases posted on its website were looked at most by journalists during 2010.

Nine of the top 10 were in biology and the biosciences, but the winner is one related to physics.

Curiously, it has nothing to do with anything that we at physicsworld.com would regard as all that significant – say the search for extrasolar planets or the hunt for the Higgs – and it certainly didn’t come anywhere near to making our list of the top 10 breakthroughs of 2010.

No, the top press released accessed by journalists on EurekAlert! was on a relatively obscure branch of physics. It concerned evidence, presented in the journal Science, that an unusual form of symmetry known as E8 – which a small number physicists believe underlies a theory of everything – may have been spotted in a solid material for the first time.

e1.jpg

We wrote about the paper at the time in January last year, which you can read here.

The paper may have proved so popular because it claimed to have shown that this 8D symmetry group describes the spectrum of spin configurations that emerge when a 1D chain of spins is chilled to near absolute zero and subjected to a specific magnetic field. The finding also suggested that the idea of a “golden mean” – previously only seen in mathematics and the arts – also exists in solid matter on the nanoscale.

But – and I’m guessing here – it may actually have been because journalists remember a controversial (and unrefereed) paper on E8, entitled “An exceptionally simple theory of everything” by an obscure, independent physicist called Garret Lisi, who is a keen surfer and does not follow a conventional academic life. Those traits – and some pretty pictures associated with E8 symmetry – led to a fair amount of press coverage, and far more than many string theorists felt, and still feel, it deserves.

In their view, this latest accolade from EurekAlert! will probably only make the situation worse.

A little bit of high-Tc history

paul.JPG

By Hamish Johnston

That’s Paul Grant (right) holding one of the high-temperature superconductor demonstration kits that he and his colleagues developed at IBM’s Almaden lab. The idea is that you fill the reservoir with liquid nitrogen and then place a magnet above the superconductor, where it will float.

The kit that Paul is holding was made in 1987 for IBM board members – and if you look at the photograph below you can see “IBM” embossed on the disc of a YBCO superconductor.

But as a 1987 New Scientist article by Grant points out, it’s not that difficult to make your own high-Tc material.

The article describes how Grant’s daughter Heidi (pictured in the clipping held by Grant) and her high-school classmates were able to make their own YBCO superconductor – and then float a magnet over it.

Reading the article I had a strong sense of deja vu. I had just been in a press conference where Kostya Novoselov was asked why graphene research took off like a rocket after he and Andre Geim worked out a way of making stand-alone sheets of the material. His answer was that it was fairly straightforward to make large high-quality samples of graphene and study its many properties.

ybco.JPG

But that’s where the similarity ends. Although physicists don’t understand everything about graphene, many properties have proven to be exactly as predicted by theory.

The same can’t be said about high-Tc superconductors, which have surprised and confounded physicists for 25 years. That was the subject of a talk today by the University of California’s Bob Dynes. “I will be surprised if there are no more surprises,” said Dynes.

Message from the Physical Society of Japan

By Susan Curtis at the APS March Meting in Dallas, Texas

It was good to see representatives from the Physical Society of Japan at the APS March Meeting. Keizo Murata of Osaka City University, who is also editor of the Journal of the Physical Society of Japan (JPSJ), wanted us to pass this message on to anyone in the physics community who wishes to make a donation to the relief efforts following the earthquake and tsunami:

“We, the Physical Society of Japan and the JPSJ, deeply appreciate the encouragement we have received from our colleagues all over the world.

“We welcome your donations to the relief and recovery from Japan’s disaster in March 2011. To help this, as well as to avoid any problems with currency exchange, we recommend that you make your donations via authorized organizations in your own country, such as the American Red Cross.

“However, to share your warm sympathy with the worldwide physics community, we would like to recognize your donation. This will be sure to encourage members of the Physical Society of Japan and people around us.

“To achieve this:

1. Send an e-mail to save.japan3.11@jps.or.jp with the subject: “donation Japan disaster” and your name.

2. In the e-mail please note in this order:

• Your name
• Your email address
• Your institution/affiliation
• Your country
• Value of donation (optional)
• The organization that took your donation
• Date of donation
• Any message (optional)
• Permission to use your message with your name on our site (yes/no)

3. If you should make further donations, please send another e-mail to save.japan3.11@jps.or.jp but include “your name (nth time)” in the email.

“Thank you for your kind co-operation,

The Physical Society of Japan
The Journal of the Physical Society of Japan

Murata also told us that JPSJ is still offering online services as normal, although some publications may be rescheduled.

Nuclear physicists protest at isotope facility closure

Physicists in the US have expressed dismay over the planned closure of the Holifield Radioactive Ion Beam Facility (HRIBF) at the Oak Ridge National Laboratory in Tennessee. Around 700 researchers have signed letters to William Brinkman, director of the Office of Science at the Department of Energy (DoE), warning that its closure will threaten the country’s lead in ion-beam research.

The closure of HRIBF would save the DoE’s Office of Science around $10.3m, with the money going instead towards two other projects. These are an upgrade of the Continuous Electron Beam Accelerator Facility at the Thomas Jefferson National Accelerator lab and HRIBF’s successor – the Facility for Rare Isotope Beams (FRIB) at Michigan State University. FRIB is expected to begin operations in 2019.

HRIBF is only one of four facilities in the world to produce radioactive beams using the isotope separator on-line (ISOL) technique – the others being at TRIUMF in Canada, SPIRAL at Grand Accélérateur National d’Ions Lourds in France and the On-Line Isotope Mass Separator at the CERN particle-physics lab near Geneva.

ISOL involves firing a beam of ions at a hot target, such as uranium, and then bringing the reaction products to rest. The products are separated according to mass and reaccelerated to carry out further studies. These facilities can produce unstable nuclei that are rich in neutrons and are thought to have played a vital role in the synthesis of heavy elements in stars.

Wider concerns

Nuclear physicists protesting against HRIBF’s closure have urged their colleagues to send a personalized letter to Brinkman as well as sign a letter of support for the facility. The letter has already been signed by some 700 physicists, including Nobel laureate Douglas Osheroff from Stanford University, who say they are “dismayed” to learn that the facility may close, claiming it would “represent a step backward” for US leadership in low-energy nuclear physics. “The logic behind the decision escapes us,” they write. “We are deeply concerned about the process by which this decision was made. There was no direct input form the community.”

FRIB chief scientist Bradley Sherrill from Michigan State University, who penned his own letter to Brinkman, says that losing HRIBF will hurt not only US scientists but also those around the world. “Research demand for rare isotopes is growing worldwide due to the exciting science it enables,” he says. Indeed, Sherrill adds that the decision to close HRIBF will mean that the US is less prepared for FRIB and put a strain on already highly subscribed facilities worldwide as US researchers look to Canada or Europe to do experiments.

“To allow continuation, HRIBF should not really close before FRIB opens,” says Peter Butler from Liverpool University in the UK, who is chair of the programme advisory committee at HRIBF. “We are all baffled by the logic of this decision.”

There is still a possibility that the letters will have an impact as the 2012 funding proposal is only a request from the US administration and still has to go through Congress. Indeed, Sherrill is hopeful that the final budget can still include funding for HRIBF that does not affect either the Jefferson project or on FRIB. But he admits that if Congress did accept the proposal, then HRIBF would indeed close.

Creating quantum dots from buckyballs

Researchers in Singapore have found that carbon-60 molecules, or “buckyballs”, can be used to make graphene quantum dots that are geometrically well defined. Such structures could be ideal in next-generation electronics as single-electron transistors in nanoscale circuits.

Graphene is a 2D sheet of carbon just one-atom-thick, and is usually made by cleaving small crystals of graphite. At the molecular level it looks like a sheet of chicken wire – a spread of joined-up benzene rings.

The “wonder material”, as it is sometimes called, is often touted to replace silicon as the electronic material of choice thanks to its unusual electronic, thermal and mechanical properties. These include the fact that electrons in the material behave like relativistic particles that have no rest mass and can therefore travel at speeds of around 106 m/s. These properties remain even when graphene devices are scaled down to a few benzene rings.

Getting the most out of graphene

Until now, researchers have only been able to make transistors from ribbons of graphene but such long shapes do not maximize the conductivity of the material. Transistors made from structures that can confine electrons quantum mechanically could be much more effective. One such structure is quantum dots, which are nanoscale objects comprising several thousand atoms that form tiny compound semiconductor crystals. But existing “top-down” methods to fabricate dots from ribbons, for example, are still quite complicated.

Now, Kian Ping Loh and colleagues at the University of Singapore have developed the first bottom-up approach to make graphene quantum dots smaller than 10 nm in size using fullerene molecules as precursors. What is more, the dots produced are all regularly sized and all the same shape – unlike those produced using top-down techniques.

Loh and co-workers generated their quantum dots by decomposing carbon-60 molecules at high temperatures on a ruthenium metal surface. The metal acts as a catalyst and causes the C60 to break down into carbon clusters. The researchers employed scanning tunnelling microscopy to observe how the carbon clusters diffused onto the metal surface and how they aggregated to form quantum dots.

Controlled manufacture

“By carefully controlling the density of the clusters on the surface, we were able to limit cluster aggregation and found that different-shaped clusters (flower-shaped and hexagonal) merged and crystallized into geometrically well-defined, hexagonal-shaped, graphene quantum dots at temperatures of around 825 K.”

To their delight, the researchers also found that the quantum dots had an energy gap that scales inversely with their size – that is, the smaller the dot, the larger its energy gap. This is an important result because graphene is normally metallic and engineering its band gap in this way to make the material semiconducting is one of the main goals of graphene research today.

Loh told physicsworld.com that he is confident that the technique might be used to produce large quantities of these dots in the future.

Spurred on by their results, the researchers now plan to isolate the quantum dots they made and fabricate devices from them. “Being chemists, we are also interested in studying the reactivity of the dots,” added Loh.

The work is detailed in Nature Nanotechnology.

A robot with a wink and a smile

robot2.JPG

By Hamish Johnston at the APS March Meeting in Dallas

One of the people in the photo (right) is a robot called Philip K Dick, and the other is David Hanson of Hanson Robotics, which is located just outside of Dallas.

David was talking at the March Meeting about the challenges of making robots that are more lifelike. And not just superficial looks – he’s put a great deal of effort into getting facial expressions right and generally making the robot respond as if it is human.

The problem today is that lifelike robots tend to be creepy (I believe that is a technical term in the industry). This is because getting just a few minor things wrong about how the machine behaves puts it in an uncomfortable place between living and dead – at least, that’s my opinion.

So when will you meet the first robot that is so lifelike that you think it’s human? Hanson thinks it will happen in less than an decade.

I don’t know about you, but that’s creepy!

Tiny antennas and transistors

moltrans.JPG

By Hamish Johnston at the APS March Meeting in Dallas

It’s the second day of the March Meeting and I’ve just done three video interviews, which should start appearing on physicsworld.com in April.

I also managed to make it to a few press conferences, including one on how to make extremely small transistors and antennas.

Above you can see Mark Reed of Yale University who was the first to create a transistor from a single molecule. Reed and colleagues place an organic molecule between two electrodes, which function as the source and drain in a field-effect transistor. The molecule is suspended above a third electrode, which acts as the gate.

You might think that Reed wants to make these tiny resistors to ensure that Moore’s law – the relentless miniaturization of computer chips – continues right down to the molecular level. However, he points out that the biggest threat to Moore’s law today is how to get rid of all the heat generated by a dense clump of tiny transistors. The molecular transistor doesn’t help much with that, and Reed is more interested in studying the fundamental physics of these quantum devices.

Also speaking at the press conference was Niek van Hulst of the Institute of Photonics Science in Barcelona. Van Hulst and colleagues have made tiny antennas that can broadcast and receive visible light.

Such antennas could be put very close to a molecule of interest for example, and capture all the light emitted by the molecule. Conversely it could also be used to direct intense light at just one molecule. Both of these abilities could prove very useful for molecular spectroscopy.

The team has also managed to put a tiny antenna on a scanning tunnelling microscope (STM) tip. Since the antenna is much smaller than the wavelength of the light it emits, such a set-up could be used to image molecules with resolutions much smaller than the wavelength of the light – beating the diffraction limit.

The most beautiful application though, was using an array of antennas coupled to quantum dots, which broadcast the flickering light of quantum noise within the dots.

Fire at US underground lab appears under control

Within the next 48 hours a crew is expected to enter the Soudan Underground Laboratory for the first time since a fire broke out last Thursday at the mine that houses it. The facility, which is managed by the University of Minnesota, is home to a number of high-profile physics experiments, including the MINOS neutrino detector and the detector of the Cryogenic Dark Matter Search (CDMS) experiment. The lab lies more than 700 m below ground where the rocks above shield the experiments from unwanted cosmic rays and other disturbances.

According to the Minnesota Department of Natural Resources (DNR), the fire was detected in the mine’s main lift shaft around 9 p.m. local time on Thursday (17 March) when nobody was in the mine. By Friday the DNR had established that the fire was blazing inside the shaft between levels 23 and 25 – just two levels above the physics laboratory, which is located 60 m below on the mine’s lowest level.

There were also fears that the lab could suffer flood damage after electrical systems automatically shut down, deactivating pumps that were designed to keep groundwater from entering the mine.

After fire-fighting efforts over the weekend, in which thousands of gallons of foam and water were sprayed into the mine, the Minnesota Interagency Fire Center reported on Sunday that the fire was 99% extinguished. Fire officials will only declare the blaze officially “out” once its source has been located and any smouldering ashes or embers have been extinguished.

A three-man team has already descended down the lift shaft to restart some of the pumps. By Sunday night they had reached the physics lab on level 27, where they encountered a large amount of foam, which seems to have prevented them from entering the lab. The laboratory’s back-up systems, however, including infrared sensitive cameras, have so far indicated that the laboratory seems to have escaped the initial fire.

“We already know that there was no flooding and since we saw by video camera that – at least on Friday – there was no smoke on level 27,” says Kurt Riesselmann, a spokesperson for the Fermilab National Laboratory, which manages the MINOS experiment, “our scientists expect that the experiments will look okay.”

Riesselmann told physicsworld.com on Monday that if conditions are safe, a crew will enter the underground lab in the next two days. The main question now, he says, is whether the unexpected and long power outage damaged the experiments.

Alfons Weber, a physicist at the University of Oxford, who works on the MINOS experiment, is confident that the laboratory is robust enough to have escaped significant impacts. “There is lot of foam at the lowest level, but there seems to be no flooding,” he says. “The lab is sealed from the rest of the mine by heavy iron doors and we don’t expect damage.”

Update: 25 March Marvin Marshak, associate director of the Soudan Underground Laboratory, told physicsworld.com that a team entered the laboratory on Tuesday to discover that foam had entered the room as the main door failed under hydrostatic pressure. This was contained, however to the 150-long-entrance section of the room and the steel face of the MINOS detector appears to have blocked its further progress. The detector’s main electronics, located on the side of the machine appear to have escaped unscathed.

Staff will be re-entering the lab today to conduct a more comprehensive assessment of the damage and to re-power the equipment in stages. He says that one of the remaining fears for the lab is that an absence of electricity over the past week could have caused damage to equipment such as background germanium detectors that may have overheated.

Metrology in the balance

 

The event at the Royal Society in London in January began precisely on time. But after the final delegates had taken their seats, Stephen Cox, the society’s executive director, noted sheepishly that the wall clock was running “a little slow” and promised to reset it. Cox knew the audience cared about precision and would appreciate his vigilance. As the world’s leading metrologists, they had gathered to discuss a sweeping reform of the scientific basis of the International System of Units (SI), in the most comprehensive revision yet of the international measurement structure that underpins global science, technology and commerce.

If approved, these changes would involve redefining the seven SI base units in terms of fundamental physical constants or atomic properties. The most significant of these changes would be to the kilogram, a unit that is currently defined by the mass of a platinum–iridium cylinder at the International Bureau of Weights and Measures (BIPM) in Paris, and the only SI unit still defined by an artefact. Metrologists want to make these changes for several reasons, including worries about the stability of the kilogram artefact, the need for greater precision in the mass standard, the availability of new technologies that seem able to provide greater long-term precision, and the desire for stability and elegance in the structure of the SI.

Over the two days of the meeting, participants expressed varied opinions about the force and urgency of these reasons. One of the chief enthusiasts and instigators of the proposed changes is former BIPM director Terry Quinn, who also organized the meeting. “This is indeed an ambitious project,” he said in his opening remarks. “If it is achieved, it will be the biggest change in metrology since the French Revolution.”

The metric system and the SI

The French Revolution did indeed bring about the single greatest change in metrology in history. Instead of merely reforming France’s unwieldy inherited weights and measures, which were vulnerable to error and abuse, the Revolutionaries imposed a rational and organized system. Devised by the Académie des Sciences, it was intended “for all times, for all peoples” by tying the length and mass standards to natural standards: the metre to one-forty-millionth of the Paris meridian and the kilogram to the mass of a cubic decimetre of water. But maintaining the link to natural standards proved impractical, and almost immediately the length and mass units of the metric system were enshrined instead using artefacts deposited in the National Archives in 1799. The big change now being championed is to achieve at last what was the aim in the 18th century – to base our standards on constants of nature.

Despite its simplicity and rationality, the metric system took decades to implement in France. Other nations eventually began to adopt it for a mixture of motives: fostering national unity; repudiating colonialism; enhancing competitiveness; and as a precondition for entering the world community. In 1875 the Treaty of the Meter – one of the first international treaties and a landmark in globalization – removed supervision of the metric system from French hands and assigned it to the BIPM. The treaty also initiated the construction of new length and mass standards – the International Prototype of the Metre and the International Prototype of the Kilogram – to replace the metre and kilogram made by the Revolutionaries (figure 1). These were manufactured in 1879 and officially adopted in 1889 – but they were calibrated against the old metre and kilogram of the National Archives.

At first, the BIPM’s duties primarily involved caring for the prototypes and calibrating the standards of member states. But in the first half of the 20th century, it broadened its scope to cover other kinds of measurement issues as well, including electricity, light and radiation, and expanded the metric system to incorporate the second and ampere in the so-called MKSA system. Meanwhile, advancing interferometer technology allowed length to be measured with a precision rivalling that of the metre prototype.

In 1960 these developments culminated in two far-reaching changes made at the 11th General Conference on Weights and Measures (CGPM), the four-yearly meeting of member states that ultimately governs the BIPM. The first was to redefine the metre in terms of the light from an optical transition of krypton-86. (In 1983 the metre would be redefined again, in terms of the speed of light.) No longer would nations have to go to the BIPM to calibrate their length standards; any country could realize the metre, provided it had the technology. The International Prototype of the Metre was relegated to a historical curiosity; it remains in a vault at the BIPM today.

The second revision at the 1960 CGPM meeting was to replace the expanded metric system with a still greater framework for the entire field of metrology. The framework consisted of six basic units – the metre, kilogram, second, ampere, degree Kelvin (later the kelvin) and candela (a seventh, the mole, was added in 1971); plus a set of “derived units”, such as the newton, hertz, joule and watt, built from these six. It was baptized the International System of Units, or SI after its French initials. But it was still based on the same artefact for the kilogram.

This 1960 reform was the first step towards the current overhaul, which seeks to realize a centuries-old dream of scientists – hatched long before the French Revolution – to tie all units to natural standards. More steps followed. With the advent of the atomic clock and the ability to measure atomic processes with precision, in 1967 the second was redefined in terms of the hyperfine levels of caesium-133. The strategy involved scientists measuring a fundamental property with precision, then redefining the unit in which the property was measured in terms of a fixed value of that property. The property then ceased to be measurable within the SI, and instead defined the unit.

The kilogram, however, stubbornly resisted all attempts to redefine it in terms of a natural phenomenon: mass proved exceedingly difficult to scale up from the micro- to the macro-world. But because mass is involved in the definitions of the ampere and mole, this also halted attempts for more such redefinitions of units. Indeed, in 1975, at a celebration of the 100th anniversary of the Treaty of the Meter in Paris, the then BIPM director Jean Terrien remarked that tying the kilogram to a natural phenomenon remained a “utopian” dream. It looked like the International Prototype of the Kilogram (IPK) was here to stay.

Drifting standard

A pivotal event took place in 1988, when the IPK was removed from its safe and compared with the six identical copies kept with it, known as the témoins (“witnesses”). The previous such “verification”, which took place in 1946, had revealed slight differences between these copies, attributable to chemical interactions between the surface of the prototypes and the air, or to the release of trapped gas. The masses of the témoins appeared to be drifting upwards with respect to that of the prototype. The verification in 1988 confirmed this trend: not only the masses of the témoins, but also those of practically all the national copies had drifted upwards with respect to that of the prototype, which differed in mass from them by about +50 µg, or a rate of change of about 0.5 parts per billion per year. The IPK behaved differently, for some reason, from its supposedly identical siblings.

Quinn, who became the BIPM’s director in 1988, outlined the worrying implications of the IPK’s apparent instability in an article published in 1991 (IEEE Trans. Instrum. Meas. 40 81). Because the prototype is the definition of the kilogram, technically the témoins are gaining mass. But the “perhaps more probable” interpretation, Quinn wrote, is that “the mass of the IPK is falling with respect to that of its copies”, i.e. the prototype itself is unstable and losing mass. Although the current definition had “served the scientific, technical and commercial communities pretty well” for almost a century, efforts to find an alternative, he suggested, should be redoubled. Any artefact standard will have a certain level of uncertainty because its atomic structure is always changing – in some ways that can be known and predicted and therefore compensated for, in others that cannot. Furthermore, the properties of an artefact vary slightly with temperature. The ultimate solution would be to tie the mass standard, like the length standard, to a natural phenomenon. But was the technology ready? The sensible level of accuracy needed to replace the IPK, Quinn said, was about one part in 108.

In 1991 two remarkable technologies – each developed in the previous quarter-century, and neither invented with mass redefinition in mind – showed some promise of being able to redefine the kilogram. One approach – the “Avogadro method” – realizes the mass unit using a certain number of atoms by making a sphere of single-crystal silicon and measuring the Avogadro constant. The “watt balance” approach, on the other hand, ties the mass unit to the Planck constant, via a special device that balances mechanical with electrical power. The two approaches are comparable because the Avogadro and Planck constants are linked via other constants the values of which are already well measured, including the Rydberg and fine-structure constants. Although in 1991 neither approach was close to being able to achieve a precision of one part in 108, Quinn thought at the time that it would not be long before one or both would be able to do this. Unfortunately, his optimism was misplaced.

The sphere…

The Avogadro approach (figure 2) defines the mass unit as corresponding to that of a certain number of atoms using the Avogadro constant (NA), which is about 6.022 × 1023 mol–1. It would, of course, be impossible to count that many atoms one by one, but instead it can be done by making a perfect enough crystal of a single chemical element and knowing the isotopic abundances of the sample, the crystal’s lattice spacing and its density. Silicon crystals are ideal for this purpose as they are produced by the semiconductor industry to high quality. Natural silicon has three isotopes – silicon-28, silicon-29 and silicon-30 – and initially it seemed that their relative proportions could be measured with sufficiently accurately. Although measuring lattice spacing proved harder, metrologists drew on a technique using combined optical and X-ray interferometers (COXI), which was pioneered in the 1960s and 1970s at the German national-standards lab (PTB) and the US National Bureau of Standards (NBS) – the forerunner of the National Institute of Standards and Technology (NIST). It relates X-ray fringes – hence metric length units – directly to lattice spacings. For a time in the early 1980s the results of the two groups differed by a full part per million (ppm). This disturbing discrepancy was finally explained by an alignment error in the NIST instrument, leading to an improvement in the understanding of how to beat back systematic errors in the devices.

The source of uncertainty that turned out to be much more difficult to overcome involved determining the isotopic composition of silicon. This appeared to halt progress toward greater precision in the measurement of the Avogadro constant at about three parts in 107. Not only that, but the first result, which appeared in 2003, showed a difference from the watt-balance results of more than 1 ppm. There was a strong suspicion that the difference stemmed from the measurements of the isotopic composition of the natural silicon used in the experiment. The leader of the PTB team, Peter Becker, then had a stroke of luck. A scientist from the former East Germany, who had connections to the centrifuges that the Soviet Union had used for uranium separation, asked Becker if it might be possible to use enriched silicon. Realizing that a pure silicon-28 sample would eliminate what was then thought to be the leading source of error, Becker and collaborators jumped at the opportunity. Although buying such a sample would be too costly for a single lab – an eye-watering €2m for 5 kg of the material – in 2003 representatives of Avogadro projects from around the world decided to pool their resources to buy the sample and form the International Avogadro Coordination (IAC). Becker at the PTB managed the group, parcelling out tasks such as characterization of purity, lattice spacing and surface measurements to other labs.

The result was the creation of two beautiful spheres. “It looks like what we’ve made is just another artefact like the kilogram – what we are trying to get away from,” Becker said at the January’s Royal Society meeting. “It’s not true – the sphere is only a method to count atoms.”

…and the balance

The second approach to redefining the kilogram involves an odd sort of balance. Whereas an ordinary balance compares one weight against another – a bag of apples, say, versus something else of known weight – the watt balance matches two kinds of forces: the mechanical weight of an object (F = mg) with the electrical force of a current-carrying wire placed in a strong magnetic field (F = ilB), where i is the current in the coil, l is its length and B is the strength of the field. The device (figure 3) is known as a watt balance because if the coil is moved at speed u, it generates a voltage V = Blu – and hence, by mathematical rearrangement of the above expressions, the electrical power (Vi) is balanced by mechanical power (mgu). In other words, m = Vi/gu.

In modern watt balances, the current, i, can be determined to a very high precision by passing it through a resistor and using the “Josephson effect” to measure the resulting drop in voltage. Discovered by Brian Josephson in 1962, this effect describes the fact that if two superconducting materials are separated by a thin insulating material, pairs of electrons in each layer couple in such a way that the microwave radiation of frequency, f, can create a voltage across the layer of V = hf/2e, where h is Planck’s constant and e is the charge on the electron. The resistance of the resistor, meanwhile, is measured using the “quantum Hall effect”, which describes the fact that the flow of electrons in 2D systems at ultralow temperatures is quantized, with the conductivity increasing in multiples of e2/h. The voltage, V, is also measured using the Josephson effect, while the speed of the coil, u, and the value of g can also be easily obtained.

What is remarkable about the watt balance is how it relies on several astonishing discoveries, none of which were made by scientists attempting mass measurements. One is the Josephson effect, which can measure voltage precisely. Another is the quantum Hall effect (discovered by Klaus von Klitzing in 1980), which can measure resistance precisely. The third is the concept of balancing mechanical and electrical power, which can be traced back to Bryan Kibble at the UK’s National Physical Laboratory (NPL) in 1975, who had actually been trying to measure the electromagnetic properties of the proton. These three discoveries can now be linked in such a way that the kilogram can be measured in terms of the Planck constant. By “bootstrapping”, the process could now in principle be reversed, and a specific value of the Planck constant used to define the kilogram.

In Michael Faraday’s famous popular talk “A chemical history of the candle”, he called candles beautiful because their operation economically interweaves all the fundamental principles of physics then known, including gravitation, capillary action and phase transition. A similar remark could be made of watt balances. Though not as pretty as polished silicon spheres, they nevertheless combine the complex physics of balances – which include elasticity, solid-state physics and even seismology – with those of electromagnetism, superconductivity, interferometry, gravimetry and the quantum in a manner that exhibits deep beauty.

Towards the “new SI”

The Avogadro approach and watt balances each have their own merits (see “A tale of two approaches” below), but as the 21st century dawned, neither had reached an accuracy of better than a few parts in 107, still far from Quinn’s target of one part in 108. Nevertheless Quinn, who stepped down as BIPM director in 2003, decided to pursue the redefinition. Early in 2005 he co-authored a paper entitled “Redefinition of the kilogram: a decision whose time has come” (Metrologia 42 71), the subtitle derived from a (by then ironic) NBS report of the 1970s heralding imminent US conversion to the metric system. “The advantages of redefining the kilogram immediately outweigh any apparent disadvantages,” Quinn and co-authors wrote, despite the then apparent inconsistency of 1 ppm between the watt balance and silicon results. They were so confident of approval by the next CGPM in 2007 that they inserted language for a new definition into “Appendix A” of the BIPM’s official SI Brochure. Furthermore, they wanted to define each of the seven SI base units in terms of physical constants or atomic properties. In February 2005 Quinn organized a meeting at the Royal Society to acquaint the scientific community with the plan.

The reaction ran from lukewarm to hostile. “We were caught off guard,” as one participant recalled. The case for the proposed changes had not been elaborated, and many thought it unnecessary, given that the precision available with the existing artefact system was greater than that of the two newfangled technologies. Not only were the uncertainties achieved by the Avogadro and watt-balance approaches at least an order of magnitude away from the target Quinn had set in 1991, but there was also still this difference of 1 ppm to be accounted for. Nevertheless, the idea had taken hold, and in October 2005 the BIPM’s governing board, the International Committee for Weights and Measures, adopted a recommendation that not only envisaged the kilogram being redefined as in Quinn’s 2005 paper, but also included redefinitions of four base units (kilogram, ampere, kelvin and mole) in terms of fundamental physical constants (h, e, the Boltzmann constant k and NA, respectively). Quinn and his colleagues then published a further paper in 2006 in which they outlined specific proposals for implementing the CIPM recommendation, with the idea that they be agreed at the 24th General Conference in 2011.

In recent years both approaches have realized significant progress. In 2004 enriched silicon in the form of SiF4 was produced in St Petersburg and converted into a polycrystal at a lab in Nizhny Novgorod. The polycrystal was shipped to Berlin, where a 5 kg rod made from a single crystal of silicon-28 was manufactured in 2007. The rod was sent to Australia to be fashioned into two polished spheres, and the spheres were measured in Germany, Italy, Japan and at the BIPM. In January this year the IAC reported a new measurement of its results with an uncertainty of 3.0 × 10–8, tantalizingly close to the target (Phys. Rev. Lett. 106 030801). The result, the authors write, is “a step towards demonstrating a successful mise en pratique of a kilogram definition based on a fixed NA or h value” and claim it is “the most accurate input datum for a new definition of the kilogram”.

Watt-balance technology, too, has been steadily developing. Devices with different designs are under development in Canada, China, France, Switzerland and at the BIPM. The results indicate the ability to reach an uncertainty of less than 107. Their principal problem, however, is one of alignment: the force produced by the coil and its velocity must be carefully aligned with gravity. And as the overall uncertainty is reduced, it gets ever harder to make these alignments. The previous difference of 1 ppm has been reduced to about 1.7 parts in 107 – close but not quite close enough.

Still, these results have led to a near-consensus in the metrological community that a redefinition is not only possible but likely. One BIPM advisory committee has proposed criteria for redefinition: there should be at least three different experiments, at least one from each approach, with an uncertainty of less than five parts in 108; at least one should have an uncertainty of less than two parts in 108; and all results should agree within a 95% confidence level. The proposal drafted by Quinn and colleagues, for what is called the “new SI”, is almost certain to be approved. It does not actually redefine the kilogram but “takes note” of the intention to do so. The redefinition is now in the hands of the experimentalists, who are charged with meeting the above criteria.

The greatest change of all?

These developments gave Quinn the confidence to organize another meeting. This time, he and his fellow organizers devised a careful strategy. The 150 participants at January’s Royal Society meeting included three Nobel laureates: John Hall of JILA (whose work contributed to the redefinition of the metre); Bill Phillips of NIST; and von Klitzing himself. Under the new proposals, no longer will these physical constants be measured as, within the SI, their numerical values are fixed (see “Towards a new SI”). Furthermore, the definitions are similar in structure and wording, and the connection to physical constants made explicit. The language makes it clear what these definitions really say – what it means to tie a unit to a natural constant – giving them a conceptual elegance. The proposed new definition for the kilogram, for example, is “The kilogram, kg, is the unit of mass; its magnitude is set by fixing the numerical value of the Planck constant to be equal to exactly 6.626,068…× 10–34 when it is expressed in the unit s–1 m2 kg, which is equal to J s.” (The dots indicate that the final value has not yet been determined.)

The metrological community is vast and diverse, and different groups tend to have different opinions about the proposals. Those who make electrical measurements tend to be enthusiastic; h and e now became exactly determined and much easier to work with. Moreover, the awkward split between the best available electrical units introduced by Kibble’s device and those available in the SI is eradicated. The only outright objection from this corner at the meeting, by von Klitzing, was tongue-in-cheek. “Save the von Klitzing constant!” he protested, pointing out that his eponymous constant, RK = h/e2, which has been conventionally set (outside the SI) at 25,812.807 Ω exactly for two decades, now becomes revalued in terms of e and h, making it long and unwieldy rather than short and neat. Yet he went on to express sympathy with the redefinition, citing Max Planck’s remark in 1900 that “with the help of fundamental constants we have the possibility of establishing units of length, time, mass and temperature, which necessarily retain their significance for all cultures, even unearthly and non-human ones” (Ann. Phys. 1 69).

The mass-measurement community tends to be less sanguine. Mass measurers can currently compare masses with about an order of magnitude greater precision – one part in 109 – than they can achieve by directly measuring a constant. The new definitions thus appear to introduce more uncertainty into mass measurements than exists at present; in place of careful traceability back to a precisely measurable mass, you now have traceability back to a complicated experiment at various national laboratories. As Richard Davis, the recently retired head of the BIPM mass division, remarked about the SI, “It’s got to be like a piece of Shaker furniture: not just beautiful but functional.” Advocates, however, point out that comparison measurements conceal the uncertainty present in the kilogram artefact itself, so that ultimately no new uncertainty is introduced. “Uncertainty is conserved,” as Quinn remarked.

One group not present at the meeting are the students, educators and other members of the public interested in metrology. As the Chicago Daily Tribune complained after the SI was created in 1960, “We get the feeling that important matters are being taken out of the hands, and even the comprehension, of the average citizen.” Woe to the colour-blind seamstress, it continued, who can use a tape measure but cannot tell an orange–red wavelength. The half-joke concealed the worry that measurement matters, which should be simple for the average person to understand, were about to become too complex for anyone except scientists. One of the attractions of science for students is that the concepts and practices are perspicuous, or aim to be – but the new SI seems to put the foundations of metrology out of reach of all but insiders. Woe to the butcher and grocer, someone may jest when the new SI takes effect, who is not proficient in quantum mechanics.

Still, each age bases its standards on the most solid ground it knows, and it is appropriate that in the 21st century this includes the quantum. None of the attendees at the Royal Society objected in principle to the idea that the kilogram should eventually be redefined in terms of the Planck constant. “It’s a scandal that we have this kilogram changing its mass – and therefore changing the mass of everything else in the universe,” Phillips remarked at one point. A few people were bothered that it now appeared impossible for scientists to detect whether certain fundamental constants were changing their values, though others pointed out that such changes would be detectable by other means.

Many participants, however, were troubled that the Avogadro and watt-balance teams have produced two measurements that are still not quite in sufficiently good agreement, throwing a spanner, if not perhaps a large one, into the attempt to pick a single value. “The person who has only one watch knows what time it is,” said Davis, citing an ancient piece of metrology wit. “The person who has two is not sure.”

Quinn appears confident that the discrepancy will be resolved in a few years. The only clear controversy on show at the meeting concerned what would happen if it is not. Quinn wants to plunge ahead with the redefinition anyway, given that the level of precision was so remote that it would not affect measurement practice. Some objected, fearing that there would be secondary effects, such as in legal metrology, which concerns legal regulations incorporated into national and international agreements. Others worry about the perception of metrology and metrological laboratories if a value is fixed prematurely and then the mass scale must be changed in the light of better measurements in the future. NPL director Brian Bowsher, referring to the current climate-change controversy in which sceptics leap on any hint of uncertainty in measurements, stressed the importance of being “the people who take the time to get it right”.

Calling the new SI the greatest change since the French Revolution may be hyperbole. The advent of the SI in 1960 was possibly just as important in that it introduced new units and tied existing ones to natural phenomena for the first time. The new changes will also have scarcely any impact on measurement practice, and are largely for pedagogical and conceptual reasons. Nevertheless, the change is breathtaking in its ambition – the most extensive reorganization in metrology since the creation of the SI in 1960 – and the realization of a centuries-old dream.

The new SI also represents a transformation in the status of metrology. When the SI was created in 1960, metrology was regarded as something of a backwater in science – almost a service occupation. Metrologists built the stage on which scientists acted. They provided the scaffolding – a well-maintained set of measuring standards and instruments, and a well-supervised set of institutions that cultivated trust – that enabled scientists to conduct research. The new SI, and the technologies that make it possible, connect metrology much more intimately with fundamental physics.

A tale of two approaches

Avogadro method
Best measurement
The lowest uncertainty is the 30 parts per billion measurement made by the International Avogadro Coordination (IAC)

Advantages

  • The definition of the kilogram using a fixed value of the Avogadro constant is reasonably intuitive, but requires auxiliary defining conditions

Disadvantages

  • It is a very difficult experiment and requires a worldwide consortium (the IAC) to make the measurements on the two silicon-28 spheres currently in existence
  • The measurements are unlikely to be repeated, so the existing spheres will become a form of artefact standard subject to questions about their long-term stability

Watt-balance approach
Best measurement
The lowest uncertainty is the 36 parts per billion measurement made by the National Institute of Standards and Technology (NIST) in the US

Advantages

  • It is not an artefact standard; the technique measures a range of masses and is not limited to multiples and sub-multiples of 1 kg
  • A watt balance can be built and operated by a single measurement laboratory
  • The results from a worldwide ensemble of watt balances can be compared and combined. This would provide the world with a robust mass standard, which is better than the individual contributions
  • A fixed value of h, combined with a redefined ampere that fixes the value of the elementary charge, e, will be a big plus both for physics and high-precision electrical measurements

Disadvantages

  • Although the principle is simple, the implementation is not, and requires significant investment of money, time and good scientists
  • The definition of the kilogram using a fixed value of Planck’s constant is less intuitive than the current definition

Ian Robinson, National Physical Laboratory, UK

Fall-out from the Japan quake

By Matin Durrani

radioactivity_chart.png
The impact of the earthquake and subsequent tsunami that hit Japan earlier this month has been truly devastating, with the latest reports suggesting 9000 people have died and a further 13,000 currently unaccounted for.

But if you spend your days following media reports of the disaster, you’d be forgiven for thinking that the biggest catastrophe has been the damage to the Fukushima Daiichi nuclear plant.

I’ve sometimes felt as if the mainstream media almost want an epic nuclear disaster to take place so that they have something to get their teeth into and fill their rolling TV news bulletins.

I was therefore pleased to see a sober assessment of the true nuclear danger from the plant from a recent blog entry by Randall Munroe, a physics graduate best known for his comic-strip website xkcd.

The picture above, which you’ll need to click here to see in full, tries to quantify to the best of Munroe’s ability the real risks from the plant.

Sure, it would be great if the reactor had survived the earthquake and tsunami – and there’s no harm making sure other reactors around the world are as safe as they can be as many countries are doing – but this shouldn’t be the signal for the world to end the recent revival in nuclear power.

You only have to think about the damage caused by the BP oil spill in the Gulf of Mexico last year to see a true environmental disaster.

Of course, the Achilles heel of the nuclear industry is the fear of “radiation” and ionizing radiation in particular. You can’t see it or smell it, which makes it, to some at least, creepily scary.

But hopefully Munroe’s chart puts things in perspective a bit.

In the meantime, we’ll continue to follow how the quake is affecting Japan’s physics community. Things are looking not too bad and the odd bent beamline is far from catastrophic given what else has been taking place.

Copyright © 2025 by IOP Publishing Ltd and individual contributors