Skip to main content

An ethnic theory for plane crashes

By Joao Medeiros

Malcolm Gladwell
Malcolm Gladwell (Courtesy; Brooke Williams)

Malcolm Gladwell, the virtuoso author of Tipping Point (which covered the work of physicists like Duncan Watts and Albert-Laszlo Barabasi and Blink, came to London for one day to present his new book, Outliers, to a packed audience at the Lyceum Theatre.

Gladwell is a maverick science journalist (or what “maverick” used to mean pre-Sarah Palin). He invented “pop economics” with his writing, spawning a whole new class of books like Freakonomics, The Long Tail, Here Comes Everybody, …. He works for the New Yorker, where he regularly writes about his niche subject: everything.

Gladwell is not a typical science journalist. He’s an original observer (not necessarily an original thinker — he defines himself as a communicator of science) that is driven by his own curiosity rather than following the agenda of scientists. Whereas most science journalists browse the scientific literature in search for the “what’s hot in science”, Gladwell follows his own instinct and curiosity. He starts his stories by asking by asking very simple questions about pretty much anything that crosses his way: “What is Cesar Milan ( from the TV show “The dog whisperer”) secret?”, “Why is there only one variety of Ketchup?”, “Why do we usually relate genius to precocity?”, etc etc These are questions that most people probably dismiss as random daydreaming divagations.

(more…)

Europe unveils 20-year astronomy roadmap

Europe’s funding agencies must increase astronomy spending by 20% in order to construct the next generation of space and ground-based telescopes. That’s the main conclusion of a roadmap published by scientists and funding bodies from 28 European nations as well as the European Space Agency (ESA) and the European Southern Observatory (ESO). The report also makes two ground-based projects — the European Extremely Large Telescope (E-ELT) and the Square Kilometer Array (SKA) — “clear top priorities” for construction within the next 10 years.

The ASTRONET Infrastructure Roadmap: A Strategic Roadmap for European Astronomy, recommends how €2bn of European funds should be spent on astronomy over the next 10–20 years by prioritizing projects due to be built both within Europe and worldwide. The ASTRONET consortium was set up in 2005 as a European version to the US National Research Council’s decadal survey of astronomy and astrophysics.

Of course it is down to governments to decide, but we believe we have a strong argument for the additional funding Michael Bode, ASTRONET Roadmap task leader

The consortium recommends five ground-based projects to be constructed in the next 10 years. The E-ELT costing €800m, and the €1.5bn SKA are among the large ground-based projects that are deemed “high priority” in the roadmap.

Largest optical telescope ever

The E-ELT will be the largest optical telescope ever built with a 42 m mirror consisting of 900 hexagonal segments to study visible and infra-red light. The SKA, to be built in either South Africa or Australia, will be an array of radio telescopes that will search for dark matter and look back to the first 100 million years after the Big Bang to study the evolution of galaxies.

The three other projects considered to be top priority in the roadmap are the Cherenkov Telescope Array to study high-energy gamma rays from black holes, Km3NeT — a 1 km3 neutrino detector in the Mediterranean — and a 4 m-class European Solar Telescope to be built in the Canary Islands.

Among space-based projects, ASTRONET recommended the space-based Laser Interferometer Space Antenna (LISA) project to detect gravitational waves and XEUS — a next-generation X-ray observatory — designed to explore how large black holes influence galaxies.

Mission to a gas giant

Also on the list is one of two missions to study Jupiter or Saturn and their satellites. The LAPLACE mission would study Jupiter and its ice-covered moon Europa, while TANDEM would place a balloon probe in the hazy atmosphere of Saturn’s moon Titan. One of these missions will be selected for launch next year, which it will then compete with XEUS or LISA to be the first to be launched.

To fund all the projects recommended in the roadmap, Europe would have to increase spending on astronomy from €2bn to €2.4bn a year. “Of course it is down to governments to decide, but we believe we have a strong argument for the additional funding,” says Michael Bode task leader for the ASTRONET Roadmap and head of the astrophysics research institute at Liverpool John Moores University in the UK.

Bode also points to the positive effect that astronomy has on science education and the potential spin-offs into the high-tech industry. “The figure [of €2.4bn per year] we estimate includes everything from employees to construction”, Bode told physicsworld.com. “To put it in context, a 20% increase on this represents around 1 euro per citizen in Europe per year”, he added.

Adopt 'dual-track" policy on nuclear weapons, scientists tell Obama

By Hamish Johnston

A report released yesterday by a group of US scientists including representatives of the American Physical Society urges president elect Obama to follow a “dual-track nuclear arms control and refurbishment/updating policy”.

This, says the report, is in line with Obama’s “vision of a nuclear-free world, and the continuing need to have a credible US deterrent as long as nuclear weapons exist”.

Although this “I’ll drop mine when you drop yours” approach makes as much moral sense as mutually assured destruction, the pragmatist in me knows the best we can hope for is that the US and others refrain from developing any new weapons — something the report calls for. And of course, pray that no-one (government, terrorist or otherwise) is demented enough to actually use one.

On more cheery notes, the report urges Obama to address the challenge of boosting global reliance on nuclear energy while controlling the risks of weapons proliferation.

The report also calls for the US and Russia to come to a new agreement on the simultaneous reduction of their nuclear weapons stockpiles.

Even more physics on film

zeilinger.jpg
Anton Zeilinger being interviewed in London

By Hamish Johnston

First it was Einstein and Eddington, then Leon Lederman …now, it’s Anton Zeilinger’s turn to hit the silver screen — or at least your computer screen.

Zeilinger was in London earlier this year to accept the inaugural Isaac Newton Medal from the Insititute of Physics .

He also delivered the 2008 Isaac Newton Lecture, which was recorded and can now be viewed on the IOP’s website.

Zeilinger, who is at the University of Vienna spoke on “Quantum Information and the Foundations of Quantum Mechanics”. You can also view an interview with the medal-winner on the website.

Our physics on film series continues shortly when Margaret Harris reveals whether her universe will ever be the same after BLAST!

A simpler route to invisibility

Two years ago researchers at Duke University in the US unveiled the first “invisibility cloak” — a device that can make objects vanish from sight, at least when viewed using a narrow band of microwave frequencies. Such cloaks work by causing electromagnetic waves to flow smoothly around the object and recombine on the other side in such a way as to make it appear that the waves travelled straight through the object unhindered.

Since then physicists have struggled to create cloaks that work across a wider range of frequencies and could be used, for example, to hide an object from radar. Now, Ulf Leonhardt of St Andrew’s University in the UK and Tomás Tyc of Masaryk University in the Czech Republic have come up with a new way of using mathematics to describe a invisibility cloak — a breakthrough that the physicists say could lead to the development of broadband invisibility cloaks (Science DOI: 10.1126/science.1166332).

From a mathematical point of view, an invisibility cloak can be described as a transformation of flat space that makes the light follow a curved path around the object. The idea is to make a coordinate transformation that takes a point in space and expands it into a sphere, the interior of which is invisible to an observer on the outside. For this to work light must traverse the surface of the sphere in the same, infinitesimally short time it would take to pass the original point. As a result, the light must travel at an infinitely high speed on the surface of the sphere.

Infinitely fast

Amazingly, the phase velocity of light can approach infinity in some materials and metamaterials (without violating the special theory of relativity because the “signal velocity” remains the speed of light). This has allowed the Duke team and others to actually build invisibility cloaks. The problem, however, is this only occurs for light at certain resonant frequencies.

Leonhardt and Tyc made their theoretical breakthrough by using non-Euclidean geometry to describe the workings of their cloak. Unlike the more familiar Euclidean geometry, non-Euclidean geometry is not restricted to describing space in terms of perpendicular axes. In their work, the physicists used a non-Euclidean geometry based on the surface of a sphere, which they intersected with a Euclidean plane in an arrangement that resembles a globe partially wrapped by a piece of paper (see figure).

The plane represents the region away from the cloak containing the light source and the observer, while the spherical geometry contains the region to be cloaked. If the sphere is between the source and observer, some of the light from the source will travel from the plane onto the sphere, where the light will naturally follows a curved path.

Resonances not needed

However, because of the way that the plane intersects the sphere, there is a small region on the sphere that these curved paths don’t cross. The trick, according to Leonhardt and Tyc, is to use a coordinate transformation to expand this into a space that could enclose a cloaked object. Because this does not involve expanding an infinitesimally small point, it does not require the light to travel at an infinitely high speed. This means that the operation of the cloak is no longer dependent on resonances in the material and should therefore work over a wider range of frequencies

While the physicists haven’t actually built such a cloak, they say that there non-Euclidean approach could provide a blueprint for building a broadband cloak. In particular, it could be used to define the index of refraction at a specific point in the cloak — and for light travelling in a specific direction. This quantity is given by the ratio by which the transformation stretches space at that point to create the cloaked region.

Of course, it will be a challenge to actually engineer a material to have an index of refraction that varies just so, but Leonhardt told physicsworld.com “I’m sure it can be done, technical challenges certainly need to be overcome, but no longer principal problems.”

Fermilab on film

bison.jpg
Still the frontier? Bison graze at Fermilab. Credit: Fermilab

By Margaret Harris

What does it feel like to work for an organization that — despite its considerable fame and all the talent it has nurtured over the years — is frankly on the verge of being outclassed? This is among the many questions raised by The Atom Smashers, an oddly moving little film about life at Fermilab in the months before its European rival, CERN, switched on the Large Hadron Collider. It’s scheduled to air on American public television stations starting from 25 November as part of PBS’ Independent Lens series, with repeats around 27 January; check local listings for specific dates and times.

The documentary focuses on the period between early 2006 and late 2007, and there is plenty of material for filmmakers Clayton Brown, Monica Long Ross and Andrew Suprenant to explore here. Over the course of the film, scientific enthusiasm collides with sharp budget cuts and promising results that don’t pan out — all while a neon “doomsday clock” marking the days, hours and minutes to LHC’s first collisions ticks down in the background.

(more…)

And the most popular cover is…


And the winner is…September 2008

By Hamish Johnston

The results are in and your favourite Physics World cover comes from the September 2008 issue of the magazine (right). The collage of galaxies was inspired by an illustration in John D Barrow’s book Cosmic Imagery: Key Images in the History of Science.

The cover, which garnered 13% of the 1303 votes in our recent survey to mark the 20th anniversary of Physics World, contains 56 striking images of galaxies that were derived from actual photos taken by NASA’s Hubble Space Telescope.

The cover highlights the Galaxy Zoo project, which recruits members of the public to help classify the thousands of galaxy images taken by the SDSS telescope in New Mexico.


March 1998 is done in John Richardson’s “Lichtenstyle”

My favourite cover – a brilliant homage to the late American pop artist Roy Lichtenstein – was runner up with 11% of the vote.

The cover of the March 1998 issue was created by the UK-based cartoonist John Richardson and shows “Alice and Bob, the central characters in many quantum information papers”. You can view a gallery of Richardson’s art here .


November 2001 is a salute to Andy Warhol

Pop art is also the theme of the third-place cover from November 2001, which uses eight slightly different “Schrödinger’s cats” to illustrate the concept of quantum cloning à la Andy Warhol.

The cat belonged to then features editor Val Jamieson (now at New Scientist), and I’m told it only had one eye – the other being cloned in our design studio. Sadly, the final measurement has been made on this cat.

The covers were voted on by our readers from a shortlist of 20 chosen by Matin Durrani and Dens Milne.

…and which cover was the least favourite? It’s the cover from July 2006 that illustrates an article on Hollywood physics – this attracted about 1% of the vote.

LHC repairs under way

Two months after an electrical fault put CERN’s brand new Large Hadron Collider (LHC) out of action, the first damaged sections of the machine are making their way out of the tunnel for repair.

In the past week or so, seven of the LHC’s magnets (mostly 15 m long, 35 tonne “dipoles”) have been transported approximately 6 km through the 27 km LHC tunnel from the scene of the incident to a shaft on the main CERN site. From there, the magnets have been craned 50 m to the surface and taken to different locations for inspection. Some 50 magnets are expected to have to come to the surface in total, about 20 of which will not return, and the last one should be above ground before Christmas.

The incident in September was a major blow for us. But things are moving fast now and we can see a way ahead Roger Bailey, LHC operations leader

On 19 September, just nine days after protons were circulated in both directions of the €3bn LHC, an electrical connection between a dipole magnet (one of 1232 that bend the protons around the ring) and a neighbouring quadrupole magnet (one of 392 that focus the proton beam) failed during circuit tests in the last of the LHC’s eight sectors. At the time, a current of 8.7 kA was being pushed through superconducting cables the width of a stick of chewing gum to generate the enormous magnetic fields required to bend protons at high energies.

Magnets broke their anchors

Due to a bad connection, a splice linking cables between two magnets in “sector 34” suddenly developed a resistance and therefore disintegrated — producing an electrical arc which punctured the liquid helium plumbing that keeps the magnets (i.e. superconducting cables) at their 1.9 K operating temperature. Two tonnes of helium was released with such force that some magnets broke their anchors to the concrete floor, and a further four tonnes of helium was also discharged into the LHC tunnel.

Although some equipment was ready to be transported from the affected area within a couple of weeks of the incident, engineers have had to wait until two independent LHC sectors — sectors 23 and 12 — were purged of helium before it was safe to do so. That’s because only one of the LHC’s access shafts, located at the north end of the CERN site in the middle of sector 12, is wide enough to handle the dipoles. Indeed, with transport vehicles in the LHC tunnel travelling at 2 km/h, this is partly why it took two years for all 1232 dipoles to be installed underground.

In a presentation to the LHC experiments committee (LHCC) on Wednesday, LHC project director Lyn Evans stated that the repair was well underway with 100 people from CERN and contractors working on it. He expects that about 20 dipoles will be replaced with spares, and said that techniques have been developed to detect resistive splices at low currents to help prevent a similar incident. Although CERN has not yet finalized the costs of getting the LHC up and running again, it estimates the maximum cost of repairs and consolidation to be CHF15m (€10m) plus CHF10–20m to replace the spare magnets.

Testing times

We need to check for damage to the super insulation, which means de-cryostating, re-cryostating and then testing all over again Nick Chohan, CERN

Of the 30 or so magnets that will be repaired and reinstated in the tunnel next year, many will require a major refit. “It’s not just a case of removing the magnets and cleaning them [the electrical arc produced soot that contaminated the proton beam pipes in some magnets],” Nick Chohan, who spent five years testing each LHC dipole before it was installed underground, told physicsworld.com. “We need to check for damage to the super insulation, which means de-cryostating, re-cryostating and then testing all over again.”

Among many safety modifications, such as reinforcements to the magnet anchors, CERN is considering modifications to the spring-loaded relief discs on the magnets’ vacuum enclosures, which were unable to cope with the huge helium discharge on 19 September. Although the latter could be done in situ, it would require all 140 tonnes of the LHC’s liquid helium to be taken out (raising the problem of where to store it) and all eight sectors to be warmed up to room temperature and cooled back down again — a process that would take months and possibly incur damage to the RF “fingers” between magnets.

CERN released an interim summary of the incident on 15 October, and plans to release a fuller report in early December outlining the repair schedule and plans for LHC operation in 2009. Despite frustration that the LHC did not provide even a few minutes of low-energy proton collisions, which machine operators were poised to deliver just days after the sector 34 incident, physicists working on the four LHC detectors are making the best of various data recorded during the start-up and of those from cosmic rays.

“The incident in September was a major blow for us,” LHC operations leader Roger Bailey told Physics World. “But things are moving fast now and we can see a way ahead.”

Proton and neutron masses calculated from first principles

The next time you step on bathroom scales, remember that it’s not your big bone structure or that extra helping of pudding that makes you heavy, it’s the motion of the quarks and gluons buzzing around inside the protons and neutrons that make up your body.

That’s the finding of a massive computer simulation of quantum chromodynamics (QCD) — the theory of the strong force — carried out by researchers in Germany, Hungary and France. They say theirs is the first to control all systematic uncertainties, thereby providing an accurate description of how quarks bind together to form baryons such as protons and neutrons.

The idea that protons and neutrons are made of three quarks has been around for over 40 years. However, the mathematical nature of the strong force that binds quarks together makes it impossible to make exact analytic calculations of the fundamental properties of both the quarks and the particles they make up.

Sea of quarks

This is further complicated by the “sea” of quark–antiquark pairs that is also believed to reside within heavy particles. These pairs bubble up from the gluons — massless particles that mediate the strong force — and must be included in any credible calculation.

To overcome these problems, physicists have come up with clever ways of doing approximate QCD calculations using supercomputers. But despite the development of sophisticated numerical techniques and ever faster computers, physicists have struggled to make a reasonable prediction of the mass of the humble proton — a quantity that is known experimentally to great precision.

Now, Zoltan Fodor and colleagues at DESY, Bergische University and the Juelich Supercomputing Center in Germany, along with researchers at Eotvos University in Hungary and CNRS Aix-Marseille in France, have used an established computational technique called “lattice gauge theory” to make the first calculations of the masses of the proton and neutron that incorporate all the relevant physics; make all the appropriate numerical approximations; and deliver a comprehensive analysis of possible errors and uncertainties in the results. Their success was possible thanks to the combined power of two IBM Blue Gene supercomputers and two cluster-computing centres (Science 322 1224).

4D lattice does the trick

The technique keeps track of a vast number of quarks and gluons by describing the space and time inside a proton with a set of points that make up a 4D lattice. This allows the equations of QCD to be solved in an iterative process using standard numerical techniques.

The problem is that this discretization introduces systematic errors. Although those errors can be controlled by making the lattice spacing smaller, that in turn requires even more computing power. The approximation also had to keep track of the sea quarks, which is another computationally intensive task.

The calculations suggest that the mass of the nucleon (the calculation cannot distinguish between protons and neutrons) to be 936 MeV/C2 with statistical and systematic uncertainties of ±25 and ±22 MeV/C2 respectively. The known mass of the proton and neutron are 938 and 940 MeV/C2 respectively.

The team has also been able to use the technique to calculate the mass of two mesons and seven heavier baryons.

Separating weak from strong

While calculating the mass of the proton hardly seems Earth shattering, Fodor believes that the team’s work shows that it is possible to make meaningful predictions of the role of the strong force in the particle interactions that will soon be occurring in the LHC.

For example, CP violation is usually associated with the weak force between quarks, but quarks always interact strongly, thus the strong force must also be considered. The LHCb experiment at the LHC is intended to explore such CP violating processes or even new physical phenomena and interpreting its results will require accurate calculations of the properties of particles containing the bottom quark.

On a more philosophical note, Fodor points out that the calculations confirm that the QCD-driven motion of quarks within nucleons — rather than the mass of the quarks themselves — is responsible for the vast majority of the visible mass in the universe.

US firm unveils plans for mini nuclear reactors

Nuclear power is normally associated with gigawatt-scale facilities costing billions of dollars and run by armies of scientists and engineers. But some in the nuclear industry have long argued that much smaller, unmanned reactors could play a role too. Such reactors, which would have power outputs of only a few tens of megawatts, would be particularly suitable for people or companies in remote parts of the world.

Now, however, Hyperion Power Generation — a US company based in New Mexico — has brought the dream of tiny nuclear reactors one step closer with its Power Module. This nuclear reactor — or “battery” as the firm calls it — is not much larger than a hot-tub and could supply thermal energy at a rate of about 70 MW. That could be converted into about 27 MW of electricity, which would be enough to supply about 20,000 US households.

Unlike conventional nuclear power plants, Hyperion’s reactor uses uranium hydride, which is essentially enriched uranium metal that has absorbed a large amount of hydrogen. As the uranium nuclei decay by fission, they release neutrons that are slowed down by the hydrogen, which acts as the moderator. The slow neutrons can then split further uranium nuclei and trigger a chain reaction.

No moving parts

The novel feature of the reactor is that the power output is kept steady without the need for any moving parts, flowing water, or human intervention. If the uranium hydride gets too hot, the hydrogen is driven out of the uranium metal and the chain reaction stops. But as the system is sealed, the hydrogen flows back into the uranium when it has cooled, allowing the reaction to restart. The up-shot is that the temperature and concentration of hydrogen stabilize, although if the sealed core is breached for any reason, the hydrogen will escape and fission stops.

Heat from the reaction is removed by liquid metal flowing in pipes with mesh wicks. According to the firm, these sealed systems are about 1000 times better than solid metals in transferring heat. Using these pipes is also an important safety feature because they keeps water, which can act as a moderator and slow down the neutrons (thereby speeding up the chain reaction), well away from the reactor core.

The technology was first developed by Otis Peterson and colleagues at Los Alamos National Laboratory in the US and then licensed to Hyperion, which was set up in 2006 to commercialize the technology. Peterson retired from the lab in 2006 to join the company as its chief scientist and in April 2008, the US-based venture-capital firm Altira Group invested several million dollars in Hyperion, which has 100 employees.

Eastern European launch in 2013

The firm says it will have a prototype of its reactor fully-designed next year and that it has already secured an order for six units from a group of investors in eastern Europe, including the Czech engineering company TES, who have an option to buy a further 44. It also claims to have other commitments from various parties — mostly energy utilities that currently use diesel generators in remote locations — for a further 100 units. The company expects to deliver its first reactor in June 2013.

Reactors would be configured and sealed at its factory, which has not yet been built, before being shipped to customers. Installation would take as little as six months and a reactor could remain in place for at least five years before the spent reactor would have to be returned to the factory and recharged with fresh fuel.

Will licensing be a problem?

The firm says that it started the process of getting the reactor — which is expected to cost about $25m — licensed for use in the US about two years ago. However, Paul Norman, a nuclear-reactor physicist at Birmingham University in the UK, warns that licensing could be a problem because regulatory agencies are not used to evaluating such small systems for commercial use. While he thinks that the concept of a mini-reactor is reasonable in principle, he believes that conventional power sources could still prove more cost-efficient. “There is a reason why we have not seen such reactors before,” he said.

But Peterson says that licensing Hyperion’s reactor will not be stumbling block because regulators already know how to evaluate research reactors, which can be very different to conventional power reactors. Moreover, he points out that the reactor is designed to run on uranium that is enriched to 10%, which — although higher than the 5% enrichment in most commercial light-water reactors — is well below the 20% threshold for what is considered to be highly-enriched material and should not raise the hackles of security agencies.

Hyperion believes that its reactors are ideally suited for companies needing a source of power in remote areas, such as mining companies or those wishing to extract oil from, say, the Canadian oil sands — an application that Altira has long been interested in. This is an energy intensive process that is currently fuelled by natural gas. According to Peterson, the large quantities of greenhouse gases that are generated during extraction could be reduced greatly if a Hyperion reactor were used — although the company has no firm orders from the oil industry.

Peterson also says that the used uranium hydride could be heated to drive out the hydrogen, leaving enriched uranium that could be re-used as fuel in a conventional nuclear reactor. While the company is still working on a way of doing this that is commercially viable, Peterson says that the “physics exists” to do so.

Copyright © 2025 by IOP Publishing Ltd and individual contributors