Skip to main content

Those puzzling infinities

Since it opened for business a couple of years back, CERN’s Large Hadron Collider (LHC) has been confirming the validity of the Standard Model of particle physics to ever-greater precision and accuracy. In the process, it has been causing ever-greater frustration among theorists, many of whom had hoped that the collider would quickly uncover new physics. Given the Standard Model’s current robust status, it is easy to forget that during the 20th century, its theoretical bedrock – quantum field theory – was left for dead at least twice by its own creators. Frank Close’s book The Infinity Puzzle contains a timely reminder of these near-death experiences.

The first convincing quantum description of a field, the reader learns, arrived in 1928, in the form of Paul Dirac’s theory of the electron and of the electromagnetic interaction. Dirac’s equations had some indisputable successes: they fitted spectroscopic data, explained photons and quantum spin, and even foresaw the existence of the positron. But his theory seemed incomplete. If a field is supposed to be a “thing” with a quantum life of its own, then it surely should interact with the electron that generated it – yet Dirac’s equations seemed unable to account for such “self-interaction”. Theorists’ fears were confirmed in 1947 when Willis Lamb announced that he had found a small deviation from the predictions of Dirac’s theory in the hydrogen spectrum.

Fortunately, the theoretical physics community was full of creative sparks with plenty of time on their hands, as the Second World War and the Manhattan Project had finished. Before the end of the 1940s, Richard Feynman, Julian Schwinger, Shin’ichiro Tomonaga and Freeman Dyson had created quantum electrodynamics, or QED, in its modern form. The new quantum field theory modelled the interactions of the electron with its own field and also with the sea of virtual particles that pop in and out of existence in the vacuum, as required by the Heisenberg uncertainty principle. Whereas simple-minded calculations of these interactions gave paradoxical results in which quantities added up to infinity, QED made the infinities go away thanks to an accounting trick called renormalization. Although most theorists, beginning with Dirac himself, saw renormalization as contrived and probably wrong, QED’s agreement with experiment was and still is – unparalleled by that of any other scientific theory.

However, as Steven Weinberg would later write, “it was not long before there was another collapse in confidence”, when “shares in quantum field theory tumbled at the physics bourse”. The reasons went beyond people’s qualms about cooking the books in their calculations. The real problem came when physicists turned their attention to the weak interaction, and realized to their horror that this force resisted being renormalized at all. Instead, its infinities remained untamed – hence the puzzle referred to in the book’s title.

Throughout the 1950s and 1960s, researchers kept trying to fix quantum field theory and to expand it to include the weak force. It was during this period that theorists proposed ways to unify the electromagnetic and weak force, to apply gauge invariance to the weak force, and to show that this could be done while keeping the weak force short range, as experiment says it should be. The latter effort led to what is now known as the Higgs mechanism, and its saga rightfully occupies a substantial part of the book.

But as long as infinities kept popping up, all these efforts were doomed. Some theorists, such as Lev Landau, proposed scrapping the entire framework of quantum field theory and focusing instead only on the things one can actually observe: the inputs and outputs of particle collisions. This back-to-basics approach led to ideas known as quantum democracy and the bootstrap model, as well as ambitious proposals to abandon the notions of elementary particles and wavefunctions. Close’s account does not really delve into these alternative attempts, focusing instead on how even the creators of the electroweak unified theory – which is now an integral part of the Standard Model – seemed to lose hope that quantum field theory would survive. Weinberg’s paper of 1967 on electroweak symmetry breaking is now one of the most widely cited physics papers in history, but it went practically unnoticed at the time of publication. For several years, Close reminds us, no-one cited it at all – not even Weinberg himself.

Everything changed in 1971, when Gerard ‘t Hooft showed how to tame the infinities of the electroweak interaction. In one fell swoop, the Dutchman rescued years of work by many of his older colleagues, and put quantum field theory back onto its feet. The day of ‘t Hooft’s breakthrough is, appropriately, what Close – a noted science writer and a theoretical particle physicist himself – chose as the opening scene of his book. As Close recounts, the subsequent “gauge-field revolution” included the development of the theory of the strong force, called quantum chromodynamics, and was crowned by some of the most heroic experimental discoveries in high-energy physics, culminating in 1983 with the discovery of the W and Z bosons.

Close deserves praise simply for picking quantum field theory as the topic of a popular book. This is as hard a topic as they come, and he doesn’t cut too many corners when it comes to conceptual depth. Just as important, though, is the fact that he does not hide the complexities of the historical development of the theory. Scientists rarely, if ever, come out with fully formed ideas, and Close demonstrates how science proceeds through false starts, strokes of luck, missed opportunities and, as he puts it, “comedies of errors”.

Close is especially diligent in investigating the priority of ideas and in crediting researchers who may have been left behind, either by the Nobel committee or by popular imagination. He interviewed virtually all surviving protagonists and, when possible, went back to their private letters and lecture notes. The result is a much more nuanced picture of history. For example, we learn that Tom Kibble and John Ward may have had as much to do with the development of the electroweak theory as Weinberg and his fellow Nobel laureates Abdus Salam and Sheldon Glashow. In Salam’s case, in particular, the reader is left with the impression that the committee rewarded a physicist widely seen as being of Nobel calibre, but that it may have picked the wrong reason to do so. We also learn how, in the late 1960s, James “B J” Bjorken taught physicists how to demonstrate the existence of quarks – thus leaving his footprint on much of the experimental particle physics that followed – but has yet to receive due recognition. Similarly, the Higgs mechanism and the Higgs boson were (as Peter Higgs himself acknowledges) the product of at least seven minds: Philip Anderson, Gerry Guralnik, Carl Hagen, Tom Kibble, François Englert and the late Robert Brout as well as Higgs. If and when the particle is found, this notorious issue will no doubt make the Nobel committee’s life very hard, as only three people can share the prize each year.

For all its qualities, however, The Infinity Puzzle is hardly a safe choice for beach reading. Its subject is inherently weighty, and the reader’s task is complicated by Close’s highly nonlinear style of narration. His story keeps bifurcating into rivulets and eddying backwards in time; often, one wonders whether the author is recapitulating an earlier chapter or telling the same story anew. One section contains no fewer than six repetitions of the sentence “hardly anyone at the time believed that quarks were real” and variations thereof. Readers with a passing interest in the Higgs will probably get more out of a smoother, more focused and more accessible treatment, such as Ian Sample’s book Massive (reviewed by Physics World in November 2010).

Serious physics-history buffs, on the other hand, will find The Infinity Puzzle invaluable. Meanwhile, LHC physicists are confident that they will finally settle the question of the existence of the Higgs boson by the end of 2012. If it does turn up, some people in Stockholm will likely be among the book’s most avid readers.

Bouncing droplets simulate Zeeman effect

Physicists in France have used pairs of bouncing droplets on a fluid surface to simulate the Zeeman effect – a phenomenon that played an important role in the early development of quantum mechanics. The ability to simulate purely quantum effects using such a classical system could provide insight into how the mathematics of quantum mechanics should be interpreted.

What does the Schrödinger equation mean? The question has been debated by physicists since this central tenet of quantum mechanics was introduced nearly 90 years ago. While its predictive power has been verified many times over in laboratories all around the world, exactly how the solutions to the equation (the wave-fuctions) should be interpreted is still not clear.

The most popular school of thought is the famous “Copenhagen interpretation”, formulated by Niels Bohr and Werner Heisenberg in the 1920s. This probalistic interpretation of quantum mechanics holds that the observable properties of a particle do not have definite values until they are measured. However, this view is not universally accepted and another interpretation of quantum mechanics favoured by some physicists is the so-called “pilot wave” interpretation, formulated by Louis de Broglie in 1927 and later developed by David Bohm. This assumes that the observable properties of quantum particles are defined at all times but that they are guided by a wave, which neatly explains wave–particle duality. This is an example of a hidden variable theory because it explains the measurable properties of quantum mechanics as the consequence of a physically real, but experimentally inaccessible, feature – the wave.

Contrived or intuitive?

The two theories are mathematically indistinguishable, so some physicists see the so-called “Bohm interpretation” as a contrived attempt to explain the experimental results of quantum mechanics without embracing the weirdness of the Copenhagen interpretation. However, in 1980 Michael Berry and colleagues at the University of Bristol in the UK used an analogy with surface waves in a classical fluid to come up with a more intuitive explanation of a bizarre quantum phenomenon called the Aharonov–Bohm effect (discovered by the same Bohm).

Now, Yves Couder at the University of Paris Diderot and colleagues have explored this analogy further by looking at the behaviour of tiny, bouncing droplets called “walkers” as they move across the surface of a vibrating bath of silicone oil. The drops create waves on the surface of the fluid and are, in turn, influenced by these waves. According to Couder, this provides an interesting parallel with the “pilot wave” model of quantum mechanics.

“There is a symbiosis between the droplet and the wave,” explains Couder, “because if there is no droplet there is no wave. And if there is no wave the droplet doesn’t move.” Couder and his colleagues believe that this interaction between a walker and the waves that it creates is an example of wave-particle duality in a classical system because, while the droplet is localized in space like a particle, its motion can be influenced by anything that affects the pilot wave.

Bound states

Couder is emphatic that his group’s system is not an exact analogy to quantum mechanics because, for example, it requires a continuous input of energy by vibrating the bath. Nevertheless, in previous research his group has managed to use walkers to create classical analogies to the quantum effects of single-particle diffraction and tunnelling. It has also shown that two walkers can orbit each other to form bound states, in an analogy to the quantized bound states in an atom.

In the new research, the group investigated the Zeeman effect – a quantum effect whereby the energy levels in an atom split in the presence of an external magnetic field. An atom is a bound state of a nucleus and one or more electrons – and this is simulated using a bound state of two walkers.

To create an analogy to an applied magnetic field, the researchers rotated the bath. The two-walker bound state was then free to rotate either with or against the rotation of the bath – simulating the orbital angular-momentum states of an atom. In the absence of the simulated magnetic field, both of these rotational states have the same energy. However, when the bath is rotated the energy of the rotational states split, with one increasing and the other decreasing – just like the angular-momentum states of an atom in a magnetic field. The team also saw abrupt transitions between energy levels.

Fernando Lund of the University of Chile in Santiago, who has headed a research group looking at similar problems but who was not involved in the current research says: “The most significant feature of this paper, and of others by the same team, is the masterful use of state-of-the-art technology to bring out analogies between classical and quantum physics that can be easily visualized.” He suggests that it might be interesting to try to visualize other quantum phenomena using classical means. “My favourite candidate would be the half-integer spin of some particles like electrons,” he says. “Not that I can see any way of going about answering this question!”

The research is published in Physical Review Letters.

No-touch technique to measure softness

Two teams of researchers in France have measured the rigidity of a material without touching it. The method involves a tiny amount of fluid flowing over the surface of the material and is non-invasive and non-destructive. As such, the technique could be used to achieve nanometre-scale analysis of the elastic properties of thin films or fragile objects such as bubbles or living cells.

A simple way of measuring the rigidity of a body is to touch it with an object that is harder than it. The problem with this technique is that the harder “probe” object can destroy the object of interest, especially if the latter is extremely fragile, such as a living cell. As a result, researchers are keen to develop a less-destructive method. This latest work was done by two teams – one including Samuel Leroy and colleagues at the Laboratoire de Physique de la Matière Condensée et des Nanostructures in Lyon and the other including Frédéric Restagno and colleagues at Laboratoire de Physique des Solides in Paris. The physicists had originally set out to measure softness by blowing on an object with a stream of air and measuring any deformation. However, they dropped that idea because they found that controlling a flow of air is difficult because of the vortexes that can form.

Nanoflows

From that setback came the idea of using a tiny “nanoflow” of fluid, which the researchers have found to be easier to control than air. To make the measurements, the researchers create a very weak and extremely thin flow of liquid between the probe and the material of interest – in one study this was a thin elastomer (rubber) film only several hundreds of nanometres thick – the film is placed on a rigid glass support and immersed in a mixture of water and glycerol. The tiny flow is created by a special technique developed in 2000 by Leroy. The probe is a millimetre-sized Pyrex glass sphere that is attached to a rod, which can be vibrated with great precision. This fine movement is produced via a “piezoelectric ceramic” system, which can control the displacement of the sphere at distances as small as 0.01 nm. It is this tiny glass bead, according to the researchers, that creates the nanoflow.

Distorting films

When the sphere is very close to the material being probed – say about 1 µm away – it pushes the liquid towards the object and so induces very gentle pressure on the surface of the material. If the material is flexible, it is deformed by the pressure. If it is completely rigid, no deformation occurs. Any deformation affects the motion of the sphere and how it vibrates, and this change can be used to calculate the elasticity of the film.

The researchers also used the set-up to measure the rigidity of an array of bubbles – something so fragile that it would be destroyed when touched. By combining this new method with established techniques, such as using “colloidal-probe atomic force microscopy”, the researchers hope to measure the softness of a wide range of materials.

The research is published in Physical Review Letters.

The Fukushima accident: 'made in Japan'

By Michael Banks

The Fukushima nuclear accident last year “could and should have been foreseen and prevented” according to a report released yesterday by the 10-member Fukushima nuclear accident independent investigation commission. Chaired by Kiyoshi Kurokawa, former president of the Science Council of Japan, the report says the accident was a profoundly “man-made disaster” that was “made in Japan” and could have been mitigated by a more effective human response.

The 88-page English version of the report says the accident was the result of “collusion” between the government, regulators and the plant’s operators. “They effectively betrayed the nation’s right to be safe from nuclear accidents,” the authors write.

In its introduction, Kurokawa writes that the commission’s report “catalogues a multitude of errors and wilful negligence that left the Fukushima plant unprepared”. Kurokawa adds that the “fundamental” failures of the plant were because of the “ingrained conventions of Japanese culture: our reflective obedience; our reluctance to question authority; our devotion to ‘sticking with the program’; our groupism; and our insularity”.

The Fukushima nuclear accident was caused by an earthquake and tsunami of a scale not seen in more than 1000 years, which struck north-eastern Japan at 2.46 p.m. local time on 11 March 2011.

The Fukushima Daiichi nuclear plant, located some 225 km north-east of Tokyo, seemed to withstand the 9.0 Richter-scale earthquake, with the three operating reactors turning off automatically as it struck. However, the tsunami that followed a few minutes later poured over a seawall designed to protect the nuclear plant from waves up to about 6 m high (the tsunami produced waves more than 14 m high).

The plant was then flooded, causing the back-up diesel generators to fail, and – with nothing to cool the reactors – their cores started to melt.

The report offers seven recommendations, including establishing a new regulator for nuclear power as well as a committee that would monitor this new body.

See also “In the wake of Fukushima” and “Lessons from Fukushima“.

Dark-matter filament spotted

Physicists claim to have reliably detected a mammoth filament of dark matter stretching between two galaxy clusters, for the first time. If the detection is bona fide, it could be one of the best confirmations yet of the “standard model” of the universe’s evolution, the so-called lambda cold-dark-matter (ΛCDM) model.

The ΛCDM model posits that, in the early universe, dark matter was spread out in a web of filaments. Over time, this cosmic web would have helped all the normal “baryonic” matter to clump together, particularly in the regions where its filaments intersected. Today, we see the result of this clumping at the filament intersections: galaxy clusters and, on a smaller scale, individual galaxies and stars.

Universal webbing

The ΛCDM model seems to explain most aspects of the universe, from the large-scale structure through to that lasting remnant of the Big Bang – the cosmic microwave background. Yet, if the universe did evolve according to the model, the dark-matter filaments ought still to exist, strung between galaxy clusters like ancient cobwebs. Unfortunately, like a cobweb, dark matter is difficult to make out, despite being thought to make up roughly a quarter of the universe’s total mass–energy. It does not interact strongly with light and is, therefore, invisible.

Astronomers have been searching for it nonetheless. In the 1980s they managed to map some of the baryonic (visible) matter running along the filaments, thereby showing that the cosmic web does indeed exist. Later, the possibility of detecting dark matter in the filaments opened up too, with use of a technique known as “weak gravitational lensing”. In this technique, astronomers examine the light from a backdrop of many distant galaxies, and determine how much of it is distorted because of the gravity of matter in the foreground. From the late 1990s, observations seemed to suggest that there was more gravitational distortion in the regions between certain galaxy clusters than could be accounted for by baryonic matter alone: this, the astronomers claimed, must have been the dark-matter filaments.

But it was not to be. As the lensing studies required observations over a very large field of view, the astronomers had been forced to coat their telescopes’ focal planes with not one but several CCD detectors. In subsequent tests, it was discovered that a slight misalignment between these detectors could have caused false distortion signals.

Advanced arrays

Now, however, Jörg Dietrich at the University Observatory Munich, Germany, thinks that physicists need not be fooled any longer. “Our understanding of the behaviour of those optical systems, the cameras with the multi-chip arrays, and the ways to correct for them, have advanced immensely over the past 10 years,” he says. Together with colleagues in the US and Europe, Dietrich has found evidence for a dark-matter filament between two neighbouring galaxy clusters – Abell 222 and Abell 223.

Dietrich and colleagues picked these galaxy clusters because, lying at a redshift of 0.21 and separated on the sky by just 0.23°, they are relatively close together, and are therefore likely to be bound by a thick dark-matter filament. Indeed, the lensing signal found by the researchers was strong: a maximum of just 9% of it could be accounted for by X-ray emission from hot gas. Add another 5% or so for stars and 5% for colder, inconspicuous matter, says Dietrich, and one is left with the normal mass estimate for dark matter as a proportion of all matter: roughly 80%.

“I find this important because it provides the – so far – clearest confirmation of a key prediction of our current cosmology paradigm, where most of the matter in the universe is constituted by unseen dark matter,” says theoretical astrophysicist Håkon Dahle of the University of Oslo.

Dahle admits that the observation does not necessarily rule out alternatives to dark matter – a minority of theorists believe that they can explain these sorts of phenomena with tweaked theories of gravity – modified Newtonian dynamics, or MOND, is a popular example. “There are, and will still be, potential alternatives to dark matter – but [the latest observation] adds to an extensive set of observations that are fully consistent with our current cosmological paradigm,” he says.

Confident case

Yet the fact that astrophysicists have wrongly claimed sightings of dark-matter filaments before raises a question: could this latest observation also be an instrumental artefact?

“That is a fair question,” says Dietrich. He believes that his group’s observations will stand the test of further scrutiny because, unlike the previous observations, they were not recorded near the gaps between the CCD detectors. Moreover, he says, the dark matter they observed hugged the same outline as the X-ray emission and light from galaxies, which is what the ΛCDM model predicts. “I am very confident in this case,” he adds.

The results are published online in Nature.

What is the most significant experimental discovery in particle physics?

By James Dacey

So the excited researchers at CERN have finally found the Higgs boson, or at least a particle that resembles the Higgs. With these scientists now preparing for a blitzkrieg of further analysis, it may turn out that their Higgs possesses properties that cannot be explained by the Standard Model of particle physics alone. But, as the CERN director-general said during a press conference yesterday, “I think we have it.”

Yesterday was a truly great day for science. But just how significant is this discovery in the history of particle physics? Let us know your thoughts by answering this week’s Facebook poll question.

What is the most significant experimental discovery in particle physics?

hands smll.jpg

The electron
The atomic nucleus
The neutron
Antimatter
Neutrinos
Quarks
The Higgs boson

Let us know by taking part in the poll. And of course, as this list does not cover all the discoveries in particle physics, feel free to make a case for a something else.

In last week’s poll we encouraged you to join us in speculating whether CERN was about to announce the discovery of the Higgs boson. The signs were there that the LHC data had thrown up something large – journalists were invited from around the world to attend a special seminar in Geneva on the eve of this year’s major particle-physics conference in Australia.

59% of respondents had put two and two together and predicted that CERN officials would be revealing the discovery of the Higgs boson. The remaining 41% thought that they would not be making this announcement, perhaps assuming that we would have to wait longer or that the scientists would declaring that the particle does not exist after all.

Thanks to everyone for taking part and we look forward to hearing from you in this week’s poll.

Tired but happy

By Hamish Johnston in Geneva

 André David

All gone well: André David

By far the best part of being at CERN for yesterday’s Higgs announcement was talking to the physicists who did all the hard work. Needless to say, it was smiles all around. Indeed, it seemed as if it was the euphoria itself – brought on by more than two weeks of intense effort – that allowed the results to be presented yesterday.

For André David (right), who works at CERN on the CMS experiment, family life has been put very much on hold for the last fortnight. “I’ve only seen my two young daughters for a total of 15 minutes over the past 48 hours,” he claims. But David, who is with the Laboratory of Instrumentation and Experimental Particle Physics in Portugal, was not alone – he emphasized that about 400 other people on CMS had been working just as hard to get the results ready. Indeed, many of them were doing exactly the same analysis – only coming together at the end to compare their results.

While this might sound like a wasteful use of human resources, David says that it allows the team to be supremely confident of its results. “That’s why it has gone so well,” he explains.

 Josh Bendavid

Still smiling: Josh Bendavid

Josh Bendavid (right) is another CMS physicist who had been burning the midnight oil. “I haven’t slept much,” he admits. “I’m very tired but very happy.” Bendavid is just finishing his PhD thesis at the Massachusetts Institute of Technology and quipped that he was going to have to change its title, replacing “search for” with “evidence for”. With the exception of a quick dash back to the US to defend his thesis, he is looking forward to analysing lots more CMS data over the summer.

Everyone I spoke to saw this week’s announcement as just the start of our understanding of the particle they have discovered. And with a bit more hard work, we could have a far better understanding by early next year.

“We’re just on the edge,” says CMS member Yves Sirois of Ecole Polytechnique in Paris. “I won’t be taking a summer holiday.”

Nobel laureates react to Higgs news

62nd Lindau Nobel Laureate Meeting

Chewing the fat: (left to right) David Gross, Martinus Veltman, Carlo Rubbia and George Smoot

By Matthew Chalmers in Lindau, Germany

The organizers of this year’s 62nd Lindau Nobel Laureate Meeting couldn’t believe their luck. Having invited 27 Nobel-prize winners (average age 73.5) plus 600 young physicists to an island in Lake Constance, what should happen right in the middle of their shindig but the announcement of the biggest physics discovery in a generation. CERN’s new boson created quite a stir on the island, and it would seem that a certain Peter Higgs could soon be among the annual event’s invitees.

David Gross, who shared the 2004 prize for his work on the strong interaction, hasn’t stopped smiling, and is sure that CERN has discovered a Higgs – if not the Higgs boson as predicted by the Standard Model of particle physics. “This is a great day for me, for physics and for all of humanity,” he enthused on Wednesday after the news emerged.

Gross is particularly happy because the mass of the new particle (125 GeV) suggests that his favourite candidate for a deeper theory of physics – supersymmetry – is on the money. However, he also admitted that this particular mass value, should the particle indeed turn out to be the Higgs, would imply that the universe is in a metastable state that could decay at any moment and cause everything we know to simply disappear. The prospect prompted hearty laughter from the crowd.

Particle-physicist heavyweight Carlo Rubbia – who was responsible for the discovery of the W and Z particles at CERN in 1983, for which he picked up a Nobel prize the following year – was not getting too caught up in the elation. He wants to know why CERN’s new boson appears to be produced at a rate twice as large as would be expected. “The Standard Model should give us an exact value for this, and here there is a direct disagreement: what’s going on?” he asked. No stranger to getting major particle-physics experiments off the ground, he demanded a dedicated new collider to pin down its properties.

The third Nobel-prize-winning particle physicist at this year’s meeting was Martinus Veltman, who shared the 1999 gong for his work on the electroweak sector of the Standard Model (for which the Higgs is crucial). Veltman had not yet organized his thoughts on the discovery – indeed, he seemed somewhat subdued about the affair. “The Standard Model has now got a degree of validity that has extended way beyond what we had before the Higgs,” he said. “However, the one aspect that dominates here is that a Higgs could close the last door of the Standard Model that could lead us to a deeper theory.”

But not all Nobel laureates at the meeting were so elated. Condensed-matter physicist Robert Laughlin, who shared the 1998 prize for the discovery of a new form of quantum fluid, thinks that particle physics is in trouble, no matter what is discovered at the Large Hadron Collider. His view is that governments justified “big physics” research for defence reasons because particle physics followed nuclear physics, which had given countries the bomb. “Those motivations are less obvious today, which is good for the world but bad for the field in the long term,” he told physicsworld.com.

Meanwhile, the UK’s Harry Kroto, who shared the 1996 chemistry prize for the discovery of fullerenes, is worried about the cash and publicity consumed by big physics. “I do see that the [Higgs] discovery is wonderful, but I also see the huge amounts of money going into this field, and I wonder whether we are getting the balance right when it comes to science funding,” he told physicsworld.com. “I’m concerned, given the current funding situation, that large numbers of chemists doing fundamental work will lose out.”

New boson sparks call for ‘Higgs factory’

CERN’s discovery of a new fundamental particle – most likely a Higgs boson – was barely hours old when physicists speaking at this year’s Lindau Nobel Laureate Meeting in Germany argued the case for a new facility to measure its properties in detail. Speaking out in favour of a new machine was former CERN boss Carlo Rubbia, who shared the 1984 Nobel Prize for Physics for the discovery of the W and Z bosons. “The technology is there to construct a Higgs factory,” he claimed. “You don’t need €10bn; it could be done relatively cheaply.”

Rubbia is regarded as the “godfather” of the Large Hadron Collider (LHC), having pushed hard for the development of the machine in the 1980s, and was instrumental in adapting CERN’s previous flagship accelerator – the Super Proton Synchrotron – into a collider capable of producing W and Z bosons. Speaking at Lindau, he argued that physicists should follow the same approach that was adopted then: follow up the discovery in detail with a cleaner lepton collider. In the case of the W and Z, this was the Large Electron–Positron (LEP) collider that previously occupied the 27 km-long tunnel that now houses the LHC.

Currently, the facility tabled to follow the LHC is a linear electron–positron collider tens of kilometres long with a price tag of £10–20bn. But now that the mass of CERN’s new particle has been established – roughly 125 GeV – physicists have a clearer idea of what sort of machine they need to build in order to pin down its properties and measure its interactions with other particles. That work would be vital in establishing whether the particle is the Higgs boson predicted by the Standard Model or something more exotic that opens the door to a deeper theory of elementary particles.

The technology is there to construct a Higgs factory. You don’t need €10bn; it could be done relatively cheaply Former CERN boss Carlo Rubbia

“The question is whether the LHC will do it all, or whether there is a need for another facility,” said Rubbia. “With a Higgs of 125 GeV we need only a modest machine, perhaps not a large linear collider.” Rubbia points out that muons colliding at a combined energy of roughly 125 GeV would suffice – just over half the energy of LEP and requiring a machine with a much smaller radius.

Like electrons, muons are point particles that produce much cleaner collisions than protons, which would allow Higgs-like particles to be studied with far less background noise than is generated by proton collisions. Being 200 times heavier than electrons, muons lose considerably less energy via synchrotron radiation when travelling in a circular collider and their collisions would produce Higgs bosons in much greater quantities.

Cool questions

The big challenge in building a muon collider is how to “cool” the muons so that they can be funnelled into narrow beams that can be accelerated. However, muon-cooling technology is already being investigated at several institutions, including the UK’s Rutherford Appleton Laboratory, while Fermilab in the US has its own dedicated programme to develop a muon collider, which would also allow intense beams of neutrinos to be fired underground at detectors on the other side of the Earth to further understand how neutrinos oscillate between their three flavours.

One physicist who shares Rubbia’s vision is David Gross from the Kavli Institute for Theoretical Physics in Santa Barbara, US, who shared the 2004 Nobel Prize for Physics for his work on asymptotic freedom. “A muon collider is a great idea,” he told delegates in Lindau, “and one that the US should take the lead on.”

Speaking via a live video link to the Lindau meeting from CERN, theorist John Ellis of King’s College London agreed that the discovery of the new boson puts a Higgs factory on the agenda, but thinks it is premature to fix ideas. “We need to see what else the LHC will find, after a couple of years of high-energy running,” he said. “It would be better to have a facility that wasn’t just limited to a Higgs factory.”

Snapshots of sporting success

long jump


(iStockphoto/technotr)

By James Dacey

Sport can produce some stunning photos, capturing the drama and the triumph that come with the pursuit of sporting success. In today’s world, sport lovers are increasingly looking to science and technology for ways to enable and enhance performances.

The theme for our latest photo challenge is “the physics and technology of sport”. As always with these challenges we want you to be creative with the theme. But if you are looking for a steer then you might consider trying to get some interesting shots of sports equipment in action. For example, you might aim for images that juxtapose the exertions of athletes with the stress and strains exerted on sporting equipment – rowers cutting through water with their oars, or golfers looking for that perfect swing.

Or you might try to explore ways of visualizing the human body as a type of machine that athletes try to optimize for sport. For example, the long lean physique of distance runners, or the more muscular, powerful bodies of sprinters. To take part please submit photos to our Flickr group by Monday 6 August, after which we will choose a selection of our favourite images to be showcased on physicsworld.com.

running with prosthesis

(iStockphoto/MichaelSvoboda)

You may take inspiration from the July issue of Physics World, which looks at some of the challenges in the “physics of sport”, including the physics of the prosthetic devices that are leading disabled athletes to success, and how gymnasts, divers and long jumpers are all unconscious masters of manipulating the law of conservation of angular momentum. Members of the Institute of Physics can access the entire new issue free online through the digital version of the magazine by following this link or by downloading the Physics World app onto your iPhone or iPad or Android device, available from the App Store and Google Play, respectively.

In our previous photo challenge, we asked readers to share their astrophotography and we had some stunning submissions. You can see a selection of these photos in this showcase.

Copyright © 2025 by IOP Publishing Ltd and individual contributors