Skip to main content

The final outcome

By Michael Banks

Many physicists in the UK have spent the past six months fuming after the Science and Technology Facilities Council (STFC) announced an £80m shortfall in its budget late last year. The STFC said it would deal with the shortfall by pulling out of plans for the International Linear Collider and withdrawing from the Gemini telescopes in Hawaii and Chile. Numerous experiments in particle physics and astronomy also faced the axe or severe cuts.

Stunned by the reaction from the community, the STFC quickly set up a consultation with physicists, the final outcome of which was due to be announced at a meeting with physicists and the media at the Royal Society in London yesterday. In fact, the STFC told the media of its plans last week, although I still decided to attend the meeting as I knew that STFC chief executive Keith Mason, science-board chair Peter Knight and John Womersley director of science programmes at the STFC would all be there.

But what could have turned out to be a lively debate into how the £80m black hole in the budget of the STFC occurred ended up as a rather drab affair. Generally, members of the physics community were pleased with how the consultation process went and had even accepted that some projects would sadly have to face the axe. Indeed, almost all physicists who asked a question began by praising the STFC into how it conducted the consultation period.

Any chance of a surprise announcement at the meeting had of course vanished due to last week’s unveiling of the final programmatic outcome. The STFC had decided that some projects that had faced the funding axe would now be saved, including the e-MERLIN project – containing the Jodrell Bank observatory. In other words, the community and media already knew what projects were going to be funded.

What was bizarre for me and other members of the media is that we were prevented from asking any questions in the open session because there was going to be time to do so afterwards in a separate session for journalists. However, this session never materialised, and when I went and asked an STFC spokesperson what happened to the media session I was told that as the outcome had been made public last week, there was no need for one!

It was a shame that the media were not given the chance to ask questions with the community present. Instead we had to approach Mason, Knight and Womersley to quiz them at the end of the meeting as they were preparing to leave.

I myself had wanted to ask Mason about the Gemini project. I am still intrigued to know how and by whom that decision was made early in March to pull UK involvement in the project completely. Not only would this have been a scientific loss, but also would have meant a financial loss as the UK has put around £35m into the 8m class telescopes based in Hawaii and Chile. Additionally, the UK would also have induced a financial penalty for pulling out.

Both these points must have been known to the management of STFC. In the final outcome, the UK has now been reinstated as a full member in the Gemini consortium, but will sell 50% of its observing time to other members.

There was a change of tune within the community from the hostilities earlier in the year. Both parties are probably thinking that damage is now being done to the image of physics not only in the UK but internationally as well.

Possibly one of the most outspoken critics of STFC, Michael Rowan Robinson from Imperial College London, former president of the Royal Astronomical Society, was pleased at how the STFC have listened to the community.

But, he added, “I am still saddened that we had to go through all that in the last eight months and damaged our international reputation. I still don’t understand why we went through it.”

The first words that came from Mason in response were, rather discouragingly, “I don’t either.”

Do cosmic rays get bogged down in the cosmos?

Physicists are closer to understanding how ultrahigh-energy cosmic rays make their way to Earth thanks to new measurements made at the Pierre Auger Observatory in Argentina. The study shows that the number of such cosmic rays reaching Earth drops off rapidly for rays with energies of more than about 4 x 1019 eV.

The observations are consistent with a 40-year-old theory that ultrahigh-energy cosmic rays cannot travel very far through the universe without losing energy as they scatter off the cosmic microwave background (arXiv:0806.4302v1).

A similar fall in the number of such rays reaching Earth was seen earlier this year in a separate measurement made by the HiRes observatory in Utah, US ( Phys Rev Lett 100 101101) . The rapid drop-off in both studies appears to contradict earlier measurements made elsewhere, which suggested that there is no fall in the number of ultrahigh-energy cosmic rays at very high energies.

Slowed by the CMB

Ultrahigh-energy cosmic rays (UHECRs) are charged particles (believed to be mostly protons) with energies greater than about 1018eV. Above a certain energy threshold, they are expected to interact with the cosmic microwave background — the remnants of light that was given off shortly after the Big Bang — putting a limit on how far the most energetic cosmic rays could travel before reaching Earth.

Called the Greisen-Zatsepin-Kuzmin (GZK) supression after the three physicists who first proposed it in 1966, the theory has also been been backed up by Auger team’s recent observation that UHECRs seem to emanate from black holes lying at the heart of nearby galaxies.

However, because UHECRs are very difficult to detect, there is currently some debate as to whether the GZK effect has actually been confirmed observationally — in particular whether the suppression begins at about 6 x 1019 eV as originally predicted.

The AGASA mystery

Indeed, several early UHECR studies have discovered particles from apparently distant sources with energies greater than 1020 eV. In particular, the Akeno Giant Air Shower Array (AGASA) in Japan recorded many events with energies well above the GZK threshold.

However earlier this year, the HiRes cosmic ray observatory in the US claimed the first observation of GZK suppression at about 6 x 1019 eV. The measurement was made using data from two detector stations taken over nine years of clear, moonless nights — during which thousands of UHECRs were detected.

Now, the HiRes claim appears to be backed up by an analysis of about 20,000 UHECR events detected by the Pierre Auger Observatory from January 2004 to August 2007. The Auger data shows a significant drop in the number of UHECRs detected at energies greater than about 4 x 1019 eV.

Nature often sets us these traps!Alan Watson, Pierre Auger Observatory

However, unlike the HiRES collaboration, which claimed to have found the first real evidence for GZK, the Auger team has taken a more cautious approach. Emeritus Auger spokesman Alan Watson of the UK’s Leeds University told physicsworld.com that neither the HiRes nor the Auger results should be seen as confirmation of GZK. Instead, he argues that both simply show a marked reduction in the number of higher-energy UHECRs reaching Earth. An alternative explanation, according to Watson, is that UHECR sources are not very efficient at creating the highest energy cosmic rays . “Nature often sets us these traps!”, he said.

Others, however are more confident in the GZK interpretation of the HiRes and Auger results. Venya Berezinsky of Italy’s Gran Sasso Laboratory told physicsworld.com that the HiRES energy spectrum in particular is consistent with the GZK theory.

As for the surfeit of higher energy events in the AGASA data – Watson believes that this could be a result of certain assumptions of particle physics that are needed to interpret the results of that experiment. These assumptions made are at energies beyond those currently accessible in particle accelerators, and therefore are largely unproven. Some light could be shed on this problem by the Large Hadron Collider (LHC), which is due to start at CERN next month. The LHC includes a small experiment called LHC Forward (LHCf), which aims to gain some understanding of cosmic rays.

The Auger team have now turned their attention to determining the mass of UHECRs — how many are protons and how many are helium nuclei, for example — which could have an effect on predictions of the UHECR energy spectrum.

Reliving the 'Victorian Internet'

Porthcurno.jpg

By Matin Durrani

In this age of e-mails, satellite navigation and mobile phones, which allows scientists and everyone else of course to communicate pretty much instantly, sending messages around the world by telegram down underwater cables seems very much old hat.

But for the pioneers of cable telegraphy in the 1860s, the ability to communicate globally in seconds was a huge advance over the standard method of giving a handwritten letter to a ship’s captain. There was no guarantee the letter would arrive and, even if it did, it could take months to get a reply.

A network of undersea cables was soon formed, now dubbed the “Victorian internet” by science popularizers. Sending telegrams was a bit of a faff — and very expensive to the average punter — but soon thousands of messages were being sent and received every day by governments, military officials and even journalists.

As I discovered while on holiday in rain-soaked Cornwall last week, for Britain many of these cables arrived at a remote, secluded beach at Porthcurno in Cornwall in the south-west of the country, almost near Land’s End.

Porthcurno soon became established as the centre of international telecommunications for the UK. The first cable, laid in 1870, stretched from Cornwall to Gibraltar, before linking up with other cables that continued to Malta, Alexandria and Aden before finally reaching Bombay in India.

At each site, the signal — weakened as it travelled down copper fibres coated with a natural rubber-like material called gutta percha — would be amplified and sent back along its way.

I haven’t got space to go into all the details of how the technology worked, but if you’re in Cornwall, as I was, you can get a first-rate understanding by visiting the Porthcurno Museum.

Celebrating its 10th birthday this year, it includes fascinating details of how, in the early 1900s, scientists at Porthcurno grew more and more anxious about the work of the future Nobel-prize-winning physicist Guglielmo Marconi, who was experimenting with trans-Atlantic radio-wave transmissions from a nearby station at Poldhu on the Lizard.

Fearing that Marconi’s wireless transmission could put cable companies out of business, they set up a clandestine operation to check out what he was up to.

Of course the snag with wireless transmissions is that they can be intercepted by anyone with a suitable receiver. Cable telegraphy is much more secure.

In the end, firms using the rival technology merged to form Imperial and International Communications Limited, which later rebranded itself as…Cable and Wireless.

So important was the Porthcurno station deemed to be to the fortunes of the British Empire, that during the Second World War it was hidden in a maze of tunnels to protect it from enemy attack.

Sadly submarine telegraphy was eclipsed by telephone cables and later fibre-optics. Porthcurno role as a telegraph station ended in 1970 after exactly a century as a working station. A Cable and Wireless college at the site continued for some years, but it too shut in 1993 and transferred to Coventry.

Thankfully the story of Porthcurno can be relived in the museum. Partly housed in the war-time tunnels, it of course makes a great break from the rain.

Nuclear fallout used to spot fake art

Scientists and art historians have developed what they say is a foolproof way of identifying forged works of art. They can distinguish between art created before 1945 and that produced after that date by measuring levels of the isotopes caesium–137 and strontium–90. These isotopes do not occur naturally but are released into the environment by nuclear blasts.

Over 2000 nuclear tests have been carried out since the first atomic explosion took place in New Mexico in July 1945, and the Japanese cities of Hiroshima and Nagasaki were bombed a few weeks later. Among the by-products of these tests are caesium-137 and strontium-90, tiny quantities of which make their way into the Earth’s soil and plants. It is then via the natural oils, such as linseed from the flax plant, that are used as binding agents in paints that these isotopes end up in post-1945 art.

According to The Art Newspaper, the idea of using non-naturally occurring isotopes to identify forged paintings occurred to Elena Basner while she was working as curator of 20th-century art at the Russian Museum in St Petersburg. Basner, who is now a consultant for the Swedish auctioneers Bukowskis, said she was spending lots of time trying to distinguish between fakes and genuine works of art, given the high quality that forgers have managed to attain.

Mass spectrometer

Basner contacted a number of scientists to find a reliable and systematic way of weeding out forgeries. She ended up working with Andrey Krusanov, a freelance chemist and writer with an interest in the history of Russian avant-garde painting. Several physicists, chemists and mineralogists at the Russian Academy of Sciences in St Petersburg were also involved. Together they hit upon the idea of looking for the presence or absence of caesium–137 and strontium–90 using a mass spectrometer

The patented technique involves extracting tiny (of the order of 1 square millimetre) samples from paintings. The team were able to show that the two isotopes are not present in paintings from the first half of the 20th century, but that there were traces in paintings done in the 1950s.

Basner says she plans to use the technique to identify post-war imitations of Russian avant-garde works produced between 1900 and 1930. According to Basner, the originals are now probably outnumbered by the imitations, which first started to appear on the market in the 1960s. But she believes that, like any other work of art, if these paintings contain traces of caesium–137 and strontium–90 then they can be definitely declared post-1945 forgeries.

However, as Antonia Kimbell points out, while the presence of these isotopes would prove that a pre-1945 painting is a fake, their absence does not necessarily mean that the painting is genuine. Kimbell, who woks for the Art Loss Register, a UK organization that helps to tackle art crime, says that cunning forgers can reproduce paintings using canvasses and paints from the period in question. She therefore believes that while the technique will prove to be useful in identifying forgeries it must be combined with research on other indicators of authenticity, such as stylistic composition and provenance.

Nanotubes get sorted

When single-walled carbon nanotubes are made, a mixture of both metallic and semiconducting nanotubes is produced. This is a problem for those trying to make electronic devices from nanotubes, who need pure samples of either semiconducting or metallic tubes (depending upon the application), not both.

Now, researchers in the US and South Korea have a developed a new and simple technique that not only efficiently separates the two types of nanotube but also allows them to be patterned onto a substrate as thin films. These films could be used to make electronic devices with desirable properties, and could even replace silicon as the material of choice for integrated circuits.

Single-walled nanotubes are essentially rolled up sheets of graphite just one atom thick and can be metallic or semiconducting depending on the direction in which the sheet has been rolled. They have enormous potential as the building blocks in nanoscale electronics, and are often touted as being the perfect alternatives to silicon thanks to their tiny size and their ability to carry large currents. Metallic tubes could function as transparent conducting leads, while semiconducting tubes could make good nanoscale transistors.

Selective molecules

Although researchers have already proposed several techniques to separate nanotubes, most of these have proved difficult to perform on an industrial scale. However, help may be at hand: recent experiments have shown that specific molecules tend to interact selectively with metallic or semiconducting tubes in solution. Now, new work, by Zhenan Bao of Stanford University and colleagues, builds on this work by using such molecules to create a special surface that interacts selectively with nanotubes (Science 10.1126/science.1156588 ).

Bao and co-workers obtained their results by treating silicon substrate surfaces with molecules containing amines that “grabbed” the semiconducting tubes and ignored the metallic ones. Once this surface modification step was complete, the researchers then created thin films of the nanotubes on the substrate using a method called spin coating, in which the nanotubes are placed on a rapidly spinning surface so that they spread out thanks to centrifugal forces.

The scientists found that the thin films behaved as excellent field-effect transistors, with on-off ratios as high as 900,000, which is very close to the value for transistors used in liquid crystal displays (LCDs), for example. “The sub-monolayer films can also be completely transferred to different substrates,” team member Melburne LeMieux told physicsworld.com. “They could be used to better understand nanotube networks by electrical testing and in techniques such as scanning Kelvin probe microscopy in which every nanotube can be characterized (since none are ‘buried’).”

To selectively absorb metallic tubes from a mix, the researchers used phenyl-terminated silanes on the silicon substrates. This selectivity is possible because the nanotubes are extended pi-electron systems that interact with other such systems via a mechanism called pi-pi stacking. Metallic nanotube films are excellent transparent conducting materials and could find applications in solar cells and touch screens, said LeMieux.

The researchers now hope to be able to scale up their technique and increase the density of sorted nanotubes on a substrate. One way to do this is by multiple transfers onto a target substrate, added LeMieux.

Voyager 2 reports from the edge of the solar system

Over 30 years after it was launched, NASA’s Voyager 2 space probe has reached the “edge” of the solar system.

In doing so the probe has confirmed that the heliosphere — an immense bubble-like structure surrounding the Sun and formed by the solar wind — is not a perfect sphere but is a squashed ellipsoid.

Voyager 2 crossed the “heliospheric termination shock” in August 2007 at a distance of about 12bn kilometers from the Sun. This is about twice as far from the Sun as Pluto and about 1.5bn kilometers closer to the Sun than where its partner Voyager 1 crossed this threshold in 2004. This confirms telescope-based observations of the flow of hydrogen and helium in this region made in 2005, which suggested that the heliosphere is squashed by interstellar magnetic fields.

This termination shock is a turbulent region is where the solar wind — a fast-moving stream of electrically charged particles expelled by the Sun in all directions — slows down significantly. It marks the boundary between the inner heliosphere — where the solar wind dominates — and the heliosheath, where the effects of the interstellar gas begin to dominate.

Scientists are hopeful the data gathered by Voyager 2 as it travels beyond the shock will give them an insight into how the Sun interacts with the rest of the Milky Way. Although Voyager 1 crossed this threshold four years ago, its plasma sensor had stopped working by then – forcing scientists to estimate many of properties of the shock that they had hoped to measure directly.

Working plasma sensor

Voyager 2 has a working sensor, allowing the probe to measure directly the velocity, density and temperature of the solar wind at this juncture. Coupled with the fact that Voyager 2 had at least five shock crossings over a couple of days thanks to the tumultuous nature of the shock front, researchers were able to study the shock front in unprecedented detail – publishing their first findings in a series of six papers in Nature (Nature 454 63-81).

The data reveals the shock front is indeed irregular and turbulent as expected. However, the temperature of the heliosheath is about one million Kelvin, which is about ten times cooler than had been predicted. This indicates that the energy is being transferred from the solar wind in collisions with interstellar particles, which are accelerated to high speeds at the shock.

Researchers expect the two Voyager spacecraft to continue providing invaluable observations of the heliosheath for years to come. Scientists are hopeful that the probes will continue to function for another decade, by which time they should cross the heliopause — giving us glimpse of pure interstellar space.

In the meantime, NASA plans to launch the Interstellar Boundary Explorer later this year, which aims to image the entire termination shock and heliosheath from Earth orbit, providing further insight into the interaction between the heliosphere and interstellar space.

33 years later, NASA finds Mercury to be even more active

The first flyby of Mercury by the Messenger probe has shown the innermost planet to be a surprisingly dynamic place, according to its NASA mission team.

Among the barrage of new data are observations that Mercury’s magnetic field is probably generated by a molten outer core, and that the planet’s peculiar surface features were produced by ancient volcanic flow that solidified and slowly contracted. Scientists had been speculating about these properties since 1974, when NASA’s Mariner 10 probe made the first scouting mission to the planet.

Sean Solomon, the principal investigator for the Messenger mission, says that the flyby — which took place in January this year — hints at what other observations will be possible when the spacecraft settles into orbit in 2011.

“Our Mercury flyby in January was our first close-up view of the innermost planet in nearly 33 years,” Solomon told physicsworld.com. “We have a payload that takes advantage of three decades of technology beyond that available to Mariner 10, and the flyby provided a wonderful return of new observations by every instrument on board.”

Unusual planet

Many scientists consider Mercury — with its high density composition, heavily cratered surface and magnetic field — to be the most unusual planet. It was produced at roughly the same time and via the same processes as the other terrestrial planets, but turned out very different.

The number of impact craters on its surface implies that Mercury’s geological activity stopped fairly early. However, when Mariner 10 uncovered 45% of the planet’s surface it found that much of it was broken up into prominent escarpments up to 600 km in length, implying that the surface has suffered a period of contraction. Messenger has now uncovered another 21% of the surface in more detail, and has found that these escarpments are indeed widespread.

“We’ll image another 30% of the planet on our next flyby, on 6 October,” explains Solomon. “Once in orbit we will be able to image the entire planet, except for the floors of the polar craters in permanent shadow. [But] we will peer into even those craters with other instruments, including our gamma-ray and neutron spectrometers, and our laser altimeter.”

Messenger has also found the best evidence yet that Mercury — like Earth — generates its magnetic field through motion in its molten outer core. This evidence is a measurement of a strong dipolar field and a lack of shorter-wavelength fields. If the latter had been detected, it would have implied that — like Mars — a remnant magnetic field had been “frozen” into the planet’s surface.

Waiting for orbit

Other firsts for Messenger include a measurement of ionized particles in the atmosphere, which tell how Mercury’s magnetic field interacts with the solar wind; a chemical analysis of the surface, in which iron makes up less than 6% by weight (despite being abundant in the core); and a record of surface strain generated by contraction, which is at least a third more than previously thought.

However, it is the data taken when Messenger reaches orbit in 2011 that the mission team is waiting for. This will bring global data so that, for example, the scientists can test competing models for how Mercury was formed and how its magnetic dynamo works. “The scientific goals of this mission will be addressed most fully from the orbital measurements,” says Solomon.

The data from the Messenger flyby are presented in 11 reports published by Science.

US presidential candidates receive questions on science

In a bid to trigger a televised debate, organizers of ScienceDebate 2008 have sent the US presidential candidates a series of questions about the role of science in the nation’s future. The questions come after the candidates twice ducked the opportunity to participate in a debate on science during the primary season.

The ScienceDebate organizers started with a list of 3,300 questions put forward by the group’s 38,0000 signatories. Working with 11 other organizations — including the American Association for the Advancement of Science and the National Academy of Sciences — the group whittled down the list to 14.

“I remain convinced that we will see them debate these issues,” Matthew Chapman, president of ScienceDebate 2008, told physicsworld.com. “It’s one of those situations where the whole thing could suddenly lurch into place. What we really need is some support from mainstream media.”

‘Unresolved challenges’

According the ScienceDebate website, the 14 questions are designed to be “broad enough to allow wide variations in response, but specific enough to help guide the discussion towards many of the largest and unresolved challenges currently facing the US.” Shortened versions of the questions include:

  • To maintain a growing economy, how will you ensure the US remains the world leader in innovation?
  • Given spending constraints, how will you prioritize investment in basic research?
  • Is it acceptable for elected officials to hold back or alter scientific reports if they conflict with their own views, and how will you balance scientific information, politics and personal beliefs in your decision making?
  • What role do you think the federal government should play in preparing primary and secondary school students for science and technology in the 21st century?
  • How can science and technology be used best to ensure national security?
  • What is your position on the following measures that have been proposed to tackle climate change: a cap and trade system; a carbon tax; increased fuel-economy standards; or research?
  • How will you meet energy demands while ensuring an economically and environmentally sustainable future?
  • How would you prioritize different areas of space exploration?

The latest poll, conducted by lake Research Partners, shows that 72% of the US public is more likely to vote for presidential candidates that support scientific research. The poll also shows 87% of the US public is more likely to vote for candidates that will invest in scientific education, while 43% consider science to be “extremely important” in influencing policy decisions.

UK physics funding plans are approved

After six months of often bitter debate and recriminations, the UK’s Science and Technologies Facilities Council (STFC) has approved its funding programme for astronomy and nuclear and particle physics for the next three years. The funding package totals nearly £2bn.

The announcement comes in the wake of accusations of mismanagement against the STFC by both British physicists and a parliamentary committee after an apparent £80m shortfall in funds for 2008-11 came to light in December 2007.

In a decision taken by the council on Tuesday and announced today, the STFC has broadly accepted the recommendations of its two committees — the Particles Physics, Astronomy and Nuclear Committee (PPAN) and the Physical and Life Sciences Committee (PALS) — which first reported in March, 2008. The committees ranked all STFC-funded programmes with the aim of deciding which projects deserved continued funding.

Both committees then received input from ten panels that were convened in response to complaints from the physics community regarding the consultation process. The concerns of more than 1400 physicists were considered by the panels, which reported in May.

We have had a healthy debate over the past six months Keith Mason, STFC

Keith Mason, chief executive of the STFC, told physicsworld.com that there was little difference between the recommendations of PPAN and PAL and the ten panels. As a result, there are no reprieves for any programmes originally slated to receive no further funding. This includes UK involvement with the International Linear Collider (ILC) — the next big particle-physics facility after CERN’s Large Hadron Collider.

Brian Foster of Oxford University who is European director of the ILC’s global design effort told physicsworld.com that the STFC’s decision to “claw back” money already promised to researchers involved in the ILC “is an unprecedented step which makes it impossible for universities to plan sensibly”.

Some good news

Today’s announcement brings some good news for physicists working on the LHCb experiment at the Large Hadron Collider. The STFC had originally planned to cut UK funding of this experiment by 25%, but has been persuaded by PPAN to reduce the cuts to 5% the first year and 10% in the two subsequent years.

Nick Brook of Bristol University, who has worked on the LHCb experiment for more than 10 years, said that he is “relatively pleased” by the decision, but pointed out that PPAN had chosen to ignore a panel recommendation that the 5% and 10% cuts would still cause significant damage to the UK’s LHCb programme. Brook fears that the cuts could lead to a reduction in the number of UK physicists working on the experiment, just as it is about to begin later this year. In the worst case, Brook says that the UK researchers may have to renege on commitments it has already made to the project.

A parliamentary report on the debacle apportioned some of the blame to the way STFC was created in April 2007 by merging the Particle Physics and Astronomy Research Council (which awarded research grants) with the CCLRC (which managed scientific facilities). Mason said “change often brings controversy”, adding “we have had a healthy debate over the past six months”.

The debate looks set to continue until at least September, when an external review of the STFC will report, along with a separate government-appointed review of UK physics in general.

On a lake in southern Germany

lindau.jpg
(Photograph courtesy of Edda Praefcke)

By João Medeiros

Part of my job as Features Editor on Physics World is to dig up great ideas for possible feature articles. That’s one reason why I am spending this week on an island in Lake Constance in southern Germany at the 58th meeting of Nobel Laureates at Lindau.

The meeting, which is held every year, gives top young students the chance to hear, talk to and debate with leading researchers from a particular field of endeavour. This year’s meeting is dedicated to physics and there are some 25 Nobel-prize-winning physicists here as well as over 550 students.

Yesterday we were treated to a fascinating debate about the Large Hadron Collider (LHC) at CERN, featuring Nobel laureates David Gross, Martinus Veltman, George Smoot, Gerhard ‘t Hooft and Carlo Rubbia, along with LHC accelerator supremo Lyn Evans and CERN chief scientific officer Jos Engelen.

Chairing the session was my predecessor in the Physics World features hot-seat Matthew Chalmers, who is now forging a career as a freelance science journalist.

Some speakers, like Smoot and Gross, preferred to talk about the hope that the LHC will yield a cornucopia of new physics , prominently of Higgs bosons and supersymmetric particles. Others, like Veltman and Rubbia, took a more cautious stance as to what might be discovered.

The experiment itself is a complex beast and will take years before the experimentalists understand it completely. The computing challenge is also gargantuan: the proton-proton collisions will yield some 109 events per second, of which only 200 can be saved into a disk.

This means there is a huge responsibility on the shoulders of the thousands of young researchers working in the bowels of the LHC to make sure that the interesting events are the ones that get saved into the computing grid.

As Rubbia told the meeting: “The discussion about the Higgs is not the right discussion at the moment. This is a very complex machine, and presumably, it will take years before we understand it properly. One should let the physicists do their work instead of pressuring the scientists for results.”

I hope to tell you more about what’s been happening here on Lake Constance later this week. Meanwhile, back to those Nobel laureates…

Copyright © 2025 by IOP Publishing Ltd and individual contributors