Skip to main content

Making CERN’s best even better

It is hard to imagine upgrading an instrument as big and complex as the SwFr6.5bn (€10bn) Large Hadron Collider (LHC) at the CERN particle-physics lab near Geneva. The 27 km-circumference collider, which was switched on in September 2008 after 25 years of planning and construction, was built to drive 2808 bunches of protons – each containing about 100 billion protons – into one another 40 million times per second inside four detectors the size of large buildings. A tiny fraction of head-on collisions, physicists hope, will hold clues about nature’s fundamental structure, in particular what gave certain elementary particles their masses.

At full luminosity – a measure of the rate of particle collisions – of around 1034 cm–2 s–1, the LHC beam will store enough energy to melt a tonne of copper as it circulates within a whisker of highly sensitive and expensive components. To protect the accelerator and its detectors from stray protons, the LHC is equipped with around 100 movable carbon or tungsten collimators each with a small slit through which the beam can pass. Yet long before the LHC fired its first protons, CERN was planning ways to produce even more intense collisions that will improve the chances of discovering rare, new particles or forces.

The upgrade to the LHC – dubbed the High Luminosity LHC (HL-LHC) – will pack a beam luminosity more than 10 times the LHC’s design goal: up to 5 × 1035 cm–2 s–1. The HL-LHC will therefore need more sophisticated collimation to avoid unacceptable heat loads and require upgrades to the LHC’s injection system, which currently relies on CERN’s more elderly accelerators and proton transit lines, to ensure beam quality and stability.

More bang for your buck

Lucio Rossi, HL-LHC co-ordinator, says that the project is the main R&D focus for CERN over the next 10 years, the other being the Compact Linear Collider – one possible design for the next big particle-physics experiment after the LHC. “[HL-LHC] will be like turning up the lights in a darkened room from the point of view of the experiments,” he says. The realities of building, commissioning and operating the LHC have meant that its high-luminosity incarnation will not materialize until around 2022, however, with a price tag of around €1bn. Half of this money will go on upgrading the collider and half on refitting the LHC’s four detectors – ALICE, ATLAS, CMS and LHCb – so that they can cope with the HL-LHC’s harsh collision environment.

The LHC was designed to circulate two 7 TeV beams, generating 14 TeV collisions, but initially has been forced to operate at half this value after an unstable magnet interconnect evaporated during high-current tests just nine days after the LHC switched on in late 2008. Intervening in the LHC is no easy task because, when running, it is kept at a temperature of 1.9 K using 130 tonnes of liquid helium to ensure that the niobium–titanium cables that power its dipole magnets are below their superconducting transition temperature.

It takes months to warm the whole machine to room temperature and then to cool it back down, and three long shutdowns are planned during the next decade. The first, in 2013–2014, will involve fixing around 1000 defective interconnects so that the magnets can operate closer to their target bending field (8.3 T), which allows them to carry 7 TeV protons, while all 10,000 joints will be fitted with a lateral restraint to ensure stability. Further improvements are anticipated in the second shutdown, likely to happen in 2017 or 2018, while the HL-LHC and associated improvements in the detectors will mainly take shape during the third long shutdown scheduled for 2021.

As well as almost doubling the number of protons in each bunch, the HL-LHC relies on improved electromagnetic “optics” to bring the beams of protons into collision in the LHC’s four detectors. Two key technologies are under development: high-field superconducting quadrupole magnets that squeeze the beam more tightly in the vertical and horizontal directions; and radio-frequency “crab” cavities that reduce the angle at which the bunches cross.

“The LHC luminosity upgrade is very demanding technologically, with the LHC already representing the apex of 30 years of work worldwide on superconducting magnets,” explains Rossi. “Existing quadrupole magnets go up to 8 T, and it has so far taken six years of solid work by many US teams to build the first 11.5 T prototype, but we need 13 T and an even larger aperture.” The magnets are mostly being developed by researchers at Fermilab in the US in conjunction with staff at CERN. The crab cavities present an even bigger challenge because they have never been used to kick a beam of protons in the transverse direction, not least at a steady rate of 40 MHz. Much of the R&D for crab cavities, which must be compact and have acute phase accuracy, is taking place at the Cockcroft Institute of Accelerator Science and Technology in the UK, with a test cavity that may be installed at the LHC during 2017–2018.

In addition to developing “radiation hard” electronics by making the semiconductor chips and associated hardware able to withstand higher radiation doses, CERN is considering moving power supplies near the detectors as far away from the beam as possible – ideally above ground, as opposed to their current location close to the detectors 100 m underground. The only way to carry the enormous 200 kA currents required is to use superconductors as well. But low-temperature superconductors, such as niobium alloys, are problematic for this application because the liquid helium required to cool them warms because of hydraulic pressure when suspended vertically. Instead, CERN may have to turn to high-temperature superconductors such as yttrium barium copper-oxide materials, which do not require liquid helium to get them into the superconducting state. “Longer high-temperature superconducting cables exist, but none carrying such high currents,” says Rossi.

Detecting more events

The higher collision rate delivered by the HL-LHC demands major modifications to the ATLAS, CMS, ALICE and LCHb experiments, not least to deal with the increased radiation dose that they will suffer. At normal running, the two general-purpose ATLAS and CMS detectors are flooded with the debris from around 20 proton–proton collisions every time two bunches cross, which must be assessed in less than 25 ns (i.e. before the next bunch crossing) by a “trigger” to decide whether or not the collision is worth recording to disk. At the HL-LHC, however, this “event pile-up” will be more like 400 per bunch crossing, requiring much faster front-end electronics and data-acquisition systems.

The innermost layer of the LHC experiments – the semiconductor pixel detectors that track the collision debris just a few centimetres from the interaction region – will be replaced in all four experiments to cope with the onslaught. As well as being more radiation hard, the upgraded trackers will be more granular to reduce occupancy on pixels, for example by using better nanofabrication techniques to create smaller sensitive regions. Without this change, the vital task of picking out which particles are associated with a particular proton–proton collision (so-called vertex reconstruction or “vertexing”) will be impossible.

“Ideally, you would have a zero-mass tracker so that the particles fly through it without losing energy, but that’s not possible,” says Craig Buttar of Glasgow University, who is a member of the UK ATLAS upgrade team. “Plus, you have all the services – the power cables, the optical signals for control and read-out, and the cooling system – to contend with.” Both ATLAS and CMS are considering using pressurized carbon dioxide in place of current fluorocarbons to cool the upgraded inner detectors, for instance, because it takes up less space.

ATLAS researchers plan to add a new pixel layer to its existing tracker during the 2013–2014 shutdown to improve vertex reconstruction, and those working on CMS are planning similar intervention in 2017–2018. “For CMS, the new tracker is an enormous operation because we need to increase the number of channels by a factor of 10 without increasing the power,” says Dave Newbold of Bristol University, who is software co-ordinator for the CMS upgrade. “Even without the luminosity upgrade, though, we would still have to maintain and improve our detectors. We might have spent 15 years building them, but they’re never really finished.”

The LHCb collaboration, which is devoted to the physics of B-mesons, is considering replacing its current particle-identification detector based on Cerenkov light, as well as a new tracker capable of better vertexing, although in general LHCb operates at a lower luminosity than ATLAS and CMS. Plans to upgrade the ALICE experiment, which is designed to study collisions between lead ions during dedicated LHC runs, are only indirectly linked to the HL-LHC because the detector has been optimized for a lead–lead luminosity of 1027 cm–2 s–1.

Despite taking up two decades of R&D at the edge of what is technologically possible, the HL-LHC will not be the end of the story for CERN’s flagship collider: it is the stepping stone to an even more powerful machine perhaps some time in the 2030s incorporating new superconducting magnets with bending fields of 20 T that would allow a beam energy of 16.5 TeV per beam. The magnet technology does not yet exist, but in May 2010 CERN established a working group to explore the High Energy LHC (HE-LHC). With a likely price tag of several billion euros, HE-LHC will also require completely new accelerators to feed it, but the project’s chances of success will depend on what the LHC and HL-LHC discover. “The HE-LHC will happen,” says Rossi, “the question is when?”

  • You can download a PDF of the October 2011 Physics World Big-Science Supplement here.

Galaxy clusters back general relativity

A study of light coming from galaxy clusters has yet again given the thumbs up to the general theory of relativity, Albert Einstein’s famous theory of gravity. Done by physicists in Denmark who measured gravitational redshift, the research appears to rule out some alternative models of gravity – particularly those that deny the existence of dark matter.

Since its publication in 1916, the general theory of relativity has defied all experimental attempts to prove it wrong. In the currently favoured “cosmological constant and cold dark matter” model (ΛCDM) of cosmology, general relativity has successfully explained many aspects of the universe, including the cosmic-microwave background, gravitational lensing and large-scale structure.

However, gravity acting on ordinary matter cannot explain all of the large-scale structure seen in the heavens. Galaxies appear to be bound together with invisible dark matter, which is thought to make up almost a quarter of the entire universe’s mass–energy content. An even less well-understood entity, dark energy, appears to be accelerating the expansion of the universe, and is thought to account for nearly three-quarters of the mass–energy content. Meanwhile, the proportion in the universe of ordinary matter such as atoms seems to be a little under 5%.

Ailing theory

Many physicists expect to understand the nature of dark matter and dark energy in due course. However, others believe that these concepts are merely symptoms of an ailing theory and are looking at alternative models of gravity that can explain observations without invoking dark matter or dark energy. One alternative is modified Newtonian dynamics (MOND), and its generalized partner tensor–vector–scalar (TeVeS) theory, which is supposed to obviate the need for dark matter. Another is f(R) gravity, which does away with dark energy.

Now, Radoslaw Wojtak and colleagues at the University of Copenhagen have used data from the Sloan Digital Sky Survey to test these theories against one another. The study focuses on the gravitational redshift of galaxies within galaxy clusters. This quantity describes how much energy it costs photons to leave a cluster. As they leave and lose energy, the photon wavelengths stretch to the red side of the spectrum. Importantly, the different models of gravity predict different amounts of redshift.

Unfortunately, measuring the gravitational redshift is not easy. There are other sources of redshift including the universe’s expansion and the individual motions of galaxies within a cluster. Wojtak and colleagues therefore calculated the average redshift as a function of distance from the cluster’s centre – a process that should exclude these other sources.

MOND and TeVeS fail

The Copenhagen group discovered that the redshifts agreed with the predictions of both general relativity and f(R) gravity, the theory that tries to avoid dark energy. However, the error bars on the redshifts excluded MOND and TeVeS, the theories that try to avoid dark matter. This backs the conclusions of a separate galaxy study performed earlier this year – but the Copenhagen study has the added clout that it has not been based on any assumptions of the generally accepted ΛCDM model.

“I always find it remarkable how general relativity performs well in all the tests we can conceive,” says Alberto Cappi, an astronomer at the Observatory of Bologna, Italy, who tried to perform a similar study in 1995. “Of course the error bars are large, and it is difficult to see a statistically significant trend…but it is true that the relativistic version of MOND [TeVeS] does not perform well in describing the data.”

‘Punching-bag proxy’

However, other astronomers point out that the Copenhagen group has not necessarily ruled out TeVeS. Hongsheng Zhao of the University of St Andrews, UK, thinks the researchers’ detection is “still in the early stages”, and that there may be other variations of TeVeS they have not looked into. Pedro Ferreira of the University of Oxford, UK, shares this concern. “I am not an advocate of TeVeS – never have been – but it is surprising how it has become the punching-bag proxy for alternative theories of gravity,” he says.

Evan Scannapieco of Arizona State University in Tempe, US, says more data could be the answer. This might come from Euclid, a space telescope planned to be launched by the European Space Agency in 2017. “While there are other reasons to argue against [alternative] gravity models, at this point the constraint from the gravitational redshift of clusters is weak,” says Scannapieco. “More detailed measurements are needed to rule out such models using this approach.”

The study is described in Nature 477 567.

Slippery surface inspired by pitcher plant

If an unfortunate insect finds itself trapped inside a Nepenthes pitcher plant its chances of survival are pretty slim – these tube-shaped plants are lined with a slippery surface that causes victims to slide into a chamber filled with digestive juices. A group of researchers in the US has taken inspiration from these carnivorous plants to design a surface that is both slippery and highly repellent of external fluids. The scientists say their material would be cheap to produce in bulk and has a range of possible applications, including slippery pipes for the efficient transport of oil.

Nepenthes acquire their slipperiness from a thin lubricating film that lines the inside surface of these plants. These films are created when water or nectar becomes locked into microscale textures in the surface of the plant creating a continuous layer of lubrication. When the films come into contact with the oils on the feet of insects the friction is very low, making it difficult for these creatures to maintain their grip when attempting to climb out.

This technique for slipperiness used by Nepenthes has now been mimicked by Joanna Aizenberg and her colleagues at Harvard University who have created an “omniphobic” surface that repels oils as well as water. Described as a “slippery liquid-infused porous surface(s)”, or SLIPS, the surface is fabricated out of a sponge-like material composed of a random network of nanofibres. The material was then coated in a lubricating film that is immiscible to a broad range of liquids. When a drop of complex fluid, such as crude oil or blood, was placed on the surface, it quickly slid off even if the surface was tilted only slightly.

Robust slipperiness

The researchers say that one big advantage of the new material over alternative slippery surfaces in industry is its robustness. Materials based on the water-repelling properties of lotus leaves, for instance, rely on a layer of trapped air, which can become unstable at high pressures – leading to a poor performance or permanent damage. “[Our] lubricating film is intrinsically smooth, making it almost perfectly slippery toward substances of any surface tension,” Aizenberg told physicsworld.com. “Lotus-inspired surfaces have a much harder time repelling liquids with low-surface liquids, such as oils, since these tend to sink into the spaces between the textures.” She believes that the smooth nature of SLIPS means it could be used to create stain-resistant coatings on optical surfaces, such as solar cells and sensors.

The group is now working closely with other academic institutions to study various features of SLIPS, including its performance at extreme temperatures and high-shear conditions. They are also seeking industrial partners to commercialize different aspects of the SLIPS technology. “The temperature and pressure stabilities of SLIPS make it ideal for energy-efficient, high-temperature transport of economically important fluids such as crude oil and biofuels,” said Aizenberg. Aizenberg believes that SLIPS could also be used as ice-resistant coatings for instruments operating in refrigeration technologies, or even in polar environments.

Michael Nosonovsky, a biomimetics engineer at the University of Wisconsin-Milwaukee in the US, agrees that the technology shows a lot of promise. “One could use it for various purposes, such as household appliances, which will require much less cleaning or all applications where moving parts can stick together and prevent proper operation,” he said. Nosonovsky envisages that in the longer run SLIPS could be used in applications where biofouling is undesirable, such as underwater hulls of ships and submarines.

Chuan-Hua Chen, a hydrodynamics researcher at Duke University in the US, is also impressed by the new design. “This is a clever way to develop the slippery surface, which reminds me of the lubricants used in automobile engines and hydraulic machinery,” he said. Chen agrees that using a liquid lubricant eliminates a lot of problems associated with air-filled cavities, though he believes this feature could also be a weakness. “Lubricants would work well in enclosed machinery, but would have to be replenished if exposed,” he explained.

This research is published in Nature.

Orbiting standards lab could improve climate predictions

Policy makers would be much better placed to combat the effects of global warming if scientists had access to accurate measurements of the Earth’s radiation balance from a dedicated satellite, claims an international group of physicists. As well as collecting its own data, the spacecraft would also calibrate other Earth-observation satellites. The group is led by scientists at the UK’s National Physical Laboratory (NPL) and it estimates that the satellite could cut a decade or more from the time needed to make useful projections of global temperature at the end of the 21st century.

Climate scientists have become increasingly convinced that much of the global temperature rise seen over the last 50 years or so is due to the emission of man-made greenhouse gases. But they are not able to predict with any certainty the extent to which temperatures will increase over the course of the coming century. Indeed, the 2007 report from the Intergovernmental Panel on Climate Change said the increase could vary anywhere from about 1 to 6 °C. This uncertainty stems from the fact that a variety of different models are used – each making different assumptions about the Earth’s climate. One of the biggest single sources of uncertainty is the nature and magnitude of the feedback provided by changes to cloud cover as the planet warms.

Time cut in third

Reducing the uncertainties will involve continued space-based measurement of key climate variables such as cloud cover in order to compare these data with the values predicted by each of the various models. According to Nigel Fox of NPL, today’s space-based instruments require an observing period of 30 or 40 years before the uncertainties can be restricted to a range of about 1–2 °C. At this point governments will know whether and when they need to take major steps to combat climate change, such as building large flood barriers, or whether more modest changes will do the job. However, he and colleagues from the UK, US and Switzerland argue that this period could be cut to just 12 years following the launch of a satellite known as TRUTHS.

TRUTHS would measure the intensity and spectral composition of radiation coming directly from the Sun and radiation reflected back into space from Earth – with an accuracy about 10 times better than existing satellites. At the heart of the spacecraft would be an instrument containing a black cavity that absorbs incoming light. The power of that light is obtained by measuring the cavity’s temperature rise and then using an electrical heater to deliver a known power to cause the same increase in temperature.

This “electrical substitution radiometry” is already used in existing satellites, but is carried out at ambient temperatures, whereas the instrument inside TRUTHS would operate at about –250 °C. As such, it would be as accurate as radiometers used in metrology institutes on the ground. Although this accuracy will degrade with time, the TRUTHS instrument will remain more accurate than today’s instruments.

Taking NPL into orbit

Another satellite would be calibrated by pointing it and TRUTHS at the same bright surface (such as a snow field) and comparing the values obtained by each. “We would be effectively taking NPL into orbit”, says Fox, “just as if we were checking a customer’s light meter against our reference light meter.”

TRUTHS was first proposed to the European Space Agency (ESA) in 2002, and the proposal was updated last year with a €50–100m cost estimate. Since then a very similar but larger NASA mission called CLARREO has been put on hold, so Fox is hoping that ESA, or perhaps even the UK, will back the project on its own. “I’ve no doubt the mission will happen at some point,” he says, “but it is a question of how quickly it will happen.”

Some are unconvinced

However, Michael Mann, a climate scientist at Pennsylvania State University in the US, says he is “unconvinced that such a mission will provide any definitive answers”. In particular, he believes it will be difficult for TRUTHS to quantify cloud feedbacks given the dominant natural year-to-year variability in cloud cover.

Michael Lockwood of Reading University in the UK is more persuaded. He believes that poor calibration between different satellites hampers our understanding of long-term climate change, adding that “future generations will curse us” for not paying more attention to the problem. And he thinks that TRUTHS could offer a way of improving such calibration for measurements of cloud and surface reflectance. But he says the detailed implementation of this improvement still needs to be worked out.

The research is described in Phil. Trans. R. Soc. A 369 4028.

Ferrofluid pump has no moving parts

Scientists in the US have developed a new way of pumping ferrofluids without the use of any mechanical components. They claim that their technique, dubbed “ferrohydrodynamic pumping”, can be easily scaled up or down to be used in microfluidic devices or industrial-scale pumping devices, and anything in between.

Ferrofluids were developed by NASA in the 1960s as a way to non-mechanically pump fuel in space. They fall under the umbrella of “smart fluids” – fluids whose properties can be changed by applying a magnetic or electrical field. Today, ferrofluids have a wide range of applications, being used liquid 0-rings, in high-end audio speakers and computer circuitry, as well as biomedical devices.

Strange brew

Ferrofluids are colloidal liquids made of nanoscale ferromagnetic particles suspended in a carrier fluid. They respond to magnetic fields while retaining liquid properties and can be manipulated by external magnetic fields. The essential difference between ferrofluids and magnetorheological fluids (MR fluids), another type of smart fluid, is the particle size – the nanoparticles in the ferrofluid are suspended by Brownian motion and so, do not settle under normal conditions; while the particles in MR fluids are of the micron-scale and are too heavy to be suspended by Brownian motion.

When ferrofluids are exposed to a magnetic field, the bulk of the liquid becomes magnetized and its surface acquires a shape to minimize the energy of the system. Sometimes, exotic spikes form on the liquid’s surface in the presence of strong magnetic-field gradients – which has been exploited in some interesting special effects and art projects. “If you have ever seen a movie scene in which there is a strange, black liquid creeping towards the protagonist, seemingly on its own accord – think X-Files – there is a good chance that the liquid is a ferrofluid,” says Hur Koser, one of the authors of the study recently published in Physical Review B.

Fluid loops

Koser and colleagues of the University of Georgia and Massachusetts Institute of Technology in the US came up with their pumping design after Koser became interested in the designing of tiny magnetic-field generators during his PhD work. He says that a common query that puzzled his team then was, “Can these machines be used to pump fluids in microfluidic devices?” Creating a tiny motor to pump fluids is complex, but Koser realized that if the fluid itself was magnetic, it could be actuated and pumped without any motors. Koser and colleague Leidong Mao then used computer simulations to work out an experiment to demonstrate such actuated pumping, and built their device. “In retrospect, the experiment was the easy part. The difficulty was in taking into account all of the nonlinearities associated with ferrohydrodynamics in our computer simulations, which took considerably longer – almost years – to conclude,” explains Koser.

The apparatus comprises a closed fluidic loop that they built using PVC pipes bought at the local hardware store. They added manual valves to the loop to stop the circulating flow whenever necessary as well as two pressure ports to measure the pressure created by the electrical windings – many turns of copper tape around the circumference of the tube – in a differential fashion. “We passed electrical current through the windings to create a magnetic excitation that travelled along the length of the tube on one arm of the fluidic loop. The currents were driven by a stereo amplifier, purchased from a local music store. The ferrofluid used was a cheap, commercially available formulation based on mineral oil and magnetite nanoparticles,” says Koser, explaining just how simply their device was built.

Chain reactions

The electromagnetic coils generate a magnetic field, which the researchers refer to as a “travelling wave”. Mao explains that these fields rotate the nanoparticles within the liquid. “We can control the strength, frequency and direction of the travelling waves, which in turn result in locally rotating magnetic fields within the ferrofluid. The field is set up to generate a gradient of nanoparticle rotation – those deeper inside the pipe rotate slower than those near the surface. This spin gradient sets up a shear gradient within the ferrofluid, propelling it linearly,” he says. A large spin gradient means that each particle’s rotations are highly coupled with those of its neighbours, while a zero-spin gradient means particle’s rotations do not affect each other at all.

The researchers also noted a discrepancy between their simulations and observed pumping characteristics. They found that the individual nanoparticles could not have been responsible for the measured pumping because the flow that they observe requires coupling between the physical rotation of magnetic nanoparticles and the surrounding liquid medium. So they deduced that a small percentage of magnetite nanoparticles must have dynamically formed short linear, reversible chains caused by the travelling wave, and that it was the rotation of these chains that lead to the differences observed.

The pumping method that the researchers have developed can be used for almost all types of ferrofluids, whether oil or water-based. Because there is no secondary liquid to pump, the ferrofluid can be optimized individually for maximum shelf-life and optimum pumping. They believe that their technique could lead to compact, integrated and efficient liquid-cooling schemes based on ferrofluids that could be used in miniaturized cooling systems for computers. “Your laptop could be twice as thin and a third lighter and faster with more efficient cooling,” says Koser.

Do neutrinos move faster than the speed of light?

Can particles travel faster than the speed of light? Most physicists would say an emphatic “no”, invoking Einstein’s special theory of relativity, which forbids superluminal travel. But now physicists working on the OPERA experiment in Italy may have found tantalizing evidence that neutrinos can exceed the speed of light.

The OPERA team fires muon neutrinos from the Super Proton Synchrotron at CERN in Geneva a distance of 730 km under the Alps to a detector in Gran Sasso, Italy. The team studied more than 15,000 neutrino events and found that they indicate that the neutrinos travel at a velocity 20 parts per million above the speed of light.

Simple measurement

The principle of the measurement is simple – the physicists know the distance travelled and the time it takes, which gives the velocity. These parameters were measured using GPS, atomic clocks and other instruments, which gave the distance between source and detector to within 20 cm and the time to within 10 ns.

This is not the first time that a neutrino experiment has glimpsed superluminal speeds. In 2007 the MINOS experiment in the US looked at 473 neutrinos that travelled from Fermilab near Chicago to a detector in northern Minnesota. MINOS physicists reported speeds similar to that seen by OPERA, but their experimental uncertainties were much larger. According to the OPERA researchers, their measurement of the neutrino velocity is 10 times better than previous neutrino accelerator experiments.

‘Totally unexpected’

“This outcome is totally unexpected,” stresses Antonio Ereditato of the University of Bern and spokesperson for the OPERA experiment. “Months of research and verifications have not been sufficient to identify an instrumental effect that could explain the result of our measurements.” While the researchers taking part in the experiment will continue their work, they look forward to comparing their results with those of other experiments so as to fully assess the nature of this observation.

Although a measurement error could be the cause of the surprising result, some physicists believe that superluminal speeds could be possible. Its discovery could help physicists to develop new theories – such as string theory – beyond the Standard Model of particle physics. However, the OPERA measurements will have to be reproduced elsewhere before they are accepted by the physics community.

Jenny Thomas of University College London, who works on MINOS, said “The impact of this measurement, were it to be correct, would be huge. In fact it would overturn everything we thought we understood about relativity and the speed of light.”

Alexei Smirnov, a high-energy physicist at the Abdus Salam International Centre for Theoretical Physics, Italy, says that he finds the OPERA result “extremely surprising” as while some small deviation could have been expected, the observed deviation is very large – much larger than what is expected from even very exotic theories. “If this result is proved to be true, the consequences for modern science would undoubtedly be enormous,” he says. He agrees with the conclusion of the OPERA collaboration that currently unknown systematic effects should be looked for and they should continue observations. Smirnov was one of three researchers who discovered the “matter–mass” effect that modifies neutrino oscillations in matter.

Talking about neutrinos

On Friday afternoon, OPERA researcher Dario Autiero from the Institut de Physique Nucleaire de Lyon discussed the details of their experiment at a seminar at CERN. Autiero addressed possible reasons for the result that took into consideration everything from inherent errors during calibration of clocks, to tidal forces and the position of the Moon with respect to CERN and Gran Sasso at the time of the readings.

They considered the possibility of problems internal to the detector itself, the chances of which OPERA researchers say were reduced thanks to the independent external calibration methods they used. They also discussed if it would be possible to re-create the results at different energies. “We don’t claim energy dependence or rule it out with our level of precision and accuracy,” said Autiero. The final note of the seminar seemed to suggest that the real reason is indeed a mystery for the time being and further analysis will definitely be required.

The discovery is described in arXiv:1109.4897 (PDF).

Electrons surf between qubits

Two independent groups of physicists have taken an important step towards the creation of a practical quantum computer by showing how to transfer single electrons over relatively long distances between quantum dots. Both schemes involve using sound waves on the surface of a material to propel electrons between the quantum dots – which are sub-micron-sized pieces of semiconductor. The teams are confident that they will soon be able to show that electrons arrive at their destination with their quantum information intact, making the system a viable “quantum data bus” for a quantum computer.

Quantum computers, which exploit purely quantum phenomena such as superposition and entanglement, should in principle be able to outperform classical computers at certain tasks. But building a practical quantum computer remains a challenge because the physical entities that store and transfer quantum bits (qubits) of information are tricky to implement and are easily destroyed.

The advantage of using quantum dots as qubits is that they can hold zero, one or two electrons, thereby defining the “logic state” of the qubit data. Furthermore, two electrons in a dot are entangled – a condition that persists even if one electron is carefully removed and transported some distance away. This process, which is known as “quantum teleportation”, can play an important role in quantum computers.

Avoiding decoherence

While physicists can reliably transfer a single electron short distances between adjacent quantum dots, moving it around an integrated circuit containing hundreds or thousands of qubits is a significant challenge. The problem is that an electron in a metal or semiconductor travels through a vast “sea” of other electrons that can destroy the entanglement. One way to avoid this “decoherence” is to essentially drain the sea of electrons from the appropriate channels in the circuit – effectively making them insulating. The challenge is then how to give the electron enough energy to send it flying through the channel without causing decoherence.

Now, however, Tristan Meunier and colleagues at the Institut Néel in Grenoble, the University of Tokyo and the University of Bochum in Germany – and, independently, Rob McNeil and colleagues at the University of Cambridge in the UK – have devised a way to deliver that kick. Both teams fabricated similar semiconductor devices, each with two quantum dots separated by several microns. In both cases the dots are connected by a narrow semiconductor channel between two electrodes.

To deplete the channel of all its conduction electrons, both teams applied a negative voltage to both electrodes. The kick is supplied by a piezoelectric actuator that injects a surface acoustic wave (SAW) pulse along the channel. A SAW is a sound wave that travels on the surface of a material, where it causes the positive ions in the channel to oscillate back and forth. The result is a changing electric field that drives the electron forward.

Extremely fast transfer

Meunier and colleagues employed one piezoelectric actuator, which was able to drive an electron the 3 µm between the two dots in just 1 ns. This is much quicker than the several microseconds it takes for decoherence to destroy a quantum dot qubit, something that is essential for a practical quantum computer, according to Meunier. Meanwhile, in Cambridge, McNeil and colleagues used two opposing piezoelectric actuators to bounce an electron back and forth between quantum dots separated by 4 µm. Indeed, McNeil said that they were able to do this up to 60 times, which means that the electron travelled a total of 0.25 mm.

Both experiments were carried out at extremely low temperature, which means that there are few random sound waves in the channel that would cause decoherence. The SAW wave itself is coherent and should not destroy entanglement, according to McNeil. However, neither team has confirmed that the electron does not suffer decoherence on its journey – something that both labs are currently investigating.

The work of both teams is described in two separate papers in Nature.

Is Sheldon an inspiration or a grotesque parody?

By James Dacey

hands smll.jpg

Earlier this week US actor Jim Parsons picked up an Emmy Award for his portrayal of Sheldon Cooper, the socially inept physics postdoc, on the hit CBS TV comedy show The Big Bang Theory. Parsons picked up the award for “outstanding lead actor in a comedy series” at the awards ceremony in Los Angeles on Saturday night, as reported by my colleague Matin Durrani. A large appeal of the comedy smash hit stems from the relationship between Sheldon and his friends and colleagues (two other physicists and an engineer), and their interactions with “near-normal” neighbour Penny (played by Kaley Cuoco).

The idea of humour on screen being derived from a geeky scientist is not particularly new: Eddy Murphy in The Nutty Professor and Christopher Lloyd in the Back to the Future series are two obvious examples that spring to mind. But the thing that strikes me as novel about the Big Bang Theory is that the vast majority of the humour comes from the geeks’ responses to everyday situations, outside of their work. A rich source of humour, for instance, derives from Sheldon’s excessively analytical approach to social situations, where he is aware of what “people” do in given situations but he is not sure why.

On the one hand, it is refreshing to see that people have accepted Sheldon and his crew into their hearts and people seem to love him because all of his physics geekiness. But on the other hand, it is rarely clear whether we are laughing with Sheldon or at him. The extreme view is that Sheldon is a grotesque parody of a socially inept physicist who simply does not fit in with everyday life.

We’d like to hear your thoughts about this. Which of the following statements best describes your feelings about Sheldon?

He’s got me down to a tee!
He’s an exaggerated version of a physicist for comic effect
He’s a grotesque parody that insults physicists
Who is Sheldon?

Have your say by taking part in our Facebook poll. And please feel free to explain your answer by posting a comment on the poll.

Last week’s poll addressed the issue of money, given that the worsening economic conditions on either side of the Atlantic have kept fiscal affairs in the headlines of late. We asked you: “Can ideas borrowed from physics lead us to financial recovery?”

80% of respondents said yes and 20% said no. This suggests that there is still faith in the ability of science to predict “the madness of men”, as Newton once described stock trading after losing a lot of money in the South Sea Bubble. Luis Rico, one of the respondents who voted yes, believes that one of the main advantages of applying physics ideas to economics is “the lack of both political bias and conflicts of interest”. He believes that economics needs to develop more sophisticated systems to better reflect the real world. “Working with a single model of a complex system that has already proven to fail seems unnatural to me and the inability to question axioms makes impossible any real advance.”

Thanks you for all your responses. And check physicsworld.com next Thursday for the results to our latest poll.

Searching for a star

By Matin Durrani

logo.gif

I learned earlier today that Lord Sainsbury – the former UK science minister – is launching a search for the UK’s “most inspirational technician”.

It seems a worthwhile initiative, given how important lab technicians are for the smooth running of science. (We can all probably speak from experience – I recall some fabulous technicians during my time at the Cavendish Laboratory in Cambridge, including Dick the glassblower, who once saved my bacon after I blew up a mercury thermometer that I’d left too long in a beaker on a hot plate. The beaker dried out and the thermometer exploded. Fortunately the embarrassing incident took place inside a fume cupboard and was not witnessed by anyone else.)

Supported by Sainsbury’s Gatsby Foundation and STEMNET – a charity that tries to get young people involved in science, technology, engineering and medicine (STEM) – the award seeks to recognize “the excellent work of technicians who inspire young people to follow technical careers” and to improve the image of a profession in “high demand by employers”. It is one of five categories in the National STEMNET Awards 2011, sponsored by the Science and Technology Facilities Council (STFC), the others being for best teacher, best employer, best STEM club and best STEM ambassador.

There is no limit to the number of categories you can nominate in and all finalists will be decided by an expert panel. The deadline for nominations is Monday 3 October – more information is available via this link

The top technician – and the other award winners – will win a day trip to the CERN particle-physics lab in Geneva, sponsored by the STFC. Winners will be announced at an awards ceremony at the House of Lords in December.

If he were alive today, I reckon in the running for an award would be veteran Cavendish lab technician Ebenezer Everett, who by all accounts did some fabulous work that played a key role in J J Thomson’s discovery of the electron in the late 1890s.

The reason I mention Everett is that I recently came across the following passage in Robin Strutt’s biography of Thomson, which appeared in the Cavendish magazine CavMag last year, concerning the switching on of a powerful electromagnet surrounding a discharge tube.

JJ: “Put the magnet on.”

There followed a click as Everett closed the large switch.

JJ: “Put the magnet on.”

Everett: “It is on.”

JJ: (eye still to the microscope) “No, it isn’t on. Put it on.”

Everett: “It is on.”

A moment later JJ called for a compass needle. Everett returned with a large needle 10 inches long. JJ took it, and approached the electromagnet. When about a foot away the needle was so strongly attracted to the electromagnet that it swung round and flew off its pivot, crashing into the bulb (which burst with a loud report) and coming to rest between
the poles of the magnet. Everett was glowing with triumph, and JJ looking at the wreck with an air of dejection.

“Hmm,” he said. “It was on.”

Neutrons for the future at the Institut Laue-Langevin

Since its reactor first went critical 40 years ago, the Institut Laue-Langevin (ILL) in Grenoble, France, has maintained it reputation as Europe’s leading centre for neutron scattering. In this audio interview, ILL’s scientific director Andrew Harrison explains how the lab is in the middle of a major upgrade that aims to maintain its position as one of the world’s leading neutron centres.

Neutrons for the future

This effort is particularly important now that construction of the European Spallation Source (ESS) in Sweden is set to get under way. The ESS is an accelerator-based facility that will offer a wide range of researchers, from biologists to engineers, neutron beams that are not available at ILL. However, Harrison insists that ILL will not become a white elephant when the ESS comes online in 2025 and explains how the two facilities will in fact complement each other.

Indeed, physicists at ILL have plenty of experience of working with other major science facilities because Grenoble is also home to the European Synchrotron Radiation Source (ESRF) and several other major research institutes. Harrison explains how this brings the best science and scientists to ILL and continues to encourage the development of other facilities in Grenoble.

The recent trend towards accelerator-based neutron sources, such as the ESS, is, however, leading to a fall in the number of research reactors worldwide, which is a concern to the medical community as it could threaten the supply of medical isotopes. Nevertheless, Harrison has some good news for medical physicists because, as he explains, the ILL has several pilot projects to look at how it could produce isotopes – particularly those that are not easily made elsewhere. Harrison also discusses how ILL is working with a commercial isotope supplier to work out how the institute’s high flux reactor could serve the medical community.

Copyright © 2026 by IOP Publishing Ltd and individual contributors