Skip to main content

Do citations take the shine off the Nobel prize?

Come October, it will be the 10 million kronor question: who’s going to win the 2008 Nobel Prize for Physics?

Of course, every physics buff will have a tip to share. But if a study by researchers in Canada is anything to go by, it is much harder to predict — and perhaps choose — winners today than it was half a century ago.

Yves Gingras and Matthew Wallace at the University of Quebec, Montreal, based the study on citation data of physics and chemistry Nobel laureates from the prize’s inception in 1901, when Wilhelm Röntgen picked up the first physics award for the discovery of X-rays, through to last year, when Albert Fert and Peter Grünberg shared the physics prize for their discovery of giant magnetoresistance.

Using this data, Gingras and Wallace ranked the laureates among their contemporaries in terms of how often their papers are cited by others, and then analyzed how the ranks changed with time. The result was a year-by-year list of the most influential scientists, which the researchers could compare with the awarding of Nobel prizes.

‘More difficult now than ever’

In the run-up to 1945, Gingras and Wallace’s analysis portrays Nobel laureates as leading lights in physics (see graph). The data reveal that, during the course of the scientists’ careers, their citation rankings peaked strongly about a year before their Nobel recognition, implying they were a dead-cert. After they received the prizes, their rankings decreased more slowly — presumably, say Gingras and Wallace, propped up by the kudos or “halo effect” of being a laureate.

After 1945, however, it is a different story. With every year gone by, the ranking of Nobel laureates goes down and down until they are barely distinguishable from other top-level scientists. In fact, in the period from 1971 to 2007 there appears to be no peak in the citation ranking of the prize winners at all (arXiv:0808.2517).

Gingras and Wallace attribute the lack of stand-out laureates to the growing number of sub-disciplines in science: “It is obvious that science has grown exponentially over the 20th century…to such an extent that the fragmentation of science makes it more difficult now than ever to identify an obvious winner for a discipline as a whole.

“Whereas it was still relatively easy around 1910 to know who the most important scientists in a discipline were, such a judgement is much more difficult since at least the 1970s.”

Prediction is ‘almost futile’

The analysis raises the question of whether the Nobel Prize for Physics is as prestigious as it was in the first half of the 20th century. Gingras and Wallace stop short of making this connection, though they do suggest that the Nobel committee has a much harder time picking out the best candidates; this might be borne out in the fact that there have been no unshared awards in the past 15 years. Certainly, they say that “the game of prediction” is “almost futile”.

“It is true that it takes longer for recognition now than, say, in the 1920s and 1930s,” says Lars Brink, a member of the 2008 Nobel Prize for Physics Committee. “This is partly due to the fact that physics is more developed and that it takes a longer time to do experiments and to verify theoretical ideas.”

Brinks adds, however, that he does not think that fact makes it harder for him and his colleagues. “It has always been a difficult and time consuming job,” he says.

The Quebec researchers also avoid the fact that Nobel prizes are, more often than not, awarded on the basis of a single discovery — not on their performance in league tables. As Arne Tiselius, erstwhile head of the Nobel chemistry committee, wrote: “You cannot give a Nobel for what I call ‘good behaviour in science.’”

Spinning electrons make for an unconventional metal

An international team of physicists has turned a common semiconductor material into an unconventional type of metal called a “non-Landau–Fermi liquid”.

While this is not the first such metal to be made, the team claims that it is the first that can be described by a simple theoretical model. This could help physicists understand more complicated non-Landau–Fermi liquids such as high-temperature superconductors and also could lead to new ways of controlling spin-polarized electrons.

One of the most remarkable properties of individual conduction electrons in most metals is that they appear to go about their business oblivious to the fact that they are crammed into a solid lattice along with 1022 other electrons and ions. Instead of ricocheting from one collision to the next, the electrons appear to flow smoothly through a metal much like a fluid through a pipe.

Non-interacting electrons

This behaviour was first explained in 1956 by the Soviet physicist Lev Landau, who argued that the effects of the interactions between the electrons could be accounted for by describing the system as a collection of non-interacting electrons, called “quasiparticles”, each with an “effective mass” greater than the mass of a free electron.

The Landau-Fermi model relies on the idea of “screening” — that the electrostatic forces within a cloud of negatively charged electrons are cancelled exactly by the positively charged ions. Similarly, the interactions between the spin magnetic moments of electrons and ions are also screened.

Landau–Fermi liquid theory has been extremely successful at describing the behaviour of systems as varied as metals and liquid helium. However, physicists have discovered a small but growing number of materials in which electrons move freely, but cannot be described by the Landau–Fermi liquid model. The most notable being the high-temperature cuprate superconductors, which have proven very difficult to understand.

Researchers are therefore keen to understand the properties of such “non-Fermi liquids”, but have yet to find a simple system that can be studied both experimentally and theoretically.

Simple theory

Now Gabriel Aeppli at the London Centre for Nanotechnology and University College London, and colleagues at the National University of Lesotho, Louisiana State University and Bell Laboratories in New Jersey, have discovered that doped iron silicide (FeSi) can be a non-Landau–Fermi liquid. What’s more, they say that the behaviour can be predicted by a simple theory (Nature 454 976).

FeSi is a small gap semiconductor that has been widely studied because its optical and electronic characteristics can be easily altered by doping it with impurities such as aluminium and cobalt . While this creates disorder in the material, the doped FeSi samples previously studied were Landau-Fermi liquids.

In this new work Aeppli and colleagues doped FeSi with manganese atoms, contributing an ionic core with a spin magnetic moment of one. However, each manganese impurity also creates a positively-charged “hole”, which is a quasiparticle with a spin magnetic moment of 1/2. This means that the spin magnetic moments of the impurities cannot be fully screened by the spins of the holes.

Quantum ambiguity

At very low temperatures of a few Kelvin, quantum mechanics dictates that some manganese impurities will be fully screened, while the rest are not screened at all. According to Aeppli, this ambiguity as to which impurities are being screened at a particular time means that the quasiparticles in the material cannot endure on time scales long enough to constitute a Landau-Fermi liquid.

The team confirmed this by measuring the electrical conductivity of the doped material as a function of temperature. The conductivity of a disordered Landau-Fermi liquid always varies as the square-root of the temperature, but that was not the case for the manganese-doped FeSi. The team also applied a relatively weak magnetic field to the sample – which caused the material to conduct like a disordered Landau-Fermi liquid. The team believes that the manganese spins tend to align themselves in the direction of the magnetic field, thus removing the quantum-mechanical ambiguity.

According to Aeppli, it should be possible to take advantage of this effect in electronic devices in which the electrical current flowing through a junction can be switched between a Landau–Fermi liquid and non-Landau–Fermi liquid. Such devices could find use in spintronics — electronic circuits that use both the charge and spin of electrons to process information.

This quantum ambiguity is an example of quantum entanglement – a process that is vital to the operation of quantum computers. As a result Aeppli believe that this effect could be used control the translation of quantum information from stationary electrons found near the cores of ions to moving electrons.

Mooning over ultrahigh-energy neutrinos

By Hamish Johnston

When it comes to designing detectors for neutrinos, the bigger the better.

At the South Pole, for example, physicists have begun work on the IceCube experiment, which will pepper a cubic kilometre of ice with over 4000 photomultiplier tubes with the aim of detecting tiny bursts of light created when neutrinos interact with the ice.

However, this experiment is tiny by comparison to the NuMoon experiment, which is trying to use the Moon to detect ultra-high energy neutrinos from the far reaches of the universe.

The NuMoon collaboration has just published an analysis of the first 10 hours of observation on the arXiv preprint server.

NuMoon uses the Westerbork Synthesis Radio Telescope in the Netherlands to look for short radio pulses that are believed to occur when an ultra-high energy neutrino creates a cascade of charged particles within the layer of rocks and sand that covers the Moon.

These particles move through this rubble at speeds faster than the local speed of light, creating pulses of Cherenkov radiation that can be detected on Earth using a radio telescope.

The NuMoon team reckon that 100 hours of data will allow them to set the best limit yet on the “GZK neutrino flux” — the number of neutrinos with energies in excess of 10^20^ eV that pass through the Moon.

Such neutrinos are believed to be produced when ultra-high energy cosmic rays from distant sources scatter from the cosmic microwave background. While this scattering prevents the cosmic rays themselves from reaching Earth, much could be learned about their origins (such as massive black holes) by studying GZK neutrinos.

This is not the first time that astronomers have tried to use the moon as a giant neutrino detector. The idea was first proposed about 20 years ago and there have since been two other experimental attempts — including the GLUE experiment, which failed to spot any ultra-high energy neutrinos.

And if NuMoon fails to detect any neutrinos, the team plan to use successively more powerful radio telescopes such as the Lofar array currently under construction in the Netherlands and ultimately the Square Kilometer Array that should be built by 2020 in either South Africa or Australia.

In the dark about dark matter

By Jon Cartwright

Has a European satellite detected dark matter? That’s the question on many people’s lips who attended the recent International Conference on High-Energy Physics (ICHEP) in Philadelphia, US.

Several physicists who attended the conference have told me that Mirko Boezio, a representative of the PAMELA (Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics) mission, briefly showed data depicting an excess of high-energy positrons in the ionosphere. If true, it would seem to be evidence of annihilation dark matter — an elusive substance thought to make up some five-sixths of all matter in the universe.

Unfortunately, neither Mirko Boezio nor the principal investigator of PAMELA, Piergiorgio Picozza, wants to comment on their data. They told me this was because they are planning to publish in either Nature or Science, and are therefore prohibited from talking to journalists because of those journals’ embargo policies. (Another little birdie told me that the PAMELA team is specifically aiming to submit to Nature by September, so if they fast-track it we might get to see the paper before Christmas.)

I’m going to tell you all I know about this, because frankly it’s not that much at the moment. The slides available from the ICHEP website only show positron data up to about 6 GeV, which doesn’t show much. Slightly better is this slide below from another PAMELA team member, Elena Vannucinni, who gave a presentation at the recent SLAC Summer Institute.

(more…)

Entanglement remains a mystery

If two particles are “entangled”, so quantum mechanics says, any tinkering with one can cause an instantaneous change in the other, no matter how separated they are.

Einstein rejected this notion as “spooky action at a distance”. But what if quantum mechanics is not quite right — that the change is not instantaneous, but instigated by a signal transmitted between the two entangled particles? Now an experiment performed in Switzerland has showed that, if such a signal does exist, it would have to travel at least as fast as light, and probably thousands of times faster.

The experiment, which has been performed by Nicolas Gisin and colleagues from the University of Geneva, is similar to other experiments that attempt to test entanglement, albeit on a larger scale. The researchers first entangle two photons at Geneva, and then send them 9 km in opposite directions — due east and west — to interferometers based at the Swiss villages of Jussy and Satigni. At these two locations they look for any interference between the photons. If the interference is above a reasonable level — given by the so-called Bell inequality — it implies the photons are changing their properties instantaneously to suit each other.

Any reference frame

On its own, this experiment would not rule-out the possibility that the photons are signalling to one another. This is because, according to Einstein’s theory of special relativity, the measurements may not be synchronous from the point of view of a moving observer or “reference frame”. In other words, there may be a finite time gap for the signal to be sent.

To get around this, Gisin and colleagues performed their experiment many times over a 12-hour period. The rotation of the Earth throughout this time means that the researchers could put a limit on the duration of the time gap for any reference frame. They found that even for a reference frame that would produce the biggest gap — one moving relative to the Earth at close to the speed of light — the signal itself would have to travel more than 10 times the speed of light. For more realistic reference frames — say, one moving at a thousandth the speed of light relative to Earth — the signal would have to travel even faster, at more than 10,000 times the speed of light (Nature 454 861)

Gisin told physicsworld.com that his team’s work, which is the first time the possibility of any hypothetical reference frame has been taken into account, “confirms the predictions of quantum theory”. He also hopes it will enable other researchers to find a more palatable explanation for the mysteries of entanglement.

Feynman 50 years ago

By Matin Durrani

It’s time for me to bow out of the LT25 low-temperature conference here in Amsterdam, which has just ended. The cool crowd will reconvene in three years’ time for LT26, which, I can reveal, will take place in Beijing, at a venue next to the current Olympic park. It’ll be the first time that China will host this triennial shindig.

Conference organiser Peter Kes from the University of Leiden gave some amusing insights into organising a conference of this scale, which saw a staggering 1482 participants. For example, stuffing the massive 380-page (double-sided) conference brochure into delegates’ shoulder bags required a small army of students, who hit a peak rate of 450 bags stuffed per hour.

Then there were the logistics of bussing 640 delegates on a trip to the University of Leiden to see the lab where Heike Kamerlingh Onnes won the race against Scottish physicist James Dewar to liquefy helium 100 years ago last month. Plus sorting out the conference dinner for nearing 600 people, which included hiring a flotilla of nine boats for the trip from the conference halls into town.

(more…)

Freezing physics

LiquidNitrogen.jpg
Levitating a magnet using liquid nitrogen. (Credit: Yorick van Boheemen)

By Matin Durrani

Tucked away in the corner of the foyer at the RAI Convention Center in Amsterdam, where the 25th International Conference on Low-Temperature Physics has been taking place for the past week, I found a series of great little demonstrations by a group of students from the University of Leiden.

The students were showing highlights from a roadshow — dubbed “Freezing physics” — that they perform at about 120 schools and numerous science fairs around the Netherlands each year in an attempt to get people hooked on physics.

You won’t be surprised to find the usual “ooo, watch how this rubber band/tennis ball/banana goes really stiff when we dunk it into a bucket of liquid nitrogen” demonstrations, which are a staple of many public shows of this kind.

But the students, known collectively as the Rino Foundation, had some clever stuff up their sleeves too. One involved using the frozen banana to hammer a nail into a piece of wood. Another saw a hand-bell being cooled in liquid nitrogen and then rung after being frozen. As the material had stiffened considerably, the bell’s ring tone was much higher than when warm.

(more…)

Camera captures at record rate

Around 130 years ago, a wealthy businessman enlisted the expertise of British photographer Eadward Muybridge to settle, once and for all, the then-popular question of whether horses lift all four of their legs off the ground at once when they trot. Rigging together a dozen or so cameras with a neat shutter mechanism, Muybridge not only proved that trotting horses do indeed spend fleeting moments in mid-air, he demonstrated a way to expose images within two-thousandths of a second of one another.

High-speed photography has come a long way since then, and is used for more prosaic tasks, such as examining crash tests, ballistics and biomechanics. Modern systems can take photos hundreds of thousands of times per second at high resolutions, or even millions of times per second at lower resolutions. But now researchers in Jordan and the US have come up with a system that could potentially give the best of both worlds. “The combination of image quality, frame rate and frame count [our] camera system is capable of is unprecedented,” says Ala Hijazi of Hashemite University.

The main problem with achieving high frame rates in photography is that cameras are, on the whole, slow to respond. It is possible to get instantaneous images by leaving a camera exposing permanently in the dark and then only lighting up a desired object for an instant with a flash lamp. Unfortunately, if this principle is extended with more cameras and flashes, the result is a string of messy multiple exposures.

Selective wavelengths

Hijazi and his colleague Vis Madhavan of Wichita State University found that they can get around this problem by four using camera-and-flash combinations that are only sensitive to one particular wavelength — either 440, 532, 600 or 650 nm. This means that each camera can get its own instantaneous image without the interference of the other three.

“This concept allows the time separation between each image that is captured to be infinitesimally small,” explains Hijazi. “[It] results in a camera system that can capture a sequence of high-resolution images at ultra-high speeds.”

Hijazi and Madhavan’s prototype system — which employs charged-coupled devices (CCDs) for the cameras and dual-cavity, Nd:YAG lasers for the flashes — is capable of producing four images in succession at 200 MHz (200 million times per second), or eight images per second at 8 MHz (Meas. Sci. Technol. 19 085503). However, Hijazi thinks that the system should be able to take up to 100 megapixel-resolution images in the gigahertz range. This would make it faster than both electronic high-speed cameras, which can only produce frame rates of the order of 100 kHz, and “rotating drum” or “rotating mirror” cameras, which can produce low-resolution images at up to 20 MHz.

The researchers think their system will be useful in imaging materials during high-speed machining. They have already set up a company called Spectrum Optical Solutions, which they say will begin marketing the high-speed camera system in the second quarter of next year.

Are supersolids not so super?

By Matin Durrani

This is my second full day at the 25th International Conference on Low-Temperature Physics in Amsterdam — LT25 in the jargon — and it’s been a busy morning, despite last night’s marathon conference dinner at the five-star Hotel Krasnapolsky that lasted until gone 11 p.m.

Almost 600 delegates, myself included, were treated to a fairly decent three-course dinner that culminated in what was billed as a “grand dessert buffet”, which seemed to take forever to set up. Thankfully the wait for the profiteroles, fruit slices and cheesecake was ameliorated by a performance by a Dutch philosophy-graduate-turned-magician, whose name escapes me but who did some clever things with various delegates’ wedding rings.

We were also serenaded by a roving accordion player and guitarist who went from table to table and who claimed they could sing songs in 24 different languages. Which was great, I suppose, as long as you didn’t mind the fact they were all sung with a painfully thick Dutch accent. A Malaysian guy on my table, for example, seemed pretty unconvinced by the pair’s children’s song about a parrot.

But back to the physics. This morning I sat in on a session on “supersolids” — a strange new form of matter that some physicists think exists when helium-4 is cooled down to sufficiently low temperatures and subject to high enough pressure. The jury is still out on whether this form of matter exists, although the consensus, as far as I could tell from today, would be that it does.

(more…)

LHC sees first protons

Physicists at the CERN laboratory near Geneva are starting the week with a spring in their steps, having successfully injected the first protons into the Large Hadron Collider (LHC) over the weekend. The test saw protons travelling 3 km through one of the LHC’s eight sectors, which bodes well for the start-up proper on September 10.

“There are lot of very happy people here today,” says CERN spokesperson James Gillies. “The test couldn’t have gone better.”

The main purpose of the injection test was to synchronize the LHC with the smaller accelerators that will feed it with protons. When the machine is up and running, pulsed magnets have to “kick” the proton bunches from one accelerator into another with nanosecond precision.

At 15:20 on Friday, a small bunch of protons was successfully kicked out of the Super Proton Synchrotron (SPS) using a pulsed magnet and sent down a 2.7 km-long transfer line towards the LHC. Then at 21:40, after a few hours spent optimizing the process, one bunch was kicked out of the transfer line into the LHC where — to the surprise of many — it travelled about 3 km until it was stopped by a screen (see image).

“The passage of the beam first time caused some excitement in the control room, and champagne was rolled out,” machine operator Roger Bailey told physicsworld.com. “We expected to have to thread the protons round using position detectors and local correction magnets, but now we know that the fields and polarities of several hundred superconducting magnets are pretty much okay.”

The test was repeated several times on Saturday, giving the operations team lots of data to help make the start-up as smooth as possible. A similar test for protons in the other (counter clockwise) direction is planned for the weekend of August 22.

Start up fever

In its search for new fundamental particles, the LHC will produce the highest energy densities ever created in a lab. But the project has not been a smooth ride, with inevitable technical problems and cost overruns that have forced the machine to slip at least five years behind schedule.

Now, after nigh-on three decades — half of which has been spent building the CHF6bn collider and its four gargantuan detectors — CERN is on the finishing stretch. Almost all of the 1600 superconducting magnets that will guide the protons around the LHC ring have been cooled to their operating temperature of 1.9 K. The next step is some 1400 hardware and electrical tests that must be performed over the next few weeks until the machine is finally ready to go on September 10.

On the start-up day itself, the 24/7 operations team will attempt to thread a single, low-intensity bunch containing a few billion protons all the way around the 27 km circumference of the LHC. Following that they will do the same in the other direction, taking perhaps a few days in total. Then they will adjust magnets so that the protons can circulate happily for periods of hours without veering off course.

Next, strong focussing magnets will bring the counter-rotating beams into collision at the LHC’s four interaction points (requiring the beam to be configured in four bunches). Initially the beams will have an energy of 450 GeV (the energy at which protons are injected from the SPS), producing 900 GeV collisions. But the target this year is to provide record-breaking 10 TeV collisions (5 TeV per beam) with 43 bunches each containing a few tens of billion protons. If all goes well, the first LHC data could be streaming out of the experiments just in time for the official LHC inauguration on October 21.

The LHC will be shut down for the winter, during which time the main bending magnets will be “trained” to handle beams at the full energy of 7 TeV (producing 14 TeV collisions) in March or April 2009. When the machine is running at full whack, nearly 3000 bunches each containing up to 100 billion protons could be hurtling around in each direction, producing half a billion collisions every second. Picking through the debris, physicists hope to find traces of particles such as the Higgs boson — which would complete the standard model of particle physics — or even more exotic entities such as black holes and extra dimensions. However, it will likely take at least a year for physicists to amass enough data and to understand their detectors well enough to be sure what they’re seeing is real.

Copyright © 2025 by IOP Publishing Ltd and individual contributors