Skip to main content

And the winner is…

astro image.jpg
The winning image of the 2010 astronomy photographer of the year award (Courtesy: Tom Lowe)

By Michael Banks

US photographer Tom Lowe has beaten hundreds of amateur and professional photographers from around the globe to win the 2010 astronomy photographer of the year award run by the Royal Observatory in Greenwich and Sky at Night Magazine.

Lowe’s winning shot, “Blazing Bristlecone”, which secured him the top prize of £1000, was taken on 14 August 2009 and shows the star-riddled Milky Way arching over an ancient bristlecone pine tree, which can live as long as 5000 years.

The photo was taken in White Mountains, California, with a Canon 5D Mark II camera and an exposure time of 32 seconds. “I like the way the tree follows the Milky Way and the definition is very good,” says astronomer Patrick Moore, one of the 10 panellists who judged the images.

The competition received over 400 entries from more than 25 countries and was split into three categories – Earth and space, our solar system, and deep space – together with young photographer award of the year, which was won by Dhruv Arvind Paranjpye, aged 14, from India. The winners of each category are here.

Selected images will be shown in a free exhibition at the Royal Observatory, which begins today and runs until February.

Peer review highly sensitive to poor refereeing, claim researchers

Just a small number of bad referees can significantly undermine the ability of the peer-review system to select the best scientific papers. That is according to a pair of complex systems researchers in Austria who have modelled an academic publishing system and showed that human foibles can have a dramatic effect on the quality of published science.

Scholarly peer review is the commonly accepted procedure for assessing the quality of research before it is published in academic journals. It relies on a community of experts within a narrow field of expertise to have both the knowledge and the time to provide comprehensive reviews of academic manuscripts.

While the concept of peer review is widely considered the most appropriate system for regulating scientific publications, it is not without its critics. Some feel that the system’s reliance on impartiality and the lack of remuneration for referees mean that in practice the process is not as open as it should be. This may be particularly apparent when referees are asked to review more controversial ideas that could damage their own standing within the community if they give their approval.

Questioning referee competence

Stefan Thurner and Rudolf Hanel at the Medical University of Vienna set out to make an assessment of how the peer-review system might respond to incompetent refereeing. “I wanted to know what would be the effects on peer review as a selection mechanism if referees were not all good, but behaved according to different interests,” Thurner told physicsworld.com.

The researchers created a model of a generic specialist field where referees, selected at random, can fall into one of five categories. There are the “correct” who accept the good papers and reject the bad. There are the “altruists” and the “misanthropists”, who accept or reject all papers respectively. Then there are the “rational”, who reject papers that might draw attention away from their own work. And finally, there are the “random” who are not qualified to judge the quality of a paper because of incompetence or lack of time.

I wanted to know what would be the effects on peer review as a selection mechanism if referees were not all good, but behaved according to different interests Stefan Thurner

Within this model community, the quality of scientists is assumed to follow a Gaussian distribution where each scientist produces one new paper every two time-units, the quality reflecting an author’s ability. At every step in the model, each new paper is passed to two referees chosen at random from the community, with self-review excluded, with a reviewer being allowed to either accept or reject the paper. The paper is published if both reviewers approve the paper, and rejected if they both do not like it. If the reviewers are divided, the paper gets accepted with a probability of 0.5.

Big impact on quality

After running the model with 1000 scientists over 500 time-steps, Thurner and Hanel find that even a small presence of rational or random referees can significantly reduce the quality of published papers. When just 10% of referees do not behave “correctly” the quality of accepted papers drops by one standard deviation. If the fractions of rational, random and correct referees are about 1/3 each, the quality selection aspect of peer review practically vanished altogether.

“Our message is clear: if it can not be guaranteed that the fraction of rational and random referees is confined to a very small number, the peer-review system will not perform much better than by accepting papers by throwing (an unbiased!) coin,” explain the researchers.

Daniel Kennefick, a cosmologist at the University of Arkansas with a special interest in sociology, believes that the study exposes the vulnerability of peer review when referees are not accountable for their decisions. “The system provides an opportunity for referees to try to avoid embarrassment for themselves, which is not the goal at all,” he says.

Kennefick feels that the current system also encourages scientists to publish findings that may not offer much of an advance. “Many authors are nowadays determined to achieve publication for publication’s sake, in an effort to secure an academic position and are not particularly swayed by the argument that it is in their own interests not to publish an incorrect article.”

Don’t forget the editors

But Tim Smith, senior publisher for New Journal of Physics at IOP Publishing, which also publishes physics world.com, feels that the study overlooks the role of journal editors. “Peer-review is certainly not flawless and alternatives to the current process will continue to be proposed. In relation to this study however, one shouldn’t ignore the role played by journal editors and Boards in accounting for potential conflicts of interest, and preserving the integrity of the referee selection and decision-making processes,” he says.

Michèle Lamont a sociologist at Harvard University who analyses peer review in her 2009 book, How Professors Think: Inside the Curious World of Academic Judgment, feels that we expect too much from peer review. Lamont believes that we should never hope for “uncorrupted” evaluation of new science as all researchers are embedded in social and psychological networks. She feels that one way to improve the system, however, is to make assessment criteria more relevant to specific disciplines.

When asked by physicsworld.com to offer an alternative to the current peer-review system, Thurner argues that science would benefit from the creation of a “market for scientific work”. He envisages a situation where journal editors and their “scouts” search preprint servers for the most innovative papers before approaching authors with an offer of publication. The best papers, he believes, would naturally be picked up by a number of editors leaving it up to authors to choose their journal. “Papers that no-one wants to publish remain on the server and are open to everyone – but without the ‘prestigious’ quality stamp of a journal,” Thurner explains.

This research is described in a paper submitted to the arXiv preprint server.

Nanostructure filter zaps bacteria

A water-purifying filter made from normal cotton coated in nanostructures has been developed by researchers at Stanford University in the US. They say that the device, which works by killing bacteria with electrical impulses, is 80,000 times faster than conventional filters and could become a useful tool for remote communities in the developing world. But the breakthrough has already been met with a degree of scepticism by other scientists in the field, who question elements of the design.

Instead of physically trapping bacteria like most existing filters do, the new filter lets them flow on through with the water. But by the time the harmful pathogens have passed into a water container they have been exposed to an electric field, generated by the coated cotton, which kills large swathes of them.

To develop their nano coating, Yi Cui and his Stanford colleagues built on recent work investigating the use of silver nanoparticles for antibacterial treatment of a variety of substrates, including cloth and medical devices. Instead of particles, however, Cui’s team use silver nanowires ranging from 40 to 90 nm and combine them with carbon nanotubes of similar dimensions, which are exceptionally strong and good electrical conductors.

“We got it at Wal-mart”

The nanowires and nanotubes are prepared separately and then added to simple dyes – the nanotubes in a water-based dye and the nanowires in an alcohol-based dye. The dyes are then applied to cotton, which was chosen as the foundation material because it is cheap as well as being relatively strong and chemically robust. “We got it at Wal-mart,” explains Cui. Adding, “The amount of silver used for the nanowires was so small that the cost was negligible.”

In lab tests, a piece if fabric comprising several layers of coated cotton was connected to an electrical power source set to 20 V. Then, water containing a common strain of E. Coli was poured onto the fabric and allowed to pass through at a rate of 10,000 L/h/m2. With this set up, the researchers found that they could rid the water of 98% of bacteria at a rate that was 80,000 times faster than was possible with existing filters.

“The technique is interesting because it uses widely available cotton fabric as the basis for the filter and it relies on an electric current rather than mesh size to remove the pathogen hazard,” says Andrew Scott, a director at Practical Action, a charity that promotes technology for development. Scott is concerned, however, that the filter requires a source of electricity, which is not always available in remote communities.

Energy troubles

The issue of energy requirements also troubles Mark Shannon, a water engineer at the University of Illinois in the US. “[The researchers] reported about 2 logs reduction of pathogens, which is a “modest” reduction – typically you want more than 4 logs,” he says. “So the amount of energy used to kill off the pathogens per unit volume of water is likely much greater than that for boiling water”.

The Stanford team acknowledge that further development is needed to improve the purification process. “With one filter, we can kill 98% of the bacteria,” says Cui. “For drinking water, you don’t want any live bacteria in the water, so we will have to use multiple filter stages.

“It will also be important to investigate how well the filter retains the nanomaterials we use, so that we can be sure the filtered water does not contain silver nanowires and carbon nanotubes,” he says.

Peter Dobson, an engineer scientist at the University of Oxford, agrees that the idea of an electrically controlled membrane is a “worthy and interesting idea” and he is impressed by the use of cotton as an inexpensive base material. However, he is frustrated by the lack of experimental detail in the related research paper. “The actual experimental set-up is very poorly described, and the manner in which the filter has been constructed is not clear, and it is also not clear how the electrical connections were made.

This research is described in a paper in Nano Letters.

‘Self-repairing’ photovoltaics not damaged by the Sun

The test cell the team built to measure the properties of the self-assembling photosynthetic system

Researchers at the Massachusetts Institute of Technology have fabricated the first synthetic photovoltaic cell capable of repairing itself. The cell mimics the self-repair system naturally found in plants, which capture sunlight and convert it into energy during photosynthesis. The device could be 40% efficient at converting solar power into energy – a value that is two times better than the best commercial photovoltaic cells on the market today.

During photosynthesis, plants harness solar radiation and convert it into energy. Scientists have been trying to mimic this process in synthetic materials, but this has proved difficult because the Sun’s rays damage and gradually destroy solar-cell components over time. Naturally occurring plants have developed a highly elaborate self-repair mechanism to overcome this problem that involves constantly breaking down and reassembling photodamaged light-harvesting proteins. The process ensures that these molecules are continually being refreshed, and so always work like “new”.

Michael Strano and colleagues have now succeeded in mimicking this process for the first time by creating self-assembling complexes that convert light into electricity. The complexes can be repeatedly broken down and reassembled by simply adding a surfactant (a solution of soap molecules). The researchers found that they can indefinitely cycle between assembled and disassembled states by adding and removing the surfactant, but the complexes are only photoactive in the assembled state.

Light reaction centre

The complexes are made up of light-harvesting proteins, single-walled nanotubes and disc-shaped lipids. The proteins (which are isolated from a purple bacterium, Rhodobacter sphaeroides) contain a light reaction centre (carried by the lipids) comprising bacteriochlorophylls and other molecules. When the centre is exposed to solar radiation, it converts the sunlight into electron-hole pairs (excitons).

The excitons then shuttle across the reaction centre and subsequently separate back out again into electrons and holes. The nanotubes – which act as wires – channel the electrons, so producing a current. The nanotubes also serve to align the lipid discs in neat rows, ensuring that the reaction centres are uniformly exposed to sunlight.

“The beauty of this system is that a jumbled solution of components can spontaneously arrange itself into highly organized structures, containing thousands of molecules in a specific arrangement, by simply removing the surfactant,” team member Ardemis Boghossian explained.

Apples and oranges

“Using the regeneration process, we are able to prolong the lifetime of our solar cell indefinitely, increasing our efficiencies by more than 300% over 164 hours of continuous illumination compared to a non-regenerated cell,” added Boghossian. “If we were to increase the concentration of these complexes to make a completely stacked, highly packed formation, we could approach the theoretical limit of 40% – which is well beyond the efficiencies we see in commercial solar cells on the market today.”

Comparing the MIT complexes to existing solar cells is like “comparing apples to oranges” though, she insists. “Most solar cells are static because they are made of solid slabs of silicon or thin films. Our solar cells are dynamic, just like plant leaves that can recycle their proteins as often as every 45 minutes on a really sunny day.”

“We’re basically imitating tricks that nature has discovered over millions of years – in particular ‘reversibility’, the ability to break apart and reassemble,” added Strano.

The work was reported in Nature Chemistry.

M-theory, religion and science funding on the BBC

singh2.jpg
Vince Cable believes in cuts, but what about God and M-theory?

By Hamish Johnston

This morning there was lots of talk about science on BBC Radio 4’s Today programme – but I think it left many British scientists cringing under their duvets.

Stephen Hawking was on the show explaining why M-theory – an 11-dimensional structure that underlies and unifies various string theories – is our best bet for understanding the origin of the universe.

Hawking explained that M-theory allows the existence of a “multiverse” of different universes, each with different values of the physical constants. We exist in our universe not by the grace of God, according to Hawking, but simply because the physics in this particular universe is just right for stars, planets and humans to form.

There is just one tiny problem with all this – there is currently little experimental evidence to back up M-theory. In other words, a leading scientist is making a sweeping public statement on the existence of God based on his faith in an unsubstantiated theory.

This, and other recent pronouncements from Hawking in his new book The Grand Design were debated in a separate piece on Today by brain scientist Susan Greenfield and philosopher AC Grayling. Neither seemed too impressed with many of Hawking’s recent statements and Greenfield cautioned scientists against making “Taliban-like” statements about the existence of God.

That brings me to another bit of news making the headlines in the UK – huge and looming cuts in science funding.

The cuts will be implemented by Vince Cable who is the UK’s secretary of state for business, innovation and skills.

He was interviewed in a third piece on Today and made the remarkable claim that “45% of research grants [in the UK] go to research that is not of an excellent standard”.

Ouch…and to save money, the government will soon be “rationing funds by quality”.

So what does this have to do with Stephen Hawking and M-theory?

Physicists need the backing of the British public to ensure that the funding cuts don’t hit them disproportionately. This could be very difficult if the public think that most physicists spend their time arguing about what unproven theories say about the existence of God.

The challenge, of course, is how to make the public aware of all the fantastic work done by other British physicists.

Filaments swarm and swim in circles

 

Swarms of insects and flocks of birds are examples of natural systems in which individual components act independently, yet together display complex collective motion. Scientists have extensively modelled such systems theoretically but have lacked the experimental apparatus to put their theories to the test. Now, a group of biophysicists in Germany has studied a simple biological “active system” in the laboratory and has shown that collective motion kicks in when the system becomes dense enough.

Active systems occur when a source of energy keeps groups of particles away from thermal equilibrium. Those of greatest interest consist of entities that are self propelled and orientable, such as the actin filaments that make up the skeleton of biological cells. Powered by myosin proteins, these filaments allow cells to move and divide coherently.

In 1995 the theoretical physicists John Toner and Yuhai Tu put forward a model to describe the collective motion of large groups of organisms, and other researchers have since expanded it to cover active systems more generally. However, experimentalists have so far been unable to create systems in the laboratory that are simple and adjustable enough to test the models.

Thrusting proteins

Now Andreas Bausch of the University of Technology, Munich (TUM) and colleagues have created such a system in a sample made up of actin and the myosin. One end of each of the myosin molecules was connected to a glass slide immersed in water with the other end free to bind with 10 µm long actin filaments. A thrusting movement of the myosin then set the actin in motion.

Such samples have been prepared by biologists since the 1970s, but these studies have focused on the behaviour of the myosin. This latest breakthrough came one Friday afternoon when Bausch and colleagues decided to see what would happen when they increased the density of actin filaments in the sample by up to a factor of 1000.

The researchers found that the filaments move around randomly in samples with an actin density less than about five per square micron. But above this critical density, the filaments form distinct clusters between 20–500 µm across that move around erratically and endure for several minutes.

Spiral and bands

Things get even more interesting at densities greater than about 20 filaments per square micron, where the filaments group together in bands that move across the sample as waves. These bands remain stable for as long as the observations are carried out (up to half an hour) and span several centimetres. In addition, the researchers found that at all densities above the critical density the filaments can also create spiral patterns lasting up to 10 minutes.

To try and understand the origin of this collective motion Bausch, together with Erwin Frey of the Ludwig Maximilians University in Munich and colleagues, carried out a computer simulation of the system. This involved the simple assumptions that filaments repel each other when they get close enough – without specifying the mechanism responsible for the repulsion – and that filaments tend to align themselves along the average direction of neighbouring filaments. The simulation successfully reproduced the cluster motion and the waves but not the spiral patterns. They think that the spirals are caused by longer-range interactions brought about by the flow fields set up in the water by each of the filaments, and which were not included in the simulation.

The work is reported in Nature 467 73. Writing in the “News and views” section of the journal, physicists Jean-François Joanny of the Institut Curie in Paris and Sriram Ramaswamy of the Institute of Science in Bangalore, India, describe the work as a “crucial quantitative, experimental demonstration” of collective motion in a biological system.

The physicist’s approach

The researchers maintain that the close similarity between the patterns seen in the experiments and those predicted by simple theoretical models underlines the value of what they call “the physicist’s approach” to studying such systems; in other words, the strategy of ignoring chemical and biological details. They propose building up a phase diagram of filament behaviour by systematically varying the density and activity of the motor molecules in future experiments, and then testing the universality of this diagram by comparing it to the results of experiments carried out on real systems.

Graphene transistor beats speed records

Researchers in the US have developed a new way of making transistors from graphene – a sheet of carbon just one atom thick. The technique overcomes a major obstacle facing those who want graphene to replace silicon as the material of choice in future electronic devices. It has also been used to make the highest-speed graphene transistors ever.

The semiconductor graphene is seen by many as an ideal material for electronic devices because it is extremely thin yet has high electrical and thermal conductivity and great physical strength. Unfortunately, however, the processing techniques currently used by the semiconductor industry cannot be applied to graphene because they introduce defects into the material, which ultimately deteriorate device performance.

Now, Xiangfeng Duan and colleagues the University of California at Los Angeles have developed a new fabrication technique that involves employing alumina-coated nanowires as the gate electrode in a graphene transistor. The device’s source and drain electrodes are then made using a self-aligning process using the nanowires as “masks” – a process that also minimizes resistance in the transistor, so improving its performance even further.

Graphene consists of a single, flat sheet of carbon arranged in a honeycombed lattice. Since the material was first created in 2004, its unique electronic and mechanical properties have amazed researchers, who have been eyeing it up for a host of device applications. In particular, it could be used to make ultrafast transistors because the electrons in graphene behave like relativistic particles with no rest mass. This means that they whiz through the material at extremely high speeds.

There are still lots of challenges to be overcome, however, before the dream of all-graphene electronics becomes reality. One of these is to develop a fabrication technique that produces nearly defect-free devices, something never achieved until now.

Conventional processing methods to make state-of-the-art silicon metal-oxide-semiconductor field-effect transistors (MOSFETs) involve using a self-aligned gate structure to ensure that the edges of the source, drain and gate electrodes are precisely positioned. This avoids any overlapping between the electrodes, thus minimizing resistance in a device. (High resistance is a bane for nanoscale devices since it slows them down). The same technique does not work for graphene though because the technique unavoidably introduces defects into the material’s lattice.

Duan and colleagues have instead used a cobalt-silicide-alumina core-shell nanowire as the top gate in their graphene transistor. This dielectric nanostructure is made in a separate step and then simply placed on top of a monolayer of graphene afterwards. Such an approach does not introduce any appreciable defects into the material, says Duan.

Nanowire mask

The researchers then place a thin layer of platinum on top of the graphene – across the nanowire such that the wire separates the thin film of graphene into two isolated regions. These two separate areas then form self-aligned source and drain electrodes next to the nanowire gate. In this work, the nanowire mask also defines the gate length of the device, in this case about 140 nm.

The finished devices have the highest transconductance value ever reported for such devices, of 1.27 mSµm–1. The transconductance of a transistor determines how well it performs. Microwave measurements on the transistors also show they have a record-breaking intrinsic cut-off frequency in the range of 100–300 GHz, which is about twice as fast as the very best silicon MOSFETs of a similar size. Finally, the mobility of the devices (which determines how fast electrons move through them) is about 20,000 cm2/Vs – a value that is around two orders of magnitude better than that of similarly sized commercial silicon transistors.

“Demonstrating graphene transistors with a cut-off frequency comparable to the very best transistors out there marks an extremely important step in graphene research,” Duan said. “This clearly demonstrates the exciting potential of graphene-based electronics for future high-frequency circuits.”

The team now plans to fabricate transistors with smaller gate lengths to push the cut-off frequency even higher – perhaps towards 1 THz. “We also hope to scale up the approach to fabricate big arrays of high-speed graphene transistors on large-area substrates, including flexible substrates,” revealed Duan.

The work was published in Nature.

Will a new law stifle physics in Canada?

By Hamish Johnston

UPDATE: A tentative agreement has been reached by CAP and PEO on the natural sciences exemption.

Professional engineering is a closed shop and rightly so – you wouldn’t want to fly in an aeroplane designed and built by someone with no knowledge of aeronautical engineering principles. As a result, many jurisdictions use laws to define a set of tasks that can only be done by professional engineers.

But could this prevent physicists from doing their jobs? Yes, according to the Canadian Association of Physicists (CAP), which is trying to stop changes to the Ontario Engineering Act in Canada’s most populous province.

The offending revision ensures that only a professional engineer can apply engineering principles to an activity that “concerns the safeguarding of life, health, property, economic interests, the public welfare or the environment”.

The problem is that many engineering principles are also principles of physics (or chemistry, biology etc.). Here’s an example…

F = ma is an engineering principle and it makes perfect sense that only a professional engineer should be allowed to approve a bridge design based on such principles.

However, F = ma could also be used by a physicist to design an ion-trap-on-a-chip for a commercial quantum computer. Because economic interests are involved, the new act would require that an engineer “sign off” on the physicist’s design before it is implemented – even if the engineer knows little or nothing about quantum computing.

CAP president Henry van Driel says that such restrictions “could make it impossible for many, if not most, natural scientists to practice their professions in industry, government and universities”.

In the past, CAP and other scientific societies have negotiated with lawmakers and provincial engineering bodies to win exemptions for natural scientists. Indeed, these are spelled out in guidelines that can be downloaded from the website of Engineering Canada, Canada’s national engineering association

But now in a letter to its members, CAP is claiming that the professional body of Ontario engineers (PEO) is intent on removing the exemption and did not consult with Canada’s scientific societies while the new legislation was being drafted.

As a result, CAP had been in the dark about the changes until the bill had made significant progress through the Ontario legislature.

Now, van Driel has called on the Ontario government to make a last minute amendment to the bill that exempts natural scientists. You can read his letter here.

Each of Canada’s 10 provinces has its own engineering laws and professional bodies, so the PEO is probably in its right to ignore the Engineering Canada guidelines. However the affair doesn’t reflect well on relations between the nation’s engineers and physicists.

I’m also surprised that CAP seems to have been caught out by the revisions. The organization has been fighting this battle for nearly 30 years, so it should have seen this coming.

Talking Hawking and God

By James Dacey

It hasn’t even been released yet but the media is awash with commentaries about Stephen Hawking’s new book, The Grand Design. People are jumping on the astrophysicist’s assertion that we no longer need a God to explain our existence because M-theory – a unified version of string theory – can now explain how the universe emerged from the vacuum.

“Because there is a law such as gravity, the universe can and will create itself from nothing. Spontaneous creation is the reason there is something rather than nothing, why the universe exists, why we exist. It is not necessary to invoke God to light the blue touch paper and set the universe going,” writes Hawking in an extract from The Grand Design, published yesterday in The Times.

“M-theory is the most general supersymmetric theory of gravity. For these reasons, M-theory is the only candidate for a complete theory of the universe. If it is finite – and this is yet to be proved – it will be a model of the universe that creates itself. We must be part of this universe, because there is no other consistent model.”

But the backlash from certain religious spokespeople has already begun, including the chief rabbi, Lord Sacks, who wrote an accompanying opinion piece in the The Times warning of the dangers of overvaluing scientific knowledge. “There is more to wisdom than science. It cannot tell us why we are here or how we should live. Science masquerading as religion is as unseemly as religion masquerading as science,” he writes.

The story was also covered in detail last night by the UK’s Channel 4 News (see video above) who hosted a discussion between Jon Butterworth, a particle physicist at University College London, and Alister McGrath, the chair of theology, religion and culture at King’s College London.

McGrath, a Christian theologian who previously studied physics at the University of Oxford, unsurprisingly points out that M-theory may hold all the answers to all fundamental questions. “All [Hawking] has done really is to simply move things one step into the distance…where do all these laws come from given they are of such importance?” he asks.

Butterworth, a self-professed atheist, agrees that M-theory is far from a grand unified theory of everything, but questions the need for a deity to fill in the gaps to reveal the origin rules of physics. “Whether you find it helpful to label the primary cause as God or some form of ‘pre M-theory quantum vacuum’ doesn’t really have much impact on our understanding of the universe to me, and it doesn’t really have much impact on my life as far as I can see.”

The Grand Design is published on 7 September.

Brazilian wondergoal was no fluke, say physicists

By James Dacey

By many fans it is considered to be one of the most brilliant (soccer) goals ever scored, but by others it is dismissed as a bizarre fluke probably caused by rare atmospheric conditions.

The free kick scored by Brazilian fullback Roberto Carlos against France in 1997 is said to have “defied physics” on account of its wicked late swerve that stunned both the French goalkeeper and thousands of fans.

Now, 13 years on, physicists in France say that they can finally explain what happened and they believe that the wonder strike was no fluke.

On that early summer night in Lyon, Carlos struck the ball at around 35 m from the French goal. It was heading so far to the right that it initially cleared the wall of defenders by at least a metre and made a ballboy, who stood metres from the goal, duck his head. Then, almost magically, the ball curved to the left and entered the top right-hand corner of the goal.

In all the talk over the years, pundits and the occasional scientist have suggested a number of possible causes. They range from a gust of wind, to a materials effect in the ball, to unusually dry localized conditions as explained in this Physics World feature article from 1998. But the case has never been closed.

Guillaume Dupeux and his colleagues at the Ecole Polytechnique in Palaiseau have taken a more practical approach by modelling the flight of the football in a more controlled environment, firing tiny polymer spheres through water using a slingshot.

The lightness of the balls and the density of water enabled them to track the tiny spheres as they moved through a spiral which rotates in progressively smaller orbits. The researchers dub this the “spinning ball spiral effect” and explain that we only see this when friction allows the spin effects to become comparable with the forwards trajectory.

planeterellasmall.jpg
Tracking the trajectory of plastic spheres in water

Dupeux’s group argues that, before it smashed into the back of the net, Carlos’ free kick had also begun to follow a spinning ball spiral, which accounts for the fact that it seemed to bend significantly more at the end of its flight. The Brazilian skill came in because Carlos had kicked the ball with enough power and spin, from far enough out, for the spiral to take effect.

It’s a shame Carlos never quite managed to repeat the trick, but at least now we know it was worth him trying!

The research is published today in New Journal of Physics.

Copyright © 2026 by IOP Publishing Ltd and individual contributors