Skip to main content

New twist on Brownian motion seen for the first time

An important aspect of Brownian motion predicted decades ago has been observed for the first time by researchers in Europe. The team has measured how micrometre-sized spheres interact with a surrounding fluid and have shown that the spheres “remember” their previous motion. Their experimental technique, the researchers claim, could be used as a biophysical sensor.

Famously explained by Albert Einstein in 1905, Brownian motion describes the erratic motion of a tiny particle in a fluid. It is caused by the many small “kicks” that the particle receives as a result of the thermal motion of the fluid. Initially, Einstein and other physicists believed these kicks to be independent of the motion of the particle and to be characterized by white noise.

Remembering motion

In the mid-20th century, however, physicists began to realize that when the densities of the particle and fluid are similar, the kicks are not completely random. Instead, “persistent correlations” are predicted between the motions of the fluid and the particle. These arise because particles moving through a fluid will cause the surrounding fluid to move, which in turn will affect the motion of the particle and so on. For example, a person swimming at a constant speed will pull some of the surrounding water with them. But if they stop suddenly, they will feel a push forward from the moving water. Researchers refer to this as “hydrodynamic memory”, but its observation has remained elusive for the tiny single particles that undergo Brownian motion.

Now, Sylvia Jeney at EPFL in Switzerland and colleagues in Switzerland and Germany claim to have seen clear evidence for this effect in the Brownian motions of particles. Their measurements are based on the idea that this hydrodynamic “memory” gives rise to the power spectrum of the particle being described by “coloured noise”, rather than white noise.

In the context of Brownian motion, white noise means that the particle fluctuates with the same magnitude (or power) regardless of the frequency of the fluctuation. Jeney’s experiments, however, show that higher frequencies actually have higher magnitudes of fluctuation – which means that the noise is no longer white but is coloured.

Specialized trap

Jeney’s group made the measurement by trapping a single micrometre-sized melamine sphere in optical tweezers created by a tightly focused laser beam. Although similar to a commercial set-up already used by biophysicists, the researchers spent several years optimizing their apparatus. In particular, they improved the time resolution of the system by a factor of 1000 and boosted its spatial resolution so it can measure distances of less than a nanometre.

The experiments involved single particles trapped by the tweezers and immersed in liquid. The parameters of the experiment were chosen so that time it takes for the fluid to diffuse over the diameter of the particle is about one-sixth of the time it takes for the sphere to reach its equilibrium position in the tweezers. This diffusion time is the timescale on which the hydrodynamic memory is expected to occur and therefore the set-up allowed the researchers to study the correlated behaviour.

“Currently, there are two maybe three labs in the world that have similar high-precision set-ups,” explains Jeney. She says that the team wants to establish the optical-trapping technique as an advanced biophysical tool.

Andrew Harrison takes over at ILL

Director General of the Institut Laue-Langevin

By Hamish Johnston

It just might have been the audio interview with physicsworld.com that swung it – Andrew Harrison has been appointed Director General of the Institut Laue-Langevin (ILL) in Grenoble, France. He replaces Richard Wagner, who has retired.

Harrison had been scientific director of the neutron lab – a post that is now filled by Helmut Schober, who has been at ILL since 1994.

Harrison is pictured above right, with Schober centre and José Luis Martínez Peña, who will continue in his role as director of ILL’s Projects and Technique Division.

When I spoke with Harrison earlier this year, I discovered that we had both spent time at Canada’s McMaster University, where Harrison did a postdoc with the chemist John Greedan. Harrison spent much of his time at the Chalk River lab, where he tells me he shared a house with McMaster graduate student Thom Mason. Mason is now director of the Oak Ridge National Laboratory in the US – and I wonder if the two had any inkling back then that together they would control a huge chunk of the world’s neutron flux!

You can listen to my interview with Harrison here. One thing we chat about is the relationship between ILL and the European Spallation Source (ESS), which is currently being built in Sweden. Two weeks ago, ILL and ESS signed a memorandum of understanding that defines how the two neutron labs will collaborate on the development of new instrumentation and other technologies.

Oxygen isotopes boost neutron scattering

Neutron-scattering measurements using different isotopes of oxygen have been made for the first time by an international team of researchers. The scientists have used the new technique to determine important differences between the molecular structures of normal and heavy water, saying that it could be used to study a wide range of oxide materials including some glasses.

Neutron scattering involves firing a beam of neutrons at a sample and studying the resulting diffraction pattern that occurs when the wavelength of the neutrons is about the same size as the distance between nuclei in the sample. While normally associated with the study of crystalline materials, the technique can be useful for studying disordered materials such as liquids.

Scientists cannot normally extract much information from a disordered sample because it is impossible to differentiate between scattering from different pairs of atomic nuclei. However, much more information can be obtained by repeating the scattering experiment with a sample in which atoms of one type (typically hydrogen) are replaced by a suitable isotope of the same atom (typically deuterium). This alters the diffraction pattern because neutrons scatter differently from different isotopes of the same nucleus. Subtracting one pattern from the other therefore leaves the contribution to the pattern from the substituted nuclei alone.

The same, but different

Substituting hydrogen with deuterium has become a successful technique because the scattering between these two isotopes is so markedly different. However, there are two drawbacks. First, the technique is based on the assumption that hydrogen and deuterium are chemically identical, which is only true up to a point. Deuterium is twice the mass of hydrogen, which means that the two isotopes do not always behave the same way within molecules. The second problem is that physicists have long assumed that for some other important atoms of interest – particularly carbon and oxygen – the scattering contrast between the isotopes would be too small to give a useful signal.

Now, Philip Salmon and Anita Zeidler of the University of Bath, Henry Fischer at the Institut Laue-Langevin (ILL), and their colleagues at the Oak Ridge National Laboratory, Stanford University and the Vienna University of Technology, have demonstrated a new isotope-substitution technique that solves both problems. Their breakthrough draws on a technique developed in 2008 by several members of the team, who had hoped to exploit the difference in scattering between two isotopes of carbon. Although that difference proved to be smaller than previously thought, the researchers then turned their attention to oxygen-18 and oxygen-16. This time they found a much larger difference than expected – so much so that the researchers went ahead with an oxygen-isotope substitution measurement of water using the D4C instrument at the ILL neutron source.

In their experiment, Salmon and colleagues made one neutron-diffraction measurement on a water sample comprising oxygen-16 and hydrogen, and a second on a sample comprising oxygen-18 and hydrogen. Subtracting one diffraction pattern from the other gives the separation between the oxygen and hydrogen nuclei in water – the oxygen–hydrogen bond length. The measurements were then repeated using deuterium instead of hydrogen. Using these data, the team was able study important differences between normal and heavy water for the first time.

Competing effects

In particular, the researchers found that there is a 0.5% difference between the lengths of the oxygen–hydrogen and oxygen–deuterium bonds. This finding supports a “competing quantum effects” model that describes the structure and dynamics of liquid water. This information can now be used to create more realistic computer simulations of liquid water – a substance that has proven very difficult to simulate effectively.

Alan Soper of the ISIS neutron lab in the UK says that the researchers’ apparent confirmation of the viability of the oxygen-isotope method is “a major breakthrough”. However, he points out that even with the revised difference in scattering between the two oxygen isotopes, an extremely stable instrument is required. “Currently, D4C at ILL is ahead of the game in terms of stability,” Soper told physicsworld.com. He also warns that the next generation of neutron sources will be spallation sources – based on accelerators rather than reactors like that of the ILL – and may therefore not be stable enough for these kinds of measurements.

Salmon is more confident, and points out that tests on the NOMAD instrument – currently being built for the SNS spallation source at Oak Ridge by team member Jörg Neuefeind – suggest that it will offer comparable stability to D4C.

As well as studying water, Salmon and colleagues believe that oxygen isotope-substitution could be applied to other materials with oxygen contents that are greater than about 33%. It could, in principle, be applied to glassy solids – materials that have so far proved very difficult to model. However, Soper points out that such analysis would have to work around the possibility that small, random structural differences between different glass samples could overwhelm any differences arising from isotope substitution. Salmon, on the other hand, believes that the potential rewards associated with applying the technique to oxide glasses “will more than offset any remaining technical issues”.

The work is reported in Physical Review Letters.

Link found between solar output and colder winters

Cold winters
How solar output affects northern winters (Courtesy: Nature)

By Hamish Johnston
The last two winters in north-western Europe have been relatively cold. Here in normally mild Bristol, for example, our garden was frozen and snow-covered in November 2010 – something that is very rare indeed.

In a paper published yesterday in Nature Geoscience, UK-based researchers at the Hadley Centre, the University of Oxford and Imperial College have proposed a link between the recent dip in the output of the Sun and the recent cold winters.

You can hear an interview with one of the scientists on BBC Radio Four’s Today Programme here. The interview has the unfortunate title “Sun spots explained” – but solar physicists can rest easy because they haven’t been!

Fire from a celestial dragon

Comet Giacobini-Zinner, a fairly frequent visitor to the inner solar system, was captured by the Kitt Peak 0.9 m telescope on 31 October 1998 (Credit: N A Sharp/NOAO/AURA/NSF)

By Tushna Commissariat

If you have some time to spare tomorrow evening and especially if you live anywhere in the UK and Northern Europe, then I would suggest putting together a picnic supper and going out to the park or an open space from where you have a clear view of the sky because the heavens might just be putting on quite a show! This Saturday on 8 October, the Draconids meteor shower will be at its peak – and scientists predict that we might be in for a meteor storm!

As the Earth revolves around the Sun during the year, it passes through clumps of comet dust – some of which fall towards the Earth’s surface and burn up in the atmosphere creating meteor showers. The Draconids are dust left behind by the periodic comet 21P/Giacobini-Zinner. Generally, the Draconids that peak from around the 8 to 11 of October are a quiet affair, but every now and then the Earth travels through a particularly dense patch of dust. This year, researchers predict that the shower might be much stronger than normal. Indeed, you could see up to 10 meteors a minute, which is well worth camping out for on a chilly autumn evening. While the shower will begin tonight on 7 October and last until 11 October, the peak is set to occur on 8 October at 9 p.m. (20:00 UT), and activity is expected to begin at about 5 p.m. (16:00 UT) so probably the best views for the UK will be just after sunset.

Unfortunately, as the Moon is waxing right now, there will some bright moonlight, so try to keep the Moon behind you by keeping your gaze directed towards the northern half of the sky. Here are a few more pointers for observing meteorite showers:

• Look up a star chart before you embark, or at least pull it up on your phone app so that you can identify the constellation Draco. The shower is called the Draconids because the meteorites look as if they originate from the Draco constellation.

• Keep your eyes open – even the slowest shooting star streaks across the sky in seconds. This is literally a blink-and-you-miss-it situation!

• Take along a reclining chair or a picnic blanket so that you can comfortably lie on your back and not have a crick in your neck to deal with come Sunday morning. And wrap up warm!

• Even if you are in possession of a good pair of binoculars or a telescope, do not bother – meteorite showers are best seen with the naked eye thanks to how quickly they zip across the sky.

• Wherever you are watching the sky from – be it your back garden, the neighbourhood park or from the top of a multi-storey building – try keep away from all sources of light. This means not only bright city lights, but also not flashing a torch in someone’s face once their eyes have become used to the dark. Take along a few coloured plastic sheets – most sheets are red or blue – and fold it over your mobile phone, tablet and even your torch bulb and just secure it with an elastic band.

While just going out and enjoying the sight is lovely, some of you might want to record your observations and contribute your data for a number of global organizations that collate information about meteorite shower from amateur astronomers the world over. If you know how to professionally record your observations, then you can send your data to the International Meteor Organization which also provides you with information on how to do this. In the UK, the British Astronomical Association (BAA) is happy to receive data from any individual or society wherever they are in the world. Their website contains information on how to submit data and some handy maps and charts.

Lastly, with today’s <a href="http://twitter.com/“>Twitter generation, it is of course possible to tweet your data about the meteors you see. The Meteorwatch website is hosting a “Twitter Meteor Map”.

Tweet your observations using one of the hashtags #meteorwatch, #bbcstargazing, followed by your postcode, your country code (UK, US, etc) and, optionally, the meteor count. An example on their website reads – #meteorwatch SW5 0TR uk 1.

If you do happen to take any pictures of the showers do send them in to <a href="http://physicsworld.com/“>Physics World at pwld@iop.org or tag them on our official Facebook page here.

And lastly, don’t forget to do your special no-rain dance/good-weather chant at least an hour before sunset tomorrow to ward off all the rain clouds and mists that plague any starry sky! Then lie back and enjoy the view.

Falling atoms measure the Earth’s rotation

A new type of gyroscope based on interfering atoms has been developed that can determine the latitude where the instrument is located – and also measure true north and the Earth’s rate of rotation. The device has been developed by physicists in the US, who hope to scale it up so that it can test Einstein’s general theory of relativity. They also want to miniaturize the technology so it can be used in portable navigation systems.

The gyroscope has been built by a team led by Mark Kasevich at Stanford University in California. It works by firing a cloud of atoms upwards at a slight angle to the vertical so that the atoms follow a parabolic trajectory as gravity pulls them down. A series of laser pulses is then fired at the cloud while in flight, which separates the atoms into a number of different bunches that follow different trajectories. The pulses are carefully selected so that two of these trajectories cross paths at a detector.

Given that the atoms are governed by quantum mechanics, they behave like waves with a relative shift in phase between the atoms taking different paths. The resulting interference at the detector is dictated in part by the relative orientations of the laser pulses, gravity and the rotation of the Earth.

Where in the world?

The device is set up so that the laser pulses are fired horizontally – that is perpendicular to gravity – and was tested by rotating the orientation of the laser pulses about the gravitational axis. The resulting interference pattern is a near-perfect sinusoid with an amplitude that depends on the Earth’s rate of rotation and the latitude of the location where the measurement is made. Because we know how fast the Earth is spinning, the latitude can therefore be easily determined. The direction of true north and south are given by the direction of the laser pulses when the amplitude of the sinusoid is zero.

As the gyroscope is also sensitive to its own motion relative to its surroundings, Kasevich and colleagues have shown that it could be used for “inertial navigation”, whereby the location of a vehicle (or person) is calculated by knowing its starting point and all the movements that it has made. The team demonstrated this by rotating the gyroscope about the axis perpendicular to both gravity and the laser pulses, which led to a steady change in the interference as the angular velocity was increased from zero to about 1.6 revolutions per second.

Testing Einstein

Although this is not the first atom gyroscope to be made, the team says that its dynamic range is 1000 times greater than previous versions. Another important difference between this and other atom gyroscopes is that the interference pattern does not depend on the velocity of the atoms, which means that noise and uncertainty in those measurements do not degrade its performance.

Kasevich believes that the technique could also be adapted to measure – for the first time in a laboratory setting – the tiny corrections to the trajectory of any object resulting from Einstein’s general theory of relativity. “As our atom-interferometry technique essentially determines trajectory, ultimately, the interferometer phase shift should reflect those trajectory corrections related to general relativity,” he says. Kasevich and colleagues now plan to refine their technique so that it is sensitive enough to measure this effect, known as “geodetic precession”, and implement it in a 10 m “drop tower” that is being built at Stanford.

Although the “geodetic precession” of general relativity has previously been measured using instruments on board satellites, Holger Müller of the University of California, Berkeley thinks that “confirmation by atom interferometers would be received with great interest”. However, he warns that the implementing the upgrade experiment in the 10 m tower will be “a challenge”.

Kasevich also has plans to implement the technology in small devices that could be used in navigation systems – and indeed is already associated with a small company called AOsense, based in Sunnyvale, California, that plans to do just that. Kasevich told physicsworld.com that a device with a volume of just 1 cm3 could be useful for terrestrial navigation applications. The current experiment is contained within a cubic magnetic shield with sides that measure about 50 cm.

The research is described in Physical Review Letters.

‘Tension’ emerges within OPERA collaboration

The claim by a team of researchers in Italy that neutrinos can travel faster than the speed of light will require extra checks before being submitted to a peer-reviewed journal. That is the position of a number of researchers in the OPERA collaboration, which announced on 23 September that it had observed superluminal neutrinos travelling from the CERN particle-physics lab near Geneva to the Gran Sasso underground lab in central Italy.

The announcement made headlines around the world, since it appears to contradict Einstein’s special theory of relativity. However, not everyone within OPERA was happy to release the results publicly, with several of the 30 group leaders within the 160-strong collaboration being opposed to the release of a paper on the arXiv preprint server and the accompanying seminars and press release without further tests of possible systematic errors being carried out. Now, a larger fraction of the group leaders is concerned about the paper being submitted to a research journal. One member of OPERA, who does not wish to be named, says there is a “lot of tension” within the collaboration and that up to half of the members are opposed to an immediate submission.

Precision measurements

Neutrinos are produced by accelerating protons at CERN’s Super Proton Synchrotron accelerator and colliding proton bunches 10 µs in length into a graphite target, generating mesons that in turn decay into neutrinos. The 1300-tonne OPERA detector, which has been running since June 2008, measures the properties of muon neutrinos as they travel 730 km through the Earth’s crust from CERN to Gran Sasso.

The experiment was originally designed to study the oscillation of muon neutrinos into tau neutrinos, but following tentative results in 2007 from the MINOS experiment in the US that showed neutrinos appearing to travel faster than light, researchers realized accurate velocity measurements could also be carried out with OPERA. Researchers installed atomic clocks at both ends of the neutrino beam to establish exactly when the neutrinos are created and detected, and used GPS-based measurements to precisely measure the length of the baseline – the velocity being derived by dividing the baseline by the time of flight.

Collecting more than 16,000 events between 2009 and 2011, the OPERA collaboration calculated that muon neutrinos arrive on average 60.7 ns earlier than they would have done had they travelled at the speed of light, which corresponds to a fractional increase over light speed of 25 parts in a million. Having accounted for a host of possible systematic errors, including uncertainties relating to the precise moment of creation and detection of the neutrinos plus errors introduced by cabling and clock synchronization, the researchers arrived at a total systematic error of 7.4 ns, comparable with the statistical error of 6.9 ns.

The OPERA collaboration calculated a confidence level of “6σ”, or a one in a billion chance the result was a statistical fluke, and this persuaded most of the collaboration that the result was solid enough to publish. However, some members were worried that unknown sources of systematic error might potentially destroy the confidence level. They argued that before making an announcement, further checks should be carried out – a process that could take several months.

One such check regards the timing of the neutrinos’ arrival at Gran Sasso, and involves carrying out an analysis of timing data collected by monitoring the charge, rather than the light, generated by particles passing through the detector. This analysis relies on a very precise and painstaking measurement of the length of the cabling used to collect the timing data, in order to isolate any systematic errors that may be present within the electronics or other parts of the timing system.

Another independent check involves the statistical analysis of the data collected by OPERA. The researchers are not able to track, and therefore time, individual neutrinos as they travel from Geneva to Gran Sasso, but instead they measure the temporal distribution of the protons within each bunch just before the protons hit the graphite target and then compare this with the distribution of the corresponding neutrinos as they are detected in OPERA – with the temporal offset between the two revealing the time of flight. Some members of the collaboration argue that this offsetting procedure needs to be carried out independently, in order to be sure that the temporal profile of the neutrinos leaving CERN can be inferred accurately from that of the protons that produced them.

Heated debate

Discussions about whether or not the collaboration was ready to publish took place in early September. As these discussions were quite animated, the decision was put to a vote, with collaboration spokesperson Antonio Ereditato from the University of Bern proposing that initially the research be published on arXiv while at the same time being presented in a series of scientific seminars, before later being submitted to a peer-reviewed journal. This strategy received a majority, but not a unanimous, vote. It was then left to individual researchers to sign the arXiv paper, with about 10 senior members out of a total of 170 people (including some non-official members) deciding not to do so.

There are so many things that people outside can’t check. It is these things that we have to do before publishing Caren Hagner, Hamburg University

Ereditato says that the collaboration will continue to carry out checks but will do so in parallel with the journal submission. He maintains that no-one outside the collaboration, either at the seminars or via e-mail, has yet presented “smoking guns against what we have seen” and adds that “as experimentalists we have done everything we can”. However, Caren Hagner, leader of the OPERA group at Hamburg University and one of the people whose name does not appear on the arXiv paper, believes that the collaboration should carry out the extra checks before submitting the paper for peer review. “Many of the collaboration are convinced that if a mistake is subsequently found it, won’t be down to OPERA,” she says. “But I am not really convinced. There are so many things that people outside can’t check. It is these things that we have to do before publishing.”

Laura Patrizii, who is leader of OPERA’s Bologna group and who did sign the preprint, clarifies the motivation of the dissenters. “It is not that people think there is a mistake that is being hidden,” she says. “But since something going faster than light would kill modern physics as we know it, some researchers would feel more at ease with these independent checks.”

Looking to the outside

In addition to the checks that can be carried out within the collaboration, there are also some additional checks that CERN could perform, such as using detectors downstream of the graphite target to provide a better estimate of the profile of the departing neutrinos. The MINOS experiment is also currently improving its cabling and electronics, and collaboration co-spokesperson Jenny Thomas from University College London says that new data collected with the upgraded detector combined with a better analysis of existing data could allow MINOS to largely rule out the OPERA result within the next four to six months (but not to rule it in, given that this would require a higher level of accuracy).

Giovanni Amelino-Camelia, a theoretical physicist at the University of Rome “La Sapienza”, believes that a confirmed OPERA result would lead to a “revolution” within physics. But he thinks that this confirmation is unlikely, pointing out that in the history of physics there have been many experimental “alarms” suggestive of a revolution but that only a small fraction of these, such as the Michelson–Morley experiment, have been confirmed.

With OPERA in the spotlight, collaboration members also disagree about their future research programme. Luca Stanco, leader of the OPERA group from the University of Padova and one of the people who did not sign the preprint, believes that the priority now should be further investigation of the superluminal-neutrino result, rather than neutrino oscillations. Ereditato, however, says that even though the collaboration will pursue superluminal research, “the main focus will continue to be oscillations”.

Electrons heat up in graphene

Graphene has once again amazed researchers with its bizarre properties – this time in the way it reacts to light. A team in the US has discovered that the material does not behave like a conventional semiconductor when exposed to light but instead produces “hot carriers” that generate a photocurrent. The finding could be useful for creating new types of ultrafast and highly efficient photodetectors and energy-harvesting devices such as solar cells.

Graphene is a layer of carbon just one atom thick that has a range of unique electronic, mechanical and optical properties that could have great technological promise. Indeed, since its discovery in 2004 the “wonder material” has been used to create transistors and other prototype components.

Hot carriers at all temperatures

Researchers are also keen to create optical devices using graphene and this latest discovery by Pablo Jarillo-Herrero and colleagues at the Massachusetts Institute of Technology and Harvard University could point the way forward. “This so-called hot-carrier regime is very unusual and is normally only seen at extremely low temperatures or in very non-linear processes,” explains Jarillo-Herrero. “However, in graphene it occurs at all temperatures from very low up to room temperature – and in the linear regime – when the material is excited with a laser.”

When a conventional semiconductor is excited with light, high-energy electron–hole pairs are produced. These charge carriers subsequently generate a photocurrent, which is usually driven by an electrostatic potential difference. Such processes form the basis of modern optoelectronics devices.

Graphene is different

Until now, researchers believed that graphene was no different in its reaction to light – although some suspected that thermoelectric processes could be at play in the material. The new research by the MIT–Harvard team has unambiguously confirmed for the first time that these processes are indeed responsible for photocurrent generation in graphene.

The researchers obtained their results by making a host of optoelectronic measurements on complex graphene p–n-junction nanodevices that they had fabricated themselves in the laboratory. In particular, they performed precise spatially resolved optical-excitation microscopy and electron-transport measurements by shining laser light with a wavelength of 850 nm onto the graphene p–n interfaces. They then measured the photocurrent produced in the devices as the laser spot was scanned over the samples.

The team observed that a strong photocurrent was produced at the p–n contact that increased as the power of the laser beam was increased. The maximum photocurrent recorded was 5 mA/W at low temperatures, a value that is six times higher than that seen in previous graphene optoelectronic devices.

Running hot and cold

According to the researchers, such high values are a result of the photothermoelectric effect. “It turns out that when you shine a light on graphene, the electrons in the material heat up, and remain hot, while the underlying carbon lattice remains cool,” explains Jarillo-Herrero. “It is these hot electrons that then produce a current.” The electrons in the excited graphene cannot cool down easily because they couple poorly to the carbon lattice and so cannot transfer their heat to it, he adds.

“Our study is of a very fundamental nature,” says Jarillo-Herrero, “and forces us to ask myriad questions.” For example, how efficient are the photogenerated charge carriers and can the dimensions of the devices we made be optimized to maximize the current produced? What happens if the number of graphene layers is changed and what happens if the devices are coupled to optical cavities?

“All of these questions will be relevant when making new types of ultrafast and highly efficient photodetectors and energy-harvesting devices, the basic operating principles of which could be quite different from those of standard semiconductor devices because they rely on hot-carrier generation,” he says.

“Graphene with its new exciting properties allows for unprecedented engineering of novel thermo-optoelectronic structures,” Gerasimos Konstantatos of the Institut de Ciències Fotòniques in Barcelona, Spain, who was not involved in the work, tells physicsworld.com. “This new research shows that delocalized photogenerated hot carriers produce a high photoresponse using electrostatic control of doping in a dual-gated graphene device. Harnessing hot carriers in this material is indeed an important finding, given its bandgap-less nature,” he adds.

The work will appear in Science.

Are big-science projects worth the money?

hands smll.jpg
By Tushna Commissariat

Published with the October issue of Physics World is a special big-science supplement where we take a good look at the specific challenges of designing and building humongous facilities such as ITER and the LHC –from how to get them funded to the engineering and scientific issues that have to be met before construction can begin. You can download a free copy of the PDF here.

So, our poll question for this week:
Do you think that “big science” facilities are value for money?
The options are “Yes”, “No” and “Depends on the project”.
Vote now on <a href="https://www.facebook.com/physicsworld“>Facebook.

Results just in

Last week we ran two polls, both of which were about the 2011 physics Nobel prize.
In our first poll we asked you which fields of physics deserved to win this year’s Nobel. We had more than 200 responses and here are some results.

Most people felt that “quantum information” would be a shoo-in with a 104 votes, followed by “neutrino oscillations” with 63 votes among others. Unfortunately, we did not have the foresight to include dark energy in the list, but one of our Facebook followers, Peter Moon, commented about an hour before the prize was awarded saying “The acceleration of the expansion of the universe is the most important and unexpected discovery of the last 30 years. Saul Perlmutter and his team from Berkeley responsible for the 1998 achievement deserve the prize, right now!” It looks as if he knew something that we didn’t!

Indeed, the prize was given “for the discovery of the accelerating expansion of the universe through observations of distant supernovae” to Perlmutter and two members of a rival group that came to the same surprising conclusion. (Read an extensive history of the discovery here

Our second poll question asked “Has the 2011 Nobel Prize for Physics ‘for the discovery of the accelerating expansion of the universe’ gone to the right people?” Some 88 of you answered with a “Yes”, while only 7 of you said “No”. So, that wraps up our Nobel polls, until next year!

Space missions look from the dark into the light

Hot on the heels of this week’s announcement of the physics Nobel prize “for the discovery of the accelerating expansion of the universe through observations of distant supernovae”, the European Space Agency (ESA) has chosen its next two science missions, one of which will explore the nature of dark energy – which many physicists believe is the cause of the accelerating expansion. Called Euclid, this mission will study the large-scale structure of the universe with the aim of understanding how it evolved following the Big Bang. The other mission is named Solar Orbiter and will gauge the influence of the Sun on the rest of the solar system, with a special focus on the effects of the solar wind.

The projects are the first in ESA’s Cosmic Vision 2015–2025 plan and fall into the category of medium-class missions. There are three such missions planned for launch in 2017–2022, but the third has yet to be chosen.

The dark side

In 1998 physicists were astounded by the discovery that the rate of expansion of the universe was increasing – not decreasing as had been previously thought. The cause of the acceleration remains one of the most enduring mysteries in cosmology. Euclid is a space-based telescope that aims to create the most accurate map yet of the large-scale structure of the universe. According to ESA’s mission objectives, this will enable astronomers “To understand the nature of dark energy and dark matter by accurate measurement of the accelerated expansion of the universe through different independent methods.”

Euclid will observe galaxies and clusters of galaxies out to redshifts of z ~ 2 at visible and near-infrared wavelengths. Its view will stretch across 10 billion light-years, revealing details of the universe’s expansion and how its structure has developed over the last three-quarters of its history. Euclid’s launch, on a Soyuz launch vehicle, is planned for 2019 at Europe’s Spaceport in Kourou, French Guiana.

Here comes the Sun

While the Euclid mission will probe the furthest corners of the universe, the other Cosmic Vision mission will be looking at something rather closer to home. The Solar Orbiter will investigate how the Sun creates and controls the heliosphere – the bubble in space that is “blown” by the Sun and engulfs the solar system. The mission is designed to better our understanding of the influences our Sun has on its neighbourhood. In particular, it will study how the Sun generates and propels the solar wind, which is the flow of particles in which the planets are bathed. Solar activity, such as solar flares, affects the solar wind, creating strong perturbations and making it turbulent. This can have dire consequences for radio communications, satellites and space missions, as well as triggering spectacular auroral displays visible from Earth and other planets.

The Solar Orbiter will maintain an elliptical orbit around the Sun and will venture closer to it than any previous mission. This will allow the mission to measure how the solar wind accelerates over the Sun’s surface and to sample this solar wind shortly after it has been ejected. The mission’s launch is planned for 2017 from Cape Canaveral using a NASA-provided Atlas launch vehicle.

Asking the right questions

Early in 2004, the Cosmic Vision 2015–2025 plan identified four scientific aims: What are the conditions for life and planetary formation? How does the solar system work? What are the fundamental laws of the universe? How did the universe begin and what is it made of? A “call for missions” around these aims was issued in 2007, with Euclid and the Solar Orbiter chosen this year. ESA is now evaluating five other medium-sized missions for the final launch slot in 2022. This includes the PLATO mission, which would look at nearby stars to study the conditions required for planet formation and the emergence of life, that narrowly missed out in this latest round.

“It was an arduous task for the Science Programme Committee to choose two from the three excellent candidates. All of them would produce world-class science and would put Europe at the forefront in the respective fields. Their quality goes to show the creativity and resources of the European scientific community,” says Fabio Favata, head of the Science Programme’s planning office.

Copyright © 2026 by IOP Publishing Ltd and individual contributors