Skip to main content

‘Quantum trampoline’ measures gravity

Physicists in France have come up with a new way of using bouncing ultracold atoms to measure the acceleration due to gravity. The technique involves firing vertical laser pulses at a collection of free-falling atoms, which bounces some atoms higher than others. When the atoms recombine at the centre of the experiment, they create an interference pattern that reveals that g is 9.809 m/s2 – just as expected for their Paris lab.

The new technique builds on pioneering work carried out earlier this year by Cass Sackett and his group at the University of Virginia in the US, who were the first to measure gravity by bouncing ultracold rubidium-57 atoms using a laser firing pulses straight up. What happens is that an atom can absorb a photon from the pulse, and then remit a photon but at a slightly different energy. If the timing of the pulses and the momentum absorbed by the atom are just right, the atoms bounce up and down like a gymnast on a trampoline.

However, as the amount of momentum transferred to the atom is quantized, an atom can sometimes receive twice the usual upward momentum – or no upward momentum at all. These breakaway atoms follow different trajectories than the main group – the double-momentum atoms bounce much higher while those that get no extra kick drop down. Although these atoms are initially lost to the main group of regular bouncers, both types can rejoin after time (see figure), creating an interference pattern when the atoms recombine.

Sackett’s team initially sought to minimize the number of atoms lost from the regular bouncers so that they could determine g classically just by measuring how fast the rubidium atoms in this group fall. Sackett’s team then carefully allowed some atoms to follow a second trajectory, and the resulting interference patterns were used to give an even more precise measure of g.

On the bounce

Now, however, Thomas Bourdel and colleagues at the University of Paris and CNRS have shown that g can also be obtained by allowing a fraction of the atoms to take lots of different trajectories and observing the diffraction patterns when they recombine. Bourdel and colleagues began with about 150,000 rubidium-87 atoms that were cooled to ultracold temperature to form a Bose Einstein Condensate (BEC) in which the atoms settle in the same quantum state. The period of the pulses was set at about 1.2 ms and each pulse lasted about 35 µs.

The atoms were bounced by a fixed number of pulses (10–30) before the interference pattern was measured. This allowed the physicists to determine the acceleration due to gravity to be 9.809 m/s2 to three decimal places. This agrees with the local gravity as measured by the World Geodetic System, and slightly more precise than Sackett’s 9.814 ± 0.008 m/s2.

Sackett told physicsworld.com that Bourdel and company were “very creative” to realize that the lost atoms could be recovered and their loss could be turned into a “benefit rather than a hindrance”.

Towards portable gravimeters

The Sackett and Boudrel teams are not the first to measure gravity using ultracold atom interferometry, but other such experiments have involved dropping the atoms in a vacuum chamber, which is limited by the length of the chamber. Although Mark Kasevich and colleagues at Stanford University in the US have built a 10 m drop, such systems are not really practical as portable gravimeters for oil and mineral exploration. Such devices would only be possible if physicists find a way of making “BEC-on-a-chip” technology.

Boudrel and colleagues are now seeking to refine their technique by increasing the number of bounces; changing the shape of the pulses to increase the number of trajectories contributing to the interference pattern; and using photons of higher energy and/or lighter atoms such as helium to increase the time between bounces.

The research will be published in the journal EPL.

Earth’s response to CO₂ underestimated

Global warming resulting from slowly changing Earth systems could be up to 50% greater than previously thought, according to research by UK and US scientists. The study reinforces the notion that certain poorly understood systems such as ice sheets or vegetation are integral to accurately predicting future temperatures. It also paints an ever-bleaker outlook for our planet at a critical time when world leaders are gathering for a United Nations conference in Copenhagen to discuss practicable ways of mitigating climate change.

“If we want to build an agreement that is going to last for many, many centuries – so for our grandchildren’s grandchildren’s grandchildren’s grandchildren – then we need to be taking in these issues,” lead author Dan Lunt of the University of Bristol told physicsworld.com.

A fiendishly complicated system

Modelling climate changes across the whole Earth system is a difficult task that requires fluid-dynamics equations to be solved all over a rotating sphere at small increments in time. The models must also account for relatively small-scale phenomena such as clouds, and interactions between, for example, the atmosphere, ocean and biosphere. One of the most challenging aspects of climate modelling is to factor in the processes that evolve over thousands, even millions, of years.

Yet with even with the best supercomputers certain systems have proved too complex to model accurately, or evolve too slowly to reach equilibrium in a simulation’s duration. As a result no one is sure what the true response of these systems might be to mounting CO2 emissions.

Lunt’s group – which also includes members from the University of Leeds, Northumbria University, the British Antarctic Survey, NASA and the US Geological Survey – has tackled this problem by trying to unravel the elements of an ancient climate retrospectively. They studied a period in the Earth’s “mid-Pliocene” period three million years ago for which they have long-term data on temperature and some of the more troublesome systems, such as ice sheets and vegetation.

Warming underestimated

The researchers discovered that the model gave the correct retrospective temperature predictions only when ice-sheet and vegetation data for the mid-Pliocene period were included. Surprisingly, when they ran the model with more modern ice-sheet and vegetation data, the retrospective predictions underestimated the mid-Pliocene period’s global warming by 30–50%.

Although this result highlights how integral some slow systems are for accurate long-term predictions, Lunt is quick to point out that his group cannot say how these factors could affect short-term predictions. “We’re not saying that in a decade the temperature will be 30–50% more than old predictions would be,” he says. “What we are saying is that the predictions of the climate’s equilibrium state are likely to be underestimates by that much.”

Reto Knutti, a climate scientist at ETH Zurich, thinks it is an “important” study, but agrees it does not tell when the extra warming would come into effect. “[It] confirms in a more quantitative way what people have been speculating: that the sensitivity of temperature to CO2 could be significantly larger if slow feedbacks are included,” he says. “What is missing at this point is an estimate of timescales, i.e. whether these slow feedbacks become important after a few centuries or after thousands of years.”

“This is certainly very exciting science,” says Gabriele Hegerl, a climate scientist at the University of Edinburgh. However, she also believes that it might have little relevance to present discussions on mitigation, because short-term climate change is likely to be governed more by our CO2 emissions. “These estimates are helpful, but they can only be guides,” she adds.

This research is published in Nature Geoscience.

The dark-matter rumour mill

DCMS.JPG
Any WIMPs in here?

By Michael Banks

You shouldn’t believe everything you read in the blogs (except this one of course).

Yesterday, the rumour mill was in overdrive as the Resonaances blog said a paper was due to be released a week on Friday in the journal Nature about a possible detection of dark matter.

What constitutes dark matter, which is thought to make up around 90% of the material in the universe, is a hot topic of research these days with researchers vying to be the first to provide direct evidence of it. If true, it would perhaps be the discovery of the year.

The new rumours are based on the latest results from the Cryogenic Dark Matter Search (CDMS located in the Soudan Underground Laboratory in Minnesota, which is searching for weakly interacting massive particles or WIMPs — a prime candidate for dark matter.

We were a little suspicious of the rumours as Nature is published on Thursday with embargos for news items about its papers on Wednesday evening at 6pm GMT. However, the paper could have been an advanced online publication in Nature or perhaps was due to be published in Science, which is published every Friday.

The rumours were also backed by a series of talks being given by various members of the CDMS team at labs such as CERN on 18 December – the same date as the paper would be published.

However, Leslie Sage, a senior editor at Nature, wrote to Resonaances saying there was no such Nature paper and the rumours were unfounded.

I contacted Priscilla Cushman from the CDMS collaboration and based at the University of Minnesota, who confirmed to me that indeed they have not submitted a paper to Nature.

So why are they presenting the results at different labs on the same day? “Since there is no major conference at this time in which to present them we are coordinating our talks,” Cushman told physicsworld.com.

CDMS researchers will, however, be publishing an arXiv paper on the morning of Friday 18 December about their latest results, so we will have to wait until then.

Cushman says the group were quite taken aback by the rumours going around. “It is certainly an interesting social phenomena [sic],” says Cushman. But ultimately it was “lots of smoke and not much fire”.

2.36 TeV collisions at LHC?

ATLAS collisionsjp.jpg
ATLAS collisions

By Hamish Johnston

According to several physics bloggers (and backed up by the above image) physicists at the ATLAS experiment have managed to collide 1.18 TeV bunches of protons to achieve the highest energy yet — 2.36 TeV.

This makes the Large Hardron Collider the most energetic particle collider, beating the Tevatron’s previous record of 1.96 TeV.

The LHC became the world’s most energetic accelerator ten days ago, when proton pulses were first boosted up to 1.18 TeV.

There hasn’t been an official statement from CERN about this — we’ll keep you updated.

Securing the supply of medical isotopes

By Hamish Johnston

Over the past few years the supply of Mo-99 — which is used to make the medical isotope Tc-99m — has been threatened by two unscheduled shutdowns of the ageing NRU reactor in Chalk River, Canada.

Normally NRU supplies North America with Tc-99m and accounts for a significant chunk of world production, so any prolonged shutdown is bad news.

That’s why the Canadian government convened the Expert Review Panel On Medical Isotope Production earlier this year to identify the most viable options for future isotope production.

The panel has just submitted its report and you can read all 135 pages of it here .

The main recommendation is the replacement of NRU with another multi-purpose research reactor that would supply isotopes as well as fulfilling other scientific functions. However, revenues from isotope production would only offset about 10-15% of the cost of such a reactor — so other research activities would have to justify the bulk of the price tag.

The panel also recommends that Tc-99m production in a cyclotron accelerator, be investigated. Although this would involve a significant amount of research and development, the infrastructure is already in place in several places in Canada.

Could we see the rebirth of the University of Manitoba cyclotron?

Nobel prizes and the credit crunch

nobel.jpg
Future winners may have to do with less

By Michael Banks

You could say physicists have much to be gloomy about these days with the Science and Technology Facilities Council in the UK cutting funding for projects to patch up its budget and scientists in Japan bracing themselves for deep cuts to the country’s science budget next year.

And now future winners of the Nobel prizes could end up feeling short changed if the Nobel Foundation, which manages the finances of the prizes, cuts the amount of money it dishes out every year.

The Foundation announced at the weekend that it might cut the $1.5m it hands out for each of the six prizes awarded each year. The reason, it says, is the credit crunch and the impending recession, which has led to losses in the foundation’s assets.

Indeed, when the credit crunch struck in 2008 the foundation’s assets lost nearly one-fifth and since then has only slightly recovered. “We have sailed the storm, but have taken on some water,” said Michael Sohlman, executive director of the Nobel Foundation, at a press conference.

So as this year’s Nobel prize winners — including US president Barack Obama who won the Nobel Peace Prize — attend the awards ceremony in Stockholm on Thursday, future winners may have to do with less.

Attacking tumours with tiny discs

 

Researchers in the US are developing a new way of destroying cancer tumours that involves attacking them with tiny magnetic discs. Although the research is still a long way from finding medical application, it joins a growing number of innovations that are seeking to apply fundamental physics to the treatment of cancer.

The idea that magnetic particles could be used to target cancer tumours has been in the research community for several decades. The general principle is that drugs intended to destroy targeted cells could be attached to magnetic particles and guided to the appropriate places in the human body using external magnetic fields. Given the precision promised by this approach, it could offer obvious advantages over the crude targeting of chemotherapy, which can leave patients feeling extremely unwell.

Despite their early promise, however, these therapies have yet to yield much success in the field of oncology, mainly due to a number of technical issues. First, most experimental work to date has used superparamagnetic particles, which can only be controlled with strong magnetic fields that are not available in clinical settings. If the particles are made larger to compensate, they have a remnant magnetization even in the absence of external magnetic fields, and this can cause them to aggregate in clusters. Inside the body this process would be dangerous because it could lead to an embolism.

A new spin

In this latest research an interdisciplinary team from the Argonne National Laboratory and the University of Chicago offer a new spin on this technology that could help to overcome some of these technical problems. Instead of viewing the magnetic nanoparticles as a means of transport for drugs, the materials science researchers have designed a technique in which the particles themselves attack the cancerous cells by exerting a mechanical force.

“This treatment is not being designed to replace surgery, but it could accompany an operation for a number of types of brain cancer,” Elena Rozhkova, Argonne National Laboratory

Using a gold-shelled iron-based alloy that they developed, the researchers have created tiny circular discs that are just 60 nm thick with diameters of approximately 1 µm. In this geometry, the magnetic moments are following the disc circumference and form a vortex-like structure with almost perfect closure of the magnetic flux within the disc itself. Instead of been guided by a magnetic field, the tiny discs are coated in antibodies that are able to hone in on the affected cells.

Once a disc is alongside a cancerous cell, an alternating magnetic field can be applied, which causes the vortex structure in the disc plane to shift and the magnetic disk to oscillate. Therefore the disks exert a lateral force towards the targeted cancer cell. This very small force, in the order of a few tens of pN, is strong enough to trigger the redistribution of calcium inside the cancer cell that can result in cell death known as apoptosis.

To demonstrate the technique, the team used brain cancer cells in a controlled environment outside of the body. An alternating magnetic field frequency of just a few tens of hertz was sufficient to destroy roughly 90% of the cancer cells.

Efficiency matters

Elena Rozhkova, a member of the research team, told physicsworld.com that the choice of cancer cells was something of an “arbitrary” decision. She adds, however, that a brain tumour is the sort of cancer that could benefit from this form of treatment due to the low efficiency of existing brain cancer chemo and radio therapies. “This treatment is not being designed to replace surgery, but it could accompany an operation for a number of types of brain cancer,” she says.

Rozhkova says that the Argonne–Chicago team is seeking to develop smaller discs that would be more suitable for clinical application. In the longer term, the researchers hope to test the spin-vortex discs on small animals before moving towards full clinical trials.

Jon Dobson, a biomedical engineering researcher at Keele University in the UK, believes that the innovation does indeed have the potential to solve some of the major problems involved in tumour targeting. He warned, however, of several technical hurdles that would first need to be overcome including the need for rigorous toxicological testing. Dobson also foresees an issue with the proposed down-sizing of the magnetic discs, “The discs will need to be smaller but it is not yet clear whether one would see the same effects with smaller discs,” he says.

The latest research was funded in part by the US National Institutes of Health (NIH), which last month unveiled a new five-year initiative to encourage more researchers to apply fundamental physics to the treatment of cancer. Worth $22.7m over five years, the project will include the creation of 12 new research centres that will bring non-traditional approaches to oncology by considering the physical properties and dynamics of cancer cells.

This research was published in Nature Materials.

Is this the world's smallest snowman?

snowman1.jpg
“I’m riding in the midnight blue”

By James Dacey

Sizing up at just one fifth the width of a human hair, this must be a very strong contender for the smallest snowman in the world.

His body has been formed by welding together two tin microparticles (10 µm in diameter), which are usually used in the calibration of electron microscopes. A focused ion beam was deployed to etch out his eyes and mouth and a tiny fleck of platinum forms the nose.

The little fella’s creator is researcher David Cox who works at the National Physical Laboratory (NPL) in London. He was taking time out from his usual job of fabricating devices for the Quantum Detection Group. “I guess I was just born to make stuff,” Cox writes on his homepage.

There is a video on the NPL homepage showing just how Cox sculpted his miniture friend.

Fish swishing mixes the oceans

Fluid mixing processes in the oceans play a major part in governing the world’s heat and carbon balances as well as providing a food delivery service to a variety of organisms. The wind and tides play key roles in this process but several research groups have claimed that, perhaps surprisingly, the combined motion of marine organisms could also make a significant contribution to global mixing. Now, a pair of mathematicians in the US have developed a model, which they say suggests that this could indeed be the case.

The idea that marine swimmers are having a significant impact on ocean-mixing was first proposed in the 1960s by Walter Munk, a celebrated researcher in the oceanography community. “Munk set out to list all the factors that could influence mixing in the ocean, and he threw ‘biological processes’ in there, by which he meant the effect of swimming organisms such as fish or plankton,” says Jean-Luc Thiffeault of the University of Wisconsin, who was involved in this latest research.

Despite his insight, however, Munk did not have the data nor the means to establish the extent to which marine life is making a contribution. Due to a lack of experimental results and ongoing debates surrounding the complex interactions between the atmosphere and the oceans, Munk’s idea seemed to fall by the wayside. One of the major objections has always been the issue of scaling – researchers have been unable to see how organisms of the order of a few centimetres could have any significant impact over the thousands of kilometres of oceans.

Swimming into the 21st century

Earlier this year, however, physicsworld.com reported on how researchers in California had demonstrated an alternative mixing mechanism that could enable a significant contribution from smaller organisms. By studying the movement of jellyfish through a water column, the scientists looked at how marine animals drag water with them as they swim up and down. As water density increases with depth, the swimming fish affect the total potential energy of the fluid leading to further ambient fluid motions and eventually molecular mixing.

In this latest research, Thiffeault and his colleague Steve Childress from New York University take this mechanism as a starting point and develop a more complex physical model to gauge the effect this mixing could have on a global scale. They model marine organisms as spheres of 1 cm radius that are travelling at 1 cm/s because these are the parameters of krill, which is an exceptionally common species across the oceans.

In the model the spheres move around randomly without interacting, much like particles in an ideal gas. Thiffeault and Childress are interested in the displacement of water as a result of these moving spheres. Given the large number of spheres involved, tools from statistical physics are used to calculate an “effective diffusivity”, which is related to the flux of molecules triggered by the disturbed sea water.

By no means salt of the Earth

Based on well-established estimates of total biomass in the oceans, the researchers calculate an effective diffusivity triggered by marine swimmers of 6 × 10–5 cm2/s. Significantly, this is four orders of magnitude larger than for the equivalent volume of salt, meaning that if all marine life behaved like grains of salt then their impact on mixing would be substantially reduced. Although it is still 100 times smaller than the effect caused by of heat redistribution in the ocean, it suggests that marine swimmers are indeed making a noticeable contribution to ocean mixing, say the researchers.

What is more, the researchers say that this figure is likely to be a lower limit on the true contribution that marine animals make to ocean mixing. If the model were to include the effect of vortices in the wake of a moving fish, or fish “sticking” due to the viscosity of seawater then the mixing effect could be enhanced. Fish travelling in large schools could also increase the effect as the mixing effect scales nonlinearly with size of body and water currents may also travel through the school.

“The model is simplified but will allow future studies to probe a wide variety of scenarios of fish schooling and the corresponding fluid mixing that is achieved,” says John Dabiri of California Institute of Technology who carried out the earlier research on water displacement. “Water parcels can become trapped within the shoal and transported over large distances. This process can significantly enhance the stirring of the water, which ultimately leads to enhanced mixing”.

Dabiri also recognizes the significance of this latest research to climate science. “The model appears sufficiently flexible that one could simulate various scenarios of future animal population dynamics, perhaps including dynamics caused by climate change.

Thiffeault told physicsworld.com that he hopes to develop this research by making the model more sophisticated and testing the results through collaborations with experimental teams.

This research is available for free download on the arXiv preprint server.

Budget deficit threatens Japanese science

Researchers in Japan are expressing concerns that their research funds could be cut in half – or possibly even terminated – following the government’s decision to slash over $35bn from next year’s budget. A number of research labs in the country, including the Spring-8 synchrotron in Hyogo and the KEK particle-physics lab in Tsukuba, could see large cuts to their research budgets next year that, they say, would threaten the county’s competitiveness in science.

In a statement issued yesterday, Atsuto Suzuki, director general of KEK, said that it could take “years” for the country to recover from the cuts. “Neglect of the importance of fundamental research could result in a long-term stagnation of our national competitiveness,” he says.

The problems have been brewing since early November, when the new Japanese government, elected in August, announced plans to slash next year’s budget, which begins in April 2010. Some suggest the budget cuts are because the new government – led by Prime Minister Yukio Hatoyama from the Democratic Party of Japan – promised during the election campaign to make it free for students to attend public high schools. Such schools currently charge tuition fees and are not compulsory to attend even though most of the population do so.

If the government starts slashing funds it really hurts Japanese science and its credibility Hitoshi Murayama, University of Tokyo

However, this pledge significantly adds to the budget of the Ministry of Education, Culture, Education, Science and Technology (MEXT), which is dominated by elementary and middle school education. As a result, MEXT is seeking to cut its expenditure elsewhere, which means that science could be in the firing line.

In early November the Japanese government set up working groups to re-evaluate around 400 ongoing research projects in the country as well as university budgets for next year. These working groups report to the Government Revitalization Unit (GRU) – an 11-member panel chaired by Hatoyama himself) – that is mostly made up of politicians, industry leaders and a few academics.

This arrangement has caused concern in the scientific community that they are not being properly represented when deciding the fate of their projects. “We scientists do not have any opportunity to defend our programmes,” says Hitoshi Murayama, director of the Institute of the Physics and Mathematics of the Universe (IPMU) at the University of Tokyo and one of the leaders against the proposed budget cuts.

Worrying times ahead

The working groups have now evaluated the 400 projects and budgets for universities. “Only very few came out unscathed,” says Murayama. “Most of them were recommended for reduction, and quite a few for near termination.” Among those earmarked for cuts are the Spring-8 synchrotron in Hyogo, which could see its budget slashed by a third to a half, as well as no money going towards what would be the world’s fastest supercomputer to be built at the RIKEN lab.

At the end of November four Nobel laureates, including Makoto Kobayashi from Kyoto University, who shared the 2008 Nobel Prize for Physics with Yoichiro Nambu and Toshihide Maskawa for their work on broken symmetry in particle physics, took the unprecedented step of criticizing the government’s plan to cut the research budget. The laureates called on the government to listen to scientists when deciding to allocate funding for research projects.

Things were made worse in late November when the GRU finished looking at the overall budget for universities and national laboratories, like the KEK particle-physics lab in Tsukuba and the J-PARC experimental complex in Tokai. Although the GRU proposed having another review about the regular operating budget for personnel and infrastructure at universities, 50% of the members of the GRU voted to terminate the operating budget for projects such as the Superkamiokande neutrino experiment at J-PARC, the B-meson factory at KEK and Japan’s involvement in the Subaru optical and infra-red telescope in Mauna Kea, Hawaii.

Research concerns

Murayama is also concerned about the threat to the IPMU. The institute is part of Japan’s World Premier International Research Center Initiative (WPI programme), which founded five institutes from nanoscience to immunology in 2007 to attract researchers from abroad to work in Japan. The five institutes are funded for 10 years and each receive $10m per year and Murayama is bracing himself for 50% cuts to the WPI programme.

As the IPMU’s costs mostly go on salaries, this could potentially mean cutting staff numbers in half. “Previously there had not been many jobs for non-Japanese scientists,” says Murayama, “so this hurts the case for science in Japan.”

Negotiations will now take place in the Ministry of Finance with a finalized budget for 2010 set to be made at the end of December. “I am very worried at the moment,” says Murayama. “If the government starts slashing funds it really hurts Japanese science and its credibility.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors