Skip to main content

China builds super-sized radio telescope

Construction has begun on a massive new 500 m diameter radio telescope in Guizhou province, China, that will allow astronomers to detect galaxies and pulsars at unprecedented distances. The $102m facility, known as the Five-hundred-meter Aperture Spherical Telescope (FAST), will boast a collecting area equal to 30 football fields — more than twice as big as the 305 m diameter radio telescope at Arecibo Observatory in Puerto Rico, which has been the world’s largest since it opened in 1964.

The geography and remoteness of FAST’s site — located some 170 km by road from the provincial capital Guiyang, near the village of Dawodang — make it unusually radio-quiet, says Nan Rendong, FAST chief scientist and a researcher from the National Astronomical Observatories at the Chinese Academy of Sciences. Like Arecibo, the new telescope will sit in a natural karst depression that mimics the shape of the collecting surface, simplifying the support structure and shielding the telescope from stray human-generated radio waves.

The site’s potential for long, uninterrupted observations — coupled with the telescope’s huge size, which will give it twice the sensitivity of Arecibo — means that researchers there will be able to detect objects like weak, fast-period pulsars that are too faint to be measured accurately by smaller instruments. The team also expects to discover the first pulsar outside the Milky Way, according to Nan.

‘Extraordinary’ impact on astronomy

“The FAST science impact on astronomy will be extraordinary,” Nan told physicsworld.com, adding that although the telescope is located in China, once it is completed in 2014 it will be open to astronomers from around the world.

In addition to being big, FAST is designed to be flexible: a system of motors attached to its 4600 panels will allow astronomers to change its shape from a sphere to a paraboloid, making it easier to move the position of the telescope’s focus. This will allow the south-pointing telescope to cover a broad swathe of the sky — up to 40 degrees from its zenith, compared to the 20-degree-wide strip covered by Arecibo.

“Arecibo points straight up, and it’s a real chore to move more than 20 degrees,” says Murray Lewis, head of the radio astronomy group at Arecibo. “In that respect, they definitely have an advantage.” However, he notes that in its initial phase, FAST will only be sensitive to low-frequency (less than 3 GHz) radio waves. This range includes the commonly observed 1.4 GHz hyperfine transition in atomic hydrogen, the universe’s most abundant element and an important marker for a variety of stellar objects. Arecibo’s bandwidth, by contrast, stretches up to 10 GHz, allowing astronomers to collect data on molecular transitions in this region of the spectrum.

A planned second phase of construction will extend FAST’s range to 5 GHz, but a date for the upgrade has not yet been set.

CERN: the view from inside

 

How do you get a billion people excited about a giant superconducting proton collider?

Thinking about how to communicate the start-up of the LHC was the focus of everything we were doing since 2004. We saw a huge opportunity to put CERN and particle physics on the map. So we started saying to journalists: “Wanna see this? Well you’d better come soon because it’s going to be closed up — and it’s impressive.” Until 2004 we’d have about 200 visits per year, but in 2007 we had 600.

It’s now clear that 10 September was the only day the LHC was functioning long enough to get protons around in both directions. Does it unnerve you that so much luck was involved when the world was watching?

We took the decision to show people the reality of doing science at this level, and that carries risk. It could have gone wrong, but on the other hand I know the people who run these machines and they’re just an amazing bunch.

How did you feel when you woke up on 10 September?

Slightly terrified. Plan B was for me to stand up in the Globe [where the media were corralled] and tell 340 journalists who had come to CERN that it ain’t going to happen today. As it was, a minor cryogenic outage added a bit of drama, but that drove home to people that the event was live. We hadn’t rehearsed it, and that openness was appreciated.

And by the time you went to bed?

Well, fantastic. It was just the most amazing day. It was amazing to see how much emotion there was in witnessing this big machine coming to life. The event was huge. Eurovision [the broadcast service which beamed footage to television networks and cost CERN about 50,000 CHF] estimated we had exposure to 1bn people. The result now is that the LHC is mentioned without explanation in contexts that have nothing to do with science.

Rumour was that the switch-on date was arranged around BBC presenter Andrew Marr’s holiday plans?

It’s hilarious. The BBC did ask if we could put the date back if Andrew couldn’t make it, and we said “no”. But on the other hand, BBC Radio 4 pulled out all the stops and decided to do something unprecedented in science by devoting a day to the event, so in return we gave them a room just off the CERN Control Centre to use as a studio. The fact that Radio 4 went so big on CERN drove it out to the rest of the BBC, culminating in “Big Bang Day”, and then out to the rest of the UK media and the world.

Did the black-hole Armageddon frenzy aid or hinder your communication efforts?

Ultimately it helped us by generating interest, but it also worried an awful lot of people and that makes me somewhat angry. People were phoning us up genuinely worried about the end of the world and demanding to know who CERN is accountable to. Of course we’re accountable — 20 countries have to say “yes” before we do anything! We ran a strict press accreditation procedure and there was heightened security on the day.

People were phoning us up genuinely worried about the end of the world and demanding to know who CERN is accountable to.

Did CERN handle the issue well?

With hindsight I would have treated the black hole stuff in exactly the same way we dealt with Angels and Demons [the Dan Brown novel in which antimatter is stolen from CERN to destroy the Vatican]. We were very proactive with that — we put up a webpage and had fun with it — but we didn’t envisage the black hole story going as far as it did. On one hand we didn’t want to engage with the scaremongers, but the particle physics community worldwide was slow to pick up and say “this is nonsense”. It’s one thing for CERN to say everything is safe, but we needed other voices, which have since surfaced.

How did you feel when you realized the full extent of the damage caused by the electrical fault on 19 September?

I was genuinely sad, and I think a lot of people at CERN felt the same. When you’ve been so intimately involved with something for such a long time, and when the start up went so well, the incident was a huge shock. But there’s a story to tell here: mishaps like this are part of life when working at the cutting edge of technology and research.

Wouldn’t it have made more sense to test all the LHC circuits before the media event on the 10th?

I don’t think it would have been any easier to live with what happened. In fact, it may have been less easy because at least now we know that the LHC works extremely well.

So the incident wasn’t the result of pressure to switch on before the machine was ready?

No. The timetable was driven by [LHC project director] Lyn [Evans] and the machine operators. The plan was to get some collision data at low energies, then finish testing the hardware to run at higher energies. Had an electrical transformer not broken down three days after the 10th, we would have had that collision data and the incident would have happened later.

We knew the warm up and cool down would take two months minimum so we quickly put that out in a statement, although with hindsight we should have been more cautious.

CERN’s new director general [Rolf-Dieter Heuer] told staff on 12 January, that from now on people would hear about events first from him, not the press. Was there a lack of communication internally following the incident?

It wasn’t organized in a way that it needs to be now that there is such a huge demand for information. We knew the warm up and cool down would take two months minimum so we quickly put that out in a statement, although with hindsight we should have been more cautious because we soon realized there was no way the LHC was going to be back up that year. Internally, people from the machine and management side were giving talks to the experiments, but we could have used our intranet, website and the CERN Bulletin better.

Did CERN try to withhold information from the media?

There’s a great quote in a Salman Rushdie book: whenever information is tightly controlled, rumour becomes a valued source of news. That was happening at CERN. All the way through, the then director-general [Robert Aymar] genuinely wanted to put out factually accurate information as soon as it was available, but CERN probably tried too hard to keep tight control. Although we were quick off the mark with releasing official statements there were long gaps in between. Even though there wasn’t very much to say, there was stuff that could have been said which would have capped those rumours.

Why was the LHC logbook modified retrospectively on the day of the incident?

There’s nothing sinister about it. The person on shift that morning just wrote down what had happened, then someone came along and said: “Everybody can see that, let’s take it away!” It was a wrong decision made in the heat of the moment, but it was naivety rather than anything systematic. It’s an issue we’re probably going to look at. A logbook should be somewhere people can write down whatever they feel, but that’s not necessarily something that should be visible to the whole world.

[Modifying the log book] was a wrong decision made in the heat of the moment, but it was naivety rather than anything systematic.

Were the long awaited official photos of the damage chosen because they were taken after the tunnel had been cleaned up?

No. They were specifically chosen because they showed where the damage was worse, at the end of the helium-induced pressure wave. I think the rumours had led people to expect something more dramatic.

Who ordered links to photos and some presentations to be password protected after they appeared on blogs?

[Aymar] wanted the CERN community to receive the news from him before it was made more widely available, so access to slides was temporarily restricted. People just hadn’t realized how much in the spotlight we are now.

Is it true that people were being threatened with disciplinary action if they circulated pictures of the tunnel before they had been officially released?

It’s true that the former DG wanted to be the one issuing the information. Look, we have to be more effective in the way that we communicate at CERN both internally and externally. The world is watching this. We’ve created what we’ve created, there is a demand for information and we need to provide it. That’s something that the new management is very aware of.

How are the repairs going?

They’re going well, but there’s a lot to do. We need to increase the LHC’s capacity to vent helium in the event of another leak, so already on all the sectors that are warmed up (half the machine) we are changing the valves on all the quadrupole magnets and putting new ones on the dipoles.

Where’s the logic in making only half of the machine safer?

What we’re doing is about as conservative as you can get. The LHC will now be able to vent ten times as much helium as before, and on top of that we’ve got lots of extra monitoring which will allow us to see a similar electrical fault coming.

CERN has a history of overly optimistic LHC timetables. Isn’t the current schedule of first-beam in July/August rather aggressive?

I’m pretty confident there will be collisions this year. There’s a great determination, but caution is the guiding principle. There will be a meeting in Chamonix in early February after which a realistic schedule will be announced.

Does the world have the appetite for “Big Bang Day II”?

It won’t be as big as 10 September. We’re not going to be inviting anyone back for first-beam this year. Journalists are keen to see first collisions even though it may involve camping out with us for a week. The whole process will be webcast. CERN was unwilling to invest in bandwidth before the 10th so the webcast fell over very early in the morning, but we’ve since had companies offering us bandwidth in exchange for having their logo displayed.

This year CERN won’t be chasing the media or the blogs. We’ll be the primary source of news about [the lab].

What do you think the LHC will find and when?

Well, you should probably ask the people who are working on it.

Come on, you’ve got a PhD in physics!

We’re going to find the Higgs particle if it exists, and I can’t think of any reason why it doesn’t. But that will take a year’s worth of good data. What I would really like to see, although it would be a nightmare from a communications point of view, is a flood of supersymmetric particles as soon as we switch on. Some models say that could happen, and if so we’ll have a group of people saying “wait!” and another group saying “look, this is a signal!”

How are you going to manage information when data arrive and rumours spread?

We’ve got protocols in place for the experiments so that if they really feel they are ready to make an announcement then we move very fast and organize a seminar here very quickly. One thing you’ll see this year is that CERN’s official communications won’t be chasing the media or the blogs. We’ll be the primary source of news about CERN.

Is that realistic, given that bloggers can brain-dump a post in a matter of minutes?

I think we’ve got to try. If there’s someone who’s blogging about a “three sigma” effect [meaning there is less than 1% chance it is a statistical fluke] that’s been verified, then there’s no reason why we wouldn’t talk about it as well. But if someone is blogging about a three sigma effect in their own particular analysis which hasn’t gone through the official verification process in their experiment then we will deny it, which may come to releasing a statement.

Are you planning to implement rules on blogging, as the CDF collaboration at Fermilab has done in response to rumours about Higgs sightings?

Yes. Some of the experiments have them already as a result of what happened there.

Isn’t that an attempt to censor information?

It’s an attempt to stop blogs fuelling rumour. Nobody wants to clamp down on people releasing information about results that have passed through official quality control in an experiment.

What’s the best question you’ve ever been asked about the LHC?

Visitors often ask first whether they can ask a really stupid question, and then come out with something profound and unanswerable that goes straight to the core of what we’re doing here.

Chilly solution to neutrino mass problem

Physicists in the US have put forward a new way of measuring one of the most important but elusive quantities in particle physics — the mass of the neutrino. The proposed experiment borrows techniques from atomic physics to lower the temperature of tritium atoms to nearly absolute zero and then study their beta decay.

A more precise measurement of neutrino mass is important because it will tell physicists what sort of theory is needed to extend the Standard Model of particle physics. A better value could also help astrophysicists work out how much of the universe’s dark matter can be accounted for by neutrinos.

The Standard Model itself assumes that neutrinos have zero mass. However several different experiments have proven that neutrinos oscillate between three different types or “flavours” — which can only happen if neutrinos have mass. Unfortunately, oscillation measurements are only sensitive to differences between the squares of the masses of different flavours, not absolute mass.

Physicists do know that the neutrino mass is less than 2.2 eV because of two independent experiments involving beta-decay — whereby a neutron in a tritium nucleus turns into a proton, emitting an electron and an antineutrino. Careful measurements of the energies of a large number of such electrons gives an upper limit on neutrino mass. A larger version of these experiments called KATRIN switches on in Germany in 2012 and should be able to further constrain the neutrino mass to 0.2 eV.

Nature does not easily reveal the secrets of the neutrino mass Guido Drexlin, KATRIN project

Other experiments, meanwhile, will measure neutrino mass by studying a very rare form of beta decay in which two neutrons inside a nucleus transform into protons at the same time, emitting two electrons but no neutrinos. The half-life of this process depends on the value of neutrino mass. Neutrino mass can also be deduced from measurements of the cosmic microwave background — the current upper limit is 1 eV — but this result depends on certain assumptions about the evolution of the universe.

Measuring motions of various particles

Now, a team including Mark Raizen at the University of Texas at Austin and Joshua Klein at the University of Pennsylvania has proposed a new way of making extremely precise measurements of the motions of the various particles involved in the beta decay of tritium.

To ensure that the system is initially as close to absolute standstill as possible, this proposal involves cooling a gas of tritium atoms to within a few millionths of a degree of absolute zero. Raizen’s group is currently working out how to cool tritium and other hydrogen isotopes by slowing down beams of these atoms using magnets and then gradually transferring their remaining momentum to individual photons. They then propose two alternative ways of measuring the neutrino mass (arXiv:0901.3111).

The first of these involves studying what is known as the “boundstate” beta decay of tritium. In this process, which has not yet been observed, the emitted electron becomes bound into the helium atom rather than escaping. This makes the decay a straightforward two-body process, in which the energy of the emitted neutrino is equal to the energy difference between the initial tritium atom and the daughter helium atom. The neutrino mass can be calculated simply by measuring the velocity of the nuclear recoil, which can in principle be done by detecting the time it takes for the helium atom to arrive at a detector placed some known distance from the tritium source.

The second method studies the three-body beta decay of tritium and involves using various detectors to work out the momenta of the daughter helium ion and the emitted electron. The neutrino mass is then calculated directly from these quantities.

Maximizing the number of decays

Both of these techniques rely on maximizing the number of decays in order to achieve as high a sensitivity as possible. Raizen and co-workers have worked out that by trapping some 1013 tritium atoms — the maximum that can feasibly be attained over the course of about a year — they would reach a somewhat disappointing upper limit on the neutrino mass of around 9 eV with the first approach, but could get down to roughly the limit possible with KATRIN — 0.2 eV — with the second.

These figures, however, were generated using a computer simulation. Whether the 0.2 eV limit could be reproduced in an actual experiment — which the researchers hope to carry out within the next decade — depends on overcoming several significant engineering challenges, not least of which is finding out how many atoms can in fact be trapped.

KATRIN project leader Guido Drexlin says that the approach of Raizen’s team is, in principle, very interesting. But he emphasizes that much work must be done to put the idea into practice. “Nature does not easily reveal the secrets of the neutrino mass,” he adds. “The name of the game in direct neutrino mass sensitivity is always the same: statistics, statistics, statistics, and carefully control all your systematics.”

Happy Chinese New Year

chinese new year.jpg
Lanterns in Kota Kinabalu, East Malaysia

by James Dacey

Last Monday was dubbed Blue Monday after “official” calculations deemed it to be the most depressing day of the year. Thankfully, this Monday, things are a lot more celebratory; the colour red takes centre stage as more than a billion people across the globe celebrate Chinese New Year.

Physics World would like to extend you all a warm welcome to the year of the Ox!

It struck me today that this year’s celebrations have fallen especially close to the Gregorian New Year. In my ignorance I’ve only just realised that the date changes each year – but how and why?

Well, if you were as in the dark as I was, check out this short video by Xinhua, a Chinese Government news agency. It gives a nice overview of key dates in the Chinese New Year Calendar.

And this year’s festivities seem to be in full flow already. According to Xinhua, Beijing last night was covered in 68 tonnes of firework debris.

UN Secretary-General Ban Ki-moon on Friday sent a message in Chinese, which read: “Happy New Year to the Chinese people and all the ethnic Chinese all over the world.”

One more slightly interesting fact for you: 2009 is the year of the Ox — the “brave leader” — and famous “oxen” include Barack Obama…

… but I’ll leave it there because this is rapidly slipping away from physics!

China to lure more scientists from abroad

China has announced a new five year plan to attract more scientists to the country. The Chinese Academy of Sciences (CAS) says that it will pay for “thousands” of overseas scholars and scientists to come and work in China over the next five years.

The country hopes to scoop 1500 “leading” scientists, accelerating the “100 People” plan, begun in the mid-1990s, which sought to attract 100 top overseas scientists each year. In selecting scientists “practical contributions” will be considered over academic achievements, according to a statement on the CAS website.

Long-term plan

This new drive by the CAS follows the release of new government guidelines that call on state enterprises and academic institutions to attract more overseas scientists, especially those on the cutting edge of science and technology.

“With this new talent project, China expects to break technological bottle-necks and enhance its research abilities and sci-tech levels in the least time,” said the CAS statement.

Over the past decade, the Chinese government has doubled the percentage of its GNP that it spends on research and development, representing a total investment of $140bn. This has led to a large increase in the number of articles by Chinese researchers in academic journals across the globe.

During this time, China has tended to focus investment on specific areas of science where practical applications are likely — such as nanoscience, where it is second only to the US in terms of number of published papers.

China plans to double investment again by 2020 and the latest projects will continue to be “in line with the national strategic developments” by concentrating on “key technology, sci-tech industries and new emerging subjects.”

Despite this scientific boom, one concern is that many young Chinese researchers, after training in China, then leave to apply their skills in the US and Europe.

One of the aims of the 100 People programme has been to attract Chinese scientists home after developing skills in the west. According to the academy, 81% of CAS academicians and 54% CES (Chinese Academy of Engineering) academicians are now returnees, and 72% of leaders of the National Key Projects are scientists returned from abroad.

Muons reveal upper atmosphere’s temperature

Scientists interested in the upper atmosphere should turn their attention to measurements made deep underground — says an international team of physicists who have noticed that the number of cosmic-ray remnants hitting Earth is linked to freak warming events in the upper atmosphere. The link implies that measurements of cosmic rays — both future and past — could help scientists improve climate and weather forecast models.

Cosmic rays are mostly high-energy protons and are constantly bombarding atoms in Earth’s atmosphere to create pions. These pions either decay into lighter muons or continue to interact with nearby atoms and avoid decaying into muons. If the atmosphere is cool and thick then the chance of continued interactions is much higher, and the number of muons generated is therefore far fewer than when the atmosphere is warmer.

As a result, underground experiments that detect these characteristic muon flashes tend to see more events in summer than in winter. But now a team of researchers has found that muons can also reveal big changes in atmospheric temperature on very short timescales.

Lurking in the background

The collaboration — which is led by Scott Osprey and others at the National Centre for Atmospheric Science (NCAS) in the UK — began considering this possibility while studying the background particles reaching the underground AMANDA neutrino experiment at the South Pole. In data recorded in 2002, when the upper atmosphere in the southern hemisphere was warmer than usual, they noticed a rise in the number of muons.

“They [the AMANDA team] didn’t know this was in their data until we showed them,” says Giles Barr, an Oxford University particle physicist who is part of the NCAS collaboration.

To confirm that the effect was real, the collaboration turned to their own muon data, taken in the northern hemisphere from the MINOS neutrino experiment at a disused iron mine in Minnesota, US. “It’s much easier to know what systematic detector effects need checking in your own data,” adds Barr. Again, they saw the correlation.

The most surprising feature of the correlations was the fact that they could even flag spikes in high-altitude temperature, sometimes as much as 40 degrees, taking place over just a few days. These events, known as sudden stratospheric warming, are thought to occur owing to the passage of huge atmospheric “Rossby” waves which shift heat from the tropics to the poles. In the past meteorologists had only satellites and weather balloons to study sudden stratospheric warming — now, claims the NCAS collaboration, they could use cosmic ray measurements too.

“They are important to study not only because they represent the most striking display of variability in the stratosphere, but also because of their links with weather occurring [lower down] in the troposphere, where our weather takes place,” explains Osprey, also at Oxford University.

More data needed

Serious measurements of cosmic rays go back more than 50 years, way before satellites began examining the atmosphere. This means that the cosmic-ray measurements could be used to corroborate past balloon data and perhaps give meteorologists a better handle on how the climate has evolved. Also, says Osprey, it is possible that cosmic rays will turn out to have niche applications in weather monitoring.

“Historical records of muon rate might be interesting — certainly the stratosphere was poorly observed prior to the 1960s,” says Peter Haynes, an atmospheric scientist at the University of Cambridge who is not involved with the research. “The muon rate seems to provide an absolute measurement of temperature — that is, there is no bias problem.”

However, Haynes points out that the muons reflect atmospheric temperature at a single location, which may make it difficult to see how the climate is changing as a whole. There are only a handful of other muon detectors in the world, and it would hard to justify building more for weather purposes alone.

Nevertheless, Barr thinks that it is a exciting discovery: “Lots of things affect cosmic rays on their journey through the galaxy, into the solar system and then into the atmosphere. Historically, cosmic rays have been notoriously hard to work with, so to understand how to use them to measure something is very satisfying.”

The research will be reported in an upcoming issue of the journal Geophysical Research Letters.

Atoms teleport information over long distance

Experimental set-up

Physicists have teleported quantum information between two atoms separated by a significant distance, for the first time. Until now this feat had only been achieved between photons, and between two nearby atoms through the intermediary action of a third. According to researchers, this advance could be a significant milestone in the quest for a workable quantum computer.

Quantum teleportation is a remarkable form of transport only available to particles at the atomic and subatomic scales. Information such as the spin of a particle or the polarization of a photon can be transferred between particles without travelling across a physical medium. Teleportation is made possible by the feature of quantum mechanics known as “entanglement”.

According to quantum mechanics, when particles become entangled the very act of measuring the quantum state of one particle instantly reveals information about the state of the second. In theory, this effect should occur regardless of distance between particles. In practice, it is very difficult to observe because of external influences. If the particles interact uncontrollably with the environment or if you try to record two quantum states directly — entanglement vanishes.

Now researchers at the University of Maryland and the University of Michigan have successfully teleported quantum information between two ytterbium ions separated by 1 m, reporting a 90% success rate. They employ a new method of teleportation where ions are stimulated to emit photons and the quantum states are inferred from the colour of these emissions (Science 323 486).

Our system has the potential to form the basis for a large-scale ‘quantum repeater’ that can network quantum memories over vast distances 

Christopher Monroe, University of Maryland

“Our system has the potential to form the basis for a large-scale ‘quantum repeater’ that can network quantum memories over vast distances” said group leader Christopher Monroe of the University of Maryland.

Double entanglement

In quantum teleportation the sender (Alice) instantaneously transfers the quantum state of a particle to a receiver (Bob). In 1997 physicists achieved teleportation of quantum states between photons for the first time. Their methods exploited the uncertainty principle: Alice could not know the exact state of her photon, but the effect of entanglement meant that she could still teleport her state to Bob.

Then in 2004 separate teams of physicists at the National Institute of Standards and Technology (NIST) in Colorado and the University of Innsbruck in Austria demonstrated teleportation at the atomic scale for the first time. Using slightly different methods, they transferred spin between pairs of ions trapped in a harmonic potential. Unfortunately, teleportation — using these methods — is restricted to very short distances because harmonic potentials are molecular in scale.

Now Monroe and his team have taken teleportation in a different direction. They first isolate the ytterbium atoms in separate vacuum traps surrounded by electromagnetic fields. Each ion — in its ground state — is then irradiated with a microwave burst which puts the ions in a superposition of two different quantum states. Next, a short laser pulse excites each ion which subsequently leads them to emit photons whose colour is a superposition of red and blue — linked with the two available quantum states.

The red photon or the blue one?

Once generated, these photons are then directed towards a beamsplitter where they have an equal chance of passing through or being reflected. There is a detector on either side of the splitter. According to the researchers, a red-blue combination detected at exactly the same time is a clear sign that the ytterbium atoms are entangled. When this occurs the researchers immediately return to the ions where they determine the quantum states using a process known as quantum tomography.

“One particularly attractive aspect of our method is that it combines the unique advantages of both photons and atoms,” said Monroe. “Photons are ideal for transferring information fast over long distances, whereas atoms offer a valuable medium for long-lived quantum memory.”

The next step for this research is to further improve the success rate of the measurements. “We are looking into putting an optical cavity around each atom — which could yield orders of magnitude improvement in the success rate of the system,” said Steven Olmschenk, a member of the research team.

Boris Blinov at the University of Washington told physicsworld.com: “Olmshenk and his colleagues have generated an entanglement and used it to teleport quantum data in what I consider the most promising ‘qubit’ [quantum bit] candidate yet — the trapped ion system.” He added, “We are one major step closer to the our elusive goal.”

Chu sworn in as DOE head

The Nobel-prize winning physicist Steven Chu has been sworn in as secretary of the US Department of Energy (DOE) — a post that makes him a member of President Barack Obama’s cabinet.

Chu is the first working scientist to head the Department of Energy — which is a major source of physics research funding — since it was created in 1977. He is an expert on energy policy and well matched to the challenges he will face. In his previous role as director of the Lawrence Berkeley National Laboratory in California, he refashioned the lab to focus on alternative-energy research and is a passionate advocate of biofuels.

Meanwhile, Obama has named physicist and long-time arms reduction advocate John Holdren as his science advisor. Holdren will be an “assistant to the president”, which should see him attending Cabinet meetings.

More money for physics?

Elsewhere in Washington, members of the Democratic Party in the US House of Representatives have unveiled an $800bn bill to stimulate the US economy. It includes an extra $3bn this year for the National Science Foundation — half its 2008 total of $6bn. Similarly, the Department of Energy’s science office, which received $4bn last financial year, stands to gain an extra $2bn, including $400m to start up an Advanced Research Projects Agency for Energy, modelled on an existing network for defence technologies.

The National Institute of Standards and Technologies does even better, with $520m extra; its FY 2008 budget was $737m. NASA, meanwhile, whose budget totalled $17.1bn last year, would receive $600m, $400m of which will go towards science projects.

While some of this money could find its way to US physicists, how much of this two-year package will go on new research projects as opposed to infrastructure — or even if the bill will be passed at all — is not yet clear.

Aspiring physicists should rock

may.jpg
Astrophysicist Brian May shows how it’s done

By Hamish Johnston

…or dance or act, if they want to succeed.

If you are a British teenager aspiring to a career in physics, you could be better off at a “School of Rock” or “Fame Academy” than a school that specializes in science — at least according to a study by researchers at the University of Buckingham.

The work, which was reported today by the BBC, reveals that students who attend schools that focus on the arts do better on physics exams than those at schools that were set up to encourage the sciences.

In 2007, for example, about 24% of students at specialist science schools who wrote the “A-level” physics exam achieved an A grade. Compare this to the 36% of pupils who achieved a physics A grade at music schools.

This is a big difference — but you must keep in mind that 124 science schools were polled, whereas only seven music schools were looked at — so I’m not sure of the statistical significance of the 36% figure.

When the team looked at 34 schools that specialized in languages, they found that 26% of students bagged an A in physics. Meanwhile, aspiring physicists enrolled at specialist maths and computing schools managed 24%.

The study focussed on physics and didn’t look at other science exams such as chemistry or biology.

Why do students at music schools do better?

(more…)

Micromotor could navigate human bloodstream

Researchers in Australia have built an electric motor just 250 µm wide that could be used to power tiny robots narrow enough to be injected into the human bloodstream, making new kinds of surgery possible.

The motors work by converting the vibrations of a piezoelectric material into rotary motion that could then be used to drive whip-like structures called flagella — mimicking how some bacteria and other micro-organisms swim.

The team claims that their piezoelectric motor is the first such device to be smaller than 1 mm and — with some improvements — could be powerful enough to drive a robot against the flow in the human bloodstream.

Propulsion is a challenge

Modern vascular surgery often involves inserting a very thin tube — or catheter — into a blood vessel in order to remove a blockage or repair damage. While this technique is often much safer than cutting open a patient, it is sometimes not possible to perform because blood vessels can be too narrow or to labyrinthine to navigate using a catheter.

Some researchers believe that surgery could be made even less invasive by using tiny, self-propelled robots that could be injected into a patient and controlled remotely by a surgeon. One challenge facing designers of such devices is how to propel them through the bloodstream to the right place in the body.

The new motor, built by James Friend and colleagues at Monash University, could do the trick. It comprises a tubular “stator” with a helical slit cut in it and mounted on a piezoelectric material (J. Micromech. Microeng. 19 022001). When an alternating voltage is applied to the material, it vibrates at about 660 kHz. This causes the stator to act like a whip, with its free end following an elliptical path. The free end of the stator is in frictional contact with a rotor, which is spun around by being whipped by the stator.

While this design is not new — much larger piezoelectric rotational motors were first developed in 1980s — Friend told physicsworld.com that the Monash design is much simpler than existing motors, making it easier to scale down to sub-millimetre dimensions.

Nearly enough power

The team was able to operate the motor at 1295 revolutions per minute at a torque of 13 nNm — which is a swimming power of about 4 μW. This power could then be used to propel the motor through a fluid by attaching a whip-like structure called a flagellum to the rotor.

Calculations done by the team suggest that the motor can only deliver about one-fifth of the power needed to drive a tiny robot against the flow in a small human artery — however, the team are hopeful that the power could be boosted in the future.

Friend added that he hopes that tiny piezoelectric motors could be commercially available by 2020.

A video of the micromotor can be seen here.

Copyright © 2025 by IOP Publishing Ltd and individual contributors