Skip to main content

Is entanglement always good for quantum computers?

The entanglement of quantum bits (or qubits) is what should allow quantum computers to perform certain calculations much faster than the computers we use today. But now, physicists in Germany and Canada are saying that most qubits could be “too entangled” to be of any use in quantum computers.

At the heart of any quantum computer qubits that are entangled — which means that they that have a relationship that is much stronger than that allowed for in classical mechanics. In qubits that are photons, for example, “1” and “0” could be represented by two different polarizations states. If two photons are entangled a measurement of the polarization of one of the photons would reveal the polarization of the other — no matter how far apart the photons are.

It is this phenomenon that can be used to perform certain calculations much faster than conventional computers.

Is more entanglement better?

The conventional wisdom is the greater the entanglement the better — but an important question facing anyone trying to build a quantum computer is whether any entangled state could be used to perform quantum calculations.

For most states, no such trick exists David Gross, University of Braunschweig

If it doesn’t matter, one could choose the entangled state that is technologically easiest to work with to create a computer. But, if only a few suitable states exist, the challenge becomes how realize these specific states in a given physical system.

Any useful quantum computation must ultimately involve measuring the values of the quantum states. However, if one measures the state of an individual object, the statistical nature of quantum mechanics means that the result will be random — and completely useless for doing quantum computations.

Correlations are key?

All is not lost, however, because the outcomes of measurements on several entangled objects are correlated — and it is these correlations that could be used to do calculation. Physicists have proposed specific schemes for measurement-based quantum computation (MBQC) in which such correlations could be used to perform calculations. However, it is not clear whether there exists a universal approach that would work with any system in an entangled state.

Now, David Gross of the University of Braunschweig and colleagues at the University of Potsdam and the Perimeter Institute have show that “for most states, no such trick exists”. The team came to this conclusion by studying a general system of highly-entangled qubits used to perform a certain mathematical calculation. They were able to prove that the number of states that could actually be used to perform the calculation was incredibly small — meaning that entanglement would not speed up the calculation at all (arXiv:0810.4331).

This result is sure to disappoint physicists who are trying to create real-life entangled systems in the lab because it suggests that most states can never be used to perform quantum calculations — no matter how hard they try.

The good news, according to Gross, is that certain special systems — some of which have already been discovered — can still fulfil the requirements of universality. The key to quantum computing success could therefore be to first identify these special systems, and then try to realize them in the lab.

Einstein and Eddington film

By Margaret Harris

Einstein and Eddington
Andy Serkis as Einstein and David Tennant as Eddington

Albert Einstein is certainly the most famous scientist of the 20th century, and probably one of the most important in all of human history. So great is Einstein’s reputation that it makes that of Arthur Stanley Eddington — a fine observational astronomer and a gifted popularizer of science — seem like footnote fodder. Yet without Eddington’s 1919 eclipse expedition, which provided early proof of general relativity, Einstein’s discoveries might have languished for years before becoming known outside the German-speaking scientific community, let alone amongst the general public.

The connections between Einstein and Eddington are the subject of a new film from the BBC, starring David Tennant of Doctor Who fame as a troubled, repressed Eddington and Andy Serkis (best known as the model for Gollum in the Lord of the Rings films) as a flawed but likeable Einstein. Einstein and Eddington airs on BBC2 on 22 November at 21:10 and is well worth a watch — if mostly for the human drama, rather than the scientific content.

(more…)

Excess of electrons could point to dark matter

An experiment that dangled from a balloon high above the Antarctic may have found the most convincing evidence yet that dark-matter particles are annihilating within our own galaxy. Indeed, Earth could even be whizzing through a clump of annihilating dark matter right now.

The Advanced Thin Ionization Calorimeter (ATIC) detected an excess of high-energy cosmic ray electrons in the 300–800 GeV range — an unexpected feature that could be caused by the annihilation of weakly interacting massive particles (WIMPs) — one of the leading candidates for dark matter. If confirmed, the signal could be the first direct detection of dark matter and help physicists understand the nature and origin of this mysterious stuff that appears to make up about 22% of the mass of the universe.

The discovery comes hot on the heels of a similar report of an excess of cosmic-ray positrons by the PAMELA experiment, which could also be a signature of WIMP annihilation.

In either case, we believe this is an unexpected and exciting result John Wefel, principal investigator for ATIC

According to the international team of scientists running ATIC, the excess of cosmic electrons cannot be explained by the standard model of cosmic ray origin and propagation in the galaxy. Instead, it suggests the existence of a hitherto unknown and nearby source of high-energy electrons (Nature 456 362).

WIMPs or a new astronomical object?

“We have possibly either the first detection of a nearby source of particle acceleration from an as yet unstudied object, or the detection of the signature of predicted dark matter particle annihilation,” explains John Wefel at Louisiana State University in the US. Wefel, who is principal investigator for the ATIC experiment added, “In either case, we believe this is an unexpected and exciting result.”

Indeed, the team was not searching specifically for dark matter — ATIC was launched in 2002 to gain a better understanding of the very-high energy cosmic ray electron spectrum.

The researchers had expected to see electrons from unknown sources outside our galaxy and those cosmic ray electrons produced closer to Earth in the interstellar medium by the interactions of cosmic ray protons and helium nuclei with interstellar gas. Instead, they found something very different.

Mysterious ‘bump’

Their measured spectrum followed the predicted spectrum up to about 100 GeV but then began to rise to produce a distinct ‘bump’ between about 300 and 800 GeV, far exceeding the predicted flux of electrons, before having a cliff-like drop-off to return back to the predicted spectrum above 800 GeV.

What is unique is that it is the first time a feature of this kind has been discovered in the cosmic ray electron spectrum James Adams, NASA

“This bump was not expected and at its peak, the excess flux of electrons is about three times above the predicted flux,” says James Adams, a member of the team at NASA’s Marshall Space Flight Center in Alabama. “What is unique is that it is the first time a feature of this kind has been discovered in the cosmic ray electron spectrum and it cannot be explained by the standard model of cosmic ray origin and propagation in the galaxy.”

One possible explanation may be that the bump is due to a strong nearby cosmic ray electron source, such as a supernova remnant, a pulsar wind nebula or an intermediate-mass black hole. However, no suitable nearby object is known to exist.

A more intriguing alternate explanation is that the source is caused by the annihilation of dark matter particles. Scientists now believe dark matter makes up the bulk of all matter in the universe, outnumbering normal matter by 5 to 1. However, its nature remains a mystery. It is fundamentally different from normal “luminous” matter such as stars as it is invisible to modern telescopes, giving off no light or heat, and interacts only through gravity, making it difficult to detect.

Physicists believe dark matter particles such as WIMPs can collide and annihilate each other, producing electron-positron pairs and emitting tell-tale electron patterns that could be detected by space-based telescopes.

Kaluza-Klein a likely candidate

“There is such a candidate dark matter particle, the Kaluza-Klein particle,” explains Adams. “The electron spectrum from the annihilation of Kaluza-Klein particles fits the measured bump well, if we assume that there is a local clump of Kaluza-Klein particles with a density about 200 times the galactic average.”

Other cosmologists are excited by the finding. “The possibility that we may be seeing the products of dark matter annihilation within a short distance from the Sun is very exciting,” says Piero Madau at the University of California, Santa Cruz.

The possible dark matter interpretation is exciting, but it may be quite some time before the smoke clears on this and a consensus emerges Stephane Coutu, Pennsylvania State University

Others are more cautious. “The electron excess they observe is intriguing as cosmic electrons must come from relatively local sources, but what these sources are is very speculative,” says Stephane Coutu at Pennsylvania State University. “The possible dark matter interpretation is exciting, but it may be quite some time before the smoke clears on this and a consensus emerges.”

Meanwhile, there are other hints of interesting signals, for instance in the recent cosmic positron measurements by the PAMELA satellite experiment. While everyone isn’t sure yet whether the two results are fully consistent with each other, both seem to indicate either “new physics” or “new astrophysics” in the 100–500 GeV region of the cosmic-ray spectrum.

“We will have to see what other instruments observe, such as the Fermi satellite project now in operation to detect energetic cosmic gamma rays,” says Coutu. “Only through a convergence of many separate independent hints will the interpretation crystallize.”

New accelerator enhances radiotherapy accuracy

A new linear accelerator for calibrating the energy doses delivered during radiotherapy has been unveiled at the National Physical Laboratory (NPL) in the UK.

Costing £1.5m, the new accelerator (or linac) will be used to calibrate radiation dosimetry equipment used in hospitals in the UK and Ireland. The linac replaces an accelerator installed in 1974, and will allow much quicker and more accurate equipment calibration than possible before.

Quality assurance plays a fundamental role in radiation treatment of cancer: while modern techniques offer the ability to deliver precise doses of radiation to tumour tissue, this advantage is lost if the equipment is not stable and accurate. Regular and precise calibration of radiotherapy apparatus is thus an essential procedure for hospitals.

The new clinical linac — officially launched at NPL last week — is scheduled to provide its first calibration services early next year. The custom-designed system will enable NPL to calibrate the full range of energies currently in therapeutic use in the UK, as well as characterize newer techniques such as intensity-modulated radiotherapy (IMRT) and image-guided treatments.

“I am absolutely convinced that this facility will play an important part in ensuring radiotherapy in this country is of the highest quality and contributes to the overall fight against cancer,” Mike Richards, national clinical director for cancer at the UK’s Department of Health, told the assembled crowd at the formal opening ceremony.

Absolute standard

As the UK’s National Measurement Institute, NPL is tasked with developing, maintaining and disseminating the UK’s primary standards of absorbed dose (as well as locking these into international standards). Radiotherapy facilities throughout the UK and Ireland (and potentially further afield) can then send their secondary standards — such as ionization chambers or other newer dosimetry devices — for calibration against the primary standards.

Such a service is not a new venture for NPL, which has been offering radiation dosimetry calibration since 1969. However, the existing linac was installed in 1974 (and was second-hand then) and is reaching the end of its operational life. In addition, while the old linac offered calibration using beam qualities “similar to those used for photon radiotherapy”, the new system will provide calibration using identical set-ups to those employed clinically to irradiate cancer patients.

The machine in question is an Elekta Synergy digital linac, manufactured not too far from NPL at Elekta’s Crawley facility. The Swedish medical technology vendor designed a one-off system for NPL that can deliver seven X-ray energies (as opposed to the two or three usually used in hospital linacs), plus up to ten electron-beam energies.

“Instead of spending two to three hours coaxing beams out of a 40-year-old linac, we can now just press a button and a stable beam comes out,” said Martyn Sené, interim managing director of NPL. “This streamlined quality assurance maximizes the availability of radiotherapy facilities without compromising treatment quality.”

Crucially, the Elekta Synergy enables dosimetry of the small fields and composite fields employed in advanced modalities such as stereotactic radiosurgery and IMRT. The machine also comes with a range of image-guidance facilities, including iViewGT portal imaging and Synergy XVI 3D X-ray volumetric imaging, and NPL plans to support research into new procedures for using such imaging capabilities more accurately.

Work in progress

NPL researchers are now pulling out the stops to test and commission the new equipment. Current work includes performing Monte Carlo modelling of beams from the old and new linacs to compare the standards operating in the two facilities. To date, three X-ray energies have been characterized (6, 10 and 15 MV photon beams), plus nine electron beams, enabling NPL to deliver first batch of calibration services early next year.

The first stage, scheduled for spring 2009, will be the transfer of existing reference dosimetry services (using the abovementioned X-ray beams) into the new facility. Further down the line, NPL will commission the remaining beams (4, 8, 18 and 25 MV) around June of next year and begin offering calibration of small and composite fields. The new facility will also be employed for training in dosimetry techniques, with courses scheduled to start next summer.

As well as offering calibration services, NPL is putting a strong emphasis on the R&D capabilities offered by the advanced linac. “We plan to exploit the new features on this facility to support radiotherapy developments,” said Sené. “We want to ensure that all radiotherapy facilities can take advantage of the latest technologies such as IMRT or IGRT that enable delivery of dose exactly where it’s needed.”

Having been the first to come up with ideas for tomotherapy standards, NPL is focusing much research effort on developing new techniques and recommendations for reference dosimetry of non-standard fields. “We are building a calorimeter to perform absorbed-dose calorimetry on small-field treatments like IMRT, tomotherapy and the Gamma Knife,” said Mark Bailey, senior research scientist at NPL.

UK schools to get 1000 telescopes

ras.jpg
More UK pupils could soon be peering through telescopes (Courtesy: RAS).

By Hamish Johnston
In 1609 the Tuscan polymath Galileo Galilei was the first astronomer to point a telescope skywards. He went on to discover sunspots, mountains on the Moon and four of the moons of Jupiter.

To mark this milestone in the development of modern science, the United Nations has declared 2009 the International Year of Astronomy.

Now, to celebrate the 400th anniversary of telescope-based astronomy, 1000 secondary schools in the UK will be given telescopes — paid for by the Society for Popular Astronomy, the Royal Astronomical Society and the UK science-research funding body STFC.

(more…)

Nanorotors move together

Researchers in China and the UK have made a new type of nanometre-sized rotor with an off-centre axis of rotation. The researchers have also made arrays of the devices that spread over distances as large as micrometres.

The individual rotors in the arrays work in concert, something that the team believes will be crucial for making molecular machines. Such machines could be used as tiny autonomous “nanorobots” in the future that would perform a wide range of tasks, such as assembling electronic circuits or delivering drugs to specific parts of the body.

Easier to control

Although physicists have been able to make molecular rotors, previous devices had no fixed axis and were thus not easy to control, which made it difficult to integrate them into real working devices. Using scanning tunnelling microscopy, Hongjun Gao and colleagues of the University of Liverpool, UK, and the Chinese Academy of Sciences have now shown that single tetra-tert-butyl zinc phthalocyanine molecules on gold surfaces have a well-defined rotation axis fixed on the surface (Phys. Rev. Lett. 101 197209). “The result is an important milestone in making practical single molecule devices,” Gao told physicsworld.com.

The researchers made the device by evaporating molecules of tetra-tert-butyl zinc phthalocyanine onto a gold crystal. The molecules adsorb by attaching to a single gold atom off centre, at the position of nitrogen molecules on the zinc phthalocyanine.

Thermal excitation

The device works using thermal excitation — the molecules are not in their ground state and so rotate. The fact that the centre of rotation is at the edge of the device and not at its molecular centre means that it acts like a wheel with an axle attached at the perimeter, explained Gao. “We can therefore also excite rotations by means other than heat, such as electron transport through the molecule from its gold attachment,” he said.

The device might also be rotated by placing a magnetic atom at the molecular centre. In this way, current travelling through the molecule from the gold lead would interact with the magnetic moment at the centre, and the magnitude of the current used to determine rotation velocity. “Now we have a rotating magnetic field — a key component of a generator,” explained Gao.

Back in the lab, the team is currently trying to create arrays with magnetic atoms at the centre. The researchers are also trying to change the rotation direction of their rotors with current flow.

Droplets wobble and dance

athene.jpg
High-speed photos of actual dancing droplets (grey left half of objects) along with the mathematical description of the normal modes (coloured right half of objects).

By Hamish Johnston

There’s a paper in the New Journal of Physics today about how to make droplets of oil “dance” on the surface of a vibrating bath.

As well as floating over the surface, the droplets also seem to deform periodically in a number of distinct normal modes. In my favourite example, a droplet literally goes pear-shaped before wobbling back to something resembling a doughnut.

The research suggests that it may be easier than previously thought to levitate tiny amounts of liquid.

You might be wondering why it is important to levitate droplets? Well, it could be used to manipulate tiny amounts of liquid without actually touching it — something that could be useful in chemical or biological analysis techniques that are very sensitive to contamination.

Movies of real-life droplets as well as computer simulations can be seen here. WARNING: Their lava-lamp-like oscillations can mesmerize!

The trick to talking science: explain the ‘how’ and the ‘why’

By Jon Cartwright

DeRegules.jpg
De Regules: “Science is the stance that the scientist adopts vis-à-vis the natural world”. (Credit: Sergio de Regules)

One of the best features of the web is that it allows readers to give their opinion freely on the news, and at physicsworld.com we appreciate all your comments. In fact, it was while looking back at an article I wrote earlier this year that I came across an interesting comment by a reader called Sergio de Regules, who suggested we ought to have more “science commentators” to cover the history, philosophy, controversies and murkiness that make science so fascinating.

De Regules, 44, is a physicist, writer and musician living in Mexico City. As he tells me via e-mail, he has written a science column for the English-language newspaper The News (a selection of which are now archived on his blog, has edited at the Mexican science title Cómo Ves, has written several other books, and has appeared on radio shows and talks. Presently he is a science communicator at the National Autonomous University of Mexico (UNAM).

I decided it would be worthwhile to ask him for his thoughts on science writing, and what academia is like in Mexico.

JC: What do mean by “science commentator”?

SdR: I like to think of science communication as a way of sharing science with the public. But we all know that science is not so much in the results of research as in the spirit of research, or in the stance that the scientist adopts vis-à-vis the natural world. If the scientific results reported in the news can be viewed as newly conquered territories, science is the strategy by which they are conquered. Explaining the what in a scientific development is very good, but it is the how and the why which are memorable. The science commentator provides these.

(more…)

Rocket to study origin of radio loss in Northern Lights

By Jon Cartwright

JoranMoen.JPG
Joran Moen waits to fire his rocket to investigate radio-transmission loss in the Arctic (Credit: Yngve Vogt)

Flying over the Arctic can be like being on the far side of the moon: if the Northern Lights are particularly active, they will sometimes block all radio signals, thus severing communications with aircraft.

Joran Moen, a physicist at the University of Oslo, might have the key to explaining this phenomenon. Over the next few weeks he will be waiting for the right moment to launch his rocket ICI-2 so that it can fly 350 km into the sky to find the origin of the radio blocking, or “high-frequency backscattering”.

Scientists think the backscattering is caused by turbulent structures in the ionosphere’s electron plasma, which are related to the Northern Lights, so Moen is going to investigate. “The formation mechanisms of the structures are not yet determined, not even the altitude range,” he writes in an e-mail. “We want to study the instability mechanisms that drive the electron plasma turbulent.”

(more…)

First ‘bona fide’ direct images of exoplanets

Two teams comprising researchers from Canada, the US and the UK have taken what appear to be the first “bona fide” direct images of planets orbiting stars outside our solar system, an achievement that has long been considered vital in the search for planets like our own.

Christian Marois of the Herzberg Institute of Astrophysics in Canada and colleagues have used the ground-based Gemini telescopes in Hawaii and Chile, and the Keck telescopes, also in Hawaii, to take infrared images of three giant planets that they claim are orbiting a star about 130 light-years away in the Pegasus constellation (Science Express 10.1126/science.1166585).

These two papers are quite important because they show how far we have come developing the techniques for direct imaging Andreas Quirrenbach, University of Heidelberg

Meanwhile, a team led by Paul Kalas of the University of California has used a camera onboard the Hubble telescope to image a giant planet that appears to be orbiting Fomalhaut, a bright star that lies about 25 light-years away in the Piscis Austrinus constellation (Science Express 10.1126/science.1166609). The planet imaged by Kalas’s team is also the first giant planet to be observed over visible-light wavelengths since Neptune in 1846 — a particular boon for astronomers as this faint part of the spectrum is where they hope to see evidence for life-supporting atmospheres.

Indeed, it is because of the potential for observing atmospheric spectra that direct imaging is often referred to as the “holy grail” of searches for planets outside our solar system. “Imaging discoveries like this are important because they give us the best glimpses into the planets’ luminosities, temperatures and characteristics,” says Joseph Carson, an astronomer at the Max–Plank Institute for Astronomy in Germany.

Exoplanets galore

Astronomers first began detecting extra-solar planets or “exoplanets” in the late 1980s, and since then have found more than 300. The majority of these discoveries have resulted from two methods: either looking for a star’s “wobble” caused by the gravity of a planet as it orbits; or looking for the dimming of a star as a closely orbiting planet moves in front and blocks the star’s rays.

Both of these methods can reveal the orbits and masses of the exoplanets, and the latter “transiting” method can also imply other properties, such as radii and atmospheric composition. But to get good view of the atmosphere of exoplanets orbiting at a reasonable distance — that is, where criteria for habitable conditions are likely to be met — a direct image is essential.

Unfortunately, direct imaging is not easy. Ground-based telescopes suffer from distorting turbulence in the Earth’s atmosphere, and often exoplanets will be hidden in the glare of their parent star. Although astronomers have laid claim to the first direct images of exoplanets before — most recently in September — others have questioned whether these planets are really orbiting, or in fact whether they are planets at all.

To get around the problems of direct imaging, Marois’s team has made use of the wide, 8–10 m apertures on Gemini and Keck together with “adaptive optics” which help cancel out atmospheric distortion to get their images of a trio of exoplanets. Kalas’s team has used a device onboard Hubble known as a coronagraph, which artificially eclipses the parent star to make the glow of their orbiting exoplanet more distinct.

Reliance on theory

The fact that Marois’s team has imaged three exoplanets around one star will reassure astronomers that there are other systems like ours with many orbiting bodies. On the other hand, both parent stars are much more massive than our Sun, and this feature will add to the growing body of evidence suggesting that it tends to be larger stars that harbour the bigger planets.

However, Andreas Quirrenbach, an astronomer at the University of Heidelberg in Germany and a member of the Exoplanet Task Force set up by NASA and the US National Science Foundation, cautions it is “a matter of debate” whether the two teams’ observations mark the first “unambiguous” direct images of exoplanets, because their mass may be too large. Although all four of the new exoplanets are thought to be well below the threshold of 13 times the mass of Jupiter, which is when a planet technically becomes a brown dwarf, the teams’ estimates rely on models of planetary evolution that are subject to sizeable uncertainties.

Still, Quirrenbach agrees that the observations are the most “bona fide” direct images to date. “These two papers are quite important because they show how far we have come developing the techniques for direct imaging,” he says.

Copyright © 2025 by IOP Publishing Ltd and individual contributors