Skip to main content

Do religion and nanotechnology mix?

By Hamish Johnston

No, at least according to a paper published yesterday in Nature Nanotechnology by researchers in the US and Singapore.

The team discovered that people who live in countries with a relatively high level of “religiosity” are less likely to agree that “nanotechnology is morally acceptable”.

Their study on public attitudes towards nanotechnology involved surveys of over 30,000 people in the US and 12 European countries. Americans topped the religiosity scale with a score of about 9 out of a possible 10, and also had the highest percentage of respondents (25%) who did not agree that nanotechnology is morally acceptable.

At the other end of the scale were countries such as the Netherlands and Sweden (5 and 4 on religiosity respectively) where far fewer respondents had negative moral issues with nanotechnology.

On the surface, this study seems to go against the popular notion of Europeans as Luddites, and Americans being keen to embrace new technologies.

The study is also interesting in relation to another paper published in the same issue of the journal but by a different team. This concludes that, “people who had more individualistic, pro-commerce values tended to infer that nanotechnology is safe”.

In terms of national stereotypes, that sounds more like your average American than your average Swede. Indeed, the UK and Ireland — which are usually thought of as individualistic and pro-commerce — also tended to be less morally accepting of nanotechnology than many of their more “socialist” neighbours.

Lamb shift spotted in solid qubit

A tiny shift in quantum energy levels usually associated with individual atoms has been seen in a solid for the first time by physicists in Switzerland and Canada. The team spotted the Lamb shift in a small piece of superconductor that functions as a quantum bit or “qubit”.

The interactions that cause the Lamb shift are also responsible for making qubits unstable, and therefore the team believes that insights from their experiments could be used to create more robust qubits that could be used in quantum computers.

The Lamb shift is a tiny change in certain atomic energy levels. It occurs because the atom is interacting with the empty space surrounding it by absorbing and emitting “virtual” photons. Discovered in 1947 by the American physicist Willis Lamb, the shift provided important experimental evidence for the then emerging theory of quantum electrodynamics (QED), which describes the interaction of charged particles in terms of the exchange of photons.

While the Lamb shift should also affect electrons in a solid, it has proven difficult to see because electron energy levels in solids are wide bands, rather than discrete atomic levels.

Shifting transmon

Now, Andreas Wallraff and colleagues at ETH Zurich in Switzerland and the University of Sherbrooke in Quebec have spotted the Lamb shift in the energy levels of a qubit called a “transmon”, which is made from two tiny pieces of superconductor connected by two tunnel junctions (Science 322 1357).

The superconductor contains a large number of “Cooper pairs” of electrons that can move through the material without any electrical resistance. The energy levels of the qubit are defined by the precise distribution of Cooper pairs between the two tiny pieces of superconductor.

The team’s transmon is placed in a microwave cavity and its shape was chosen to give it a large electrical dipole moment. This increases the strength at which it interacts with both microwave photons and the virtual photons of the vacuum. In addition, the shape and size of the cavity were designed to enhance the photon’s electric field in the region of the qubit.

Transitions between qubit energy levels occur when electrons in the superconductor collectively absorb or emit photons at certain wavelengths. This process can be enhanced by tuning the frequency of microwave radiation injected into the cavity so that a single photon of the correct wavelength bounces back and forth across the qubit many times.

In their experiment, the team used a cavity to enhance the effect of the virtual photons related to the Lamb shift — which makes it more likely that the qubit absorbs and emits virtual photons. Indeed, Wallraff and colleagues measured a cavity-enhanced shift of 1% in the difference between the two energy levels. This is 10,000 times greater than the Lamb shift seen in hydrogen without a cavity.

Tricky measurement

Despite its relative magnitude Wallraff told physicsworld.com that the tricky part of the experiment was measuring the shift. This is because any measurement on the qubit must be made using photons as a probe — and their presence in the waveguide could cause a shift in the energy levels (the a.c. Stark effect), which would overwhelm the Lamb shift.

To get around this problem, the team used a very small number of probe photons that were off-resonance with the cavity. This means that they remain in the region of the qubit only long enough to measure the transition energy but not cause any a.c. Stark shifts.

The discovery of such a large Lamb shift is a mixed blessing for those trying to design practical qubits. On one hand, the virtual photons induce spontaneous emission in qubits, which limits their usefulness for quantum computing. On the other hand, Wallraff and colleagues have established that the Lamb shift can be minimized in a transmon qubit if it is set far from resonance with the virtual photons. This suggests a way of making qubits more robust, as demonstrated in a recent work by a Robert Schoelkopf and colleagues at at Yale University (Phys. Rev. Lett. 101 080502).

Detecting the Lamb shift in a solid system also suggests the possibility of seeing the effects of other virtual particles such as phonons — which are quantized vibrations in solids. According to Wallraff, such an acoustical Lamb shift due to the mechanical quantum fluctuations of nanometer-scale electromechanical oscillators could similarly affect the energy levels of a qubit.

LHC will restart end of June 2009

The Large Hadron Collider (LHC) will switch on again the end of June next year, according to a report that will be published later today by CERN.

On 19 September 2008, nine days after the highly successful “switch on” day in which the LHC circulated its first proton beams, commissioning of the accelerator was interrupted as an electrical fault damaged part of the machine. The fault generated an electrical arc that punctured the cooling enclosure, allowing some six tonnes of liquid helium to evaporate into the tunnel with such ferocity that it broke floor anchors and magnet connections.

At the time, the LHC operations team was commissioning for operation at energies of 5 TeV, and had not yet collided any protons.

Upgrade of pressure-release valves

James Gillies, the chief spokesperson for the CERN particle-physics lab near Geneva, told physicsworld.com that engineers will upgrade pressure-release valves on the machine’s focusing or “quadrupole” magnets to prevent such a helium discharge being so damaging in the future.

This involves warming up the relevant sectors and lifting out 53 magnets to perform the necessary repairs. So far three sectors have been warmed up and 28 magnets have been lifted out. Two magnets have already received the upgrades and are back in place.

Engineers will also upgrade the valves on the dipole magnets, which steer the proton beams. However, they will only do this when the sectors containing those magnets are warmed up for other reasons. “It really is extremely cautious to do this,” says Gillies.

Although it is unlikely that beams will reach their full energy of 7 TeV next year, they should be able to collide at a record-breaking 5 TeV.

BLAST takes off

BLAST_TeamPortrait.jpg
The BLAST team. Credit: Mark Halpern

By Margaret Harris

Things are not going well for the astrophysics “balloonatics” at the bottom of the world. After weeks spent waiting for decent weather, their Balloon-Borne Large Aperture Submilimeter Telescope, or BLAST, has hit a stumbling block. Fairly literally, in fact: the fragile, sensitive instrument has just slammed into the truck being used to launch it. “Oh, you’re (expletive) kidding me,” someone cries in the background, as the stricken telescope sways gently beneath its balloon in the still Antarctic air.

“Step by tedious step, we stumble away from abject failure,” says Barth Netterfield, a Canadian astrophysicist and co-star of the feature-length documentary BLAST, which chronicles the 18 rocky months leading up to the equally rocky launch of the telescope. “And that’s on a good day.” It’s a statement that will bring grimaces of recognition to many an experimentalist’s face, and as a summary of the film, it’s as good as any. If you’re reading this as a PhD student, and your experiment is not going well, take heart: at least it isn’t scattered over a 120-mile stretch of frozen wilderness, with the bulk of it halfway down a crevasse.

(more…)

‘Echoes’ shine a light on Tycho Brahe’s supernova

The supernova first observed by Tycho Brahe in 1572 helped change our conception of the universe, by undermining the Aristotelian idea of the immutability of the heavens. Now, a new study of this massive explosion could help shed more light on the nature of the cosmos.

Research carried out by astronomers in Europe and Japan on SN1572, as the event is known, should help us understand exactly how supernovae occur and might also lead to a better understanding of how the universe has been expanding. By analysing “echoes” of light from SN1572 reflected off a nearby dust cloud, Oliver Krause of the Max Planck Institute for Astronomy in Heidelberg, Germany, and colleagues have proved that it is a so-called type-Ia supernova (Nature 456 617).

Supernovae are the explosions of aging stars, that produce enough light to outshine entire galaxies for a few weeks. They are important scientifically because they seed the universe with heavy elements, providing the raw material for successive generations of new stars. The very well defined luminosity of type-Ia supernova also makes them ideal “standard candles”, allowing astronomers, by comparing their observed and actual luminosities, to gauge distances within the universe and thereby chart the cosmic rate of expansion.

Supernovae occur throughout the universe continually, but those close enough to be of use to researchers happen only rarely. Indeed, in the last 1000 years only six supernovae have been observed taking place in the Milky Way. Unfortunately, the historical observations of these events are not of a high enough quality to reveal precisely what happens during a supernova. Astronomers can study the material left over, known as the remnant, but this cannot provide detailed information on the explosion itself.

Scattered from dust

To get round this problem, Krause and colleagues have instead studied light from SN1572 that has been scattered by a nearby dust cloud. This has been made possible because the cloud, of a suitably high density, is located several hundred light years from the site of the supernova, thereby allowing observation of this light “echo” on the Earth today.

Using the 8.2 m Subaru telescope in Hawaii, Krause’s team found a tell-tale patch of brightness in the night sky close to the supernova remnant that was moving away from the remnant. This, they say, is the echo formed as the flash of light produced by the explosion moves through the cloud.

Echoes of runaway fusion

The discovery of light echoes from SN1572 and another ancient supernova was announced by Armin Rest of Harvard University and colleagues earlier this year. What Krause’s team has done is to use echoes to establish the nature of Brahe’s supernova. Indirect evidence, such as radio and X-ray observations of the remnant, had suggested that SN1572 was a type-Ia supernova, which occurs when a white dwarf star accumulates enough material from a companion star to raise its core temperature sufficiently that it initiates runaway fusion reactions.

However, it had also been suggested that the event could instead have been either a type Ib, Ic or II, which occur when an aging massive star no longer undergoes fusion reactions and its weight causes it to collapse in on itself, flinging off its outer layers in the process.

Krause and colleagues have proved that SN1572 is in fact a type-Ia supernova. They did this by measuring the absorption spectrum of the echo, showing that the event generated silicon but no hydrogen, as would be expected of a type Ia. The researchers also found evidence of calcium ions moving at much higher velocity than the bulk of the explosion debris, suggesting, they say, that the explosion could have been asymmetrical and as such a challenge to existing models of type-Ia explosions.

Comparing old to new

The observation of light echoes will now allow astronomers to characterize other historic supernovae, enabling them to correlate the many observations of the remnants with new direct measurements of the corresponding explosions.

In addition, says Krause, studying light echoes from different parts of the sky could enable researchers to perform a three-dimensional study of supernovae and hence identify any asymmetry in the explosions. Krause also points out that an improved knowledge of the luminosity of type-Ia supernovae could have implications for our understanding of how the universe expands.

Superconductor switches on and off

A superconductor that can be switched on and off with an electric field has been made by physicists in Switzerland, France and Germany. The material could help make resistance-free electronic devices that are faster and more efficient that the transistors of today.

Most electronics devices are structures that contain interfaces between relatively simple materials such as silicon, some sort of metal and silicon oxide, which simply acts as a insulator. However, some researchers are keen to use more complex oxides — with properties such as superconductivity, ferromagnetism and ferroelectricity — which could result in new and more efficient types of devices.

This is like putting two slices of bread on top of each other and finding that a slice of ham appears in the middle

It used to be very difficult to make tiny devices using complex oxides, but thanks to recent progress in experimental techniques, scientists can now create atomically abrupt interfaces between these materials by growing them on top of each other in sandwich-like structures.

Ultrathin superconductor

Now, Andrea Caviglia of the University of Geneva and colleagues at the University of Paris-Sud 11 and the University of Augsburg, have shown that potentially useful electronic states can be found at the interface between two complex insulating oxides: lanthanum aluminate, LaAlO3, and strontium titanate, SrTiO3 (Nature 456 624). And that these states are very sensitive to external disturbances, such as electric fields.

Their work builds on experiments done last year when the team discovered that a thin superconducting layer forms between the two insulators when atomically thin layers of the materials are grown on top of each other (Science 317 1196). “This is like putting two slices of bread on top of each other and finding that a slice of ham appears in the middle,” quip the researchers.

The team has applied the same principle used in the CMOS field effect transistor to modulate the transport properties of this superconducting layer. In a CMOS field-effect transistor, an external voltage is employed to change a semiconducting channel’s resistance to an electrical current. “We report that we can switch the interface from a superconducting state to an insulating state just by applying an electric field to it,” the scientists told physicsworld.com. “In other words, you can drive the system from being a perfect conductor (which offers no resistance to electrical current) to being an insulator (which has a very high resistance to electrical current) just by applying a voltage.”

Faster and more efficient

A transistor made with a superconductor would carry current without any dissipation and should therefore be very efficient. Thanks to the lack of electrical resistance, the device would run faster than its semiconducting counterparts while using up much less power. Scientists have been trying to achieve such a result with high-temperature superconductors (which belong to same family as LaAlO3 and SrTiO3 because they have very similar crystal structures) for the last 20 years.

Although there are no immediate commercial applications for this research because of the very low device operation temperatures involved, the electric field effect could be applied to other materials, such as ferromagnets. “We can now imagine switching a magnetic system on and off just by applying a voltage, something that would have immediate applications in electronic devices,” said the team.

The researchers next want to apply their technique in the field of nanoelectronics. “We will realize nanoscale devices in which superconductivity can be dynamically defined using local electric fields,” they said.

First pictures of LHC magnet damage

By Michael Banks

Whilst trawling the web this morning I came across a few blog posts showing the first pictures of the damage caused by the magnet failure at CERN’s Large Hadron Collider (LHC) on 19 September.

The pictures were apparently shown during a presentation by the lab’s director general Robert Aymar on Friday at a meeting of the European Committee for Future Accelerators held at CERN.

The US/LHC blog posted a link to slides of Aymar’s talk. However, within an hour of the post (on 1 December) access to the talk had been restricted. Fortunately, particle physicist Stephanie Majewski from Brookhaven National Laboratory, who is at CERN for a year, posted the pictures from the talk on her

New thermometer could help redefine temperature

Physicists in Finland and Japan have invented a new type of electronic thermometer that relates temperature directly to the Boltzmann constant. Although not the first device to do so, the team say that their thermometer could easily be mass produced and therefore could be used as a highly accurate laboratory instrument as well as a calibration standard.

The current definition of the unit of absolute temperature is very messy indeed — the International Committee for Weights and Measures (CIPM) in Paris defines the Kelvin as 1/273.16 of the temperature difference between absolute zero and the triple point of pure water (roughly 0 °C) at a certain pressure. However, the CIPM would prefer to define the Kelvin, along with other SI units, in terms of fundamental constants — the Boltzmann constant kB, in the case of temperature.

As a result, teams of physicists around the world are dreaming up new techniques that relate temperature directly to kB. The latest is “single-junction thermometry” (SJT), which has been unveiled by Jukka Pekola and colleagues at the Helsinki University of Technology and NEC’s Nano Electronics Research Laboratories in Tsukuba (Phys. Rev. Lett. 101 206801).

A variation on Coulomb blockade

Their technique is a variation on Coulomb blockade thermometry (CBT), which was invented by Pekola a decade ago and is currently used in some commercial devices. CBT is based on the fact that the electrical conductance of an array of tunnel junctions — tiny bits of insulator sandwiched between two metals — changes with temperature.

While CBT works very well at temperatures above about 1K, small variations in the electronic properties of individual junctions results in an unacceptably large measurement uncertainty at very low temperatures.

Now, Pekola and colleagues have got around this problem by arranging a collection of tunnel junctions in a circuit such that the conductance depends on the properties of just one junction.

The tunnel junctions are created by first allowing a very thin layer of aluminium oxide to grow on the surface of micrometre-wide aluminium electrodes. Another electrode is then deposited on the oxide, creating metal-insulator-metal junctions through which electrons can tunnel.

Drop in conductance

Applying a voltage across the electrodes causes a current to flow through the junction. In principle, the size of the current depends on the number of electrons that can pile into the negative electrode — the more electrons available for tunnelling, the greater the conductance. At voltages above about 0.4 mV, however, this number is limited by a compromise between the Coulomb repulsion between electrons — which tends to reduce the number — and thermal energy of the electrons, which tends to boost the number. The upshot of this is that at these voltages the conductance does not vary with voltage.

At smaller voltages, however, the conductance drops off rapidly, until it falls to a minimum value at 0 V, before rising again as a negative voltage is applied. The dramatic drop in conductance occurs because at lower voltages, the junction behaves more like a capacitor, with the number of electrons that can pile into an electrode being proportional to the applied voltage as well as the thermal energy.

Works down to 150 mK

According to Pekola, the width of the dip (which can be measured by scanning the applied voltage and measuring the current through the junction) is directly proportional to the Boltzmann constant multiplied by the temperature. The team measured this width at several temperatures ranges (the lowest being 150–450 mK) and confirmed that the width is directly proportional to temperature.

As well as providing a way to express temperature in terms of the Boltzmann constant, Pekola says that the device is suitable for mass production and could therefore form the basis of a new thermometry system for use in low-temperature labs.

Sam Benz, a thermometry expert at NIST in the US, told physicsworld.com that the HUT-NEC team have done an “interesting experimental demonstration of a primary electronic-based thermometer that may prove useful at low temperatures”.

Pekola says that the team may try to commercialize the technology for use as an electronic thermometer, however there are several challenges that must be overcome. For example, he pointed out that their SJT devices are not optimized for any particular temperature range — something that would have to done to make them useful in the lab. For measuring temperatures lower than about 150 mK, for example, the junction electrodes would have to be made with a relatively large volume to ensure that the electrons are in thermal equilibrium with their surroundings.

Seeing is believing

njpblog.jpg
Ebb and Flow by P Mininni et al

By Matin Durrani

Two silent round flashes on a dark screen. That was the image witnessed by researchers crowded into the control room of the Large Hadron Collider (LHC) at the CERN particle-physics lab near Geneva on 10 September that heralded the successful passage of the first beam of protons around the 27 km collider. Later that day physicists watched as one of the LHC’s main experiments – the Compact Muon Solenoid – generated its first images from the debris of particles produced when the proton beam was deliberately steered into a tungsten collimator block.

Particle physics has long been a rich source of iconic images – from the tracks in the bubble chambers of the 1950s to the particle collisions that signalled the detection of everything from the W-boson to the top quark. But visualization has a proud history in other areas of science too. Ever since Galileo turned his telescope to the heavens in 1609 and saw mountains on the Moon and spots on the Sun, researchers have sought to see beyond what is possible with the naked eye. Indeed, astronomers now claim to have directly observed extrasolar planets for the first time.

(more…)

Is a new force at work in the dark sector?

Dark matter — the elusive substance that makes up most of the matter in the universe — may be far more complex than physicists had previously thought. Indeed, it may even be influenced by a hitherto unknown “dark force” that acts exclusively on dark matter particles. That’s the claim of a team of US cosmologists, who believe that their new theory of dark matter could explain the recent intriguing and anomalous results of several high profile searches for the first direct evidence of dark matter.

Physicists believe that there is about five times more dark matter in the universe than normal matter — the latter being the familiar stuff that makes up planets and stars. While dark matter appears to interact via gravity and has a strong influence on the motion of massive objects such as galaxies, it does not interact with light and has proven very difficult to detect directly — let alone study in any detail.

In recent years, several different experiments have found unusual results that might be linked to dark matter. Researchers operating a balloon-borne cosmic-ray detector called ATIC published a paper last week detailing an unexpected excess of electrons between about 300–800 GeV. The results cannot be explained by standard models of cosmic ray origin and propagation in the galaxy and instead suggest a nearby and hitherto unknown “source” of high energy electrons.

Growing evidence of new physics?

Earlier this year it was revealed that the PAMELA satellite found an excess of positrons around 10–100 GeV, which is also unexpected for high energy cosmic rays interacting with the interstellar medium. And the INTEGRAL satellite discovered an unexpected excess of low-energy positrons at the galactic centre. While everyone isn’t sure yet whether these results are fully consistent with each other, they all seem to point to “new physics”.

One possibility is that these excess particles are caused by the annihilation of weakly interacting massive particles (WIMPs) — one of the leading candidates for dark matter. In theory, WIMPs can collide and annihilate each other, producing electron-positron pairs. However, the annihilation rate required to explain the observed excesses is far higher than expected from standard theories of dark matter.

Now, Douglas Finkbeiner at the Harvard-Smithsonian Center for Astrophysics and colleagues believe they may have a possible answer (arXiv:0810.0713). “If we believe dark matter annihilation may be the culprit for the ATIC and PAMELA excesses, we come straight to a couple of interesting conclusions,” explained Finkbeiner. “First, dark matter must annihilate to electrons or muons, either directly or indirectly, and second, it does so about 100 times more readily than expected. Both can be accomplished with a theory containing a new force in the dark sector.”

A new force and particle

According to Finkbeiner and his colleagues, their new proposed fundamental force is felt only by dark matter and mediated by a new particle, “phi”, much in the same way that another fundamental force — electromagnetism — is mediated by photons. Crucially, the force is attractive, bringing dark matter particles together much more effectively to annihilate at low speeds and leading to a greater annihilation rate than otherwise expected. Dark matter particles would collide and annihilate to produce phi particles, with each phi then decaying to produce the electrons, positrons and other particles observed by experiments.

What’s remarkable is that we found how easily the different elements supported each other and can explain a number of different anomalies simultaneously Neal Weiner, New York University

“This theory is partially a synthesis, but in bringing all the ideas together and realizing how simply they could fit together in a single framework, it is much more than just that,” says Neal Weiner, another member of the team at the Center for Cosmology and Particle Physics at New York University. “What’s remarkable is that we found how easily the different elements supported each other and can explain a number of different anomalies simultaneously.”

There are several possibilities to explain a dark matter signal and this is only one of them Dan Hooper, Fermilab

Others are more cautious. “The detections by ATIC and PAMELA are compelling but it is far too early to say that we have detected dark matter,” says Dan Hooper at Fermilab in Illinois. “Moreover, there are several possibilities to explain a dark matter signal and this is only one of them. There is no compelling reason why the universe has to be this way.”

Finkbeiner, however, is optimistic. “All we have given up is the relative simplicity of recycling the same old forces we already know about,” he says. “But we have no philosophical problem with this. Can we really expect to discover a whole new ‘dark sector’ of particles and not find any new forces at all?”

Call for better measurements

Hooper believes the only way to resolve the question ultimately is to have more precise measurements of the electron bump and more data to nail down what scientists are looking at here. Finkbeiner and Weiner agree, pointing out that further astrophysical signals, scattering of nuclei in underground experiments and news from particle accelerators could all help probe dark matter further.

“It will require many pieces, probably, to figure it out,” says Weiner. “We’re really just at the beginning of thinking about these things.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors