Skip to main content

Earth-sized planets may be more common than we thought

Almost one in four stars like the Sun could harbour an Earth-mass planet, according to US researchers. Their finding questions conventional models of planetary formation, which suggest that it is rare to find low-mass planets close to their parent stars, implying that solar systems like ours could be more common than we thought. The result also suggests that NASA’s Kepler mission, currently hunting for Earth-like planets, could discover more than 250 “plausibly terrestrial worlds”.

The population of known alien worlds, a total of almost 500 discovered since the mid-1990s, is currently skewed towards the more easily detectable Jupiter-mass planets that orbit close to their host stars. It is only recent advances in technology that have allowed the search for planets with similar masses to the Earth. Yet existing models of solar-system formation predict a “planetary desert” close to the star: a lack of planets with 1–30 times the mass of Earth and an orbital period of less than 50 days. Now, a team of astronomers, including Geoff Marcy at the University of California at Berkeley, is challenging this received wisdom.

“This is the first time anyone has measured the fraction of stars that have smaller planets,” Marcy, often credited as the most prolific planet hunter of all time, told physicsworld.com. His team used data from the Keck telescope in Hawaii relating to 166 stars between 0.54 and 1.28 solar masses, all within 80 light-years of Earth. Doppler shifts in the starlight, the result of the star wobbling under the gravitational influence of an orbiting exo-planet, revealed a total of 33 planets around 22 of the stars.

Don’t forget the missed planets

Marcy’s team also made an attempt to account for any planets that might have been missed due to limitations in the sensitivity of their equipment. “We asked what would be the maximum planet mass that could hide in our data. If there were a more massive planet than that we would have seen it,” Marcy explained. This statistical sampling analysis enabled them to infer the “missed” planets that sit alongside the confirmed planets.

This information, from both the confirmed and inferred cases, was used to model the likelihood of close-by planets as a function of a planet’s mass. It turned out that a power law was the best fit to the data, one that implied that the smaller the mass of the planet, the more likely it was to exist. This suggests the “planetary desert” is far from the desolate wasteland previously envisioned. “Our observations don’t agree with theoretical predictions. We now know that the universe has more Earth-mass planets than Jupiter-mass planets,” said Marcy. His power law predicts the chances of a Sun-like star having a one Earth-mass planet to be 23% – almost one in four. The findings are published in Science.

However, the research was limited, by current technology, to only modelling planets orbiting at less than one quarter of the Sun–Earth distance. So Marcy’s finding could still be promising in the hunt for Earth’s “twin”: a one Earth-mass planet orbiting at an Earth–Sun distance. “Current models suggest most planets form far away from their stars; you should find more planets at longer orbital periods,” Coel Hellier, an exo-planet researcher at Keele University, told physicsworld.com. “This research predicts a 23% chance of finding short period Earth-mass planets, so there should be even more further out; perhaps then nearly all solar-type stars have an Earth-mass planet,” he added.

Searching for a second Earth

However, just because a planet has near Earth-mass, doesn’t necessarily mean it is Earth-like. “Planets with a few Earth masses may be qualitatively different from one Earth-mass planets. They might be much larger, more like mini-Neptunes, with a lot more water and a lot less rock,” Marcy warned.

But early results from NASA’s Kepler space telescope, which measures a planet’s radius rather than its mass, are promising. “Many of Kepler’s planet candidates appear to have small radii, which is consistent with our research; they could be Earth-like after all,” said Marcy. The team predicts that Kepler could find 120–260 “plausibly terrestrial worlds”. “We are starting to see suspicious signs that Earths are out there in large numbers,” he added.

Meanwhile, a pair of researchers based in the US and Switzerland has begun to study a contender Earth in more detail. Kevin Heng at ETH Zurich and Steven Vogt at the University of California have simulated atmospheric circulation on Gliese 581g, a “super Earth” discovered in 2009. Publishing their findings in a paper submitted to the arXiv preprint server, the researchers argue that the specific locations for habitability depend on whether the planet is tidally locked and how fast radiative cooling occurs on a global scale.

Fusion – from here to reality

Fusion – the nuclear process that powers the Sun – has long held promise as a potential source of energy here on Earth. But recreating such conditions under experimental conditions is far from easy, which is one reason why a commercial fusion plant is still many decades away. Still, if physicists and engineers can pull the trick off, fusion could play a massive role as a vital part of our future energy mix.

In this exclusive physicsworld.com video, David Ward from the Culham Centre for Fusion Energy (CCFE) in the UK discusses the challenges in going from the ITER fusion experiment being built in southern France to a working fusion plant, dubbed DEMO. Ward, who is head of power plants and energy at the CCFE, has spent a quarter of a century in the fusion field. What’s interesting is that he predicts not just one version of DEMO, but lots, with China potentially leading the way.

Kenyan physics graduate builds aircraft via Wikipedia

By James Dacey

After studying physics at university and then moving into the computer hardware business, Nairobi inhabitant Gabriel Nderitu has now set his eyes truly skywards. He has cobbled together a 2-seater aircraft after reading about the principles of aeronautics on the internet.

In this special video report for Citizen TV Kenya, Nderitu describes his inspiration for the mission. “My boyhood interest was in aviation, so maybe it was a missed career that I’m trying to recreate or something,” he says.

The Kenyan aviator is currently applying the finishing touches to his craft, which will weigh in at 800 kg and is built around the engine of a Toyota NZE. Its wings are made from sheets of aluminium and attached to its nose is a 74 inch propeller that turns at 4000 rotations per minute. Construction began one year ago, with the help of 5 assistants, and it has cost just shy of $8000.

Nderitu tells the Guardian that in designing the craft he has downloaded roughly 5 GB of data, with Wikipedia serving as the main source of information. “It was a bit of a re-inventing the wheel – not really looking and trying to copy…it’s a matter of reading the science of it,” Nderitu tells Citizen TV Kenya.

However, despite all Nderitu’s hard work, the tale is not yet guaranteed its cheery Hollywood ending. It seems that all the media coverage surrounding the mission has also attracted the attention of the Kenyan Civil Aviation Authority, which has advised that Nderitu cease working on his plane. You can hear about this latest spanner in the works in this follow-up video report.

Boost for Tevatron extension

 

A plan to increase the lifespan of Fermilab’s Tevatron collider for a further three years was given a boost yesterday when an advisory panel to the US Department of Energy (DOE) recommended the extension.

The Tevatron, which collides protons with antiprotons, is due to close in September 2011 when work will begin adapting parts of the accelerator to produce neutrinos and muons. In late August, however, Fermilab’s Physics Advisory Committee recommended that collisions at the Tevatron should instead continue until 2014 to take advantage of a 15 month shutdown of the rival Large Hadron Collider (LHC) at CERN in 2012.

Fermilab director Pier Oddone last month unveiled a financial plan to keep the collider going for an extra three years, which would cost around $50m a year. Around $15m would be freed up by delaying the NOvA neutrino and the Mu2e muon experiments but an additional $35m per year would still be needed to fund the Tevatron through to 2014.

Yesterday, the DOE’s 21-member High Energy Physics Advisory Panel (HEPAP) committee recommended the extension. “With a three-year extension, [the Tevatron] could make a significant contribution to understanding the central issue of our field, the existence and nature of the Higgs boson, complementing the information physicists can expect from the LHC while it ramps up to its full energy and luminosity,” the report says.

“With extended running, the two Tevatron experiments — D0 and CDF — have an excellent chance of finding evidence for the Higgs boson if it exists.” says D0 spokesperson Stefan Soeldner-Rembold from the University of Manchester. “I am confident that Fermilab and the DOE will provide the financial support to make the extension possible.”

Finding funds

However, the panel warns that finding the extra $35m should not come at the detriment of the nation’s high-energy physics programmes. “The panel recommends that the agencies proceed with a three-year extension of the Tevatron program if the resources required to support such an extension become available in addition to the present funding for [high-energy physics],” the report says. “Given the strong physics case, we encourage the funding agencies to try to find the needed additional resources.”

The DOE will now work on its budget request to be sent to US President Barack Obama. The final decision, however, will lie with the Obama administration, which will present its budget request for 2012 to Congress in February 2011.

Is the comms industry speeding towards an energy bottleneck?

SDC11489_2.JPG
Courtesy: Bas van Schaik

By James Dacey

When faced with the need to cut energy usage, those who favour the business-as-usual approach often argue that improvements in energy efficiency will enable us to continue consuming as before. But in the world of communications, at least, it seems that gains in energy efficiency are not keeping pace with the unrelenting growth of traffic.

That is according to a team of scientists at Alcatel-Lucent Bell Labs that has examined the energy consumption trends of communications equipment as compared with the related efficiency savings. On the one hand, they find that the current traffic growth rate of 40–60% will remain as high as 25–50% in 2020 even as traffic growth begins to slow as it approaches saturation. Efficiency savings, on the other hand, are currently falling 20% to roughly 10%.

The consequence, they say, is that energy is going to become an increasingly important problem for communication networks with the threat of a bottleneck situation. The findings will be presented this week at the annual meeting of the Optical Society, which is taking place in Rochester, NY.

These findings also chime with a study published recently in Journal of Physics D, which examined lighting usage over the past 300 years, finding that as lighting becomes more energy efficient, and thus cheaper, we use ever more of it. Interpretations of this study caused quite a stir in the blogosphere, leading one of the authors, Harry Saunders, to clarify the issues in this article.

Compact X-ray source could rival accelerators

 

A compact source of high-quality X-ray pulses has been unveiled by an international team of researchers. The group claims that the source – which is contained in a vacuum chamber about 1 m3 – produces intense and highly coherent X-ray pulses that rival those produced by “wiggling” electrons in large and extremely expensive particle accelerator facilities.

The source was created by Zulfikar Najmudin and colleagues at Imperial College London along with researchers at the University of Michigan, Instituto Superior Técnico in Lisbon and Ecole Polytechnique Palaiseau in France.

The device creates X-rays using the “plasma wakefield” effect whereby an intense laser pulse is fired into gas to create a plasma. As the pulse travels through the gas, its electric field separates electrons from atoms. This creates an extremely large electric field in the wake of the pulse, which accelerates electrons. As the wake collapses, electrons are “wiggled” violently, causing them to radiate X-rays.

Broad energy distribution

The team made the plasma at the University of Michigan using Hercules – a petawatt laser that creates some of the most intense laser pulses ever. The pulses are fired into a jet of helium gas and X-rays are created in a volume about 1 µm and propagate in the direction of the laser pulse. They are produced with a broad energy distribution and have an average energy of about 10 keV with some as energetic as 100 keV. According to the team, the X-ray source is 1000 times brighter than previous schemes for generating X-rays in “plasma wigglers”.

The team evaluated the quality of the X-ray pulses by using them to image a number of microscopic test patterns. They concluded that pulses have a large degree of spatial coherence – which makes them well suited for studying the structural properties of materials on the nanometre scale. In addition, the pulses last only a few femtoseconds, which means that they can be used to study processes such as atomic and molecular interactions that occur on very short timescales.

“We think a system like ours could have many uses,” said Najmudin. “For example, it could eventually increase dramatically the resolution of medical imaging systems using high energy X-rays.”

Large laser required

However, the technique has one important shortcoming at the moment – it requires an extremely powerful and relatively large laser to work. While the Hercules laser is smaller than an accelerator facility, it still occupies several rooms at the University of Michigan.

“High-power lasers are currently quite difficult to use and expensive, which means we’re not yet at a stage when we could make a cheap new X-ray system widely available,” admitted Najmudin. “However, laser technology is advancing rapidly, so we are optimistic that in a few years there will be reliable and easy-to-use X-ray sources available that exploit our findings.”

The work is described in Nature Physics DOI:1038/NPHYS1789.

Fluorescent beads illuminate sugar in blood

 

The days of pricking fingers several times a day to draw blood could become a thing of the past for diabetes sufferers, thanks to a new way of monitoring sugar levels in the body. The minimally invasive test, developed by researchers in Japan, could also provide patients with a more detailed picture of glucose concentrations as they fluctuate over time.

Shoji Takeuchi and his colleagues at the University of Tokyo, working with collaborators in industry and at Kyoto University Hospital, have developed a type of fluorescent bead that can be injected into skin. The beads contain a type of boronic acid that binds reversibly with glucose molecules. Once inside the blood, these beads, which are of the order 1 × 102 µm, can be tracked to reveal blood sugar levels by exposing skin to light.

In practice, diabetic patients would need devices containing an excitation light source and photodiode to read out the fluorescent signal. “The device would be as small as a watch or a piercing, and diabetic patients would wear the device,” Takeuchi tells physicsworld.com.

Glowing hair

In a series of in vitro tests, the researchers observed the relative fluorescence increase by five times as they raised glucose concentrations from zero to 1000 mg.d/L. They also carried out a series of in vivo tests by injecting their fluorescent beads into the ear skin of a mouse using a needle commonly used in clinical settings. The implanted beads were still visible through skin layers thicker than 200 µm even when impeded by hair.

Takeuchi believes that the technique could become commercialized within five years but there are still aspects of the science that need to be developed before it can be taken to a clinical trial. “For the next step, we are going to improve the microbeads so they have a little protein adsorption – this is important to keep the microbeads performing stably in a body,” he explains.

Ishan Barman, an analytical chemist at Massachusetts Institute of Technology, is impressed by the breakthrough. “The real highlight of this sensor is its capability to provide high intensity levels that leads to sensitive detection, even for transdermal monitoring,” he says.

Outstanding challenges

But Barman also realizes that there are several remaining challenges. “Since this is a minimally invasive design (as opposed to a non-invasive one), one has to further explore the material toxicity question, especially in human population.” Barman also notes that the researchers need to examine how the particles clear from the body over days and weeks, not just hours.

This research joins a number of other innovations that could come provide diabetes sufferers with an alternative to the “finger prick” test. These techniques could provide a more immediate and continual response to changing glucose levels in blood, as opposed to taking blood samples, which only provides a snapshot in time. Examples include the carbon “tattoo” containing nanotube sensors and a non-invasive Raman spectroscopy technique, both techniques being developed at MIT.

The work of Takeuchi and his colleagues is described in a recent paper in Proceedings of the National Academy of Sciences.

Is this really the start of Italy’s nuclear renaissance?

By Edwin Cartlidge in Rome

Last week Italy’s energy agency ENEA put on a conference to celebrate the 50th birthday of its Casaccia research centre outside Rome. The occasion also marked the official restart of two veteran research reactors at the site, which, said the agency, represented the symbolic return of nuclear power to the country. A referendum held in the wake of the Chernobyl disaster in 1986 led to all of Italy’s power reactors being shut down, but the current government announced two years ago that it was to return to the nuclear fold and start construction of a number of modern plants by the end of 2013.

There are still many people in Italy who oppose nuclear power, notwithstanding its newfound green credentials. And there are also plenty who believe that the government’s ambitious plans, ultimately to generate 25% of the country’s electricity from nuclear, are destined to become an expensive flop. Certainly, the meeting at Casaccia did not instill confidence.

Being a 50th birthday party, it was natural that scientists and engineers should take a look back at the early days of Italy’s nuclear programme, entertaining us with film clips that re-enacted some of the crucial events leading to Italian physicist Enrico Fermi’s operation of the world’s first nuclear reactor in 1942. But in all the various talks there seemed precious little to indicate that concrete steps are being taken to revive nuclear. Indeed, the politician in charge of Italy’s energy policy, Stefano Saglia, was supposed to come and tell us about the new nuclear programme, but he failed to show up.

Although Casaccia focused predominantly on renewable energies and the environment throughout the 1990s it never entirely left behind its nuclear roots. In particular, it continued to operate two research reactors, the 1 megawatt thermal reactor TRIGA and the 5 kilowatt fast reactor TAPIRO. And it is the restart of these devices, following a two-year period of maintenance, that ENEA boss Giovanni Lelli declared on Wednesday marks the symbolic return to nuclear. But in fact it seems more of a case of business as usual.

TAPIRO will carry out tests that should help in the design of future generation-IV reactors and could also provide useful data in the construction and operation of the generation-III plants that Italy wants to start building in the next few years. But by and large the two reactors seem set to carry on doing what they have done for many years – developing nuclear medicine, providing isotopes to hospitals and industry, and analysing a range of materials. All laudable activities, but nothing to do with building new power stations. While being shown around a 50-year-old reactor, particularly one that gives off an eye-catching blue glow (see above), is fun, it does not provide convincing evidence that in a few years’ time Italians will once again enjoy the benefits of homegrown nuclear energy.

Topological insulators could help define fundamental constants

A newly discovered class of materials known as “topological insulators” could help physicists to obtain new ways of defining the three basic physical constants – the speed of light (c); the charge of the proton (e); and Planck’s constant (h). That’s the claim of a team of physicists in the US, which has proposed a new experiment to measure the fine-structure constant (α), which is a function of h, c and e, by scattering light from such a material. Topological insulators are unusual in that electrical current flows well on their surface, but not through their bulk.

The new measurement has been proposed by Shou-Cheng Zhang and colleagues at Stanford University as well as researchers at the University of California at Santa Barbara and the University of Maryland. Although there are many other ways of determining α, their technique is the only method that involves measuring a phenomenon that is quantized in units of α. In principle, this means it could provide an extremely precise metrological definition of α.

A new way of defining h, c and e could then be obtained by combining the value of α with measurements of magnetic flux quanta and electrical conductance quanta in the materials, which both depend on h and e.

Insulators that conduct

The strange properties of topological insulators arise from the fact that the shape – or topology – of the electron energy bands makes it impossible for a surface electron to backscatter. Zhang believes that, under certain conditions, this topology leads to an “exact quantization” of how a material responds to an external field. Physicists are already familiar with similar topological responses, which occur when a superconductor is exposed to a magnetic field – resulting in magnetic flux quanta. It is also seen in the quantum Hall effect, when a 2D conductor is exposed to a magnetic field resulting in quanta of electrical conductance.

The physicists argue that a “topological magneto-electric effect” – whereby an electric field can induce a magnetic polarization and a magnetic field can induce an electric polarization – can occur in some topological materials. Furthermore, the topology of the effect means that the response of the material to applied electromagnetic fields is quantized in units of α.

Rotating incident light

To measure the effect, Zhang and colleagues propose an experiment whereby light is shone at a thin film of topological insulator and the Kerr and Faraday rotations are measured. The Kerr rotation is the shift in the direction of the polarization of the reflected light relative to the polarization of the incident light. The Faraday rotation is the shift in the polarization of the transmitted light.

Both effects involve the interaction between light and matter in the presence of a magnetic field. The physicists have derived a formula that suggests that in some toplogical insulators a certain combination of the Kerr and Faraday angles is quantized in integer multiples of the fine structure constant.

The next step is to try to measure this effect – and Zhang says that three independent labs are currently trying to measure the effect in real materials.

The work is reported in Phys. Rev. Lett. 105 166803.

End of the road for prestigious science books prize?

life ascending.jpg

By James Dacey

Yesterday, the biochemist Nick Lane was crowned as this year’s winner of the Royal Society Prize for Science Books. But he may well be the last recipient of this award because the Royal Society has announced that it can no longer afford to finance the prize. It’s a sad state of affairs and could serve to reduce public interest in science in the UK at a time when science budgets are being squeezed.

Lane took the £10,000 prize for his book Life Ascending: The Ten Great Inventions of Evolution, which charts the history of life on Earth through 10 of its most wondrous features, such as sex, photosynthesis and consciousness. Upon winning the award Lane said, “I’ve been following the prize since its inception and I know it’s the highlight of the year for many scientists.”

Among the six books shortlisted for the prize were Why Does E = mc2, co-written by particle physicist Brian Cox, and We Need To Talk About Kelvin by established science writer Marcus Chown.

The prize was created in 1988 as the science writing equivalent of the Booker prize for literature. It was the same year that Stephen Hawking published A Brief History of Time, although interestingly Hawking did not win the 1989 prize, which instead went to Roger Lewin for Bones of Contention, a book about controversies in the search for human origins

Over the years physics-related books have fared well, with winners including David Bodanis in 2006 for Electric Universe: How Electricity Switched on the Modern World and Philip Ball in 2005 for Critical Mass: How One Thing Leads to Another.

But in recent years the prize has been struggling to attract media attention after the pharmaceutical company Aventis withdrew its sponsorship in 1997. Now, the Royal Society has said that it will terminate the awards unless they can get financial support from an external sponsor.

Copyright © 2026 by IOP Publishing Ltd and individual contributors