Skip to main content

How to cook up a new topological insulator

By Hamish Johnston
First predicted in 2005 and confirmed in the lab in 2007, topological insulators (TI) are perhaps the hottest material in condensed-matter physics these days. As well as constituting a new phase of quantum matter that should keep physicists busy for some time, the material has recently been shown to harbour quasiparticles resembling Majorana fermions. First predicted by the Italian physicist Ettore Majorana in 1937, such particles could be used to store and transmit quantum information without being perturbed by the outside world. As such, they could find use in the quantum computers of the future.

curtarolo.jpg
It’s not surprising that scientists worldwide are working hard to discover and study new variants of TIs. However, researchers at Duke University in the US believe that, until now, discoveries have been based on trial and error.

To encourage a more systematic approach, Stefano Curtarolo (right) and colleagues have created a “master ingredient list” that describes the properties of more than 2000 compounds that could be combined to make TIs. The clever bit of the work is a mathematical formulation that helps database users search for potential TIs that are predicted to have certain desirable properties.

The system is based on Duke’s Materials Genome Repository, which has already been used to develop both scintillating and thermoelectric materials.

According to Curtarolo, the system gives practical advice about the expected properties of a candidate material – for example, whether it will be extremely fragile or robust.

Commenting on the fragile materials, Curtarolo says “We can rule those combinations out because what good is a new type of crystal if it would be too difficult to grow, or if grown, would not likely survive?”

The research is also described in a paper published in Nature Nanotechnology.

Bow-shock no-show shocks astronomers

The Sun moves much more slowly relative to nearby interstellar space than was previously thought, according to scientists working on NASA’s Interstellar Boundary Explorer (IBEX) mission. Their study casts doubt on the existence of an abrupt “bow shock” where the edge of the solar system meets the interstellar medium – instead suggesting that the boundary between the two regions is much gentler than previously thought. The discovery could lead to a better understanding of how some cosmic rays can enter the solar system, where they pose a threat to space travellers.

The bow shock refers to the region where the heliosphere – the huge bubble of charged particles that surrounds the Sun and planets – is believed to plunge into the interstellar medium. The commonly accepted idea is that the solar system moves faster (relative to the speed of the interstellar medium) than sound itself, rather like the shock wave that forms ahead of a supersonic aircraft. Charged particles moving supersonically in the heliosphere therefore pile up at the front of the shock, with the density of charged particles dropping off rapidly where the heliosphere meets the interstellar medium.

Astronomers have always had good reason to believe the bow shock exists because similar structures can be seen surrounding nearby stars. But the new analysis of IBEX data – which has been carried out by David McComas of the Southwest Research Institute in Austin, Texas, and an international team – suggests that the bow shock does not exist after all. In other words, the solar system is not moving as fast as we though relative to the interstellar medium.

Fast-moving atoms

Launched in 2008, IBEX orbits the Earth and is designed to study fast-moving neutral atoms. What McComas and colleagues did was to use IBEX to characterize neutral atoms from the interstellar medium that cross into the heliosphere. Because these atoms are not electrically charged, they are not affected by magnetic fields – and so their speed should correspond to the relative velocity of the interstellar medium.

The study suggests that the relative speed is about 84,000 km/h, which is about 11,000 km/h less than previously thought. In addition, data from IBEX and earlier Voyager missions suggest that the magnetic pressure found in the interstellar medium is higher than expected. When these parameters were fed into two independent computer models of the heliosphere, both suggested that a bow shock does not exist, but rather a gentler “bow wave” occurs at the interface.

Gliding through space

“A wave is a more accurate depiction of what is happening ahead of our heliosphere – much like the wave made by the bow of a boat as it glides through the water,” explains McComas.

According to McComas, decades of research based on the presumption of a bow shock must now be redone using this latest information. “It’s too early to say exactly what these new data mean for our heliosphere,” he says. Given that the heliosphere shields the solar system from some cosmic rays, McComas says “there are likely [to be] implications for how galactic cosmic rays propagate around and enter the solar system, which is relevant for human space travel”.

The work is described in Science.

Physics in 100 seconds

Moon

Ready, steady, GO!

James Dacey

“What is dark matter?…you’ve got up to 100 seconds to answer…your time starts…NOW!”

This was the challenge facing Luke Davies (above) during a day of filming at the University of Bristol, where academics were asked to give super-condensed lectures on some of the biggest questions in physics. Participants at this UK university were armed with nothing more than a whiteboard and a couple of marker pens. And just to make the experience that bit more thrilling/nerve-racking, speakers were faced with a countdown alarm that sounded once their time was up.

The idea is to compile a series of short films for physicsworld.com that will provide introductions to topics across the whole spectrum of physics and its related disciplines. Films are presented by various physicists and cover everything from antimatter to fracking to black holes. Oh, and I almost forgot to mention the presentation about recognizing penguins in a crowd. From behind the camera, I certainly learned an awful lot about an awful lot!

The scientists appeared to get a lot from the day too. Several of them commented about what a vast departure it was from their usual experiences of presenting: standing in front of students and lecturing for an hour or so. Clearly 100 seconds is not very much time to explain topics as complex and detailed as dark energy or the Higgs boson, but everybody rose to the challenge and it was fascinating to observe the different styles that people adopted.

These films will be appearing on physicsworld.com over the coming weeks.

Astronomers lift the veil on hidden exoplanet

An international team of researchers has detected and characterized an “unseen” exoplanet, simply by looking at its gravitational effects on the orbit of another exoplanet within the same system. While the technique has been used before to confirm the existence of certain exoplanets found using other methods, this is the first time it has been used to discover an exoplanet.

NASA’s Kepler mission is currently looking at more than 15,000 stars, any of which could harbour their own planetary systems. Kepler looks for small dips in a star’s brightness as a function of time – its light curve – that would occur when a planet crosses the face of the star in a “transit”. It also looks for changes in the radial velocity of the star itself – the tiny shifts that occur because of the gravitational pull of a planet. In 2005 two different groups of researchers outlined a method that could help find exoplanets – especially very small ones – for which radial-velocity data are inconclusive and the transits of which are not visible from Kepler’s vantage point. The method focuses on irregularities in the patterns and transit timings of known exoplanets – dubbed “transit-time variations”.

Predicted twists

A lone exoplanet following a strict Keplerian orbit around its parent star would show an equally spaced and timed transit light curve that is periodic. Any variation in this could mean that the exoplanet is not alone – it may have a moon or there may be other unseen exoplanets in the system. Since the method was proposed, the Kepler team has used it to confirm its data in a number of systems where exoplanets did show their transits, rather than as a method to discover hidden exoplanets.

It is only now that an analysis of publicly available Kepler data has allowed David Nesvorny of Southwest Research Institute in Boulder, Colorado, and colleagues to successfully identify and characterize an exoplanet in a system previously thought to contain only one exoplanet. “We had been looking at hundreds of systems that could host possible exoplanets as identified by Kepler – each is known as a ‘Kepler Object of Interest’ [KOI] – and this one in particular seemed the best candidate as it showed a huge transit-time variation of almost two hours and was not periodic at all,” explains Nesvorny. The team looked at the data from 15 transits of the known exoplanet-b that has an orbit of 33.6 days around its parent star – known as KOI-872 – and then ran complex computer simulations to allow for all possible solutions that could cause the variations.

Hidden effects

The researchers found that the best solution for the gravitational perturbations seen for exoplanet-b is the existence of another near-by exoplanet, exoplanet-c, that orbits the parent star every 57 days. According to Nesvorny, the data supporting the existence of exoplanet-c are “very convincing and have a statistical significance of better than 3σ”. To put their findings to the test, the researchers have predicted the timing variations for numerous upcoming transits of exoplanet-b. Some of the data for this are already available to the Kepler team, while other data that be collected over the next three years. “The Kepler team can check for the predicted trend and they can even check for whatever faint radial-velocity data it can get for the effect of the other exoplanet in the system to test our predictions,” says Nesvorny.

Unexpected addition

As it turns out, soon after the researchers predicted exoplanet-c, a third exoplanet – a super-Earth approximately 1.7 times the mass of the Earth and with a very short orbit of 6.8 days – was found in transit data from Kepler. So could it be that this third exoplanet has an equal or greater effect on exoplanet-b? “We did immediately test for that too, but our simulations showed that for a smallish exoplanet like the super-Earth, its gravitational effects would be in the order of minutes and not hours. Of course, there is a chance that the super-Earth is extremely dense or that exoplanet-c is extremely light, but the chances of that are slim. For the observed two-hour transit-time variation, the super-Earth would have to almost be as dense as a neutron star, which is not going to be the case,” says Nesvorny. So the team is sticking to its original prediction for exoplanet-c.

In the months to come, Nesvorny and the team will be looking at more data from the Kepler mission to prove their predictions. Nesvorny, who is also part of the Hunt for the Exomoons with Kepler (HEK) project, will be looking at data on transit-tiem variations to search for exomoons. “Exomoons are the next big thing, I feel,” says Nesvorny. “They help determine the mass of an exoplanet and could be situated within the habitable zone of a star and be as interesting as their parent planets,” he says.

The research will be published in Science.

Frequency comb takes a measure of distance

A new method for measuring distance based on an optical frequency comb has been unveiled by physicists in the Netherlands. The main benefit of the technique, which involves passing the light from an optical comb through a Michelson interferometer and analysing the resulting interference patterns, is that it allows distances to be measured accurately without already knowing the value to within half a wavelength of the light used. The technique could be used to measure the distance between satellites or to make very precise measurements of the dispersion of light in optical materials.

A Michelson interferometer consists of two optical paths – or arms – that are at right angles to each other. Incoming light strikes a beamsplitter where the two arms meet and light is sent simultaneously along each arm until it hits a mirror at the end of each path and is reflected back. The two beams are then recombined and sent to a screen, where an interference pattern appears if the two arms are different lengths. By analysing the pattern, the difference in length between the arms can be determined in terms of a fraction of the wavelength of the light.

Half-wavelength limit

In its simplest form, a Michelson interferometer is used with monochromatic light. However, this limits its effectiveness because, before a measurement is made, the length of the distance to be measured must be known to within one half of the wavelength (λ/2) of the light used – typically less than 500 nm. The problem is that the distance being measured can be expressed as an integer multiple of λ/2 plus a fraction of λ/2 – but this integer multiple cannot be determined from the interference data.

Physicists have found two ways round this problem. One is to use several lasers at different colours to gain more information about the system. The other is to use a light source with a range of wavelengths and then look for phase differences in the interfering light, which can be related to distance. These techniques have their own problems, not least that thousands of different lasers would ease a measurement but installing them all in a lab would be impractical.

9000 different lasers

What Steven van den Berg and colleagues at the National Metrology Institute in Delft and the Technical University of Delft have done is to combine the two techniques to allow even better distance measurements. The team makes use of a frequency comb – a device that outputs laser light at regularly spaced wavelengths. The result is that the interferometer is effectively being operated using about 9000 different lasers at the near infrared in the 808–828 nm range.

The researchers demonstrated their new technique by firing pulses from their frequency comb into an interferometer with a measurement arm about 15 cm long. After being recombined, the light from the two arms was passed through an optical device called an etalon and then reflected from an optical grating. Both of these devices change the trajectory of the light according to its wavelength, creating an interference pattern that was captured using a CCD camera. This image was then analysed to produce a plot of the intensity of light as a function of wavelength – with each data point corresponding to a different wavelength of the comb. The result is a sine wave from which the measured distance can be calculated.

The team found that the distance could be found to within λ/30 – or about 27 nm. While this is a typical result for a Michelson interferometer, the new set-up allows any distance that is already known to within 15 cm to be measured – compared with the 400 nm prior knowledge required by a monochromatic device.

Measuring optical properties

Van den Berg believes that the system could find uses in space, where light can travel long distances without interference. As such, he says it could be used to make extremely accurate measurements of the distances between satellites. The fact that the system makes very sensitive measurements across a range of wavelengths could also see it being used to measure the properties of optical materials – particularly how the index of refraction changes with wavelength.

The team is currently working on improving the stability of the system and also investigating how it performs over longer distances.

The work is described in Physical Review Letters.

What is your primary source of online physics news?

By James Dacey

These days, pretty much every major newspaper, science magazine or broadcaster has an associated website and these sites almost always provide the breaking news stories before their printed counterparts. In addition, the Internet is awash with blogs, podcasts and social-media sites, where it is often the scientists themselves who are first break new developments to the outside world.

hands smll.jpg

When it comes to slightly longer news and analysis articles, just a few years ago printed media was still the first choice for most people, as reading at length from an antiquated screen could leave you with serious eyesore. What’s more, busy people on the go didn’t always have immediate access to a computer or an Internet connection to access their chosen news websites. Today the situation is different. Screens have improved and the proliferation of Internet connectivity, combined with the advent of smartphones and tablets, means that many people can access many forms of news, at any time, nearly anywhere.

We want to know where you get most of your updates when it comes to physics news. Let us know via this week’s Facebook poll.

What is your primary source of online physics news?

General news sites
Specialist media sites
Blogs
Social media
Internet radio/podcasts

To share your online habits, please visit our Facebook page. And, if you get the majority of your physics news from a different source, then please let us know what that is by posting a comment on the Facebook poll.

In last week’s poll we were interested to know how you see astronomy in relation to physics. We asked “Do you consider astronomy to be a distinct academic discipline from physics?”

The results are in and 70% of you think that astronomy is not a distinct academic discipline from physics.

Michael Danielides voted with the majority and commented that an astrophysics lecturer once told him that astronomy and astrophysics were both branches of theoretical physics, “because you can’t interfere with the ongoing experiment”.

Thank you for all your participation and we look forward to hearing from you in this week’s poll.

Share your astrophotography

Moon

Courtesy: Nose in a book, via Flickr


By James Dacey

To tie in with next month’s transit of Venus, in which our sister planet passes across the face of the Sun, we want you to submit your astronomy photos to our Flickr group. The images could be of star trails, the Moon, meteor showers, the night skies – or, even better, of the transit of Venus itself, which will occur on 5/6 June.

To take part please submit photos to our Flickr group by Saturday 16 June, after which we will choose a selection of our favourite images to be showcased on physicsworld.com.

Please also feel free to include a caption to explain your photo. You may have photographed a rare astronomical event, or perhaps you travelled to a remote location in search of clear skies.

In our previous photo challenge, we asked readers to submit images on the theme of “doing physics”. We had some great submissions, which conveyed the excitement of new physics in the making, with both theorists and experimentalists featuring. You can see a selection of these photos in this showcase.

Doing physics – readers’ pictures

In the latest Physics World photo challenge, we asked readers to submit photos to our Flickr group on the theme of “doing physics”. Thank you to everybody who took part, and here is a selection of the images we received.

This photo, submitted by Flickr member P^2-Paul, manages to look both retro and futuristic at the same time. It is a digitized slide dating back to 1986, showing the set-up of a laser experiment designed to look at the extremely fine spectroscopic features in heavy metal oxides to probe their nuclear structures. “There were 26 degrees of freedom in that set-up. It took a whole morning to tune it up,” Paul recalls

Constructing physical models can help scientists to visualize structures and processes occurring at small scales. It can also help scientists to communicate their research to others, as is the case in this photo submitted by the University of Wisconsin-Milwaukee in the US. It shows graduate student Haihui Pu (left) and professor Michael Weiner (right), holding aloft a model of a molecule while standing in front of a scientific poster explaining their work.

The precision and beauty of scientific instruments is on display in this image taken by Anjan Reijnders at the University of Toronto. It is described as a “homebuilt” device for recording spectroscopic data in the infrared. Reijnders, who works at the University of Toronto, told Physics World that his group currently uses the instrument to study the properties of novel materials such as topological insulators, high temperature superconductors and quantum dot solar cells.

The thrill of physics experiments can be enjoyed by people of all ages at all stages in their careers. This photo, submitted by Flickr member SuperDewa, shows budding physicists in a homeschool group testing the forces required to smash a banana with a toy car.

Physicists can sometimes be accused spending long hours in their offices as they ponder some hideous equation or other. This photo submitted by RubyT shows one young physicist attempting to dispel this image by doing his sums while busy cooking in the kitchen. “On this side of the kitchen, physics homework. On the other side, the dinner he was cooking for the family,” writes RubyT.

Physics can help us to address some of the big philosophical questions of our time, but it can also be extremely useful for some more mundane but equally important tasks – such as finding new water supplies. This picture, simply titled “trabajando” (working), shows four scientists on Mexico’s Sierra Negra mountain searching for an aquifer. The photo’s author, Manolo Arrubarrena, said that his team was using a resistivimeter to take electrical measurements at the surface and build an image of the subsurface.

Physicists can be highly creative at times, creating complex experiments from the most basic equipment. This image, submitted by Humboldt State University in the US, shows physics student Brandon Merrill standing behind a plasma generator created out of an old beer keg. The experiment is designed to electrostatically accelerate electrodes to help demonstrate the feasibility of plasma-based fusion.

A good demonstration can capture the imagination and it can be a great tool in science communication. This picture, submitted by Flickr user binnysharma, shows how a compressed air gun and coloured balls were used to creative effect to demonstrate how a gamma-ray camera operates.

Thank you to everyone who submitted images and you can see all the images in our Flickr group, the Physics World photo challenge.

The theme for our next photo challenge is astrophotography. To tie in with next month’s transit of Venus, in which our sister planet passes across the face of the Sun, we want you to submit your astronomy photos to our Flickr group. The images might be of star trails, the Moon, meteor showers, the night skies – or, even better, of the transit of Venus itself, which will occur on 5/6 June.

To take part please submit photos to our Flickr group by Saturday 16 June, after which we will choose a selection of our favourite images to be showcased on physicsworld.com.

In the wake of Fukushima

The Iitate region of Japan is a beautiful place. Its pleasant roads wind through spectacular, steep-sided mountains and terraced paddy fields, and its villages are far away from the coastal areas devastated by the 2011 tsunami. But anyone who visits Iitate – as we did at the end of last year – will immediately notice a problem. There are very few people there. None of the shops were open as we drove around the region’s villages. There were no children playing in the gardens, and ours was one of only a handful of cars on the road.

A look at a map (figure 1) reveals the reason for their absence. The Iitate region is located in Japan’s Fukushima prefecture, around 60 km from the Fukushima Daiichi nuclear power station. After the Tōhoku earthquake on 11 March, a 15 m high tsunami wiped out the station’s power supplies, sending three of the six reactors into a partial meltdown. Tens of quadrillions (1015) of becquerels of radioactive caesium were released into the air, and over the next few days, wind and rain carried these radioactive isotopes to the surrounding countryside – including the villages of Iitate.

In the months since the meltdown, much of the media attention has focused on what went wrong at the power station, and how safety measures could be improved to prevent it from happening again (see Physics World May 2011 pp12–13 and March 2012 pp19–20). But the people in areas such as Iitate and even further afield have a more immediate concern: their health. It was this concern that led us to visit the region in late 2011, in connection with a project sponsored by TUV Rheinland Japan. The company had been receiving many enquiries about safety and the import/export of goods, and wanted to be able to provide clear, accurate and independent information about the radiological situation in Japan. The goal of the project was to set up new instrumentation at TUV Rheinland’s laboratories in Yokohama to check food and soil for radioactive contamination. Before doing this, we drove across the affected region, collecting soil samples at different distances from the stricken reactor. Then, when we returned to the laboratory, we conducted detailed studies of the samples to help us devise the procedures we would be using for food and soil measurements.

Areas of concern

At first sight, the numbers we found were scary. Tests on the soil around Iitate showed nearly 100,000 caesium disintegrations per second for every kilogram of soil, 2500 times the level found in soil near nuclear power stations in the UK. Such high levels of radioactivity meant that the dose rates in the region’s villages were similar to those present in restricted areas inside nuclear power stations, which only radiation workers wearing protective equipment and dosimeters are permitted to enter. The rates were particularly high near drains and ditches, and on mossy ground – anywhere that rainwater had collected after the accident.

It would be a mistake, however, to get too caught up in the raw numbers. While 100,000 disintegrations per second sounds like a lot, it is not immediately obvious what such figures mean for human health. It is also worth remembering that detecting radioactivity is much easier than detecting other industrial pollutants. The soil in the Iitate region might be contaminated with agrochemicals and heavy metals as well, but this would be much harder to detect. Most of us are willing to delegate our worries about these harder-to-detect contaminants to authorities such as the UK’s Food Standards Agency. Why can we not seem to do the same for radioactivity? Do we not trust the authorities to act in our best interests? If not, who can we trust to keep us safe?

For the citizens of Japan, these are pressing questions. People in this earthquake-prone country are used to living under the threat of natural disasters, and coping with their aftermath is a part of everyday life. Outwardly, the reactions of authorities and individuals have been pragmatic and efficient. Thousands of people have been trained as radiation workers to help with the clean-up efforts, many organizations including supermarket chains have invested in radiation detection equipment to check foodstuffs for contamination, and many private inspection and certification firms (including TUV Rheinland) have taken on new responsibilities for checking industrial products and export cargoes for radioactivity.

But underneath the country’s calm exterior, there is real concern. Low levels of radioactivity can be detected in the soil in Tokyo, more than 200 km from the Daiichi reactors. Reports that traces of radioactive caesium have been found in the urine of children living in Fukushima town, 90 km away, have given parents some sleepless nights. Schools are particularly worried. The International School at Yokohama, for example, has tried to reassure parents by arranging regular radiation surveys and placing restrictions on where it buys food.

The risk to health

One thing that has not helped the situation is the plethora of scientific terms for measuring radioactivity. Confusion between units such as sieverts and becquerels – not to mention prefixes such as milli and micro – has sown confusion among journalists and members of the public, while creating headaches for scientists and officials trying to evaluate and communicate the risks.

In the midst of this muddle, however, one thing is clear: away from the Daiichi site itself, the principal risk to health comes from an increased probability of contracting cancer caused by interactions of radiation with the body. This risk is related to the energy deposited by the radiation in tissue, a quantity known as the dose and measured in sieverts (Sv), where 1 Sv = 1 J kg–1 (see “Dose”). Ionizing radiation can reach the body in two ways. One is via direct gamma radiation from, for example, radioactivity deposited on surfaces. The other is by ingesting radioactive substances (including alpha, beta and gamma emitters), for example in food.

The external gamma dose can be measured directly with a hand-held meter. The dose from ingestion, however, has to be estimated from another quantity called the activity, which is measured in units of disintegrations per second. The standard SI unit of activity is the becquerel (Bq), where 1 Bq = 1 disintegration per second. Unfortunately, this means that the cause of the problem (radioactive isotopes of caesium) is measured in becquerels, while the risk of harm (the effective dose) is measured in sieverts. To add to the communication difficulties, these are not the only units in common usage. The US nuclear industry uses the unit rem instead of sieverts to measure dose (1 Sv = 100 rem), and the curie instead of becquerels to measure activity (1 curie = 3.7 × 1010 Bq). But even if you stick to SI units, it is still confusing that the number 1 can denote both a very large quantity and a very small one: a dose of 1 Sv is enough to cause immediate radiation sickness, whereas a radioactivity content of 1 Bq is tiny, thousands of times less than the activity caused by traces of radioactive potassium in the human body. As a rule of thumb, millisieverts (mSv, 10–3 Sv) and megabecquerels (MBq, 106 Bq) are the levels at which the risks to human health may start to become significant.

In most countries, members of the public receive a dose of about 2–3 mSv per year from a combination of cosmic rays and naturally occurring radioactivity in the environment. To quantify how much the Fukushima disaster has added to this dose, you need to take into account two factors: the external dose (as measured on the dosimeter) and the internal dose from ingesting contaminated food. For foodstuffs, the Japanese authorities initially imposed a limit of 0.5 MBq of caesium isotopes per kilogram. Someone who ate food at this limit for an entire year would receive, at most, an additional dose of 5 mSv.

As for the external dose, the radioactivity in some of the villages near the Fukushima nuclear site would result in an annual dose of about 70 mSv. This is high enough to pose a measurable threat to health, and is well in excess of the 20 mSv annual limit for radiation workers in the UK (in practice, occupational radiation doses are much less than this, since employers must ensure that doses are kept “as low as reasonably practicable”). If a worker is exposed to 20 mSv every year for their entire working life, experts estimate that it will increase that person’s risk of dying from cancer by about 0.1%, so exposure to 70 mSv per year increases the risk by about 0.4% – over and above the 20–25% chance we all have of dying from cancer.

However, few people are still living close to the Fukushima Daiichi site, and the dose rate further from the nuclear site falls off rapidly. In major population centres such as Fukushima town, an annual additional dose of 1 mSv per year might be more typical, although the dose for an individual will depend heavily on diet and lifestyle factors – such as how much time a person spends near highly contaminated areas.

So in total, a person living in a “typical” village in Fukushima prefecture and eating food with the maximum allowed level of radioactive contamination may receive a dose that is roughly double that of the natural background in Japan. This sounds like a lot, but in fact it is less than the additional dose someone would receive by relocating from London to Cornwall, which has a higher natural background because of uranium-bearing ores in the rock. This is a risk that most people would probably be prepared to accept.

However, the fact remains that there are large quantities of radioactive caesium isotopes in the soil, which can be taken up by food grown in the soil, and results from studies around Chernobyl have shown that these isotopes tend to remain in the top 20 cm of soil for decades. A greater challenge than assessing the average or “typical” risk will be to check and control contaminated areas and foodstuffs to prevent higher levels of activity from affecting large populations. Some foreign governments have responded to this risk by carrying out their own measurements of foodstuffs imported from Japan. For example, the Hong Kong government has increased its food surveillance programme and last year measured 70,000 samples.

The view from the ground

When we went to carry out checks on contaminated areas near Iitate, our task was made simpler by the fact that contamination from the Daiichi plant is almost entirely down to caesium-137 and caesium-134 isotopes, which are easy to detect using a technique called high-resolution gamma spectrometry (see “High-resolution gamma spectroscopy”). Moreover, advances in digital signal processing and software developments mean that such spectrometers can be operated by relatively inexperienced staff. In contrast, the Chernobyl disaster in 1986 involved a more complex mix of radionuclides, and the instrumentation available at the time was difficult to set up and run.

Another factor in our favour is that the accuracy of measured dose rates and radioactivity content is underpinned by a mature international measurement system. The starting point for all the measurements are specialist instruments and techniques known as “primary standards”, which are held at national standards laboratories, such as the National Metrology Institute of Japan in Tsukuba and the National Physical Laboratory in the UK, and are used to measure dose rates and activity in terms of fundamental units. These primary standards are crosschecked against each other and are independent of any possible government interference. The primary standards are used to calibrate other instruments or reference materials in an unbroken chain that leads to the measurements themselves.

By and large, then, the raw measurements of the activity and dose rates should be accurate, thanks to historical work at national laboratories worldwide. The more difficult questions concern how these measurements are interpreted, and what authorities choose to do in response. Standard techniques for interpreting the results do exist thanks to work done primarily by the US Environmental Protection Agency, and the nuclear industry has adopted them for decommissioning plants and surveying contaminated land. But unfortunately, at the moment there is not enough information from the Japanese authorities in the public domain to assess how they are interpreting the measurements.

To understand why interpretations matter, consider the task of keeping radioactivity out of the food supply. At the time of our visit, the Japanese government was planning to check one sample of rice per hectare cultivated as part of its efforts to prevent contaminated foodstuffs from appearing in grocery stores. The question we heard often was “Surely this isn’t enough?”. In fact, it depends on how the result is used. If a single result on a single sample of rice is used to decide whether that particular hectare of rice is below the legal limit, then no, it is not enough. Without further measurements, you cannot know the variation in the results and consequently how close one result is to the “true mean activity”. On the other hand, if that result is combined with others from many fields in a particular area, you can obtain a very accurate estimate of the true mean activity for the whole area – and thus be fairly sure whether food grown there is safe for humans.

In perspective

The problem of radioactive contamination of the environment around the Fukushima area is real, with levels of radioactivity in some places far in excess of the natural background. However, all the building blocks for limiting the risk to human health are available, in the form of quantified dose limits, accurate measurements with easy-to-use instrumentation and statistical techniques for interpreting the results. It remains to be seen how these resources will be deployed by the Japanese authorities and supported by a regulatory framework. Our impression from talking to our colleagues in Japan during our visit is that people have a palpable fear of radioactivity above and beyond the real risks, and this is bound to influence the authorities in the long term.

But regardless of the methods Japan decides to use to deal with the problem of radioactivity, the overriding impression we got from our visit is that the environmental consequences of the explosions at the Fukushima Daiichi plant are insignificant compared with the human cost of the natural disaster that preceded them. The tsunami killed more than 20,000 people and left 100,000 homeless. Large swathes of the coastal areas are still full of the foundations of former family homes, piles of building rubble and poignant sights such as lines of wrecked motorbikes in a disused school yard and small shrines to lost relatives. Such destruction is a very distressing and sobering sight, and puts into perspective our fears about radioactivity.

Exoplanet burning bright…

55 Cancri e


An artist’s impression of the 55 Cancri system, with 55 Cancri e nearly lost
in the glare of its star. (Courtesy: NASA/JPL-Caltech)

By Tushna Commissariat

Exoplanetary scientists will rejoice to hear that NASA’s Spitzer Space Telescope has managed to detect and analyse the tiny amount of infrared light that comes directly from a super-Earth exoplanet for the first time. About a few dozen super-Earths – planets that are 2–10 times more massive than the Earth – have been officially detected and countless other possible candidates have been found.

The exoplanet in question – known as 55 Cancri e – belongs to the 55 Cancri star-system, which is a measly 41 light-years away from the Earth – a small distance by astronomical scales. Indeed, 55 Cancri is so bright and close that it can be seen with the naked eye on a clear, dark night. The system is known to have five planets, with 55 Cancri e being the closest to its parent star. The planet is about eight times more massive than the Earth, completes its orbit in a dizzying 18 h – the shortest orbit known for an exoplanet – and is tidally locked, so one side always faces the star.

Previous studies of the planet revealed that 55 Cancri e is an extreme exoplanet with a rocky core surrounded by a layer of water in a “supercritical” state – the water is heated to such a degree that it is somewhere in-between a liquid and a gas – and topped off by a blanket of steam. In the new study, Spitzer measured the amount of infrared light that comes from the planet itself by looking at the slight dip in total light intensity when the planet undergoes an occultation – that is, when it circles behind the face of its parent star. When viewed in infrared, the planet is brighter relative to its star as its scorching surface heat blazes in the infrared end of the spectrum. This information reveals the temperature of a planet and, in some cases, its atmospheric components. Most other current planet-hunting methods obtain indirect measurements of a planet by observing its effects on the star’s light. In this case, the data revealed that the star-facing side of the exoplanet is more than 2000 K – hot enough to melt metal.

“Spitzer has amazed us yet again,” says Bill Danchi, who works on the Spitzer programme in Washington, DC. “The spacecraft is pioneering the study of atmospheres of distant planets and paving the way for NASA’s upcoming James Webb Space Telescope to apply a similar technique to potentially habitable planets.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors