Skip to main content

Keeping ahead: a look at physics in Japan

With a string of new high-profile international research facilities, Japan is maintaining its world-leading status in physics and astronomy. Yet there remain both challenges and opportunities for physicists from abroad to go and work in Japan or to collaborate with Japanese researchers.

In this lecture, Adarsh Sandhu, who has worked in the country for more than 25 years, gives his personal take on physics in Japan.

Date: Wednesday 10 October 2012

Speaker: Adarsh Sandhu, Toyohashi University of Technology, Japan
Professor Adarsh Sandhu has been a faculty member of the Quantum Nanoelectronics Research Centre at Tokyo Institute of Technology since 2002, and director of research at the Advanced Interdisciplinary Electronics Research Institute, Toyohashi University of Technology since April 2010. His research activities include scanning Hall probe microscopy and the development of biosensors based on magnetic labels for rapid medical diagnosis. He is also a visiting professor at Tsinghua University in Beijing and IIT Delhi.

Moderator: Dr Michael Banks, news editor, Physics World

Free access to Nobel winners' papers

Serge Haroche's cat states


The time evolution of a Schrödinger’s cat state realized by Serge Haroche and colleagues.

By Hamish Johnston

To celebrate the 2012 Nobel Prize for Physics, IOP Publishing has collected 30 papers published in its journals by winners Serge Haroche and David Wineland. Articles in this collection are free to read until the end of February 2013.

The above image is taken from a paper by Haroche and colleagues entitled “Manipulating and probing microwave fields in a cavity by quantum non-demolition photon counting” (Phys. Scr. T137 014014). You can read it here.

Quantum-control pioneers bag 2012 Nobel Prize for Physics

The 2012 Nobel Prize for Physics has been awarded to Serge Haroche and David Wineland for their work on controlling quantum systems. The prize is worth SEK 8m (£750,000) and will be shared by the pair, who will receive their medals at a ceremony in Stockholm on 10 December.

According to the prize citation, Haroche and Wineland won “for ground-breaking experimental methods that enable measuring and manipulation of individual quantum systems”.

Haroche is a French citizen and works at Collège de France in Paris. Wineland is a US citizen and works at the National Institute of Standards and Technology in Boulder, Colorado.

In a statement, the Royal Swedish Academy of Sciences said “Serge Haroche and David Wineland have independently invented and developed methods for measuring and manipulating individual particles while preserving their quantum-mechanical nature, in ways that were previously thought unattainable”.

According to Nobel committee member Anne L’Huillier, the pair’s work represents “the first tiny steps towards building a quantum computer”.

Quantum-optics pioneer Alain Aspect of Laboratoire Charles Fabry in Paris told physicsworld.com “Observing, manipulating and controlling individual quantum systems has been a major breakthrough of the last few decades. Schrödinger doubted that it might ever be possible, but this year’s laureates have done it.”

CQED pioneer

Haroche was born 1944 in Casablanca, Morocco, and in 1971 gained a PhD from Université Pierre et Marie Curie in Paris. He shares half of the prize for developing a new field called cavity quantum electrodynamics (CQED) – whereby the properties of an atom are controlled by placing it in an optical or microwave cavity. Haroche focused on microwave experiments and turned the technique on its head – using CQED to control the properties of individual photons.

In a series of ground-breaking experiments, Haroche used CQED to realize Schrödinger’s famous cat experiment in which a system is in a superposition of two very different quantum states until a measurement is made on the system. Such states are extremely fragile, and the techniques developed to create and measure CQED states are now being applied to the development of quantum computers.

Had to sit down

In a telephone interview with Swedish journalists shortly after the announcement was made, Haroche said that he knew that he had won the prize when his mobile phone rang this morning as he was out walking and a Swedish number was on the display. “I sat down on a bench before I answered,” he said. Although Haroche knew that he was in the running for the prize, he was overwhelmed upon hearing the news. He said that he will enjoy a glass of champagne at lunch with family and friends then “go back to the office to celebrate with colleagues”.

Haroche also said that he was “glad to share the prize with Dave Wineland – he is a fantastic physicist and to be in his company is certainly a great pleasure for me and a great recognition”.

Wineland returned the compliment by saying, “[Haroche] and I have been friends for a long time, so it’s nice to share it with him”.

Master of ion control

David Wineland was born in 1944 in Milwaukee, Wisconsin, and received his PhD in 1970 from Harvard University. As well as being Group Leader and NIST Fellow at the National Institute of Standards and Technology, he also has an appointment with the University of Colorado at Boulder.

Wineland bagged his half of the Nobel for his ground-breaking work on the quantum control of ions. One of his many achievements was the creation and transfer of a single ion in a Schrödinger’s cat state using trapping techniques developed at NIST. Ion traps are created in ultrahigh vacuum using carefully controlled electric fields and a trap can hold just one ion or several in a row.

Ions vibrate as they are held in a trap, and this vibrational energy must be removed in order to cool the ion to its lowest energy state. To achieve this cooling, Wineland developed a laser-based technique to remove quanta of vibrational energy from ions. This “sideband” technique of cooling can also be used to put an ion into a superposition of states – including a Schrödinger’s cat state.

Wineland has also used ion-control techniques to develop extremely accurate optical clocks, as well as circuits for quantum computers.

Groundwork for quantum information

Rainer Blatt of the University of Innsbruck in Austria does experiments in both CQED and ion trapping, and he told physicsworld.com that the Nobel committee chose well in awarding the prize to Haroche and Wineland. Blatt points out that the pair developed similar quantum-control techniques for use on different physical systems – techniques that have laid the groundwork for many of today’s nascent quantum-information systems.

Blatt cites Wineland’s 2008 development of “quantum-logic spectroscopy” – which allows a single ion to be used as an optical clock – as an important application of the control techniques, along with the creation in 2009 of a small-scale device that performs all the functions required in large-scale ion-based quantum processing.

Haroche’s work provides a framework for controlling the interaction between a single atom and a single photon – something that Blatt says is currently being used to develop ways of exchanging quantum information between atoms and photons. This could allow physicists to create quantum computers in which data are stored in stationary quantum bits (qubits) based on atoms, which are relatively stable over long periods of time. Data could then be transmitted between atoms using photons, which can preserve their quantum information while travelling relatively large distances.

The hue of alien Earths

An international team of researchers claims that the link between the colour of a planet and its surface features can be used to prioritize which newly found exoplanets, especially rocky planets with clear atmospheres, should be studied in-depth for signs of life. The work provides an important link between Earth-based geomicrobiology and observational astronomy.

A huge number of exoplanets have been discovered in recent times – just over 800 confirmed examples are known today, with more than 2000 candidates waiting to be confirmed. Of the candidate exoplanets, it is difficult to decide which ones are the most likely to harbour life.

Home sweet home?

“What is now observed is that smaller Neptune-sized planets are, in fact, far more abundant than larger Jupiter-sized ones. This is exciting and one feels that it is only a matter of time before the same can be said for Earth-sized planets around other stars. The question then naturally arises as to how one could characterize these rocky planets to check for their potentially habitability,” explains Siddharth Hegde of the Max Planck Institute for Astronomy in Germany. He and colleague Lisa Kaltenegger from the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, in the US have explored how filter photometry can be used to pinpoint Earth-like exoplanets and study their atmospheric bio-signatures – whether they have aerobic or anaerobic atmospheres. Looking at the diversity of life on Earth, even under extreme conditions, the researchers wonder whether planets around other stars with extreme surroundings could also harbour some form of life.

In astronomy, photometry is a way of measuring the flux of the electromagnetic radiation of an astronomical object. “Filter photometry basically means that you split the collected light [from a celestial object] only into a few wavelength bins that are defined here by the commonly used filters in the visible called ‘B, V, I Johnson–Cousins filters’ [or blue, green and red colour bins],” explains Hegde. The advantage of this approach is that lots of photons are gathered per bin, meaning a good signal-to-noise ratio is achieved – which, in turn, means that it may be possible to characterize dimmer planets. The researchers use this method to identify planets that have surfaces similar to those on Earth that harbour life. This is done by plotting the blue–green versus blue–red bins using customized filters, creating what is known as a “colour–colour diagram”. While the technique does not provide the finer details of a planet, it can very easily be used to put together a follow-up prioritized “target list” of planets that should be studied in detail with spectroscopy.

True colours

A way of looking for these extreme environments is to study the “albedo” of a planet – its reflectivity as a function of wavelength. For example, snow has a high albedo, meaning that it reflects well, while water has a low albedo and so does not reflect as well. A previous study, conducted in 2003, compared the colour–colour diagrams of rocky and Jupiter-like planets in our solar system to see whether they were the same – they were not. That study concluded that a colour–colour diagram can be used to make a first-order basic characterization of a planet’s nature. Hegde and Kaltenegger extended this idea to rocky exoplanets based on the assumption that these habitats best determine the environmental limits for harbouring Earth-type extremophiles.

Going to extremes

An extremophile is an organism that exists in physically or geochemically extreme conditions – such as extreme temperature, radiation, pressure, dryness, salinity or pH – that are detrimental to most other life-forms on Earth. “By splitting the light from a hypothetical planet, with a surface covered with a material that can harbour extremophiles on Earth, into the three filter bins, we found that those planets fall into a tight band when plotting a colour–colour diagram,” says Hegde.

The method is similar to another already used by exoplanet hunters who look for the “red-edge” – a telltale sign of vegetation – in the spectra of planets. This is a large and abrupt change in the absorption of light by plants that occurs at about 700 nm. At shorter wavelengths, chlorophyll absorbs very strongly and therefore plants reflect little light; above 700 nm, chlorophyll does not absorb light, which means that leaves are able to reflect much more sunlight back into space. Combining such spectral readings with colour–colour diagrams could clearly indicate if a planet has any Earth-like life, or is capable of harbouring it.

In the future, the researchers are keen to study possible changes in a planet’s atmosphere caused by different kinds of extremophiles that might inhabit its surface – for the moment, their model assumes the extremophiles do not affect the atmosphere significantly. “Maybe, with the help of biologists who culture such extremophiles in the lab, we can find out if there are gases in the atmosphere that can tell us whether such surfaces really harbour life,” muses Hegde.

A paper on the work is available on the arXiv preprint server.

Nobel for the Higgs is the people's choice

By Hamish Johnston

Our Facebook followers have spoken – when I last checked, about 63% of respondents to last week’s poll believe that scientists involved in the discovery of the Higgs boson should share the 2012 Nobel Prize for Physics.

I agree. The Nobel committee should reward the fantastic work done by those who built the Large Hadron Collider and those who designed and ran the ATLAS and CMS experiments while the discovery is fresh in the minds of the public. A public, I should add, who will be left scratching their heads if and when a breakthrough made 30 years ago trumps the Higgs and bags the Nobel.

Some of you may be shouting “But they don’t yet know if it is the Higgs” at your screen. I would argue that the act of building such a colossal facility, getting it to work, analysing vast quantities of data, and finding something, is worthy of a Nobel – regardless of what that something is.

But alas, I don’t think that a Higgs-related prize with come this year – it’s likely to be a shoe-in for 2013, when we will have a much clearer idea of what has been found at the LHC.

So what other topics have our readers tipped for the prize? Runner-up in our poll with 9% is a Nobel related to the first experimental test of Bell’s theorem. Because this pioneering work – done in 1981 by Alain Aspect and others – marks the beginning of the burgeoning experimental field of quantum information, I’d say it’s a frontrunner for tomorrow’s prize.

Just behind at 8% is the discovery of neutrino mass, which could see my fellow Canadian Art McDonald making the trip to Stockholm.

The prize will be announced tomorrow at 10:30 BST, so stay tuned to physicsworld.com for comprehensive coverage of the 2012 physics Nobel.

Graphene tunnel barrier makes its debut

Researchers in the US have found yet another use for the “wonder material” graphene. Instead of exploiting the material’s exceptional ability as an electrical conductor, the team has found a way to use graphene as an extremely thin “tunnel barrier” to conduction. The team says that this new application is particularly suited to developing spintronics – a relatively new technology that exploits the spin of an electron as well as its charge.

Graphene is a sheet of carbon just one atom thick and ever since the material was first isolated in 2004, researchers have been trying to create electronics devices that make use of its unique properties. Most of this effort has focused on how electrons flow in the plane of the sheet – which can behave both as a conductor and semiconductor. But now Berry Jonker and colleagues at the US Naval Research Laboratory (NRL) have shown that graphene can serve as an excellent tunnel barrier when current is directed perpendicular to the plane of carbon atoms. The spin polarization of the current is also preserved by the tunnel barrier, a finding that could have important implications for spintronics.

Low-energy switching

The spin of an electron can point in an “up” or “down” direction and this property could be used to store and process information in spintronics devices. Circuits that employ a spin current – electrons with opposite spins moving in opposite directions – could, in principle, be smaller and more efficient than conventional electronic circuits that rely on switching charge alone. This is because switching spins from up to down can be done using very little energy.

Spintronics devices are typically made from ferromagnetic materials and semiconductors. Ferromagnetic metals, such as iron or permalloy, have intrinsically spin-polarized electron populations – that is, different numbers of up-spin and down-spin electrons – and thus make ideal contacts for injecting spins into a semiconductor. However, ferromagnets and semiconductors have a large conductivity mismatch, so spin is injected via a tunnel barrier – an electrically insulating barrier through which electrons tunnel quantum mechanically. The problem is that the oxide barriers normally employed as tunnel barriers introduce defects into the system and have resistances that are too high – factors that adversely affect device performance.

Enter the graphene tunnel barrier

To overcome this problem, Jonker and colleagues decided to employ single-layer graphene as the tunnel barrier, because the material is defect resistant, chemically inert and stable. These properties can be exploited to make low-resistance graphene spin contacts that are compatible with both the ferromagnetic metal and semiconductor.

The researchers began by mechanically transferring graphene grown by chemical vapour deposition onto hydrogen-passivated silicon surfaces. They achieved this by floating the graphene on the surface of water and bringing the silicon substrate up from below. This common technique ensures that there is no oxide layer between the silicon surface and the graphene. The team then injected electron spins from a ferromagnetic nickel–iron alloy into the silicon via the graphene tunnel barrier. The voltage arising from the resulting spin polarization in the silicon was then measured using the Hanle effect, a method that is routinely employed by spintronics scientists.

Beyond Moore’s law

“Our discovery clears an important hurdle to the development of future semiconductor spintronics devices – that is, devices that rely on manipulating the electron’s spin rather than just its charge for low-power, high-speed information processing beyond the traditional size scaling of Moore’s law,” Jonker says. “These results identify a new route to making low-resistance-area spin-polarized contacts, which are key for semiconductor spintronics devices that rely on two-terminal magnetoresistance, including spin-based transistors, logic and memory.”

Using graphene in spintronics structures may provide much higher values of the tunnel spin polarization thanks to so-called spin-filtering effects that have been predicted for selected ferromagnetic metal/graphene structures, Jonker adds. “Such an increase would improve the performance of semiconductor spintronics devices by providing higher signal-to-noise ratios and corresponding operating speeds, so advancing the technological applications of silicon spintronics,” he says.

The work, which was supported by programs at the NRL and the US Office of Naval Research, is reported in Nature Nanotechnology.

Tiny spheres simulate crystal melting

Physicists in Hong Kong have used tiny spheres to simulate what happens when a crystalline solid starts to melt. Instead of seeing the emergence of crystalline defects during the melting process, the team found that the transition from solid to liquid is driven by small groups of spheres that move co-operatively in small loops.

Almost all solid objects – including ice cubes – melt from the surface in a process that is well understood by physicists. But while some materials could be heated internally by focussing a beam of light at a point well below their surface, actually seeing how the atoms or molecules make the transition from solid to liquid would be extremely hard. As a result, scientists have little understanding of how internal melting occurs.

To get around this problem Yilong Han and colleagues from the Hong Kong University of Science and Technology simulated the melting process using a crystal made of tiny N-isopropylacrylamide (NIPA) spheres. These spheres all have the same radius of about 700 nm, which is about 1000 times larger than a typical unit cell in a crystalline solid. Crucially, though, this is large enough to be seen using an optical microscope.

Tracking the spheres

The team began by compressing a collection of these spheres to create a face-centred cubic (FCC) lattice – the crystal structure adopted by copper, aluminium and many other metals. A light was then focussed to a point about 45 μm beneath the surface, which is well within the bulk of the crystal. The researchers then watched how the spheres moved, using an optical microscope equipped with a camera that could take pictures at a rate of 15 frames per second.

In a crystalline material, melting is driven by the increase in random thermal motion of the constituent atoms or molecules as the material heats up. For much larger spheres, however, the amount of energy that would be needed to make them move about like atoms would be impossible to deliver. But what is interesting about NIPA is that it shrinks as it is heated, with the volume of the spheres dropping by more than 30% as they are warmed up from 26 °C to 31 °C. The spheres therefore have enough room to move around when heated.

The team began with a NIPA FCC crystal in a “solid state” in which more than 54.5% of the volume is occupied by spheres and the rest by water. While this “volume fraction” is much smaller than the familiar 74% for a hard-sphere FCC lattice, Han points out that this is still the most stable configuration until the volume fraction drops below 54.5%. Below this value, the material becomes disordered and resembles a liquid.

Deviating from equilibrium

Han’s team then focused a heating lamp on a spot of the crystal that is about 75 µm across and 20 μm deep and gently heated the spheres for about 107 minutes. The degree to which the spheres deviate from their equilibrium positions is analysed by calculating their Lindemann parameter (L).

The team concluded that the melting process involves two main steps. The material begins in the solid FCC phase with a low L value. As it heats up, large domains of particles then emerge with high L numbers. Finally, a small region of liquid is created that then grows throughout the crystal. According to the team, this two-step process is in line with a century-old principle of melting and crystallization called “Ostwald’s step rule”, which says that an intermediate unstable structure is involved.

When the physicists took a closer look at how melting occurred, they found that the process seemed to begin via “loop rearrangements” of the spheres. This involves a small number of spheres moving in a loop, with one sphere moving into the space vacated by its neighbour and another sphere moving in behind it. While the integrity of the FCC lattice is maintained by these loops, they become surrounded by regions of high L, which then make the transition to the liquid state.

Surprisingly long time

Han told that the team was initially surprised to see that the large L domains lasted for relatively long times of about 30 s. “Then we realized that particle swapping can stabilize the large-L region and the large-L region can, in turn, promote particle swapping,” he says.

One interesting phenomenon that had not been predicted by theory – but that was seen by the team – was the coalescence of two or more liquid regions into a larger region. This big domain rapidly assumes a spherical shape because of the surface tension at the boundary between the liquid and solid phases. However, one thing that is predicted by some theories – but that was not seen in the experiment – is the emergence of crystalline defects, such as disclinations, during the melting process. Han, however, points out that a defect-mediated melting theory is “just a guess without a theoretical derivation”.

Difficult measurement

Gary Bryant of RMIT University in Australia told physicsworld.com that the new work is an important step towards a better understanding of crystal melting, adding that the study “goes a long way to measuring homogeneous melting in bulk materials, which is normally tricky because temperature variations are usually transferred from the container walls, leading to inhomogeneous or heterogeneous melting”.

Although the work provides important insights into the melting process, both Han and Bryant admit that there are few practical applications of understanding internal melting because it rarely occurs in practice. “One possible application might be in selective annealing – for those trying to create large, defect-free colloidal crystals,” says Bryant. Such crystals could find use in photonics or other nanotechnology applications.

Han and colleagues are now planning to use their technique to look at the second stage of melting – when the nucleus of liquid expands rapidly. They also want to study the effect of local heating on regions of a crystal that contain a single defect, such as a dislocation or grain boundary, to see how its presence affects the melting. Studying solid–solid transitions between two crystal structures as well as solid–glass transitions are also on the researchers’ to-do list.

The research is described in Science.

How to vote

On 6 November Americans go to the polls to elect a president, a third of the US Senate, all 435 members of the House of Representatives, and numerous governors, mayors and local officials. To this extent, it is an ordinary election year. This time around, however, campaigning seems thicker than usual with claims and counter-claims. Many people say they are unsure how to vote.

Not me. I’m a single-issue voter. I vote for the candidates with the most respect for science. I am not saying I necessarily want to see scientists in office; while scientists are prone to wait for definitive answers, political decisions often have to be made before all relevant data are in. I also don’t think political candidates need be scientifically knowledgeable – able to recite π or the periodic table, say. What I am saying is that a necessary qualification for candidates for public office is a respect for the scientific process and its infrastructure.

In case that sounds vague and abstract, let me illustrate. A little over a year ago, Hurricane Irene slammed into the North Carolina coast and made its way north, leaving devastation in its wake. It flooded towns, smashed property, and caused dozens of deaths and many billions of dollars in damage. On Long Island alone it left hundreds of thousands of homes without electricity for days, including mine in Stony Brook. Meteorologists expressed some frustration with their inability to predict the hurricane’s exact path and intensity, citing several unknowns in the myriad factors that govern the paths of storms. More research was needed, they said, before their models would markedly improve. In the light of that uncertainty, what do you think ought to be the rational response for politicians? Is it a to reinforce hurricane safety measures while supporting the efforts of scientists to improve their models? Or b to ignore hurricane warnings, attack hurricane scientists as a self-promoting cabal of dishonest conspirators and de-fund hurricane research?

Ignoring the science

Okay, it’s a stupid example. Only an insane person answers b. But why do more and more political candidates – even current office-holders – resort to a version of b in responding to issues such as climate change, medical care, vaccines, epidemics, evolution and birth control? For instance, the Republican candidate for vice-president, Paul Ryan, wrote an article in the Wisconsin newspaper The Journal Times on 11 December 2009 in which he said that leading climatologists were perverting the scientific method so as to “intentionally mislead the public”. Ryan subsequently reacted to what he felt was uncertainty over climate change by voting to prevent the US Department of Agriculture from implementing a climate-change protection plan, and to eliminate White House climate advisers.

The most outrageous case of ignoring scientific findings in this election – so far – involves Congressman Todd Akin, a Republican from Missouri and a Senate candidate, and a member of the House Committee on Science, Space and Technology. In August Akin defended his opposition to abortion following rape by saying that pregnancy resulting from rape is rare because “the female body has ways to try to shut the whole thing down”. Akin’s remarks directly contradicted numerous studies on pregnancy resulting from rape, such as one published in 1996 in the American Journal of Obstetrics and Gynecology (175 320).

Lest you think that physics is immune from such treatment – or that I am targeting only Republicans – consider Congressman Dennis Kucinich, an Ohio Democrat who, citing possible hazards, introduced a bill requiring radiation warning labels on mobile phones. But although the world’s seven billion people own an astounding 5.6 billion mobile phones, the National Cancer Institute (Journal of the National Cancer Institute 93 166) and other federal scientific agencies agree that there is no scrap of evidence that electromagnetic radiation can break DNA bonds. Though not as repugnant as Akin’s remark, Kucinich’s action involves shameless posturing over a health issue that can needlessly frighten innocent people.

Politics and science are two very different professions. Politicians come up with socially desirable visions and fact-based plans for achieving them, balancing side effects and costs. Scientists collect technical data and other information that may enter into these plans. In normal situations, scientific work is uncontroversial and operates below the political radar. The danger arises when issues become political lightning rods, and ideological aims cause people to override basic, unavoidable facts bearing on health and safety.

Candidates’ views about science are important because these views tend to reflect how the candidate approaches other issues. This is why I look carefully at each candidate’s attitude towards science, reflected in their web pages, pronouncements and voting record. Do they arrive at a decision first, and then cherry-pick information to support it? Or do they inquire first before coming to a decision, respecting and supporting the scientific infrastructure that has been build to acquire such data?

The critical point

“But it’s real for us!
It’s real for us!
Doesn’t matter what the muggles say,
it’s real for us!”

Lauren Fairweather’s affecting song, “It’s real for us”, is a cult classic among Harry Potter fans. It is about how a youngster’s love for the fantasy land of the young magician helps her to cope with the more intractable adult world. This election year, I’m hearing more political candidates than ever express the same sentiment, though neither with Fairweather’s self-conscious irony nor the realization that the election is not about who governs Hogwarts but the US. It’s a world where wishes and wands cannot control things such as hurricanes, disease, pregnancy and evolution.

Do any candidates realize that? Do they have a track record of supporting the network of institutions that’s been established to discover what does control these things? If so, they get my vote.

Star seen whizzing around supermassive black hole

Astronomers using the Keck telescope have found a new star orbiting very near to the supermassive black hole believed to be at the centre of the Milky Way. This is only the second star that researchers have observed completing an entire orbit – and its discovery confirms the black hole’s presence beyond reasonable doubt. Future observations of both orbiting stars could provide a unique test of general relativity.

The Keck telescope atop Mauna Kea in Hawaii has been used since the mid-1990s to systematically probe the area surrounding the centre of the Milky Way. In doing so, astronomers revealed several stars that appear to be orbiting a central object dubbed Sgr A* (“Sagittarius A Star”). From measurements of the stars’ orbital characteristics, it was calculated that Sgr A* must weigh in at around four million times the mass of the Sun. The only known astrophysical object that could be so massive, yet exist in such a small space, is a black hole.

However, only the orbit of one star – S0-2 – had data covering its entire 16.5 year journey around the centre. Data on the rest of the stars cover less than 40% of their orbits – the remainder has been projected using modelling. In order to characterize an orbit, astronomers believe that 50% of a star’s orbit needs to be observed. With only S0-2 breaking this threshold, some sceptics questioned whether a central black hole existed at all.

Better adaptive optics

Now, astronomers, including Andrea Ghez at the University of California, Los Angeles, have revealed the discovery of a new star named S0-102. “The orbital period of this star is just 11.5 years – the shortest of any star known to orbit the black hole,” Ghez told physicsworld.com. “Improvements in adaptive optics have allowed us to find fainter stars and measure them more acurately,” she says. With adaptive optics, the telescope’s mirror is not a single surface, rather a tiled surface made up of smaller mirrors. A laser guide is fired into the sky above the telescope and the distortion of the laser due to atmospheric turbulence is measured. The shape of the mirror can then be adapted by moving individual tiles in order to compensate for the distortion.

This technique will also allow the future observation of S0-102 at apoapsis – its furthest distance from the black hole. “This will reduce our uncertainties in parameters like the black hole’s mass,” says Ghez. Having a second star to observe will also allow astronomers to improve their understanding of S0-2’s orbit. In particular, it will help provide a more precise measurement of S0-2’s periapsis – its closest approach to the black hole – in 2018. During periapsis, the star experiences a stronger gravitational force, causing an additional redshift in its light. The precise amount of redshift is predicted by Einstein’s general theory of relativity. The experiment can be repeated when S0-102 reaches its own periapsis in 2021.

General relativity also predicts the precession of a star’s periapsis. “The fact that space is warped by the gravity of the black hole means that orbits overshoot each time. The point of periapsis moves on in the direction that the star is already orbiting,” explains Ghez. This is similar to the precession of Mercury’s orbit within our own solar system – a puzzle that, when explained by Einstein in 1915, provided an early endorsement of his ideas.

Unknown parameter

However, this particular test of relativity is not possible with a single star. “The situation isn’t as simple as two stars orbiting a single black hole,” says Ghez. “There are likely to be other things orbiting in there too, such as stellar-mass black holes and neutron stars,” she adds. This means that the orbiting stars do not see a symmetrical distribution of mass as they pass through this crowded region. If general relativity is to be tested, it has to be treated as an unknown parameter. If the mass distribution is also unknown, you need two stars to solve the equations. “With future advances in adaptive optics, and the next generation of telescopes, we will now be able to see whether Einstein’s relativity stands up in this extreme gravitational environment,” Ghez hopes.

“It is pretty spectacular that they’ve observed the whole orbit of a second star,” Nils Andersson, head of the General Relativity Group, at the University of Southampton, UK, says. “It shows there has to be a black hole in the centre, and it helps pinpoint how massive it is,” he adds. However, he believes there are stronger tests of general relativity. “I think the best test beyond the solar system is still two pulsars orbiting around each other. That sort of system puts more constraints on Einstein’s theory,” he explains.

The observations are described in Science.

Red carpet physics

Sir Peter Knight


IOP president, Sir Peter Knight. (Courtesy: IOP/Mark Earthy)

By Tushna Commissariat

Yesterday, I was in London attending the annual awards dinner of the Institute of Physics (IOP), which publishes Physics World, as well as the first ever IOP Innovation Awards, held earlier in the afternoon. It proved to be an exciting and jam-packed day to say in the least.

The IOP Innovation Awards have been set up to recognize and celebrate businesses from the UK and Ireland that have achieved significant commercial success by finding a niche in the market and developing physics-based applications to fill it. This year, the four inaugural awards went to a wide range of products.

All afternoon long, the Innovation Awards room was full to the brim with scientists, developers, students and recruiters keen on finding out what the companies did and the products they had to offer. In fact, the room was so busy that IOP president, Sir Peter Kinght, who came along to address the ceremony, found it hard to make his way to each event desk and promised a larger space for next year’s meet. He was keen to show the world the “vibrancy of investment in the technology” the various companies had developed. “Physics is not just about cosmology or particle physics – that’s great too – but it’s about making a difference in the world,” he told visitors. Knight finished by promising that next year, the Innovation Awards would be “bigger and better”.

Attendees at the Innovation Awards


Visitors at the Innovation Awards. (Courtesy: IOP/Mark Earthy)

One winner was a small, noiseless, high-volume pump that has many applications in medical devices, developed by Technology Partnership, based at the Melbourn Science Park in Hertfordshire. The tiny pump runs at 20 kHz and is already being used for wound therapy devices and in an electronic atomizer that is used for more efficient drug delivery. The device has only been on the market for 18 months, but has already earned the company more than £1m in additional revenue.

Another award went to Canterbury-based Naneum, which has developed a portable and easy-to-operate particle monitor to detect and identify nanoparticles pollutants, with applications in environmental monitoring, occupational health and atmospheric physics. The company was keen to develop a device that was easy to transport and could be used by any engineer, rather than someone trained to specifically do so. The device – the Nano-ID NPS 500 – is forecast to earn the company more than £1.5m over the next two years.

“Personal confocal” is how the next award winner, Aurox, describes its microscope attachment that lets researchers take 3D high-resolution images without the costs of investing in a confocal laser scanning microscope. Spun out from the University of Oxford, the firm has now partnered with Andor and Carl Zeiss to develop the Viva Tome imaging system. Having developed the new technology three years ago, it has already earned the company almost £1m in additional revenue. Aurox also won a Queen’s Award for Enterprise in Innovation earlier this year.

Samples of ZBD's e-paper


Some of ZBD’s e-paper supermarket labelling. (Courtesy: Physics World/Tushna Commissariat)

The final company to be lauded was ZBD Solutions, which has spent the past 12 years perfecting a novel e-paper display that makes shelf-edge labelling easier. The Malvern-based company was spun out from the liquid-crystal research centre at DERA, formerly the UK Ministry of Defence’s research arm. The current avatar of their e-paper (pictured above) was developed four years ago and has created 62 jobs and earned the company an additional £20m. ZBD Solutions was ranked 5th on this year’s Sunday Times Hiscox Tech Track 100 league table.

The IOP Awards dinner took place later in the evening and 600 of the “who’s who” of the UK physics community were out in their finest clothes. IOP medals span the entire spectrum of physics research, physics education and outreach as well as the application of physics and physics-based technologies. They are given to “identify and honour physicists who are today making remarkable contributions and to encourage younger members of our community to greater success in the future”. A complete list of all the many awards and their winners of the 2012 medals can be found here.

In Knight’s address to the gathering he highlighted, among other issues, the lack of girls in physics, after it was noted that only 20% of girls have been taking physics A-levels over the past 20 years. His comments were made in the light of a new report published by the Institute on the same day entitled It’s Different for Girls. The report looks at changing the attitude of school teachers in all subjects, as well as parents, to encourage girls to take up A-level physics.

Professor Brian Cox, who was awarded the President’s Medal 2012 for his “achievements in promoting science to the general public and inspiring the next generation of physicists”, was the guest speaker at the dinner. In his witty and engaging speech, a video of which you can watch below, he addressed the excitement of the Higgs discovery made earlier this year as well as the sophistication of the Large Hadron Collider. But he also had some strong words to say about promoting bad science and how it was not acceptable – he highlighted homeopathy and some ill-advised comments made by Jeremy Hunt, the current health secretary, on the issue.

To much laughter, Cox followed that up with some amusing comments about “faith-based aviation” or the serious lack thereof by saying, “There is a reason why we don’t have…homeopathic aircraft that run on the memory of petrol.” He also spoke of how it was important for the government to invest in increasing the number of STEM graduates in the UK. He ended his address by thanking the physics community, saying “Without you, I would have nothing to say the next time I stand on a mountain!”

All in all, it was an entertaining and illuminating evening for the people in the UK who are involved in physics…and the raspberry and chilli ice-cream for desert was excellent too!

Copyright © 2025 by IOP Publishing Ltd and individual contributors