Skip to main content

How green is green gas?

Green gas is being talked up of late as one new way forward for decarbonisation. So what exactly is green gas? It could, in fact, be a lot of different things, some of which are far from green. In general, that depends on the sources and the counterfactuals of using/not using them. Biomass use can be near carbon-neutral, if the replanting rate keeps up with its use, but destroying natural carbon sinks (e.g. by aggressive use of forest products for fuel) means there can be net carbon dioxide rises, while greenhouse gas (GHG) emissions from using fossil gas may be reduced if carbon capture and storage (CCS) is included. In between these extremes there are all sorts of options, including synthetic gas made using surplus renewable power-to-gas electrolysis (P2G).

A helpful new report for the European Commission (EC) from The International Council on Clean Transportation offers a three-part classification system:

  • High-GHG: Gases with a life-cycle greenhouse gas intensity of 30% or higher than business-as-usual natural gas. The EC says that this category of gas “likely needs to be phased out in the near future in order to meet Europe’s climate targets”.
  • Low-GHG: Gases that reduce lifecycle greenhouse gas emissions by a substantial degree. For example, the Renewable Energy Directive 2018/2001 requires renewable power-to-methane to reduce greenhouse gas emissions by 70% compared to fossil fuels.
  • GHG-neutral: Gases with zero net greenhouse gas emissions. This includes pathways with negative greenhouse gas emissions on a life-cycle basis.

Gaseous complexity

It certainly can get complicated. For example, the report says “the use of additional or excess zero-GHG electricity from wind and solar to produce hydrogen for electrolysis, or green hydrogen, is a renewable, GHG-neutral pathway. However, that same process utilizing conventional fossil-powered electricity in the absence of CCS will be neither renewable nor GHG-neutral. As with fossil gases, the production of hydrogen or synthetic methane using electrolysis utilizing conventional power with CCS could place those fuels in the low-GHG category, but they would be non-renewable”.

And here’s a nice conundrum — is carbon dioxide really saved when it is used to make new fuels? The report says that it is important to draw a distinction between CCS and carbon capture and utilization (CCU). It notes that “in CCU, captured carbon is utilized for a given fuel production process or end-use. In this case, CO2 from a fossil fuel power plant or used gases from a steel mill may be utilized as an input for synthetic methane production.” But of course, when that snygas is burnt, you get CO2 again.

What’s more, the report worries that “in the long-term, it is possible that policy tools may be necessary to ensure that CCU does not perpetuate the use of fossil fuels. If all industrial CO2 emissions are phased out in the future or claimed by other circular economy processes, all CCU must utilize CO2 from direct air capture to continue delivering climate benefits”.

Yes, true, CCU may lead to fossil “lock in”. But so may CCS. Whatever is done with the carbon removed from the atmosphere, or from power station exhausts, there will still be more coming unless we switch to non-carbon energy sources or cut demand so that less energy is needed. Indeed, some say that carbon removal offsets and zero-carbon generation are fundamentally at odds. A focus on compensatory carbon removal deflects resources from zero-carbon renewable energy generation and also from energy-saving — arguably the only long-term solutions to climate change.

Then again, not all ostensibly renewable gas generation is necessarily green. The report says: “The cultivation of silage maize for biomethane production in the EU can be expected to displace food production on cropland and thereby cause indirect land-use change” and that “in conjunction with cultivation and processing emissions, pushes that gas source into the high-GHG category”. What’s more, “within a given combination of feedstock and process, factors such as the degree of methane leaks for a given facility may further influence the GHG categorization for that producer”. So some green gases may have issues.

New sources

How about grass – converted to biogas by anaerobic digestion (AD)? UK energy supplier Ecotricity is currently pushing for it as a vegan energy option, since it does not involve animal products: a metric the EC evidently hasn’t as yet taken on board! Ecotricity says it will start work on grass-powered bio-gas plants soon. AD conversion of domestic food and farm waste to biogas is another key option, with no new land-use implications. There are already 1 million UK biogas users, and biogas and other green gases can play a role in fuelling local Combined Heat and Power plants linked to district heating networks.

some say that carbon removal offsets and zero-carbon generation are fundamentally at odds

The UK and many other countries certainly need to get on with decarbonising heat, and green gas could be a key option, replacing fossil gas. But, as the report for the EC notes, green gas  of various types can also be used to make other fuels and products, including synfuels for use in some vehicles. Will there be enough green gas for that, as well as for heating and for power generation? Although there are obvious land-use limits to growing biomass as an energy crop, there are other sources, including farm and other wastes, and the biogas resource overall does look quite large. The Global Biogas Association says that only 2% of available feedstocks undergo anaerobic digestion and are turned into biogas, and, if developed, biogas could cut global greenhouse gas emissions by up to 13%. 

The synthetic gas resource could be even larger, if renewable power generation expands as much as is expected, creating large surpluses at times, which can be converted to hydrogen and possibly methane at reasonably low cost. Some projects may be designed specifically to make hydrogen. For example, there is a UK proposal for a £12 billion 4 GW floating offshore wind farm for on-board hydrogen production, the gas then being piped to shore. That may be some way off — probably not until the early 2030s. A 2 MW prototype off Scotland is planned first, followed, if all goes well, by a 100 MW multi-unit test project in 2026 and leading on, if further backing can be found, to the full 4 GW version. It is a fascinating idea. The developer says, “if you had 30 of those in the North Sea you could totally replace the natural gas requirement for the whole country, and be totally self-sufficient with hydrogen”.

One way or another, even if bold plans like that do not win through, the UK could have plenty of easily stored and pipe-transportable green energy for multiple uses, as long as any leakage problems can be dealt with: see my earlier post. Many other countries also have large green gas/hydrogen potentials. For example, in addition to biogas inputs, in most EU decarbonisation scenarios, hydrogen and derived fuels add up to between 10% and 23% of 2050 final energy consumption. And a new IRENA report is quite optimistic about the global prospects for green P2G hydrogen: “Hydrogen from renewable power is technically viable today and is quickly approaching economic competitiveness”.

More than electricity

Not much of this, however, seems to be taken on board in the scenarios reviewed in the recent Global Energy Outlook produced by Washington-based Resources for the Future (RFF). As in an earlier World Energy Council (WEC) review, the outlook compares scenarios from oil companies, the IEA (International Energy Agency) and so on, most of which see fossil fuels still romping ahead, feeding growing energy demand, with renewables, although expanding, barely keeping up. In most of the scenarios RFF looks at, the emphasis is on electricity-producing renewables, and under the highest renewables scenarios examined, they only reach a 31% primary energy share by 2040, matching oil’s share. Emissions rise in all scenarios, even those using CCS.

However, neither the WEC or RFF reviews look at the many, much more ambitious “100% renewables by 2050” global scenarios from NGOs like GENI and academics like Mark Jacobson and the team at Lappeenranta University of Technology (LUT). In all, there are now 42 peer-reviewed “100%”, or near-100%, renewables studies. Most of them cast the net wider than just electricity, with some including direct green heat supply on a large scale, with solar and biogas combined heat and power (CHP)-fed heat networks and heat stores. Though nearly all of them still focus heavily on the power-supply side, leavened in some cases by P2G conversion to green syngases.

Electricity-generating renewables will obviously rule the roost, as costs continue to fall, and electricity has many good end-uses, but there are also other options that can play a part in meeting energy needs and in balancing variable green electricity power sources. Green gas is one.

Lithium-ion battery pioneers bag chemistry Nobel prize

The 2019 Nobel Prize for Chemistry has been awarded to John Goodenough, Stanley Whittingham and Akira Yoshino for the development of lithium-ion batteries.

The trio will share the SEK 9m (about £740,000) prize money equally and the award will be presented at a ceremony in Stockholm on 10 December.

“Electrochemistry is at the heart of this award, but other branches of chemistry have played a part,” said Nobel Committee member Olof Ramstrom from the University of Massachusetts when the winners were announced. “This prize is also connected to physics and even engineering – it’s a good example of how these disciplines have come together.”

Speaking over the telephone at the press conference, Yoshino said that it  was “amazing” to hear he had won the Nobel prize. He adds that it was “curiosity” rather than thinking about the immediate applications that pushed the work and says that lithium-ion batteries are being used to build a “sustainable society”.

The story of the lithium-ion battery begins in the oil crisis of the 1970s when Whittingham was trying to develop new energy systems. He discovered that a battery cathode made of titanium disulphide can absorb large numbers of lithium ions from a lithium anode. In 1980, Goodenough showed that an even better performing cathode can be made from cobalt oxide.

Metallic lithium is an excellent anode material because it readily gives up electrons, however it is highly reactive and early batteries were prone to exploding. In 1985 Akira Yoshino solved this problem by creating a carbon-based anode that is able to absorb large numbers of lithium ions. This removed the need to use reactive metallic lithium and the first commercial lithium–ion battery appeared in 1991. Since then, the devices have powered a revolutions in handheld electronics and electric vehicles.

Oldest ever winner

Goodenough did a PhD in physics at the University of Chicago and at 97 is the oldest person ever to receive a Nobel prize. An American citizen, he has worked at the Massachusetts Institute of Technology, University of Oxford and the University of Texas at Austin.

Whittingham is a chemist who was born in the UK in 1941 and is based at Binghamton University in the US.

Also a chemist, Akira Yoshino was born in Osaka, Japan in 1948 and is at the Asahi Kasei Corporation in Japan and Meijo University in Nagoya.

The physical chemist and nanoscientist Rodney Ruoff of the Institute for Basic Science Center for Multidimensional Carbon Materials in Korea had been colleague of Goodenough’s at the University of Texas. He told Physics World that Goodenough is “fun and fascinating”. “He’s a well-rounded individual, well read on many topics as well, of course, topics of science surrounding the things he has done during his research career,” says Ruoff, “I’m really delighted that he has received this honour along with [Whittingham and Yoshino]”.

Learn more about lithium-ion batteries:

We have also created a collection of key papers by Goodenough, Whittingham and Yoshino, which are free to read until 31 October 2019. You will find these papers below the list of papers written by the 2019 physics Nobel Laureates.

Oxford Instruments logo

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

Quality control tests for PET/MRI vary widely in Europe

© AuntMinnieEurope.com

A survey of eight large European imaging facilities with extensive clinical experience of PET/MRI has found wide variations in daily quality control and assurance procedures that ensure their proper function and image quality. The international study was published on 24 September in Frontiers in Physics.

One reason for the differences in performance testing protocols, the researchers suggested, is the lack of dedicated quality control recommendations for the hybrid imaging modality beyond standard vendor guidelines.

“The reported variations of local PET and MRI quality control procedures and testing frequencies are assumed to partly reflect the variations seen in the existing guidelines for the single modalities and the nonexistence of specific recommendations for PET/MRI,” wrote the authors, led by Alejandra Valladares, a doctoral candidate from the Medical University of Vienna.

European PET/MR sites

“However, in part, the variability in the reporting seems to be caused by differences in the definition of the specific tests between the centres and a lack of in-depth knowledge about the implemented quality control measures included in the daily quality assurance and the tests performed by the vendor during the preventive maintenance,” they explained.

Hybrid hype

The term “hybrid” is more than simply integrating two imaging modalities. It also is the acronym for Healthcare Yearns for Bright Researchers for Imaging Data (HYBRID), a training network project funded by the European Commission (MSCA ITN 764458) with goals that include the promotion of molecular and hybrid imaging modalities to advance personalized medicine and non-invasive disease evaluations. The HYBRID consortium consists of international academic, industrial and non-governmental partners at eight European locations with extensive clinical experience in PET/MRI, along with three PET/MRI vendors (www.hybrid2020.eu).

During the first phase of commercialization almost a decade ago, three PET/MRI systems came to market, all of which had similar technological components and state-of-the-art image reconstruction algorithms, which require stringent quality control procedures and image quality parameters.

“Hybrid imaging systems such as PET/CT or PET/MRI, in particular, present additional challenges caused by differences between the combined modalities,” the authors wrote. “However, despite the increasing use of this hybrid imaging modality in recent years, there are no dedicated quality control recommendations for PET/MRI.”

So, how are institutions handling their duties to keep these hybrid scanners running at peak efficiency? Researchers set out in this study to summarize PET and MRI quality control programs employed by European hybrid imaging sites, as well as guidelines on maintenance and performance measures recommended by the systems’ manufacturers. The end goal was to develop a consensus on minimal quality control standards for PET/MRI for use throughout the HYBRID consortium.

Of the eight facilities with PET/MRI systems, there were five Biograph mMR systems from Siemens Healthineers, two Signa PET/MR systems from GE Healthcare and one Philips Healthcare’s Ingenuity TF PET/MR. The GE and Philips systems both have PET time-of-flight (TOF) capabilities.

Survey results

All PET/MRI centres stated they perform daily quality assurance, which includes assessing detector stability in accordance with the manufacturers’ guidelines. There were, however, cases where facilities handled certain chores less frequently than others (Front. Phys. 10.3389/fphy.2019.00136).

Take sensitivity testing, for example, which was conducted in a “high variability across the centres,” the authors wrote, ranging from “daily, quarterly, annually or at the commissioning of the system (during the acceptance test).” The sensitivity of a PET system reflects “how many events can be detected for a given activity within the field-of-view. A change in sensitivity can possibly be caused by malfunctioning detectors, changes in the energy resolution or the coincidence timing window.”

Regarding spatial resolution, two centres provided no information for how often they tested, while one facility indicated the task was performed daily and the other five participants indicated their evaluations ranged from yearly to once every 10 years.

For image uniformity, three centres indicated they tested daily, compared with three other centres that conducted their assessments every three months, one centre that tested annually, and one location did not perform the test as part of its routine quality control.

As for image quality, attenuation- and scatter-correction accuracy and quantitation, three centres ran those tests annually. One centre reported this assessment every 10 years, two facilities conducted the test as part of its acceptance of the PET/MRI and one centre does not perform it as part of the routine quality control.

Vendor evaluations

The trio of PET/MRI manufacturers also was involved in the survey to describe their standards for quality assurance. For daily quality control of the PET component, all three vendors provide dedicated phantoms containing long-lived positron emitting sources. Using these phantoms, a detector stability test is included within all daily quality assurance procedures, but other system and components testing varied between vendors, the authors noted.

For the MRI component, for instance, Siemens’ quality assurance and preventive maintenance procedures usually includes an image quality test using a spherical head phantom with an outer diameter of 180 mm and a body spherical phantom with an outer diameter of 300 mm, which the company provides. This test is done to assess signal-to-noise ratio (SNR) and artefacts.

GE provides at least one specific phantom in the daily quality assurance package to check the system’s MRI component for its geometric accuracy, ghosting level, and SNR. The imaging centre has the choice of how frequently to perform the test.

While Philips’ daily quality assurance for the Ingenuity PET/MR system does not include any test for the MRI component, the company does provide a 200 mm head phantom to validate the MRI component of its system through an MRI image quality evaluation.

Consensus recommendation

After reviewing the quality control procedures of the imaging centres and the vendors’ guidelines, Valladares and colleagues developed their own set of recommendations.

Quality control measures

Their minimum consensus for all the three PET/MRI systems includes the following:

  • Daily quality assurance implemented by the vendor for the PET component, with quarterly cross-calibration testing and yearly image quality evaluations
  • A monthly coil check for the MRI scanner and a quarterly MR image quality test
  • An assessment of the PET/MRI system’s alignment after any mechanical adjustments or work on the system’s gantry and after software updates

“It is also recommended to check the function and quality of additional hardware,” the authors added. “Checking the [MRI] coil performance permits the detection of issues with the coils before these affect clinical scans and clinical image quality. This test is particularly important when using flexible coils, which are more susceptible to deterioration.”

Taken together, this consensus recommendation adds positively to overcoming the inertia toward standardized PET/MRI procedures and protocols that will help support high-quality assessments in clinical routine and research using this multiparametric, hybrid imaging modality, they concluded.

  • This article was originally published on AuntMinnieEurope.com ©2019 by AuntMinnieEurope.com. Any copying, republication or redistribution of AuntMinnieEurope.com content is expressly prohibited without the prior written consent of AuntMinnieEurope.com.

Hamiltonian learning technique advances quantum spin register

Researchers have developed an efficient way to characterize the effective many-body Hamiltonian of the solid-state spin system associated with a nitrogen-vacancy (NV) centre in diamond. The technique will be important for making and controlling high-fidelity quantum gates in this multi-spin quantum register, they say.

A quantum register is a system made up of many quantum bits (qubits) and it is the quantum equivalent of the classical processor register. Quantum computers work by manipulating qubits within such a register and making and understanding such systems is essential for quantum information processing.

Crosstalk is a problem

Researchers have made quantum registers from many physical systems thus far, including trapped ions, solid-state spins, neutral atoms and superconducting qubits. Whatever their nature, however, they all suffer from the same problem: crosstalk between multiple qubits when an individual qubit is addressed during a measurement. This induces a state error to other qubits and reduces gate fidelity.

One way to overcome this problem is to fully characterize the many-body Hamiltonian of the system, which determines how it evolves and crosstalk interactions. “Once we have this information, we can predict the evolution of any initial states and, what is more, design optimized gate operations to reduce crosstalk errors and achieve high-fidelity gates in a multi-qubit register,” explains Panyu Hou of the Center for Quantum Information at Tsinghua University in Beijing, who is the lead author of this new study. “The Hamiltonian can often be fully described by some essential parameters, which characterize the coupling between the qubits.”

Diamond defects

Hou and colleagues studied the electron spin and the surrounding nuclear spins of a single NV centre in bulk diamond. A NV centre, or defect, is one of the hundreds of colour centres found in natural diamond and it is known to be a promising platform for quantum information processing. The centre involves two adjacent carbon atoms in the diamond lattice being replaced by a nitrogen atom and an empty lattice site (or vacancy).

The electron and nuclear spins in the NV centre each play a different role. Electron spins allow for fast control and high-quality readout and, when coupled to light, allow for entanglement between qubits, even over long distances. This means that they can be used to realize a scalable quantum network. The nuclear spins for their part provide additional qubits that are stable – that is, they have long coherence times – and can thus be used to store and process quantum information.

Constantly-on interactions

For this solid-state spin register, the interactions among all the spin qubits are constantly on, however, explains Hou. This is why there is crosstalk error from the other nuclear spins other than the target one being addressed.

In their work, the researchers characterized the effective many-body coupling Hamiltonian of an NV centre containing one electron spin and ten weakly coupled 13C nuclear spins. They then used the learnt Hamiltonian parameters to optimize the dynamical decoupling sequence and so minimize the effect from the unwanted crosstalk coupling. This adaptive scheme was first put forward by researchers at Delft University in the Netherlands.

The researchers validated their technique by designing a universal quantum gate set that includes three single-qubit gates for each of the 11 qubits and the entangling two-qubit gate between the electron spin and each of the 10 nuclear spins. “In principle, we can realize any unitary operation on this 11-qubit register by combining the above gate set,” Hou tells Physics World.

“This learning technique could be a useful tool to characterize many-body Hamiltonians with a constantly-on interaction,” he adds. “The 11-qubit quantum register that we made might also be used as a proof-of-principle test of some quantum algorithms.”

The work, which is reported in Chinese Physics Letters, comes hot on the heels of new research from Delft, described in this Physics World article.

Handheld gas sensor distinguishes methanol from ethanol

Drinking as little as 6 ml of methanol can be fatal. Since the start of the year, there have been 1636 reported cases of methanol poisoning across a number of countries. The problem arises in adulterated counterfeit or informally produced spirit drinks (according to the World Health Organisation) which can lead to inadvertent methanol consumption. Break outs of methanol poisoning are particularly common in developing countries partly because current methods to detect methanol are resource intensive making them inaccessible. Researchers Jan van den Broek  and colleagues at ETH Zurich and the University Hospital Zurich in Switzerland have now developed a handheld gas analyser that is able to successfully discriminate between methanol, ethanol and acetone. Additionally, they showed the device is capable of differentiating between low concentrations of methanol (1 ppm) and high concentrations (up to 62000 ppm) of ethanol within 2 minutes, essential for the analysis of alcoholic beverages.

Developing the sensor for methanol

The device comprises a column of polymer resin to separate methanol from interference compounds such as ethanol prior to reaching the detector. The column works like a gas chromatographic system, separating the components based on differences in their volatility. The researchers then incorporate a highly sensitive but non-specific microsensor composed of palladium-doped tin oxide nanoparticles on interdigitated sensing electrodes – a measurable change in conductivity in the semiconductor indicates the presence of methanol, ethanol or acetone in the different parts of the column.

The researchers then investigated the efficacy of the device for differentiating between un-tainted liquor from contaminated beverages. They laced Arrack (a Southeast Asian liquor) with varying concentrations of methanol (0.3-1% vol/vol). There was a peak in the conductance at 1.7 minutes for the more volatile methanol, which the device extracts first, with a peak for ethanol following after at approximately 8.3 minutes. The device could not only clearly differentiate between the tainted Arrack and the unadulterated liquor at all concentrations tested, but was also capable of quantifying the amount of methanol present. This provides the first step to implementing low-cost methanol analysis of alcoholic beverages, particularly in developing countries.

Additionally, the researchers investigated the potential to detect methanol in breath samples. An ethanol intoxicated volunteer with a blood alcohol content of 0.54‰ provided a breath sample, which was subsequently spiked with a concentration of methanol slightly above that of serious methanol intoxication (135ppm). When analysed, the device could clearly differentiate between methanol and ethanol in breath samples. They confirmed the results from the device by analysing the same breath samples by bench top proton-transfer-reaction time-of-flight mass spectrometry (PTR-TOF-MS).

The future of this device

The authors hope this “proof-of-concept” device will lead to the development of a fast and non-invasive method for detecting methanol poisoning. Additionally, it could pave the way towards an inexpensive means of monitoring alcoholic beverage production to prevent instances of accidental poisonings in the future. Full details of this research are reported in Nature Communications.

 

Forests reduce cyclone intensity

Planting trees could protect against storm damage. A new study suggests that major reforestation across Europe has the potential to reduce the number of extra-tropical cyclones by up to 80%.

Previous research showed that extra tropical cyclones reduce in intensity when the land surface becomes rougher. This set Danijel Belušić from the Swedish Meteorological and Hydrological Institute and colleagues to wondering how much difference it would make to extra-tropical cyclone activity across Europe if the continent was covered in trees.

Using a regional climate model and a cyclone-tracking algorithm, the researchers investigated long-term changes in the number and intensity of cyclones, and their effect on precipitation, under conditions of deforestation and complete afforestation.

The results suggest that if Europe was completely covered in trees, the number of cyclones would drop between 10 and 80%, compared to the case of a continent devoid of trees. West European coastal areas experienced the smallest reduction (10%), most likely because cyclones tend to arrive from the west and have less opportunity to interact with trees before hitting these areas. But by the time weather systems reach eastern Europe, Belušić and his colleagues estimate that afforestation could reduce the number of cyclones by as much as 80%.

“The increased roughness due to forests spins down a cyclone and reduces its intensity,” says Belušić.

With weaker and fewer cyclones, Europe would inevitably have less heavy rain. The model suggests that complete afforestation would reduce extreme winter rainfall by as much as 25%. However, Belušić and his colleagues suggest that this loss could be at least partially compensated by trees’ ability to access deep soil moisture.

“For example, due to the increased evapotranspiration, the atmosphere receives more moisture from the surface which can be converted into cloud and precipitation,” says Belušić, whose findings are published in Environmental Research Letters (ERL).

Indeed, the team’s model shows that total precipitation is similar for both scenarios, but that the afforested Europe experiences an increase in weak to moderate precipitation and a decrease in very heavy rain compared to a deforested Europe.

It’s likely that the total area of trees is important. Small pockets of forest are unlikely to increase surface roughness sufficiently to impact an extra-tropical cyclone; these tend to be several hundred kilometres wide. Right now, Belušić and colleagues don’t know how large the forest must be in order to impact a cyclone, but their model does confirm that very large areas of roughness do have an impact. For the British Isles, which are similar in area to a typical cyclone, the researchers show that a covering of trees would reduce the number of cyclones over both the British Isles and the North Sea.

In theory, surface roughness could be increased by covering Europe with wind turbines or tall buildings but these structures wouldn’t bring the evapotranspiration benefits that accompany trees. Belušić speculates that artificial increases in roughness like these would be accompanied by significant decreases in rainfall.

The researchers also note that mass tree-planting to reduce cyclones should work in any mid-latitude location, not just Europe.

Currently, covering Europe with trees is hypothetical; the scientists say that more research is needed to understand the science better and to investigate the feasibility of such a major project. But as climate continues to change, solutions like this could become serious contenders in helping us adapt to increasingly extreme weather.

How are winners of the Nobel Prize for Physics selected?

Winners of the Nobel Prize for Physics, join a group of elite scientists, including Albert Einstein, Marie Curie and Enrico Fermi. But who are the mysterious kingmakers who select the laureates? And how exactly does that decision process work? This video introduces the process and the different players involved in reaching that dramatic annual announcement. Find out more in this recent Physics World article, ‘Inside the Nobels: Lars Brink reveals how the world’s top physics prize is awarded’.

James Peebles, Michel Mayor and Didier Queloz share Nobel Prize for Physics

James Peebles, Michel Mayor and Didier Queloz have won the 2019 Nobel Prize for Physics “for contributions to our understanding of the evolution of the universe and Earth’s place in the cosmos”.

Peebles bags half the prize for “theoretical discoveries in physical cosmology”.  Mayor and Queloz share the other half for “the discovery of an exoplanet orbiting a solar-type star”.

The prize is worth SEK 9m (about £740,000). Peebles will get one half of the prize money and Mayor and Queloz will share the other half. The new laureates will gather in Stockholm on 10 December, where they will receive their medals at a ceremony.

“They have painted a picture of a universe that is far stranger and more wonderful that we could ever have imagined,” said Nobel Committee member Ulf Danielsson of Uppsala University when the winners were announced. “Our view of the universe will never be the same again.”

In the 1960s, Peebles developed the Big Bang model of cosmology as a theoretical framework for describing the universe. Using the model, he predicted properties of the cosmic microwave background radiation (CMB), which was created 400,000 years after the Big Bang. His work has meant that measurements of the CMB have provided profound insights into how much matter was created in the Big Bang and how this matter then clumped to form galaxies and galaxy clusters.

In the early 1980s Peebles was a driving force behind the idea that the large-scale structure of the universe can be explained by the existence of an invisible substance called cold dark matter. In the 1980s he also helped revive the idea of Albert Einstein’s cosmological constant to account for the theoretical observation that about 69% of the mass–energy in the universe is missing. In 1998 the first measurement of the accelerating expansion of the universe confirmed the existence of this missing mass–energy in the form of dark energy. As a result Peebles played a crucial role in the development of the Standard Model of cosmology, which assumes the existence of cold dark matter and dark energy without describing the precise nature of either substance.

There are still many open questions – what is dark matter and Einstein’s cosmological constant?

James Peebles

“We have clear evidence that the universe expanded from a hot, dense state,” Peebles said on the telephone during the press conference that followed the prize announcement. “But we must admit that dark energy and dark matter are mysterious. There are still many open questions – what is dark matter and Einstein’s cosmological constant?”

Wobbling star

In October 1995, Mayor and Queloz made the first discovery of a planet orbiting a star other than the Sun (an exoplanet). Using custom-made instruments on the Haute-Provence Observatory in France, the pair detected the tiny wobbling motion of a star 50 light-years away. This wobble is caused by the close orbiting of a gas-giant planet called 51 Pegasi b, which is about half the mass of Jupiter.

As well as being the first-ever sighting of an exoplanet, the discovery signalled a major shift in our understanding of planet formation and evolution because the distance between 51 Pegasi b and its Sun-like star is just 5% of the distance between Earth and the Sun. This is unlike the much larger orbits of gas giants in the Solar System, yet many more large exoplanets with similar orbits have since been found. This has prompted planetary scientists to develop new theories that describe gas giants as “wandering Jupiters”, forming in relatively large orbits and then migrating inwards.

“It’s a great day for exoplanets,” planetary scientist Sara Seager from the Massachusetts Institute of Technology told Physics World. “To see a field go from obscure, fringe and laughable to Nobel-prize-worthy is a huge tribute to all the people around the world that make exoplanets real.”

Charming discovery

Commenting on the work of Mayor and Queloz, Peebles said: “It is so charming to me to think that we once had a good theory of how planets formed until we found the first [exoplanet] – it’s a good illustration of how science works”.

Since the 1995 discovery, new techniques and telescopes have been developed to make increasingly sophisticated observations of exoplanets. This includes spectroscopic measurements of the chemical compositions of the atmospheres of some exoplanets, which could reveal whether they are suitable for life.

As of 1 October 2019, 4118 exoplanets have been found in the Milky Way. These reside in 3063 exoplanetary systems, 669 of which are known to contain more than one exoplanet – illustrating how rich the study of exoplanets has become since the 1995 discovery.

Peebles was born in 1935 in Winnipeg, Manitoba, Canada. He completed a degree in physics at the University of Manitoba in 1958 before doing a PhD at Princeton University in 1962. He remained at Princeton ever since, where he is still active in research.

Mayor was born in Lausanne, Switzerland in 1942. He did a MSc in physics at the University of Lausanne in 1966 before completing a PhD in astronomy in 1971 at the Geneva Observatory, which belongs to the University of Geneva. Mayor remained at the university – becoming director of the Observatory of Geneva from 1998 to 2004 – until retiring in 2007.

Queloz was born in Switzerland in 1966. In 1995, he completed a PhD in astronomy at the University of Geneva under the supervision of Mayor. After a spell at NASA’s Jet Propulsion Laboratory from 1997 to 1999, he returned to the University of Geneva, where he has remained since. In 2013 he took up a joint position at the University of Cambridge.

Learn more about cosmology and exoplanets:

Some of our reviews of books on these topics:

  • Commonly uncommon” – Hamish Johnston reviews One of Ten Billion Earths: How we Learn About our Planet’s Past and Future from Distant Exoplanets by Karel Schrijver
  • How to build a planet” – Louisa Preston reviews The Planet Factory: Exoplanets and the Search for a Second Earth by Elizabeth Tasker
  • The dark-energy game – Robert P Crease reviews The 4% Universe: Dark Matter, Dark Energy, and the Race to Discover the Rest of Reality by Richard Panek

We have also created a collection of key papers by Peebles, Mayor and Queloz, which are free to read until 31 October 2019.

Oxford Instruments logo

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

My favourite Nobel prize: weighing up neutrino mass

I’ve always been fascinated by neutrinos, those tricksy fundamental particles that abound the universe ( you’ve probably heard it before, but some 65 billion neutrinos pass through a space as small as your fingernail every second) but are rather hard to detect, as they are electrically neutral, and only interact with matter via gravity and the weak force. First predicted 89 years ago by Wolfgang Pauli as a “desperate remedy” for discrepancies arising in the study of beta decays, these so-called ghostly particles were thought to be impossible to detect. In fact, Pauli himself famously bet a case of champagne that it could never be done, supposedly saying “.”

Pauli was happily proven wrong though, when Frederick Reines and Clyde Cowan detected antineutrinos emitted by a nuclear reactor, in 1956 – a feat that earned Reines the 1995 Nobel Prize for Physics. Indeed, neutrino astronomy is a rather Nobel-friendly topic, with the 1988 prize awarded jointly to Leon M Lederman, Melvin Schwartz and Jack Steinberger “for the neutrino beam method and the demonstration of the doublet structure of the leptons through the discovery of the muon neutrino“, as well as the 2002 prize, one half of which was awarded to Raymond Davis Jr and Masatoshi Koshiba “for pioneering contributions to astrophysics, in particular for the detection of cosmic neutrinos“. 

But the neutrino Nobel that fascinates me the most is the 2015 prize, given to to Arthur B McDonald and Takaaki Kajitafor the discovery of neutrino oscillations, which shows that neutrinos have mass”. You see, early theories suggested that neutrinos were massless particles. But by 1957, Italian physicist Bruno Pontecorvo was already considering the possibility that multiple types, or “flavours” electron, muon and tau of neutrinos exist, and that they can change, or “oscillate”, from one to another. This was confirmed by 1962, when  Lederman, Schwartz and Steinberger observed the existence of the electron and muon neutrino at Brookhaven National Laboratory in the US , while the tau neutrino was ultimately discovered in 2000.

But a rather chilling conundrum soon raised its head, in the form of the “solar neutrino problem”, when an experiment at the Homestake Gold Mine in South Dakota, carried out by physicists Raymond Davis and John Bahcall, detected only about 30% of the amount of solar neutrinos that it should have, as predicted from the Sun’s luminosity. The only way to explain this discrepancy (short of worrying that the Sun itself was dying, which was considered for a while) was that the solar neutrinos were oscillating between flavours, as they travel from the Sun to the Earth. The Homestake experiment (which could only detect electron neutrinos), was therefore only picking up a third of the actual amount. A direct consequence of this solution was that neutrinos must possess mass, for these oscillations to take place – but this was contrary to the Standard Model of particle physics prediction of them being massless particles.

This is one of the reasons why I find neutrinos so interesting – while not being able to explain some pretty major phenomena such as gravity, dark matter and energy, and the matter–antimatter asymmetry, the Standard Model is one of the most successful and pervasive theories of particle physics to date, one that has consistently been proven to be accurate. To find out then, that neutrino physics is beyond the ken of the Standard Model, must have been a rather exciting prospect for physicists. But the proof of these neutrino oscillations, and therefore their mass, was a long time coming, and was what won Kajita and McDonald their Nobel in 2015. Kajita and his colleagues were part of the Japanese Super-Kamiokande experiment, which in 1998 showed that the ratio of electron to muon neutrinos coming from opposite sides of the Earth were different, showing that neutrinos changed flavour as they travelled long distances and passed through the Earth. For the first time, physicists were able to experimentally show that neutrinos must have mass, albeit only about 0.1 eV. A year later, the Sudbury Neutrino Observatory (SNO), led by McDonald, began collecting data, and was able to determine how many of the electron neutrinos produced in the Sun change into muon neutrinos or tau neutrinos as they travel to the Earth. SNO data were able to confirm the fact that about two-thirds of the solar electron neutrinos change flavour by the time they reach the Earth.

While neutrino science has significantly advanced in the years since both those experiments, there are still many questions, including the fact that we still don’t know what the exact masses of the three neutrino flavours, or how they achieve this mass. The day after the 2015 prize, I wrote a blog about why improving our understanding of neutrino mass is so important, when it comes to some of the big unanswered question in physics today. Enticingly, the answer to one or even more of those Standard Model problems that I previously mentioned may lie with an as-yet-undiscovered, but long coveted, fourth type of neutrino dubbed the sterile neutrino … but you’ll simply have to read that blog to find out more. I do hope, though, that I have made a good case for why neutrino Nobel prizes have fascinated me so. In any case, I will leave you with the captivating words of physicist Frederick Reines who once described neutrinos as “the most tiny quantity of reality ever imagined by a human being”.

Oxford Instruments logo

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more.

Material lattice morphs into doubly curved shapes

Researchers have succeeded in 4D-printing material lattices that can shape-morph into complex and doubly curved structures in response to changes in temperature. The lattices are printed using inks composed of elastomeric matrices with tunable cross-link density and an anisotropic filler, which means that their elastic modulus and thermal expansion coefficient can be precisely controlled. The technique could be extended to many other temperature-responsive materials and different material designs to produce scalable, reversible, shape-shifting structures with hitherto unseen complexity for use in applications such as stents or scaffolds for artificial tissue, deformable lenses in telescopes and soft robotics.

Materials that reversibly change their shape in response to an external stimulus, such as temperature, could lead to a host of new applications in areas such as additive manufacturing, robotics and biomaterials. Most of the shape-morphing structures made so far, however, have been limited in their ability to transform into complex and doubly curved shapes. This is because such transformations require the surface of a material to curve in two perpendicular directions at the same time by different amounts at different places.

Making a shape-shifting sheet doubly-curved

This double curvature effect (which forms the basis of a famous nearly two-centuries-old theorem from the German mathematician Carl Friedrich Gauss) is familiar to anyone who has tried to gift-wrap a soccer ball, explain the researchers, led by Jennifer Lewis, William Boley and L Mahadevan of Harvard University. To transform paper, which is a flat (curvature-free) object, to the shape of a ball, which has positive double curvature, the paper needs to be creased and crumpled at the sides and bottom of the ball. Only if the paper sheet were able to naturally stretch or contract, or indeed both, could it then adapt to the shape of the ball.

To make a shape-shifting sheet doubly-curved, Boley, Rees and colleagues first calculated the form of the printed planar lattice that would deform into a given shape when heated. The lattice contained curved bilayer ribs with individually programmable shapes. The ink they then used to print it comprises elastomeric matrices made of PDMS (which naturally expands when heated) with tuneable cross-link density and anisotropic fillers comprising short glass fibres that have a far lower thermal expansion coefficient than the silicone matrix.

The ribs bend in response to changes in temperature, with the lattice nodes expanding and contracting much more than would a continuous sheet. The voids in the lattice also easily accommodate large changes in surface area when the ribs are designed to grow at different rates across the sheet, say the researchers.

Multiplexed bilayer ribs

To independently control extrinsic curvature, the researchers created multiplexed bilayer ribs composed of four different materials that allowed them to encode a wide range of 3D shape changes in response to temperature. As an example, they designed and printed planar lattices embedded with a conductive liquid metal that morphs into a dome shape to form an antenna whose resonance frequency changes as it deforms.

They also printed a flat mesh that deforms into the shape of a human face (that of Gauss himself in this work) when subjected to a certain temperature. They did this by designing each individual rib of the lattice to bend by a predetermined amount to preferentially form the shape of a nose or that of an eye socket. They say they can vary the arrangement of the four ribs to pre-programme whether the rib as whole curves up to morph into the shape of a nose, for example, or slopes down to form part of an eye socket.

The team, which includes researchers from Boston University and the Massachusetts Institute of Technology (MIT), says that it is now looking to apply its technique to stiffer materials for applications such as self-propelling fins and wings.

Full details of the present research are reported in PNAS.

Copyright © 2025 by IOP Publishing Ltd and individual contributors