Skip to main content

There’s no place like home

student leaving home

It’s usual to think of physicists as globe-trotting rebels who uproot their lives to go wherever the opportunities lie. Moving from city to city and country to country, they coolly switch locations, following their physics dreams – be it CERN in Geneva, oil rigs in the North Sea or start-ups in Silicon Valley.

The reality, though, is rather different. Many young people – physics graduates included – prefer to stay where they grew up or studied. They have an emotional attachment to the town, city or region they’re already living in, and are reluctant or unable to move. While it’s not exactly clear why, we attribute this to a sense of geographic belonging, which is related to place, family, social and community relations. In some cases, money is probably a key factor too.

The importance that students give to this “emotional geography” also affects the kinds of jobs they can get after graduation. In the UK particularly, the graduate labour market is geographically unbalanced, with most graduate jobs concentrated in London and south-east England and in the larger cities of Birmingham, Manchester and Leeds. So if you’re a physics graduate from a part of the country with few physics-based employers, it’ll be harder to find a job using your physics degree if you can’t – or don’t want to – move.

According to the 2015/16 Destinations of Leavers from Higher Education (DLHE) report, some 69% of UK graduates took their first job in the region of their home domicile. Further analysis using data from the survey shows that 45% of graduates did not move regions at all – they both studied and sought work in their home domiciled region. This inability or unwillingness of graduates to move means that university physics departments need to do more to recognize the importance of “place-based” decision-making when it comes to graduate career choices. Departments also need to adapt their physics degrees so that students can more easily find graduate jobs in the local area – and have the skills that local employers need.

Local skills for local people

The impact of emotional geography on physics graduate outcomes was a key theme of a meeting in London hosted by the Institute of Physics (IOP) in July. The meeting was organized by the South East Physics Network (SEPnet), which links nine physics departments in south-east England, and by the White Rose Industrial Physics Academy (WRIPA) – a collaboration between the universities of York, Sheffield, Hull, Nottingham and Leeds. Emotional geography, it turns out, is a particular headache for physics departments in regions with a low Gross Value Added (GVA) – one way of measuring economic output that can be used as an indicator of regional productivity.

This problem is borne out in an analysis carried out by Alastair Buckley, a physicist at the University of Sheffield, who examined the mobility of students who studied physics at the university between 2011 and 2017. He found that a high proportion (about 50%) of Sheffield physics students’ domiciled (permanent) address when they applied for the course was within 100 km of the university. What’s more, after graduation 65% of Sheffield physics students chose to return to their domiciled address (most likely the parental home) to work. Buckley and colleagues consider these physicists to be “work immobile”, meaning that remaining in a desired location may be more important to them than the type of job they do.

Buckley’s analysis shows, however, that physics graduates at Sheffield who are “work mobile” – i.e. prepared to move after graduation – are more likely to get better, graduate-level work. That’s because other parts of the country with higher productivity and growth have a greater number of graduate-level jobs that make the most of a physicist’s specific skills and knowledge. Indeed, when the five WRIPA departments analysed the DLHE data, they found that physics graduates who stay in the Yorkshire, Humberside and East Midlands region (where those departments are based) have significantly lower prospects than those choosing to leave. Only 55% of them attain graduate-level jobs, compared with the 80% of those who move.

82% of all students live in their original home region a year after graduating – and 65% are still based there a decade later

This insight corroborates a recent analysis by the UK Department of Education, based on the government’s controversial Longitudinal Education Outcomes (LEO) project, which provides information on how much UK graduates of different courses at different universities earn after graduating, by linking up data on tax, benefits and student loans. The LEO data suggest that 82% of all students live in their original home region a year after graduating – and that 65% are still based there a decade later.

Next steps

So what can be done to improve the situation – either by helping students become more mobile, or by improving the prospects of those who cannot or don’t want to move? The WRIPA has recently been awarded funding from the Office for Students’ Challenge Competition to boost links between physics departments and regional employers, to develop inclusive modes of work-based learning and to support physics students to be more work mobile. However, several strategies are already being tried at universities around the UK (see box).

Case study: Amy Hearst

Amy HearstOne physicist who has benefited from her university’s links with local firms is Amy Hearst. She graduated from the University of Southampton, UK, and is now a production engineer at vivaMos, which designs and builds CMOS image sensors for flat-panel X-ray detectors at its premises in Southampton Science Park.

However, Hearst might not have realized the opportunities on offer in Southampton were it not for a summer industrial placement she did at the defence and aerospace firm Leonardo, which has a centre of excellence for infrared detectors in the city. During her placement, she made ultrasensitive thermal-imaging cameras used on the TV shows Springwatch and Planet Earth.

Before the placement, Hearst had seen Leonardo as uninteresting. “To me it was just a grey building, it was defence,” she told delegates at a meeting in July about the impact of emotional geography on physics graduates in the UK.

After working at Leonardo as a summer student, Hearst was taken on full-time after graduating. She and her fellow graduate recruits formed a close social and professional network, which led to her wanting to stay in the Southampton area, and she found her current job at vivaMOS thanks to this network.

Hearst’s experience underlines the value of university physics departments developing links with local employers and also of helping physics students to sharpen their professional skills while studying.

One example is the University of Lancaster’s regional “employer engagement strategy”, which physicist Manus Hayne and colleagues developed to support students who want to stay in the area after graduating. Their strategy includes curriculum-based projects for final-year physics students, who work with local industrial firms on real-world problems. These projects are an effective way of connecting students with employers that are less likely to participate in traditional recruitment events. Together with guest lectures, industrial placements and opportunities for internships at local firms, such projects help to create what you might call a departmental “employability ecosystem”.

How can local hi-tech firms shape the nature of physics degrees offered by local universities to ensure that graduates have the appropriate skills?

But how can local hi-tech firms shape the nature of physics degrees offered by local universities, to ensure that graduates have the appropriate skills? One solution adopted by the University of Portsmouth’s School of Mathematics and Physics has been to set up an industry advisory board, which currently has about 20 representatives from local businesses. Complaints from one major local employer that it could not find graduates with the right skills in radar-systems engineering led to the university reintroducing this subject into its undergraduate physics degree.

Portsmouth also delivers a course on the applications and “impact” of physics in partnership with industry and healthcare professionals, developed in response to local skills needs. Indeed, the IOP’s revised degree-accreditation scheme gives physics departments the flexibility to respond to regional economic needs and focus on developing students’ skills. Physics departments are encouraged by the IOP to deliver degree programmes that maintain academic rigour while also embedding key skills such as innovation, leadership, creativity, entrepreneurship and self-management.

Working together

Another attempt to make physicists more employable to local firms has been developed by Trevor Farren, a director of business and knowledge exchange at the University of Nottingham. He delivers an optional interdisciplinary module entitled “Enterprise for scientists”, in which final-year physics students can elect to work with students from other disciplines to develop their innovation, business and entrepreneurial skills. The course involves groups of students pitching ideas to a “Dragon’s Den” panel of local businesses, thereby helping them to develop an understanding of finance, negotiation, sales and marketing as well as enhance their problem-solving and creative-thinking skills.

With jobs evolving all the time, programmes such as these help students to develop the transferable skills needed to adapt to the changing employment environment. They also help students build contacts with potential local employers, in the process raising the value and success of physics degrees offered by local universities. And with graduate recruiters saying that their ideal candidates should not only have a strong academic knowledge of particular field, but also an ability to work outside their core area, it’s more important than ever to ensure that physicists develop the skills that will make them succeed in the marketplace.

Exoplanet researchers welcome ‘cataclysmic’ Nobel-prize announcement

Physicists and astronomers have welcomed yesterday’s decision to award this year’s Nobel Prize for Physics to the discovery of the first exoplanet as well as research on the evolution of the universe. Despite some surprise regarding the mix of the two subjects in one prize, the award has been particularly welcomed by exoplanet researchers, who had worried that the Nobel Committee for Physics would ever deem it “fundamental enough” to be recognized.

James Peebles, Michel Mayor and Didier Queloz bagged the 2019 Nobel Prize for Physics “for contributions to our understanding of the evolution of the universe and Earth’s place in the cosmos”. Peebles won half the prize for “theoretical discoveries in physical cosmology” while Mayor and Queloz share the other half for “the discovery of an exoplanet orbiting a solar-type star”.

Over the past five decades, Peebles had made pioneering contributions to cosmology including dark matter and dark energy. He made a vital headway in predicting the properties of the Cosmic Microwave Background even before it was discovered as well as improving our understanding of how elements were formed soon after the Big Bang. He also came up with “cold dark matter” as a description of a universe filled partly with invisible matter and, according to astrophysicist Jo Dunkley from Princeton University, “also put back in the vacuum energy ‘Lambda’ to our list of cosmological ingredients — the dark energy that is still a mystery today”.

To see a field go from obscure, fringe, and laughable to Nobel-worthy is a huge tribute to people all around the world making exoplanets real

Sara Seager

“It was so obvious that Peebles should win the prize,” Dunkley told Physics World. “He really founded our entire field of modern cosmology — wherever you look in cosmology, you find Peebles’ vital contributions.”  That view is shared by other astrophysicists, including Ofer Lahav from University College London, who says that the award to Peebles is “long overdue” and “well deserved”. “Through his articles and books he has inspired generations of cosmologists as well as pioneered the creation of large cosmological surveys, which are now at the core of cosmology research,” Lahav says.

Positive impact

Mayor and Queloz, meanwhile, have been awarded the prize for making the first discovery of a planet orbiting a star other than the Sun. In 1995, using custom-made instruments on the Haute-Provence Observatory in France, the pair detected the tiny wobbling motion of a star 50 light-years away. This wobble is caused by the close orbiting of a gas-giant planet called 51 Pegasi b, which is about half the mass of Jupiter.

Alan Boss, from the Carnegie Institution for Science in the US, told Physics World that the award is “a splendid surprise” and that Mayor and Queloz “fully deserve” to be recognized given they were the first to find proof of a planet in orbit around another sun-like star. “The surprise is that it took the Nobel committee nearly 25 years to recognize this huge discovery,” says Boss. He adds that their finding helped to foster support for NASA’s Kepler Space Telescope, which has since spotted thousands of exoplanets and determined that potentially habitable, Earth-like worlds are commonplace in our galaxy. “[It] is certainly the most astounding discovery in the last decade about our physical universe,” says Boss.

Sara Seager, an astronomer at the Massachusetts Institute of Technology, says the award for exoplanets is a “cataclysmic shift” for the subject. “To see a field go from obscure, fringe, and laughable to Nobel-worthy is a huge tribute to people all around the world making exoplanets real,” she told Physics World, adding that she hopes that there will be a “huge positive impact” in terms of funding for dedicated exoplanet missions in future.

Like all huge achievements, the discovery of 51 Pegasi b by Mayor and Queloz had important forerunners all of whom could claim a first for exoplanets

Elizabeth Tasker

Exoplanet researcher Jason Wright from Penn State University, meanwhile, says that he is “very pleased” to see the award go to Mayor and Queloz and that it is “well deserved”. He also admits never imagining the Nobel committee would dedicate a prize to the first exoplanets. “I did not think the committee would think of it as sufficiently fundamental to physics,” he says.

Early pioneers

As with all Nobel prizes, the requirement to limit the award to no more than three people — stipulated in the will of Alfred Nobel — will have made the 2019 prize as tricky as ever. Other pioneers who will surely have been considered by the committee will have been astronomer Geoff Marcy and his colleague Paul Butler from the Carnegie Institution for Science, who were both early pioneers in exoplanetary science. Butler, who is currently carrying out observations on the Magellan telescopes in Chile, told Physics World that the committee “had recognized the accomplishments of three great researchers,” adding that “their work has helped illuminate our understanding of the universe, and to redefine our relationship with the universe.” Another thorny matter is that in 2015 Marcy was found to have violated the University of California, Berkeley’s sexual-harassment policy.

Exoplanet researcher Elizabeth Tasker at the Japan Aerospace Exploration Agency told Physics World that there were many people before Mayor and Queloz – and indeed Marcy and Butler — that made important contributions. “Like all huge achievements, the discovery of 51 Pegasi b by Mayor and Queloz had important forerunners all of whom could claim a first for exoplanets,” she says.

In particular, in 1988, a Canadian team of astronomers, led by Gordon Walker, Bruce Campbell and Stephenson Yang, detected the “wobble” of Gamma Cephei A using a new absorption cell that allowed much higher precision measurements of a star’s radial velocity. However, the trio didn’t have enough support to claim it was a planet – something that was only later confirmed in 2004. Then in 1992, Alex Wolszczan from Penn State University and Dale Frail at the National Radio Astronomy Observatory in Socorro, New Mexico, announced two rocky planets around a pulsar – yet at the time these results were not widely accepted. “While these previous discoveries were equally as important to our knowledge of planet formation — it was the discovery of 51 Pegasi b that marked the flood gates opening to the discovery of new worlds,” adds Tasker.

Despite the plaudits for the individual researchers in this year’s award, some say that the mix of exoplanets and cosmology was unexpected. “Sharing the award between cosmology and extrasolar planets is a bit surprising given these are such different disciplines,” says Lahav. “But, in the spirit of the award’s citation about understanding ‘the Earth’s place in the cosmos’, this is perhaps an encouragement to study the universe in a more wholistic way.”

That view is backed up by Dunkley. “I think the mix of subjects awarded this year is interesting,” she says. “But there is a common thread — Peebles taught us how we came to be here in a galaxy full of stars and Mayor and Queloz discovered that planets beyond our own Solar System were created in this bigger arc of cosmic history. So together they take these huge strides towards understanding our place in the cosmos.”

  • We have created a collection of key papers by Peebles, Mayor and Queloz, which are free to read until 31 October 2019.

Oxford Instruments logo

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

How green is green gas?

Green gas is being talked up of late as one new way forward for decarbonisation. So what exactly is green gas? It could, in fact, be a lot of different things, some of which are far from green. In general, that depends on the sources and the counterfactuals of using/not using them. Biomass use can be near carbon-neutral, if the replanting rate keeps up with its use, but destroying natural carbon sinks (e.g. by aggressive use of forest products for fuel) means there can be net carbon dioxide rises, while greenhouse gas (GHG) emissions from using fossil gas may be reduced if carbon capture and storage (CCS) is included. In between these extremes there are all sorts of options, including synthetic gas made using surplus renewable power-to-gas electrolysis (P2G).

A helpful new report for the European Commission (EC) from The International Council on Clean Transportation offers a three-part classification system:

  • High-GHG: Gases with a life-cycle greenhouse gas intensity of 30% or higher than business-as-usual natural gas. The EC says that this category of gas “likely needs to be phased out in the near future in order to meet Europe’s climate targets”.
  • Low-GHG: Gases that reduce lifecycle greenhouse gas emissions by a substantial degree. For example, the Renewable Energy Directive 2018/2001 requires renewable power-to-methane to reduce greenhouse gas emissions by 70% compared to fossil fuels.
  • GHG-neutral: Gases with zero net greenhouse gas emissions. This includes pathways with negative greenhouse gas emissions on a life-cycle basis.

Gaseous complexity

It certainly can get complicated. For example, the report says “the use of additional or excess zero-GHG electricity from wind and solar to produce hydrogen for electrolysis, or green hydrogen, is a renewable, GHG-neutral pathway. However, that same process utilizing conventional fossil-powered electricity in the absence of CCS will be neither renewable nor GHG-neutral. As with fossil gases, the production of hydrogen or synthetic methane using electrolysis utilizing conventional power with CCS could place those fuels in the low-GHG category, but they would be non-renewable”.

And here’s a nice conundrum — is carbon dioxide really saved when it is used to make new fuels? The report says that it is important to draw a distinction between CCS and carbon capture and utilization (CCU). It notes that “in CCU, captured carbon is utilized for a given fuel production process or end-use. In this case, CO2 from a fossil fuel power plant or used gases from a steel mill may be utilized as an input for synthetic methane production.” But of course, when that snygas is burnt, you get CO2 again.

What’s more, the report worries that “in the long-term, it is possible that policy tools may be necessary to ensure that CCU does not perpetuate the use of fossil fuels. If all industrial CO2 emissions are phased out in the future or claimed by other circular economy processes, all CCU must utilize CO2 from direct air capture to continue delivering climate benefits”.

Yes, true, CCU may lead to fossil “lock in”. But so may CCS. Whatever is done with the carbon removed from the atmosphere, or from power station exhausts, there will still be more coming unless we switch to non-carbon energy sources or cut demand so that less energy is needed. Indeed, some say that carbon removal offsets and zero-carbon generation are fundamentally at odds. A focus on compensatory carbon removal deflects resources from zero-carbon renewable energy generation and also from energy-saving — arguably the only long-term solutions to climate change.

Then again, not all ostensibly renewable gas generation is necessarily green. The report says: “The cultivation of silage maize for biomethane production in the EU can be expected to displace food production on cropland and thereby cause indirect land-use change” and that “in conjunction with cultivation and processing emissions, pushes that gas source into the high-GHG category”. What’s more, “within a given combination of feedstock and process, factors such as the degree of methane leaks for a given facility may further influence the GHG categorization for that producer”. So some green gases may have issues.

New sources

How about grass – converted to biogas by anaerobic digestion (AD)? UK energy supplier Ecotricity is currently pushing for it as a vegan energy option, since it does not involve animal products: a metric the EC evidently hasn’t as yet taken on board! Ecotricity says it will start work on grass-powered bio-gas plants soon. AD conversion of domestic food and farm waste to biogas is another key option, with no new land-use implications. There are already 1 million UK biogas users, and biogas and other green gases can play a role in fuelling local Combined Heat and Power plants linked to district heating networks.

some say that carbon removal offsets and zero-carbon generation are fundamentally at odds

The UK and many other countries certainly need to get on with decarbonising heat, and green gas could be a key option, replacing fossil gas. But, as the report for the EC notes, green gas  of various types can also be used to make other fuels and products, including synfuels for use in some vehicles. Will there be enough green gas for that, as well as for heating and for power generation? Although there are obvious land-use limits to growing biomass as an energy crop, there are other sources, including farm and other wastes, and the biogas resource overall does look quite large. The Global Biogas Association says that only 2% of available feedstocks undergo anaerobic digestion and are turned into biogas, and, if developed, biogas could cut global greenhouse gas emissions by up to 13%. 

The synthetic gas resource could be even larger, if renewable power generation expands as much as is expected, creating large surpluses at times, which can be converted to hydrogen and possibly methane at reasonably low cost. Some projects may be designed specifically to make hydrogen. For example, there is a UK proposal for a £12 billion 4 GW floating offshore wind farm for on-board hydrogen production, the gas then being piped to shore. That may be some way off — probably not until the early 2030s. A 2 MW prototype off Scotland is planned first, followed, if all goes well, by a 100 MW multi-unit test project in 2026 and leading on, if further backing can be found, to the full 4 GW version. It is a fascinating idea. The developer says, “if you had 30 of those in the North Sea you could totally replace the natural gas requirement for the whole country, and be totally self-sufficient with hydrogen”.

One way or another, even if bold plans like that do not win through, the UK could have plenty of easily stored and pipe-transportable green energy for multiple uses, as long as any leakage problems can be dealt with: see my earlier post. Many other countries also have large green gas/hydrogen potentials. For example, in addition to biogas inputs, in most EU decarbonisation scenarios, hydrogen and derived fuels add up to between 10% and 23% of 2050 final energy consumption. And a new IRENA report is quite optimistic about the global prospects for green P2G hydrogen: “Hydrogen from renewable power is technically viable today and is quickly approaching economic competitiveness”.

More than electricity

Not much of this, however, seems to be taken on board in the scenarios reviewed in the recent Global Energy Outlook produced by Washington-based Resources for the Future (RFF). As in an earlier World Energy Council (WEC) review, the outlook compares scenarios from oil companies, the IEA (International Energy Agency) and so on, most of which see fossil fuels still romping ahead, feeding growing energy demand, with renewables, although expanding, barely keeping up. In most of the scenarios RFF looks at, the emphasis is on electricity-producing renewables, and under the highest renewables scenarios examined, they only reach a 31% primary energy share by 2040, matching oil’s share. Emissions rise in all scenarios, even those using CCS.

However, neither the WEC or RFF reviews look at the many, much more ambitious “100% renewables by 2050” global scenarios from NGOs like GENI and academics like Mark Jacobson and the team at Lappeenranta University of Technology (LUT). In all, there are now 42 peer-reviewed “100%”, or near-100%, renewables studies. Most of them cast the net wider than just electricity, with some including direct green heat supply on a large scale, with solar and biogas combined heat and power (CHP)-fed heat networks and heat stores. Though nearly all of them still focus heavily on the power-supply side, leavened in some cases by P2G conversion to green syngases.

Electricity-generating renewables will obviously rule the roost, as costs continue to fall, and electricity has many good end-uses, but there are also other options that can play a part in meeting energy needs and in balancing variable green electricity power sources. Green gas is one.

Lithium-ion battery pioneers bag chemistry Nobel prize

The 2019 Nobel Prize for Chemistry has been awarded to John Goodenough, Stanley Whittingham and Akira Yoshino for the development of lithium-ion batteries.

The trio will share the SEK 9m (about £740,000) prize money equally and the award will be presented at a ceremony in Stockholm on 10 December.

“Electrochemistry is at the heart of this award, but other branches of chemistry have played a part,” said Nobel Committee member Olof Ramstrom from the University of Massachusetts when the winners were announced. “This prize is also connected to physics and even engineering – it’s a good example of how these disciplines have come together.”

Speaking over the telephone at the press conference, Yoshino said that it  was “amazing” to hear he had won the Nobel prize. He adds that it was “curiosity” rather than thinking about the immediate applications that pushed the work and says that lithium-ion batteries are being used to build a “sustainable society”.

The story of the lithium-ion battery begins in the oil crisis of the 1970s when Whittingham was trying to develop new energy systems. He discovered that a battery cathode made of titanium disulphide can absorb large numbers of lithium ions from a lithium anode. In 1980, Goodenough showed that an even better performing cathode can be made from cobalt oxide.

Metallic lithium is an excellent anode material because it readily gives up electrons, however it is highly reactive and early batteries were prone to exploding. In 1985 Akira Yoshino solved this problem by creating a carbon-based anode that is able to absorb large numbers of lithium ions. This removed the need to use reactive metallic lithium and the first commercial lithium–ion battery appeared in 1991. Since then, the devices have powered a revolutions in handheld electronics and electric vehicles.

Oldest ever winner

Goodenough did a PhD in physics at the University of Chicago and at 97 is the oldest person ever to receive a Nobel prize. An American citizen, he has worked at the Massachusetts Institute of Technology, University of Oxford and the University of Texas at Austin.

Whittingham is a chemist who was born in the UK in 1941 and is based at Binghamton University in the US.

Also a chemist, Akira Yoshino was born in Osaka, Japan in 1948 and is at the Asahi Kasei Corporation in Japan and Meijo University in Nagoya.

The physical chemist and nanoscientist Rodney Ruoff of the Institute for Basic Science Center for Multidimensional Carbon Materials in Korea had been colleague of Goodenough’s at the University of Texas. He told Physics World that Goodenough is “fun and fascinating”. “He’s a well-rounded individual, well read on many topics as well, of course, topics of science surrounding the things he has done during his research career,” says Ruoff, “I’m really delighted that he has received this honour along with [Whittingham and Yoshino]”.

Learn more about lithium-ion batteries:

We have also created a collection of key papers by Goodenough, Whittingham and Yoshino, which are free to read until 31 October 2019. You will find these papers below the list of papers written by the 2019 physics Nobel Laureates.

Oxford Instruments logo

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

Quality control tests for PET/MRI vary widely in Europe

© AuntMinnieEurope.com

A survey of eight large European imaging facilities with extensive clinical experience of PET/MRI has found wide variations in daily quality control and assurance procedures that ensure their proper function and image quality. The international study was published on 24 September in Frontiers in Physics.

One reason for the differences in performance testing protocols, the researchers suggested, is the lack of dedicated quality control recommendations for the hybrid imaging modality beyond standard vendor guidelines.

“The reported variations of local PET and MRI quality control procedures and testing frequencies are assumed to partly reflect the variations seen in the existing guidelines for the single modalities and the nonexistence of specific recommendations for PET/MRI,” wrote the authors, led by Alejandra Valladares, a doctoral candidate from the Medical University of Vienna.

European PET/MR sites

“However, in part, the variability in the reporting seems to be caused by differences in the definition of the specific tests between the centres and a lack of in-depth knowledge about the implemented quality control measures included in the daily quality assurance and the tests performed by the vendor during the preventive maintenance,” they explained.

Hybrid hype

The term “hybrid” is more than simply integrating two imaging modalities. It also is the acronym for Healthcare Yearns for Bright Researchers for Imaging Data (HYBRID), a training network project funded by the European Commission (MSCA ITN 764458) with goals that include the promotion of molecular and hybrid imaging modalities to advance personalized medicine and non-invasive disease evaluations. The HYBRID consortium consists of international academic, industrial and non-governmental partners at eight European locations with extensive clinical experience in PET/MRI, along with three PET/MRI vendors (www.hybrid2020.eu).

During the first phase of commercialization almost a decade ago, three PET/MRI systems came to market, all of which had similar technological components and state-of-the-art image reconstruction algorithms, which require stringent quality control procedures and image quality parameters.

“Hybrid imaging systems such as PET/CT or PET/MRI, in particular, present additional challenges caused by differences between the combined modalities,” the authors wrote. “However, despite the increasing use of this hybrid imaging modality in recent years, there are no dedicated quality control recommendations for PET/MRI.”

So, how are institutions handling their duties to keep these hybrid scanners running at peak efficiency? Researchers set out in this study to summarize PET and MRI quality control programs employed by European hybrid imaging sites, as well as guidelines on maintenance and performance measures recommended by the systems’ manufacturers. The end goal was to develop a consensus on minimal quality control standards for PET/MRI for use throughout the HYBRID consortium.

Of the eight facilities with PET/MRI systems, there were five Biograph mMR systems from Siemens Healthineers, two Signa PET/MR systems from GE Healthcare and one Philips Healthcare’s Ingenuity TF PET/MR. The GE and Philips systems both have PET time-of-flight (TOF) capabilities.

Survey results

All PET/MRI centres stated they perform daily quality assurance, which includes assessing detector stability in accordance with the manufacturers’ guidelines. There were, however, cases where facilities handled certain chores less frequently than others (Front. Phys. 10.3389/fphy.2019.00136).

Take sensitivity testing, for example, which was conducted in a “high variability across the centres,” the authors wrote, ranging from “daily, quarterly, annually or at the commissioning of the system (during the acceptance test).” The sensitivity of a PET system reflects “how many events can be detected for a given activity within the field-of-view. A change in sensitivity can possibly be caused by malfunctioning detectors, changes in the energy resolution or the coincidence timing window.”

Regarding spatial resolution, two centres provided no information for how often they tested, while one facility indicated the task was performed daily and the other five participants indicated their evaluations ranged from yearly to once every 10 years.

For image uniformity, three centres indicated they tested daily, compared with three other centres that conducted their assessments every three months, one centre that tested annually, and one location did not perform the test as part of its routine quality control.

As for image quality, attenuation- and scatter-correction accuracy and quantitation, three centres ran those tests annually. One centre reported this assessment every 10 years, two facilities conducted the test as part of its acceptance of the PET/MRI and one centre does not perform it as part of the routine quality control.

Vendor evaluations

The trio of PET/MRI manufacturers also was involved in the survey to describe their standards for quality assurance. For daily quality control of the PET component, all three vendors provide dedicated phantoms containing long-lived positron emitting sources. Using these phantoms, a detector stability test is included within all daily quality assurance procedures, but other system and components testing varied between vendors, the authors noted.

For the MRI component, for instance, Siemens’ quality assurance and preventive maintenance procedures usually includes an image quality test using a spherical head phantom with an outer diameter of 180 mm and a body spherical phantom with an outer diameter of 300 mm, which the company provides. This test is done to assess signal-to-noise ratio (SNR) and artefacts.

GE provides at least one specific phantom in the daily quality assurance package to check the system’s MRI component for its geometric accuracy, ghosting level, and SNR. The imaging centre has the choice of how frequently to perform the test.

While Philips’ daily quality assurance for the Ingenuity PET/MR system does not include any test for the MRI component, the company does provide a 200 mm head phantom to validate the MRI component of its system through an MRI image quality evaluation.

Consensus recommendation

After reviewing the quality control procedures of the imaging centres and the vendors’ guidelines, Valladares and colleagues developed their own set of recommendations.

Quality control measures

Their minimum consensus for all the three PET/MRI systems includes the following:

  • Daily quality assurance implemented by the vendor for the PET component, with quarterly cross-calibration testing and yearly image quality evaluations
  • A monthly coil check for the MRI scanner and a quarterly MR image quality test
  • An assessment of the PET/MRI system’s alignment after any mechanical adjustments or work on the system’s gantry and after software updates

“It is also recommended to check the function and quality of additional hardware,” the authors added. “Checking the [MRI] coil performance permits the detection of issues with the coils before these affect clinical scans and clinical image quality. This test is particularly important when using flexible coils, which are more susceptible to deterioration.”

Taken together, this consensus recommendation adds positively to overcoming the inertia toward standardized PET/MRI procedures and protocols that will help support high-quality assessments in clinical routine and research using this multiparametric, hybrid imaging modality, they concluded.

  • This article was originally published on AuntMinnieEurope.com ©2019 by AuntMinnieEurope.com. Any copying, republication or redistribution of AuntMinnieEurope.com content is expressly prohibited without the prior written consent of AuntMinnieEurope.com.

Hamiltonian learning technique advances quantum spin register

Researchers have developed an efficient way to characterize the effective many-body Hamiltonian of the solid-state spin system associated with a nitrogen-vacancy (NV) centre in diamond. The technique will be important for making and controlling high-fidelity quantum gates in this multi-spin quantum register, they say.

A quantum register is a system made up of many quantum bits (qubits) and it is the quantum equivalent of the classical processor register. Quantum computers work by manipulating qubits within such a register and making and understanding such systems is essential for quantum information processing.

Crosstalk is a problem

Researchers have made quantum registers from many physical systems thus far, including trapped ions, solid-state spins, neutral atoms and superconducting qubits. Whatever their nature, however, they all suffer from the same problem: crosstalk between multiple qubits when an individual qubit is addressed during a measurement. This induces a state error to other qubits and reduces gate fidelity.

One way to overcome this problem is to fully characterize the many-body Hamiltonian of the system, which determines how it evolves and crosstalk interactions. “Once we have this information, we can predict the evolution of any initial states and, what is more, design optimized gate operations to reduce crosstalk errors and achieve high-fidelity gates in a multi-qubit register,” explains Panyu Hou of the Center for Quantum Information at Tsinghua University in Beijing, who is the lead author of this new study. “The Hamiltonian can often be fully described by some essential parameters, which characterize the coupling between the qubits.”

Diamond defects

Hou and colleagues studied the electron spin and the surrounding nuclear spins of a single NV centre in bulk diamond. A NV centre, or defect, is one of the hundreds of colour centres found in natural diamond and it is known to be a promising platform for quantum information processing. The centre involves two adjacent carbon atoms in the diamond lattice being replaced by a nitrogen atom and an empty lattice site (or vacancy).

The electron and nuclear spins in the NV centre each play a different role. Electron spins allow for fast control and high-quality readout and, when coupled to light, allow for entanglement between qubits, even over long distances. This means that they can be used to realize a scalable quantum network. The nuclear spins for their part provide additional qubits that are stable – that is, they have long coherence times – and can thus be used to store and process quantum information.

Constantly-on interactions

For this solid-state spin register, the interactions among all the spin qubits are constantly on, however, explains Hou. This is why there is crosstalk error from the other nuclear spins other than the target one being addressed.

In their work, the researchers characterized the effective many-body coupling Hamiltonian of an NV centre containing one electron spin and ten weakly coupled 13C nuclear spins. They then used the learnt Hamiltonian parameters to optimize the dynamical decoupling sequence and so minimize the effect from the unwanted crosstalk coupling. This adaptive scheme was first put forward by researchers at Delft University in the Netherlands.

The researchers validated their technique by designing a universal quantum gate set that includes three single-qubit gates for each of the 11 qubits and the entangling two-qubit gate between the electron spin and each of the 10 nuclear spins. “In principle, we can realize any unitary operation on this 11-qubit register by combining the above gate set,” Hou tells Physics World.

“This learning technique could be a useful tool to characterize many-body Hamiltonians with a constantly-on interaction,” he adds. “The 11-qubit quantum register that we made might also be used as a proof-of-principle test of some quantum algorithms.”

The work, which is reported in Chinese Physics Letters, comes hot on the heels of new research from Delft, described in this Physics World article.

Handheld gas sensor distinguishes methanol from ethanol

Drinking as little as 6 ml of methanol can be fatal. Since the start of the year, there have been 1636 reported cases of methanol poisoning across a number of countries. The problem arises in adulterated counterfeit or informally produced spirit drinks (according to the World Health Organisation) which can lead to inadvertent methanol consumption. Break outs of methanol poisoning are particularly common in developing countries partly because current methods to detect methanol are resource intensive making them inaccessible. Researchers Jan van den Broek  and colleagues at ETH Zurich and the University Hospital Zurich in Switzerland have now developed a handheld gas analyser that is able to successfully discriminate between methanol, ethanol and acetone. Additionally, they showed the device is capable of differentiating between low concentrations of methanol (1 ppm) and high concentrations (up to 62000 ppm) of ethanol within 2 minutes, essential for the analysis of alcoholic beverages.

Developing the sensor for methanol

The device comprises a column of polymer resin to separate methanol from interference compounds such as ethanol prior to reaching the detector. The column works like a gas chromatographic system, separating the components based on differences in their volatility. The researchers then incorporate a highly sensitive but non-specific microsensor composed of palladium-doped tin oxide nanoparticles on interdigitated sensing electrodes – a measurable change in conductivity in the semiconductor indicates the presence of methanol, ethanol or acetone in the different parts of the column.

The researchers then investigated the efficacy of the device for differentiating between un-tainted liquor from contaminated beverages. They laced Arrack (a Southeast Asian liquor) with varying concentrations of methanol (0.3-1% vol/vol). There was a peak in the conductance at 1.7 minutes for the more volatile methanol, which the device extracts first, with a peak for ethanol following after at approximately 8.3 minutes. The device could not only clearly differentiate between the tainted Arrack and the unadulterated liquor at all concentrations tested, but was also capable of quantifying the amount of methanol present. This provides the first step to implementing low-cost methanol analysis of alcoholic beverages, particularly in developing countries.

Additionally, the researchers investigated the potential to detect methanol in breath samples. An ethanol intoxicated volunteer with a blood alcohol content of 0.54‰ provided a breath sample, which was subsequently spiked with a concentration of methanol slightly above that of serious methanol intoxication (135ppm). When analysed, the device could clearly differentiate between methanol and ethanol in breath samples. They confirmed the results from the device by analysing the same breath samples by bench top proton-transfer-reaction time-of-flight mass spectrometry (PTR-TOF-MS).

The future of this device

The authors hope this “proof-of-concept” device will lead to the development of a fast and non-invasive method for detecting methanol poisoning. Additionally, it could pave the way towards an inexpensive means of monitoring alcoholic beverage production to prevent instances of accidental poisonings in the future. Full details of this research are reported in Nature Communications.

 

Forests reduce cyclone intensity

Planting trees could protect against storm damage. A new study suggests that major reforestation across Europe has the potential to reduce the number of extra-tropical cyclones by up to 80%.

Previous research showed that extra tropical cyclones reduce in intensity when the land surface becomes rougher. This set Danijel Belušić from the Swedish Meteorological and Hydrological Institute and colleagues to wondering how much difference it would make to extra-tropical cyclone activity across Europe if the continent was covered in trees.

Using a regional climate model and a cyclone-tracking algorithm, the researchers investigated long-term changes in the number and intensity of cyclones, and their effect on precipitation, under conditions of deforestation and complete afforestation.

The results suggest that if Europe was completely covered in trees, the number of cyclones would drop between 10 and 80%, compared to the case of a continent devoid of trees. West European coastal areas experienced the smallest reduction (10%), most likely because cyclones tend to arrive from the west and have less opportunity to interact with trees before hitting these areas. But by the time weather systems reach eastern Europe, Belušić and his colleagues estimate that afforestation could reduce the number of cyclones by as much as 80%.

“The increased roughness due to forests spins down a cyclone and reduces its intensity,” says Belušić.

With weaker and fewer cyclones, Europe would inevitably have less heavy rain. The model suggests that complete afforestation would reduce extreme winter rainfall by as much as 25%. However, Belušić and his colleagues suggest that this loss could be at least partially compensated by trees’ ability to access deep soil moisture.

“For example, due to the increased evapotranspiration, the atmosphere receives more moisture from the surface which can be converted into cloud and precipitation,” says Belušić, whose findings are published in Environmental Research Letters (ERL).

Indeed, the team’s model shows that total precipitation is similar for both scenarios, but that the afforested Europe experiences an increase in weak to moderate precipitation and a decrease in very heavy rain compared to a deforested Europe.

It’s likely that the total area of trees is important. Small pockets of forest are unlikely to increase surface roughness sufficiently to impact an extra-tropical cyclone; these tend to be several hundred kilometres wide. Right now, Belušić and colleagues don’t know how large the forest must be in order to impact a cyclone, but their model does confirm that very large areas of roughness do have an impact. For the British Isles, which are similar in area to a typical cyclone, the researchers show that a covering of trees would reduce the number of cyclones over both the British Isles and the North Sea.

In theory, surface roughness could be increased by covering Europe with wind turbines or tall buildings but these structures wouldn’t bring the evapotranspiration benefits that accompany trees. Belušić speculates that artificial increases in roughness like these would be accompanied by significant decreases in rainfall.

The researchers also note that mass tree-planting to reduce cyclones should work in any mid-latitude location, not just Europe.

Currently, covering Europe with trees is hypothetical; the scientists say that more research is needed to understand the science better and to investigate the feasibility of such a major project. But as climate continues to change, solutions like this could become serious contenders in helping us adapt to increasingly extreme weather.

How are winners of the Nobel Prize for Physics selected?

Winners of the Nobel Prize for Physics, join a group of elite scientists, including Albert Einstein, Marie Curie and Enrico Fermi. But who are the mysterious kingmakers who select the laureates? And how exactly does that decision process work? This video introduces the process and the different players involved in reaching that dramatic annual announcement. Find out more in this recent Physics World article, ‘Inside the Nobels: Lars Brink reveals how the world’s top physics prize is awarded’.

James Peebles, Michel Mayor and Didier Queloz share Nobel Prize for Physics

James Peebles, Michel Mayor and Didier Queloz have won the 2019 Nobel Prize for Physics “for contributions to our understanding of the evolution of the universe and Earth’s place in the cosmos”.

Peebles bags half the prize for “theoretical discoveries in physical cosmology”.  Mayor and Queloz share the other half for “the discovery of an exoplanet orbiting a solar-type star”.

The prize is worth SEK 9m (about £740,000). Peebles will get one half of the prize money and Mayor and Queloz will share the other half. The new laureates will gather in Stockholm on 10 December, where they will receive their medals at a ceremony.

“They have painted a picture of a universe that is far stranger and more wonderful that we could ever have imagined,” said Nobel Committee member Ulf Danielsson of Uppsala University when the winners were announced. “Our view of the universe will never be the same again.”

In the 1960s, Peebles developed the Big Bang model of cosmology as a theoretical framework for describing the universe. Using the model, he predicted properties of the cosmic microwave background radiation (CMB), which was created 400,000 years after the Big Bang. His work has meant that measurements of the CMB have provided profound insights into how much matter was created in the Big Bang and how this matter then clumped to form galaxies and galaxy clusters.

In the early 1980s Peebles was a driving force behind the idea that the large-scale structure of the universe can be explained by the existence of an invisible substance called cold dark matter. In the 1980s he also helped revive the idea of Albert Einstein’s cosmological constant to account for the theoretical observation that about 69% of the mass–energy in the universe is missing. In 1998 the first measurement of the accelerating expansion of the universe confirmed the existence of this missing mass–energy in the form of dark energy. As a result Peebles played a crucial role in the development of the Standard Model of cosmology, which assumes the existence of cold dark matter and dark energy without describing the precise nature of either substance.

There are still many open questions – what is dark matter and Einstein’s cosmological constant?

James Peebles

“We have clear evidence that the universe expanded from a hot, dense state,” Peebles said on the telephone during the press conference that followed the prize announcement. “But we must admit that dark energy and dark matter are mysterious. There are still many open questions – what is dark matter and Einstein’s cosmological constant?”

Wobbling star

In October 1995, Mayor and Queloz made the first discovery of a planet orbiting a star other than the Sun (an exoplanet). Using custom-made instruments on the Haute-Provence Observatory in France, the pair detected the tiny wobbling motion of a star 50 light-years away. This wobble is caused by the close orbiting of a gas-giant planet called 51 Pegasi b, which is about half the mass of Jupiter.

As well as being the first-ever sighting of an exoplanet, the discovery signalled a major shift in our understanding of planet formation and evolution because the distance between 51 Pegasi b and its Sun-like star is just 5% of the distance between Earth and the Sun. This is unlike the much larger orbits of gas giants in the Solar System, yet many more large exoplanets with similar orbits have since been found. This has prompted planetary scientists to develop new theories that describe gas giants as “wandering Jupiters”, forming in relatively large orbits and then migrating inwards.

“It’s a great day for exoplanets,” planetary scientist Sara Seager from the Massachusetts Institute of Technology told Physics World. “To see a field go from obscure, fringe and laughable to Nobel-prize-worthy is a huge tribute to all the people around the world that make exoplanets real.”

Charming discovery

Commenting on the work of Mayor and Queloz, Peebles said: “It is so charming to me to think that we once had a good theory of how planets formed until we found the first [exoplanet] – it’s a good illustration of how science works”.

Since the 1995 discovery, new techniques and telescopes have been developed to make increasingly sophisticated observations of exoplanets. This includes spectroscopic measurements of the chemical compositions of the atmospheres of some exoplanets, which could reveal whether they are suitable for life.

As of 1 October 2019, 4118 exoplanets have been found in the Milky Way. These reside in 3063 exoplanetary systems, 669 of which are known to contain more than one exoplanet – illustrating how rich the study of exoplanets has become since the 1995 discovery.

Peebles was born in 1935 in Winnipeg, Manitoba, Canada. He completed a degree in physics at the University of Manitoba in 1958 before doing a PhD at Princeton University in 1962. He remained at Princeton ever since, where he is still active in research.

Mayor was born in Lausanne, Switzerland in 1942. He did a MSc in physics at the University of Lausanne in 1966 before completing a PhD in astronomy in 1971 at the Geneva Observatory, which belongs to the University of Geneva. Mayor remained at the university – becoming director of the Observatory of Geneva from 1998 to 2004 – until retiring in 2007.

Queloz was born in Switzerland in 1966. In 1995, he completed a PhD in astronomy at the University of Geneva under the supervision of Mayor. After a spell at NASA’s Jet Propulsion Laboratory from 1997 to 1999, he returned to the University of Geneva, where he has remained since. In 2013 he took up a joint position at the University of Cambridge.

Learn more about cosmology and exoplanets:

Some of our reviews of books on these topics:

  • Commonly uncommon” – Hamish Johnston reviews One of Ten Billion Earths: How we Learn About our Planet’s Past and Future from Distant Exoplanets by Karel Schrijver
  • How to build a planet” – Louisa Preston reviews The Planet Factory: Exoplanets and the Search for a Second Earth by Elizabeth Tasker
  • The dark-energy game – Robert P Crease reviews The 4% Universe: Dark Matter, Dark Energy, and the Race to Discover the Rest of Reality by Richard Panek

We have also created a collection of key papers by Peebles, Mayor and Queloz, which are free to read until 31 October 2019.

Oxford Instruments logo

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

Copyright © 2025 by IOP Publishing Ltd and individual contributors