Skip to main content

Physicists break distance record for electron spin-state transmission in spin qubits

For quantum computers to be feasible, quantum error correction is necessary to protect the information in qubit arrays even if individual qubits become corrupted. Its implementation requires that multiple qubits can interact with one another.

Researchers at the University of Rochester and Purdue University, US, have demonstrated the ability to manipulate the interactions between electron spin qubits in the form of spin swapping between electron pairs. They were able to transmit electron spin states with single-electron precision in a linear array of spin qubits – a significant step towards realizing fault-tolerant quantum information processing.

A spin on quantum state transmission

Led by John Nichol at Rochester, the researchers successfully demonstrated coherent spin-state transfer along an array of four electrons confined in a quadruple quantum dot in a GaAs/AlGaAs heterostructure. When they applied a voltage pulse to a gate between two quantum dots, the electrons in the dots exchanged their spin states via Heisenberg exchange coupling, which occurs when the wavefunctions of neighbouring electrons overlap. By applying a series of voltage pulses to specific gates, the researchers were able to shuttle the spin states of the electrons back and forth. They were also able to transmit the spins of entangled electrons using the same technique.

“Spin swapping in electron pairs has been known for some time, but we were the first to experimentally apply it to long-distance quantum information transfer,” said Nichol. His team demonstrated the transmission of spins in four electrons, breaking the previous record of two.

These demonstrations are significant in two ways. Firstly, the process of spin transfer by Heisenberg exchange coupling is scalable to larger spin qubit arrays. Secondly, the researchers were able to transmit spins without having to move a single electron. This is akin to a row of rugby players standing in a row and swapping their jerseys without moving. However, unlike jerseys, an electron cannot be stripped of its spin. After all, spin is a fundamental property of the electron. Yet Nichol et al. were able to shuffle spin states as if they were physical objects to be moved around arbitrarily – the spins had become independent of the parent electrons.

A promising outlook

There is no theoretical limit on how far spin transmission can reach. Nichol and his group are planning to go to greater lengths – literally – and incorporate more electrons in the same GaAs/AlGaAs system.

Additionally, they aim to recreate these observations in silicon spin qubits. Unlike GaAs/AlGaAs, silicon can be isotopically purified to remove nuclear spins, which can serve as a source of magnetic noise. Nichol predicts that the transmission of electron spins in silicon qubits can be conducted with much greater fidelity, propelling humankind towards a quantum future.

“Having the capability to transmit quantum states paves the way for more bizarre tricks such as quantum teleportation – the instantaneous transfer of quantum information from one physical location to another,” said Nichol. “Our coherent spin state transfer should in principle allow us to do that, which is pretty exciting.”

Full details of this research are reported in Nature.

Infection model could explain spread of a curious type of virus, but some scientists disagree

New insights into our understanding of a curious type of virus have been claimed by physicists in China, Japan and the US – but some scientists are not convinced. The team modified the established “SIR” model of how viruses spread to try to explain the surprising success of multipartite viruses. These viruses have genetic material that is split amongst multiple viral particles, with each particle needing to infect the same cell for replication to occur. While some multipartite virus experts welcome the efforts of the physicists, they have questioned the biological accuracy of the modified model.

Numerical models are the bread and butter of theoretical physics and their application in biophysics has helped scientists glean important insights into biological processes. But it can be extremely difficult to turn the intricacies of nature into equations and on occasion crucial details can be lost in translation between biology and physics. According to some virus experts, this problem has cropped up in research described in a recent paper in Physical Review Letters, where much to the excitement of experts in the field, physicists have tried to tackle the baffling behaviour of the multipartite virus.

For most viruses, it only takes one infection for a virus to replicate, but multipartite viruses are weird. Their genetic material is split between different particles and need multiple infection events to occur. This seems disadvantageous and so the evolutionary success of multipartite viruses, primarily in plant and fungi hosts, has long baffled scientists.

Static interactions

Now, Yi-Jiao Zhang and colleagues at Lanzhou University, Tokyo Institute of Technology and Indiana University have used epidemiological models to evaluate a core factor that differs between plants and animals. Plants usually stay in one place and therefore interact only with their neighbours. In contrast animals usually move around in space and have more dynamical interactions with others.

“Intuitively in a dynamic network with connections that change rapidly, you expect spread to be faster and broader. But we found that the multipartite viruses had a lower colonization threshold in the static networks, meaning that in plants the virus spreads easier than in animals, says Zhang. “This is very unusual for an epidemic model.”

Zhang and colleagues used a minimal model of a bipartite virus in two types of host networks – one that shuffles its connections, mimicking the changing interactions between animals, and a static network mimicking the relatively fixed interactions between plants. They began with a classic epidemiology model that defines viral hosts as being in one of three states: these are susceptible (S) when no virus particles are present in the host; infectious (I) with both parts of the virus present; and recovered (R) or dead from infection. They extended this “SIR” model to include a fourth latent (L) state where a host is infected with one type of virus particle but not the other. In this L state they cannot infect other individuals.

“Difficult to justify”

It is this L state that has sparked debate in the research community. Susanna Manrubia is a physicist at the Spanish National Centre for Biotechnology in Madrid who has studied multipartite viruses for over a decade.  She says, “Unfortunately, this L state is difficult to justify. Once the genomic fragments enter the cell, they are exposed to the action of proteins that degrade polynucleotides different from the DNA of the host (of the plant, in this case). Therefore, there is degradation in the unreplicated genomic material of the virus that affects L states,”

This concern is echoed by Mark Zwart of the Netherlands Institute of Ecology, who studies the ecology and evolution of viruses and bacteria. “This latent state doesn’t really mesh with what we know about the biology of these viruses,” he says.

Zhang acknowledges these concerns but points to studies that her team believes supported the possibility that multipartite viruses survive in host cells for a sufficiently long time.

Manrubia disagrees, pointing out that replication is occurring in each of the experimental studies highlighted and therefore she does not think that they support a long-lived L state. “The problem is not trivial and, since experimental approaches to viruses are often highly intricate, there are important details that may have been inadvertently overlooked,” she says.

Important applications

Despite their reservations, both Manrubia and Zwart voiced their excitement that these types of models are starting to be applied to multipartite viruses.

Manrubia is pleased to see the bigger ecological picture of how host dynamics impacts multipartite virus spread being considered, “I have the hunch that their [multipartite viruses’] competitive strength relies on ecological features, precisely of the type this paper addresses.”

Zwart thinks the techniques used in Zhang’s research will be very useful in pinning down how multipartite viruses spread from cell to cell within a single plant host.

To spread within a host plant, viruses move from an infected cell to the adjacent cells using the existing communication and contact routes between cells. Perhaps these restrictions on virus spread are the reason why multipartite viruses have been evolutionary so successful in plants. Zwart is hopeful that Zhang and colleagues will apply their methods to find out. “Plant viruses have really different lifestyles, and if they use these type of models at the within host level then it might be more relevant and could capture those sort of dynamics,” said Zwart.

Trends in Nanotechnology: nano pioneers celebrate the conference’s 20th anniversary

How do you define the cutting edge of nanotechnology these days? Or should I say how do you define the edge states – topological materials cropped up more than once in the talks at Trends in Nanotechnology 2019 , as did 2D materials and a number of other topics. Researchers from around the world convened in San Sebastian in Spain for the 20th anniversary of the conference and I tagged along for a couple of days to get the low down.

Top-trending topological materials 

The existence of materials where certain regions have a separate portfolio of properties to the bulk has captivated a lot of research groups, and for some the theories to describe these systems are infiltrating the way other systems are considered. “Symmetry and topology are the corner stones of physics nowadays,” Ikerbasque research fellow Maia Vergniory told attendees – a bold claim, but as she pointed out herself, she is not the first to say so. What her group are among the first to do is to put the weight of numbers behind the claim. Treating her audience to a whistle stop tour of Euler’s theorem for the vortices, faces and edges of polyhedra, Bloch’s theorem for periodic structures, the impact of spin-orbit coupling on densities of states, and comparisons of triangular, hexagonal and kagome lattices, she then outlined a predictive theory for topological materials that makes the link between real space orbitals and momentum space topology. With this theory she highlighted work analysing thousands of atomic limits found in nature that led to the conclusion that 27% of materials are in fact topologically “nontrivial” (for more details see Hamish Johnston’s article on Physics World from earlier this year). She then invited those interested to try the online tool she and her collaborators have developed that allows you to combine elements of your choice and identify whether or not the resulting material will have topologically nontrivial properties.

With topological behaviour so unexpectedly common it is little wonder that this line of research is having such an impact in nanotechnology and materials science. While a number of fundamental questions about the nature of these topological states still remain, Eugene Chulkov and Mikhail Otrokov were happy to relieve attendees of one of them – the existence of an antiferromagnetic topological insulator. Taking us through a series of options for achieving magnetism in a non-magnetic topological insulator Chulkov – UPV/EHU professor and Materials Physics Center (CFM) and Donostia International Physics Center (DIPC) researcher – highlighted a paper currently on arXiv but accepted for publication in a scholarly journal soon, where they report predictions and the first observations of an antiferromagnetic topological insulator. Those keen to know more of the details had not long to wait before further updates on the work from CFM’s Otrokov later in the afternoon. You can find out more too in a Physics World news story coming soon.

2D stays in trend

Of course, the topic that inspired and cross-fertilized the most talks, posters and discussions was graphene and 2D materials. Although many have been chafing at the bit for years to see the commercial impact of graphene Amaia Zurutuza, scientific director of Graphenea advised her audience to be patient, adding “things take time – there are many examples of materials taking more than 20 years to get to market”. In the meantime research in the field remains prolific, which has been good news for Graphenea who plan to release two new single crystal graphene products later this year and have even moved the company’s output up the product line to supply field effect transistors. You can hear her describe developments herself in the audio clip.

 

Amaia Zurutuza talks about commercial graphene

In terms of the practical aspects of fabricating graphene devices much has been learned. “There is some hope on the horizon that we can manufacture devices from 2D materials at around an-8 inch wafer,” AMO GmbH’s Daniel Neumaier told attendees following a round-up of progress in graphene device fabrication, adding, “But we need to evaluate a complete value chain.” Stephan Hofmann from Cambridge University highlighted some of the complexities in the chemical vapour deposition growth process favoured for graphene crystal growth that are not always appreciated in aims to yield reproducible high-quality samples. He described it as a “heterogenous growth” process because of all the different facets on the growing graphene islands, each with a potentially different growth-limited step be that dissociation-limited, interface-limited or something otherwise. How much imperfections such as holes in graphene pose a problem he also called into question, highlighting experiments suggesting that the size of the hole seemed to be less of a factor than the Debye length. As he described efforts to connect fundamental science with the road to market he stressed the need to consider the “interplay between software and hardware” and to think beyond CMOS and its prescriptive perfect crystal requirements. Talking with him over the coffee break he laughed at my suggestion that the holy grail of graphene research might be holey after all.

One fabrication technique that raised eyebrows a few years ago is laser-induced graphene, where the intensity of the laser converts the polymer into graphene. The approach is attractive for flexible electronics, such as the in-plane microcapacitors Joan Marti Gonzalez, a PhD student at the University of Barcelona described in his poster, and has since been demonstrated on an unlikely range of materials including toast. (“Next must be skin right,” said Marti Gonzalez only half jokingly it seemed.) Flexible devices is an increasingly  active field of research asevident in the talk by Elvira Fortunato at the University of Lisbon. In addition, the 2019 Chemistry Nobel Prize for lithium ion batteries has highlighted the importance of this research for some (many were already persuaded of the field’s significance) and these and other energy storage devices will likely stay on trend in nanotechnology research for some time to come.

At the other end of the spectrum Diego Peña Gil from the Universidade de Santiago de Compostela in Spain described some of the Cordon bleu of bottom-up graphene synthesis from monomers carefully tailored for bespoke nanoribbon structures. He listed “beauty” among the motivations for the research, alongside the search for new materials, synthesis methods and structures and properties, adding “Why not? We produce very expensive buildings because they are beautiful and we are the same, we are organic architects.”

The poster session featured a lot of work on 2D materials from β12 borophene doped with transition metals to aluminium-doping for better blue phosphorene sensors. The emphasis on graphene and 2D materials may be no great surprise but it perhaps gives pause for thought to recognize that neither graphene nor topological insulators really existed as research topics twenty years ago when the first Trends in Nanotechnology conference took place. “I came 15 years ago and it was totally different – there was no graphene it was all magnetic nanoparticles,” Neumaier told me over coffee.

The enduring attraction of magnetism and nanotechnology

While some things change others stay the same. Magnetism certainly still cropped up a lot although with a radical makeover in the form of magnetic 2D materials, novel magnetic proximity effects, and – another phenomena that had not yet been experimentally observed 20 years ago – skyrmions. Javier Junquera from Universidad de Cantabria described his group’s ultimately successful hunt for an electronic analogue to these magnetic vortex-like quasiparticles, despite several explanations why their apparent absence so far was unsurprising. The interaction primarily responsible for the emergence of magnetic skyrmions – the Dzyaloshinskii–Moriya interaction – has no electronic analogue. Another factor is that electronic dipoles – like those in a crystal lattice when a positive cation is slightly displaced with respect to the counter charges around it – can ultimately disappear, which is not the case for spin. The trick was to look at an interface layer between regions of opposing polarity. You can hear him describe the hunt himself too in the audio clip.

interview with Javier Junquera

Organic nanostructures also featured, such as the impact of “unconventional” substrates on polymer growth. (The Ulmer reaction for polymerization typically only takes place on gold and silver surfaces.) Michele Pisarra at IMDEA Nanocienca described catalytic effects of the nanostructured Moiré patterns from graphene grown on ruthenium. His talk described reactions between the organic molecule TCNQ and CH2CN but catching up with him in the break it seems the possibility of catalytic effects on other reactions are already under investigation.

I attended my first Trends in Nanotechnology conference in Tenerife in 2011. The calibre of the speakers is another aspect of the conference that doesn’t seem to have changed, so it was with genuine regret that I could only attend the first couple of days this year and hence missed a lot of photonics and bio-oriented talks among others scheduled for later in the week. The programme and attendance may be lean compared to some other conferences but the organizers seem to have mastered the kind of boutique conference that neatly blends prestigious presentations with a healthy student attendance and plenty of conferring time in breaks. Trends in Nanotechnology is organized by the Phantoms Foundation headed by Antonio Correia.

Does climate change mean Wales will hit truffle jackpot?

Known as the “prince” of black truffles, Périgord truffles are some of the most sought-after ingredients in the world. A combination of poor harvests and increasing demand has driven prices sky high, with top chefs paying as much as 5000 euros per kilo in recent years.

Unlike the super-rare white truffles from northern Italy and the Balkans, Périgord truffles can be semi-cultivated; there are plantations in the Périgord region of southern France, parts of north-eastern Spain and northern Italy. However, the days of farming truffles in the Mediterranean may be numbered. A study has shown that summer rainfall is key for a good truffle harvest. As drought becomes more common across the Mediterranean, truffle farming could become increasingly challenging but, as a new venture in Wales demonstrates, the Mediterranean's loss could be northern Europe's gain.

Cultivating Périgord truffles is still something of a dark art. Farmers grow a host tree seedling in a greenhouse then inoculate its root system with the spores of the Périgord truffle. When the tree is strong enough, they transplant it into a suitable environment -- alkaline soil in limestone or chalk areas -- and then leave it to establish for 6 to 8 years. Finally, specially-trained truffle-sniffing dogs begin their hunt.

When truffle harvests plummeted in recent years it wasn't clear what was driving the decline. Farmers guard their techniques closely, and the truffle life-cycle is relatively poorly understood. Ulf Büntgen from the University of Cambridge, UK, and colleagues had a hunch that climate was responsible. To test this theory, they compared 49 years’ worth of truffle harvest data from Spain, France and Italy with climate data from the same period.

Their data clearly showed that truffle yield between November and March was significantly linked to summer rainfall. Wet summers produced bumper harvests the following winter, whilst dry summers led to meagre hauls. The researchers also noticed that wet autumns were detrimental to truffle harvests in the following months.

“It shows that there is a short but crucial period when truffle plantations need to be irrigated,” says Büntgen, who published the findings in Environmental Research Letters (ERL).

Exactly why Périgord truffles require rainfall at this time is not known. Truffles are not at fruiting stage at this point in the year, but it could be a crucial time for the host tree, whose roots the truffles eventually grow upon. “We think that summer rainfall supports the vitality of the host trees,” says Büntgen.

Some truffle farmers do irrigate, but until now there has been no consistent advice over when to do this and how much water to use. “Our work suggests that there is no point irrigating outside of the summer months,” says Büntgen.

The scientists also believe that it is crucial that farmers and scientists work together, sharing data to understand how to optimise truffle harvests. “We don't know how deep the water needs to penetrate into the soil, how dense the plantations should be, and what kind of understory might help to retain moisture,” Büntgen says.

To help answer some of these questions, and better understand which trees host truffles best, Büntgen and his colleagues are experimenting at the botanic gardens in Cambridge, where there are 2400 species of tree.

In the Mediterranean the increasing probability of summer droughts, combined with a shortage of water for irrigation, may make truffle farming high-risk. However, enterprising farmers further north are demonstrating that Périgord truffles are happy to adapt. Just over ten years ago, Matt Sims planted truffle-inoculated host trees on the sun-drenched chalk soils at his farm in Monmouthshire, Wales. Two years ago, Sims and sprocker spaniel Bella harvested his first Périgord truffles – the first time these fungi have been cultivated in the UK.

This year Büntgen and colleagues published a study showing that white truffles are already marching north, most likely in response to climate change. Black Périgord truffles are likely to follow suit. Helping the truffles adapt to a changing climate will ensure they don't fall off the menu.

Queensgate reaches the pinnacle of nanopositioning performance

Queensgate – a brand of precision optical and mechanical instrumentation manufacturer Prior Scientific – has come a long way since it was formed in 1979 as a spin-out from Imperial College London. Initially formed to supply academic colleagues with Fabry–Pérot interferometers for use on large astronomy telescopes, the business grew to the point where NASA commissioned Queensgate to incorporate one of its systems on the Space Shuttle. This success allowed the company to diversify into nanotechnology instrumentation and fibre-optic devices.

Queensgate founders Paul Atherton and Thomas Hicks quite literally wrote the book on nanopositioning, publishing The Nano Positioning Book in 1997 and cementing Queensgate’s position as a thought leader in the field. Today, 40 years later, Queensgate remains a relatively small team working at the forefront of nanopositioning innovation, offering a range of custom and OEM nanotechnology products from nanosensors to nanopositioning devices, control electronics and software.

Disk-drive demands

It was this vanguard reputation that first piqued the interest of global data-storage company Seagate some 20 years ago – sparking a relationship that continues to this day. Seagate has been a major supplier of hard-disk drives in the computer industry for decades, and still dominates the market today. To maintain its position, the company must ensure the quality of its existing products while also constantly pushing technological limits to meet the demands for higher and higher capacity from data-hungry consumers.

This means building test systems for determining the quality of every read/write head Seagate ships – around 1 million heads per day – before building them into the disk drives. And that task is becoming increasingly difficult. “To keep the test equipment aligned to product, the positioning error budget is approximately halved every 5 years,” explains Ron Anderson, Seagate’s Managing Principal Engineer. “Today, the total position error budget is less than 0.5 nm, which means we are measuring at the angstrom level. Everything from conversations in the same room to fractional degrees of temperature change now affect positioning.”

For over a decade, Seagate had been using Queensgate’s controller and nanopositioning stage in their test systems. However, with the need to reduce the positioning error to the angstrom level, Anderson could see that the existing positioning stage was reaching the limit of its capabilities. Seagate needed new technology: “I performed competitive comparisons with other suppliers, but in the end we went back to Queensgate because of the strength of our partnership and their technical advantages.”

Beyond today’s requirements

Seagate’s brief for a new positioning stage arrived at the door of Queensgate Principal Mechanical Engineer John Clarke. Clarke immediately realized he was in for a challenge. “The stage had to fit the footprint of the earlier stage we had produced, but the specification we were given pushed for more range, more speed and better accuracy,” he recalls. These factors are generally conflicting; for example, if you increase the range you reduce the resolution, or if you increase speed you reduce accuracy.

To solve this engineering challenge, Clarke went on a fact-finding mission to Seagate’s engineering department and factories in the US. Talking with Seagate’s R&D teams and employees allowed him to get a better understanding of how Queensgate could help improve Seagate’s testing performance.

Queensgate products

After numerous design iterations and weekly calls with Seagate’s R&D team, Clarke’s efforts culminated in a prototype positioning stage that almost met Seagate’s requirements. However, although the new stage delivered in terms of range and speed, testing revealed that the accuracy of the prototype was not quite at the level Seagate had hoped. Unsurprisingly, given his passion to “push the performance of products and make leading solutions”, this setback only served to galvanize Clarke and his team. As Anderson recalls: “At no time did Queensgate stop and say, ‘This is the best we can do’. I enjoyed working with John for his perseverance and commitment to our performance targets.”

Clarke says that over the course of about a year, his team gradually improved on the accuracy of the stage – working in collaboration with Anderson’s Seagate engineers – until it was superior to the previous version. “It allowed us to offer longer range, higher speed and greater accuracy, and it was a drop-in replacement of the previous system so that Seagate could upgrade their test capabilities easily.”

Now embedded in Seagate’s quality-assurance systems, Queensgate’s patented new technology is now testing Seagate’s two most recent products on a continual basis. It is also helping Seagate’s development labs to characterize new read/write head designs, allowing them to push hard-disk drive technology forward. Reflecting on the collaboration, Anderson concludes: “I feel that we have challenged them, helped them make their products better, and in return we have received what I believe is the best nanopositioner on the market.”

One image, one prediction: MRI may foretell cognitive decline after stroke

With a single MRI brain scan and an automated tissue segmentation tool, researchers from the UK can predict cognitive problems in individuals with stroke-related small vessel disease in the brain.

Their research, published in Stroke (10.1161/STROKEAHA.119.025843), may establish useful image-based biomarkers for small vessel disease severity. The technique is simpler than previously published measures and yields comparable results.

Diffusion-based MRI technique

A hallmark of small vessel disease is damage to tiny blood vessels in the brain caused by stroke and other diseases. Outwardly, individuals with small vessel disease may have problems in planning, organizing information and processing information quickly, although their memory functions often appear relatively stable. Others may develop dementia.

While early treatment could help individuals at risk, identifying those who may go on to experience cognitive decline or dementia remains difficult.

“Our objective was to find a measure of brain tissue microstructural damage,” says senior author Rebecca Charlton from Goldsmiths University of London. “Using a new technique based on readily available MRI scans, we can predict which people go on to show cognitive decline and develop dementia.”

Rebecca Charlton

In 2001, Charlton and her research team began investigating the use of an advanced MRI technique, diffusion tensor imaging (DTI), to evaluate cerebral small vessel disease, track disease severity and identify individuals vulnerable to dementia and cognitive decline.

DTI is important when a tissue, such as the neural axons of the brain’s white matter, has a fibrous structure. Water will diffuse more rapidly in the direction aligned with this structure, and more slowly as it moves perpendicular to the preferred direction. To sensitize the MRI system to this diffusion, DTI techniques vary the magnetic field in the MRI equipment.

DSEG predicts cognitive decline and vascular dementia

Unlike other MRI-based markers of brain damage in individuals with small vessel disease, the diffusion tensor image segmentation (DSEG) technique assesses brain microstructural damage from small vessel disease using a single DTI acquisition at 1.5 T and automatic segmentation of the cerebrum. The researchers obtained MRI annually for three years, and cognitive tests annually for five years, for 99 individuals participating in the study.

DSEG maps the cerebrum into 16 segments. By comparing the segments of an individual with small vessel disease to those of a healthy brain, the researchers derived a DSEG spectrum containing information about grey matter, white matter, cerebrospinal fluid and regions with diffusion profiles that deviate from those of healthy tissue.

The researchers found that DSEG measures increased over time, indicating progression of small vessel disease burden. DSEG measures also predicted decline in executive function and global cognition and identified stable individuals versus those who developed dementia.

The researchers found no relationship between DSEG measures and information processing speed, perhaps because DSEG covers the entire cerebrum and not just white matter tracts, with which information processing and small vessel disease are strongly associated.

The efficacy of DSEG needs to be validated using large-scale studies that follow patients over time. Research is also required to assess the utility of DSEG in small vessel disease caused by events and processes other than ischemic stroke.

The technique is currently limited by the researchers' choice of a single healthy reference brain, which defines a narrow representation of the healthy, aging brain. Charlton notes that the MRI arm of the upcoming Biobank study, which will acquire MRI scans, including DTI, from 100,000 individuals by the year 2020, should help with this.

“In the future, DSEG technology could be used as a decision support system for clinicians,” Charlton says. “This technique has the potential to identify those patients at risk for cognitive decline and vascular dementia.”

There’s no place like home

student leaving home

It’s usual to think of physicists as globe-trotting rebels who uproot their lives to go wherever the opportunities lie. Moving from city to city and country to country, they coolly switch locations, following their physics dreams – be it CERN in Geneva, oil rigs in the North Sea or start-ups in Silicon Valley.

The reality, though, is rather different. Many young people – physics graduates included – prefer to stay where they grew up or studied. They have an emotional attachment to the town, city or region they’re already living in, and are reluctant or unable to move. While it’s not exactly clear why, we attribute this to a sense of geographic belonging, which is related to place, family, social and community relations. In some cases, money is probably a key factor too.

The importance that students give to this “emotional geography” also affects the kinds of jobs they can get after graduation. In the UK particularly, the graduate labour market is geographically unbalanced, with most graduate jobs concentrated in London and south-east England and in the larger cities of Birmingham, Manchester and Leeds. So if you’re a physics graduate from a part of the country with few physics-based employers, it’ll be harder to find a job using your physics degree if you can’t – or don’t want to – move.

According to the 2015/16 Destinations of Leavers from Higher Education (DLHE) report, some 69% of UK graduates took their first job in the region of their home domicile. Further analysis using data from the survey shows that 45% of graduates did not move regions at all – they both studied and sought work in their home domiciled region. This inability or unwillingness of graduates to move means that university physics departments need to do more to recognize the importance of “place-based” decision-making when it comes to graduate career choices. Departments also need to adapt their physics degrees so that students can more easily find graduate jobs in the local area – and have the skills that local employers need.

Local skills for local people

The impact of emotional geography on physics graduate outcomes was a key theme of a meeting in London hosted by the Institute of Physics (IOP) in July. The meeting was organized by the South East Physics Network (SEPnet), which links nine physics departments in south-east England, and by the White Rose Industrial Physics Academy (WRIPA) – a collaboration between the universities of York, Sheffield, Hull, Nottingham and Leeds. Emotional geography, it turns out, is a particular headache for physics departments in regions with a low Gross Value Added (GVA) – one way of measuring economic output that can be used as an indicator of regional productivity.

This problem is borne out in an analysis carried out by Alastair Buckley, a physicist at the University of Sheffield, who examined the mobility of students who studied physics at the university between 2011 and 2017. He found that a high proportion (about 50%) of Sheffield physics students’ domiciled (permanent) address when they applied for the course was within 100 km of the university. What’s more, after graduation 65% of Sheffield physics students chose to return to their domiciled address (most likely the parental home) to work. Buckley and colleagues consider these physicists to be “work immobile”, meaning that remaining in a desired location may be more important to them than the type of job they do.

Buckley’s analysis shows, however, that physics graduates at Sheffield who are “work mobile” – i.e. prepared to move after graduation – are more likely to get better, graduate-level work. That’s because other parts of the country with higher productivity and growth have a greater number of graduate-level jobs that make the most of a physicist’s specific skills and knowledge. Indeed, when the five WRIPA departments analysed the DLHE data, they found that physics graduates who stay in the Yorkshire, Humberside and East Midlands region (where those departments are based) have significantly lower prospects than those choosing to leave. Only 55% of them attain graduate-level jobs, compared with the 80% of those who move.

82% of all students live in their original home region a year after graduating – and 65% are still based there a decade later

This insight corroborates a recent analysis by the UK Department of Education, based on the government’s controversial Longitudinal Education Outcomes (LEO) project, which provides information on how much UK graduates of different courses at different universities earn after graduating, by linking up data on tax, benefits and student loans. The LEO data suggest that 82% of all students live in their original home region a year after graduating – and that 65% are still based there a decade later.

Next steps

So what can be done to improve the situation – either by helping students become more mobile, or by improving the prospects of those who cannot or don’t want to move? The WRIPA has recently been awarded funding from the Office for Students’ Challenge Competition to boost links between physics departments and regional employers, to develop inclusive modes of work-based learning and to support physics students to be more work mobile. However, several strategies are already being tried at universities around the UK (see box).

Case study: Amy Hearst

Amy HearstOne physicist who has benefited from her university’s links with local firms is Amy Hearst. She graduated from the University of Southampton, UK, and is now a production engineer at vivaMos, which designs and builds CMOS image sensors for flat-panel X-ray detectors at its premises in Southampton Science Park.

However, Hearst might not have realized the opportunities on offer in Southampton were it not for a summer industrial placement she did at the defence and aerospace firm Leonardo, which has a centre of excellence for infrared detectors in the city. During her placement, she made ultrasensitive thermal-imaging cameras used on the TV shows Springwatch and Planet Earth.

Before the placement, Hearst had seen Leonardo as uninteresting. “To me it was just a grey building, it was defence,” she told delegates at a meeting in July about the impact of emotional geography on physics graduates in the UK.

After working at Leonardo as a summer student, Hearst was taken on full-time after graduating. She and her fellow graduate recruits formed a close social and professional network, which led to her wanting to stay in the Southampton area, and she found her current job at vivaMOS thanks to this network.

Hearst’s experience underlines the value of university physics departments developing links with local employers and also of helping physics students to sharpen their professional skills while studying.

One example is the University of Lancaster’s regional “employer engagement strategy”, which physicist Manus Hayne and colleagues developed to support students who want to stay in the area after graduating. Their strategy includes curriculum-based projects for final-year physics students, who work with local industrial firms on real-world problems. These projects are an effective way of connecting students with employers that are less likely to participate in traditional recruitment events. Together with guest lectures, industrial placements and opportunities for internships at local firms, such projects help to create what you might call a departmental “employability ecosystem”.

How can local hi-tech firms shape the nature of physics degrees offered by local universities to ensure that graduates have the appropriate skills?

But how can local hi-tech firms shape the nature of physics degrees offered by local universities, to ensure that graduates have the appropriate skills? One solution adopted by the University of Portsmouth’s School of Mathematics and Physics has been to set up an industry advisory board, which currently has about 20 representatives from local businesses. Complaints from one major local employer that it could not find graduates with the right skills in radar-systems engineering led to the university reintroducing this subject into its undergraduate physics degree.

Portsmouth also delivers a course on the applications and “impact” of physics in partnership with industry and healthcare professionals, developed in response to local skills needs. Indeed, the IOP’s revised degree-accreditation scheme gives physics departments the flexibility to respond to regional economic needs and focus on developing students’ skills. Physics departments are encouraged by the IOP to deliver degree programmes that maintain academic rigour while also embedding key skills such as innovation, leadership, creativity, entrepreneurship and self-management.

Working together

Another attempt to make physicists more employable to local firms has been developed by Trevor Farren, a director of business and knowledge exchange at the University of Nottingham. He delivers an optional interdisciplinary module entitled “Enterprise for scientists”, in which final-year physics students can elect to work with students from other disciplines to develop their innovation, business and entrepreneurial skills. The course involves groups of students pitching ideas to a “Dragon’s Den” panel of local businesses, thereby helping them to develop an understanding of finance, negotiation, sales and marketing as well as enhance their problem-solving and creative-thinking skills.

With jobs evolving all the time, programmes such as these help students to develop the transferable skills needed to adapt to the changing employment environment. They also help students build contacts with potential local employers, in the process raising the value and success of physics degrees offered by local universities. And with graduate recruiters saying that their ideal candidates should not only have a strong academic knowledge of particular field, but also an ability to work outside their core area, it’s more important than ever to ensure that physicists develop the skills that will make them succeed in the marketplace.

Exoplanet researchers welcome ‘cataclysmic’ Nobel-prize announcement

Physicists and astronomers have welcomed yesterday's decision to award this year’s Nobel Prize for Physics to the discovery of the first exoplanet as well as research on the evolution of the universe. Despite some surprise regarding the mix of the two subjects in one prize, the award has been particularly welcomed by exoplanet researchers, who had worried that the Nobel Committee for Physics would ever deem it “fundamental enough” to be recognized.

James Peebles, Michel Mayor and Didier Queloz bagged the 2019 Nobel Prize for Physics “for contributions to our understanding of the evolution of the universe and Earth’s place in the cosmos”. Peebles won half the prize for “theoretical discoveries in physical cosmology” while Mayor and Queloz share the other half for “the discovery of an exoplanet orbiting a solar-type star”.

Over the past five decades, Peebles had made pioneering contributions to cosmology including dark matter and dark energy. He made a vital headway in predicting the properties of the Cosmic Microwave Background even before it was discovered as well as improving our understanding of how elements were formed soon after the Big Bang. He also came up with “cold dark matter” as a description of a universe filled partly with invisible matter and, according to astrophysicist Jo Dunkley from Princeton University, “also put back in the vacuum energy ‘Lambda’ to our list of cosmological ingredients -- the dark energy that is still a mystery today”.

To see a field go from obscure, fringe, and laughable to Nobel-worthy is a huge tribute to people all around the world making exoplanets real

Sara Seager

“It was so obvious that Peebles should win the prize,” Dunkley told Physics World. “He really founded our entire field of modern cosmology -- wherever you look in cosmology, you find Peebles' vital contributions.”  That view is shared by other astrophysicists, including Ofer Lahav from University College London, who says that the award to Peebles is “long overdue” and “well deserved”. “Through his articles and books he has inspired generations of cosmologists as well as pioneered the creation of large cosmological surveys, which are now at the core of cosmology research,” Lahav says.

Positive impact

Mayor and Queloz, meanwhile, have been awarded the prize for making the first discovery of a planet orbiting a star other than the Sun. In 1995, using custom-made instruments on the Haute-Provence Observatory in France, the pair detected the tiny wobbling motion of a star 50 light-years away. This wobble is caused by the close orbiting of a gas-giant planet called 51 Pegasi b, which is about half the mass of Jupiter.

Alan Boss, from the Carnegie Institution for Science in the US, told Physics World that the award is “a splendid surprise” and that Mayor and Queloz “fully deserve” to be recognized given they were the first to find proof of a planet in orbit around another sun-like star. “The surprise is that it took the Nobel committee nearly 25 years to recognize this huge discovery,” says Boss. He adds that their finding helped to foster support for NASA’s Kepler Space Telescope, which has since spotted thousands of exoplanets and determined that potentially habitable, Earth-like worlds are commonplace in our galaxy. “[It] is certainly the most astounding discovery in the last decade about our physical universe,” says Boss.

Sara Seager, an astronomer at the Massachusetts Institute of Technology, says the award for exoplanets is a “cataclysmic shift” for the subject. “To see a field go from obscure, fringe, and laughable to Nobel-worthy is a huge tribute to people all around the world making exoplanets real,” she told Physics World, adding that she hopes that there will be a “huge positive impact” in terms of funding for dedicated exoplanet missions in future.

Like all huge achievements, the discovery of 51 Pegasi b by Mayor and Queloz had important forerunners all of whom could claim a first for exoplanets

Elizabeth Tasker

Exoplanet researcher Jason Wright from Penn State University, meanwhile, says that he is “very pleased” to see the award go to Mayor and Queloz and that it is “well deserved”. He also admits never imagining the Nobel committee would dedicate a prize to the first exoplanets. “I did not think the committee would think of it as sufficiently fundamental to physics,” he says.

Early pioneers

As with all Nobel prizes, the requirement to limit the award to no more than three people -- stipulated in the will of Alfred Nobel -- will have made the 2019 prize as tricky as ever. Other pioneers who will surely have been considered by the committee will have been astronomer Geoff Marcy and his colleague Paul Butler from the Carnegie Institution for Science, who were both early pioneers in exoplanetary science. Butler, who is currently carrying out observations on the Magellan telescopes in Chile, told Physics World that the committee "had recognized the accomplishments of three great researchers," adding that “their work has helped illuminate our understanding of the universe, and to redefine our relationship with the universe.” Another thorny matter is that in 2015 Marcy was found to have violated the University of California, Berkeley's sexual-harassment policy.

Exoplanet researcher Elizabeth Tasker at the Japan Aerospace Exploration Agency told Physics World that there were many people before Mayor and Queloz – and indeed Marcy and Butler -- that made important contributions. “Like all huge achievements, the discovery of 51 Pegasi b by Mayor and Queloz had important forerunners all of whom could claim a first for exoplanets,” she says.

In particular, in 1988, a Canadian team of astronomers, led by Gordon Walker, Bruce Campbell and Stephenson Yang, detected the “wobble” of Gamma Cephei A using a new absorption cell that allowed much higher precision measurements of a star’s radial velocity. However, the trio didn’t have enough support to claim it was a planet – something that was only later confirmed in 2004. Then in 1992, Alex Wolszczan from Penn State University and Dale Frail at the National Radio Astronomy Observatory in Socorro, New Mexico, announced two rocky planets around a pulsar – yet at the time these results were not widely accepted. “While these previous discoveries were equally as important to our knowledge of planet formation -- it was the discovery of 51 Pegasi b that marked the flood gates opening to the discovery of new worlds,” adds Tasker.

Despite the plaudits for the individual researchers in this year’s award, some say that the mix of exoplanets and cosmology was unexpected. “Sharing the award between cosmology and extrasolar planets is a bit surprising given these are such different disciplines,” says Lahav. “But, in the spirit of the award's citation about understanding ‘the Earth’s place in the cosmos’, this is perhaps an encouragement to study the universe in a more wholistic way.”

That view is backed up by Dunkley. “I think the mix of subjects awarded this year is interesting,” she says. “But there is a common thread -- Peebles taught us how we came to be here in a galaxy full of stars and Mayor and Queloz discovered that planets beyond our own Solar System were created in this bigger arc of cosmic history. So together they take these huge strides towards understanding our place in the cosmos.”

  • We have created a collection of key papers by Peebles, Mayor and Queloz, which are free to read until 31 October 2019.

Oxford Instruments logo

Physics World's Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

How green is green gas?

Green gas is being talked up of late as one new way forward for decarbonisation. So what exactly is green gas? It could, in fact, be a lot of different things, some of which are far from green. In general, that depends on the sources and the counterfactuals of using/not using them. Biomass use can be near carbon-neutral, if the replanting rate keeps up with its use, but destroying natural carbon sinks (e.g. by aggressive use of forest products for fuel) means there can be net carbon dioxide rises, while greenhouse gas (GHG) emissions from using fossil gas may be reduced if carbon capture and storage (CCS) is included. In between these extremes there are all sorts of options, including synthetic gas made using surplus renewable power-to-gas electrolysis (P2G).

A helpful new report for the European Commission (EC) from The International Council on Clean Transportation offers a three-part classification system:

  • High-GHG: Gases with a life-cycle greenhouse gas intensity of 30% or higher than business-as-usual natural gas. The EC says that this category of gas “likely needs to be phased out in the near future in order to meet Europe’s climate targets”.
  • Low-GHG: Gases that reduce lifecycle greenhouse gas emissions by a substantial degree. For example, the Renewable Energy Directive 2018/2001 requires renewable power-to-methane to reduce greenhouse gas emissions by 70% compared to fossil fuels.
  • GHG-neutral: Gases with zero net greenhouse gas emissions. This includes pathways with negative greenhouse gas emissions on a life-cycle basis.

Gaseous complexity

It certainly can get complicated. For example, the report says “the use of additional or excess zero-GHG electricity from wind and solar to produce hydrogen for electrolysis, or green hydrogen, is a renewable, GHG-neutral pathway. However, that same process utilizing conventional fossil-powered electricity in the absence of CCS will be neither renewable nor GHG-neutral. As with fossil gases, the production of hydrogen or synthetic methane using electrolysis utilizing conventional power with CCS could place those fuels in the low-GHG category, but they would be non-renewable”.

And here’s a nice conundrum -- is carbon dioxide really saved when it is used to make new fuels? The report says that it is important to draw a distinction between CCS and carbon capture and utilization (CCU). It notes that “in CCU, captured carbon is utilized for a given fuel production process or end-use. In this case, CO2 from a fossil fuel power plant or used gases from a steel mill may be utilized as an input for synthetic methane production.” But of course, when that snygas is burnt, you get CO2 again.

What’s more, the report worries that “in the long-term, it is possible that policy tools may be necessary to ensure that CCU does not perpetuate the use of fossil fuels. If all industrial CO2 emissions are phased out in the future or claimed by other circular economy processes, all CCU must utilize CO2 from direct air capture to continue delivering climate benefits”.

Yes, true, CCU may lead to fossil “lock in”. But so may CCS. Whatever is done with the carbon removed from the atmosphere, or from power station exhausts, there will still be more coming unless we switch to non-carbon energy sources or cut demand so that less energy is needed. Indeed, some say that carbon removal offsets and zero-carbon generation are fundamentally at odds. A focus on compensatory carbon removal deflects resources from zero-carbon renewable energy generation and also from energy-saving -- arguably the only long-term solutions to climate change.

Then again, not all ostensibly renewable gas generation is necessarily green. The report says: “The cultivation of silage maize for biomethane production in the EU can be expected to displace food production on cropland and thereby cause indirect land-use change” and that “in conjunction with cultivation and processing emissions, pushes that gas source into the high-GHG category”. What’s more, “within a given combination of feedstock and process, factors such as the degree of methane leaks for a given facility may further influence the GHG categorization for that producer”. So some green gases may have issues.

New sources

How about grass - converted to biogas by anaerobic digestion (AD)? UK energy supplier Ecotricity is currently pushing for it as a vegan energy option, since it does not involve animal products: a metric the EC evidently hasn’t as yet taken on board! Ecotricity says it will start work on grass-powered bio-gas plants soon. AD conversion of domestic food and farm waste to biogas is another key option, with no new land-use implications. There are already 1 million UK biogas users, and biogas and other green gases can play a role in fuelling local Combined Heat and Power plants linked to district heating networks.

some say that carbon removal offsets and zero-carbon generation are fundamentally at odds

The UK and many other countries certainly need to get on with decarbonising heat, and green gas could be a key option, replacing fossil gas. But, as the report for the EC notes, green gas  of various types can also be used to make other fuels and products, including synfuels for use in some vehicles. Will there be enough green gas for that, as well as for heating and for power generation? Although there are obvious land-use limits to growing biomass as an energy crop, there are other sources, including farm and other wastes, and the biogas resource overall does look quite large. The Global Biogas Association says that only 2% of available feedstocks undergo anaerobic digestion and are turned into biogas, and, if developed, biogas could cut global greenhouse gas emissions by up to 13%. 

The synthetic gas resource could be even larger, if renewable power generation expands as much as is expected, creating large surpluses at times, which can be converted to hydrogen and possibly methane at reasonably low cost. Some projects may be designed specifically to make hydrogen. For example, there is a UK proposal for a £12 billion 4 GW floating offshore wind farm for on-board hydrogen production, the gas then being piped to shore. That may be some way off -- probably not until the early 2030s. A 2 MW prototype off Scotland is planned first, followed, if all goes well, by a 100 MW multi-unit test project in 2026 and leading on, if further backing can be found, to the full 4 GW version. It is a fascinating idea. The developer says, “if you had 30 of those in the North Sea you could totally replace the natural gas requirement for the whole country, and be totally self-sufficient with hydrogen”.

One way or another, even if bold plans like that do not win through, the UK could have plenty of easily stored and pipe-transportable green energy for multiple uses, as long as any leakage problems can be dealt with: see my earlier post. Many other countries also have large green gas/hydrogen potentials. For example, in addition to biogas inputs, in most EU decarbonisation scenarios, hydrogen and derived fuels add up to between 10% and 23% of 2050 final energy consumption. And a new IRENA report is quite optimistic about the global prospects for green P2G hydrogen: “Hydrogen from renewable power is technically viable today and is quickly approaching economic competitiveness”.

More than electricity

Not much of this, however, seems to be taken on board in the scenarios reviewed in the recent Global Energy Outlook produced by Washington-based Resources for the Future (RFF). As in an earlier World Energy Council (WEC) review, the outlook compares scenarios from oil companies, the IEA (International Energy Agency) and so on, most of which see fossil fuels still romping ahead, feeding growing energy demand, with renewables, although expanding, barely keeping up. In most of the scenarios RFF looks at, the emphasis is on electricity-producing renewables, and under the highest renewables scenarios examined, they only reach a 31% primary energy share by 2040, matching oil’s share. Emissions rise in all scenarios, even those using CCS.

However, neither the WEC or RFF reviews look at the many, much more ambitious “100% renewables by 2050” global scenarios from NGOs like GENI and academics like Mark Jacobson and the team at Lappeenranta University of Technology (LUT). In all, there are now 42 peer-reviewed “100%”, or near-100%, renewables studies. Most of them cast the net wider than just electricity, with some including direct green heat supply on a large scale, with solar and biogas combined heat and power (CHP)-fed heat networks and heat stores. Though nearly all of them still focus heavily on the power-supply side, leavened in some cases by P2G conversion to green syngases.

Electricity-generating renewables will obviously rule the roost, as costs continue to fall, and electricity has many good end-uses, but there are also other options that can play a part in meeting energy needs and in balancing variable green electricity power sources. Green gas is one.

Lithium-ion battery pioneers bag chemistry Nobel prize

The 2019 Nobel Prize for Chemistry has been awarded to John Goodenough, Stanley Whittingham and Akira Yoshino for the development of lithium-ion batteries.

The trio will share the SEK 9m (about £740,000) prize money equally and the award will be presented at a ceremony in Stockholm on 10 December.

“Electrochemistry is at the heart of this award, but other branches of chemistry have played a part,” said Nobel Committee member Olof Ramstrom from the University of Massachusetts when the winners were announced. “This prize is also connected to physics and even engineering – it’s a good example of how these disciplines have come together.”

Speaking over the telephone at the press conference, Yoshino said that it  was “amazing” to hear he had won the Nobel prize. He adds that it was “curiosity” rather than thinking about the immediate applications that pushed the work and says that lithium-ion batteries are being used to build a “sustainable society”.

The story of the lithium-ion battery begins in the oil crisis of the 1970s when Whittingham was trying to develop new energy systems. He discovered that a battery cathode made of titanium disulphide can absorb large numbers of lithium ions from a lithium anode. In 1980, Goodenough showed that an even better performing cathode can be made from cobalt oxide.

Metallic lithium is an excellent anode material because it readily gives up electrons, however it is highly reactive and early batteries were prone to exploding. In 1985 Akira Yoshino solved this problem by creating a carbon-based anode that is able to absorb large numbers of lithium ions. This removed the need to use reactive metallic lithium and the first commercial lithium–ion battery appeared in 1991. Since then, the devices have powered a revolutions in handheld electronics and electric vehicles.

Oldest ever winner

Goodenough did a PhD in physics at the University of Chicago and at 97 is the oldest person ever to receive a Nobel prize. An American citizen, he has worked at the Massachusetts Institute of Technology, University of Oxford and the University of Texas at Austin.

Whittingham is a chemist who was born in the UK in 1941 and is based at Binghamton University in the US.

Also a chemist, Akira Yoshino was born in Osaka, Japan in 1948 and is at the Asahi Kasei Corporation in Japan and Meijo University in Nagoya.

The physical chemist and nanoscientist Rodney Ruoff of the Institute for Basic Science Center for Multidimensional Carbon Materials in Korea had been colleague of Goodenough’s at the University of Texas. He told Physics World that Goodenough is “fun and fascinating”. “He's a well-rounded individual, well read on many topics as well, of course, topics of science surrounding the things he has done during his research career,” says Ruoff, "I'm really delighted that he has received this honour along with [Whittingham and Yoshino]".

Learn more about lithium-ion batteries:

We have also created a collection of key papers by Goodenough, Whittingham and Yoshino, which are free to read until 31 October 2019. You will find these papers below the list of papers written by the 2019 physics Nobel Laureates.

Oxford Instruments logo

Physics World's Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more

Copyright © 2025 by IOP Publishing Ltd and individual contributors