Skip to main content

Virgo bags its first gravitational waves

The Virgo observatory in Italy has detected its first gravitational waves just two weeks after the upgraded facility was switched on. The signal came from the merger of two distant black holes and was also seen by the two LIGO detectors in the US, which have already detected three black-hole mergers on their own.

Detecting the same event at three separate locations on Earth gives astronomers a much better idea of where in the sky the gravitational waves were produced. Telescopes can quickly be pointed at that part of the sky to search for electromagnetic radiation given off by the same event – an emerging discipline called multimessenger astronomy.

Located in the flat countryside near Pisa, Virgo is an interferometer comprising two 3 km-long arms in a perpendicular configuration. The LIGO detectors are of similar design but have 4 km arms. Laser light is reflected multiple times between mirrors suspended at the ends of each arm and then combined at a detector. A gravitational wave is a ripple in space–time and when it passes through an interferometer, it can change the distances between the mirrors. This is detected as a change in how the laser light interferes at the detector.

In a spin

The first gravitational waves to be detected by both Virgo and LIGO passed through Earth on 14 August 2017 and the event has been dubbed GW 170814. Physicists working on the LIGO–Virgo collaboration think that the signal was created by the merger of two black holes with masses about 31 and 25 times that of the Sun. The merger occurred about 1.8 billion light-years away and created a spinning black hole of about 53 solar masses. During the merger, a huge amount of energy, equal to about three solar masses, was radiated as gravitational waves, a tiny proportion of which eventually passed through Earth.

Seeing the event in three (rather than two) geographically-separated detectors makes it easier to pinpoint where the gravitational waves came from. This is done by comparing the delay between signal arrival times at the detectors and by comparing the relative strengths of the signals at each detector. The distance to the source is calculated from the intensity of the gravitational waves.

A LIGO–Virgo measurement reduces the volume of the universe likely to contain the source of a detected gravitational wave by a factor of 20 compared with a detection by LIGO alone. The area of sky that is likely to contain GW 170814 can be limited to about 60 square degrees, about 10 times smaller than possible with data from just LIGO.

25 telescopes

Restricting the source to a smaller area of the sky allowed astronomers to train 25 telescopes at the source of GW 170814 in the hope of seeing electromagnetic radiation from the black-hole merger. None was spotted, which is not surprising for this type of astronomical event.

Other sources of gravitational waves, such as merging neutron stars, are expected to emit significant amounts of electromagnetic radiation. While such events have yet to be detected by LIGO–Virgo, Imre Bartos of the University of Florida points out that the longer duration and higher frequency of a signal from a neutron-star merger would make its location easier to pinpoint than GW 170814.

Having a third detector also provides more information about the intrinsic angular momenta (spins) of the black holes before they merge. Astrophysicists had thought that binary black holes form from binary stars, which means that their spins should be aligned. Like the previous three detections, however, the spins of the black holes that created GW 170814 appear to be misaligned. Bartos says that this suggests that the black holes formed independently, perhaps in the dense centres of galaxies, and then paired-up before merging.

Ambitious objective

Built in the early 2000s, Virgo ran in from 2007–2011 without detecting any gravitational waves. It then underwent a lengthy upgrade that lasted until this year.

“The Virgo upgrade to Advanced Virgo had an ambitious objective: to significantly improve the sensitivity of our detector, in order to maximize the probability to detect gravitational wave signals,” says Federico Ferrini, director of the European Gravitational Observatory, which operates Virgo.

Both Virgo and LIGO have now shutdown for further upgrades and Bartos is hopeful that all three detectors will be up and running in autumn 2018.

Physics World visited the LIGO Livingston detector in the US recently and you can read about it in “The great detector“.

Bartos and Marek Kowalski of Humboldt University and DESY in Germany have written the Physics World Discovery book Multimessenger Astronomy, which is free to read.

Australia to create space agency

The Australian government has announced it will create a national space agency to help grow its domestic industry. Speaking at the 68th International Astronautical Congress in Adelaide this week, Michaelia Cash, acting minister for industry, innovation and science, outlined how the move would create thousands of jobs and make the country less reliant on international partners for satellite and Earth-observation data.

Australia is one of the few developed nations not to have its own agency with the government eager to tap into the “rapidly” growing global space industry. “A national space agency will ensure we have a strategic long-term plan that supports the development and application of space technologies and grows our domestic space industry,” says Cash.

Expert reference group

The establishment of a space agency was recommended by an expert reference group that was chaired by Megan Clark, head of the Commonwealth Scientific and Industrial Research Organisation. The group was appointed to review Australia’s space industry in July and received almost 200 written submissions that highlighted the need for such an agency. The group will now develop a charter for the space agency by the end of March 2018.

Physicists create and measure quantized mechanical oscillations

Two groups of physicists – one in the US and one in Europe – have developed two different techniques to create and measure quantized mechanical oscillations. The research has fundamental implications for the macroscopic limits of quantum mechanics and the techniques could also be useful for transferring quantum states between different systems. The latter could be very useful for creating new technologies for quantum computing and quantum communication.

Quantum mechanics is usually thought of as the physics of very small objects such as atoms or subatomic particles. It also governs larger objects, but even a relatively small macroscopic system must be cooled to extremely low temperatures for quantum effects to be observed above fluctuations related to thermal noise.

In 2010, Andrew Cleland and colleagues at the University of California, Santa Barbara observed quantum effects in an object visible to the naked eye by coupling a microwave-frequency mechanical oscillator to a superconducting circuit. John Teufel of the National Institute for Standards and Technology (NIST) says that this work was “ahead of its time”, and because it was so technically challenging there has been very little follow-up work. “In the original Cleland paper, their heroic measurement was being able to measure [the quantum state] before this thing decayed,” he says. Further work such as constructing unusual quantum states or performing intricate operations was impossible.

Simple and improved

In one of the new experiments, Robert Schoelkopf and colleagues at Yale University fabricated a conceptually similar, but technically much simplified and improved, set-up using a superconducting circuit with multiple, tunable energy levels. Team member Yiwen Chu explains that two of these energy levels can be used to create a quantum bit (qubit).

One of the circuit’s electrodes is connected to a thin disc of piezoelectric aluminium nitride at one end of a sapphire resonator cavity. This resonator contains multiple independent, quantized vibrational modes. The aluminium-nitride’s piezoelectricity allows the researchers to inject quantized vibrations (phonons) into the sapphire or to draw them out.

By adjusting the spacing between the energy levels, the researchers can tune the qubit into resonance with any of the vibrational modes. They performed several operations on one of the quantum states, such as measuring its decay rate by exciting a single phonon mode, waiting a variable amount of time and extracting the energy again to measure the probability that the phonon had been lost. They found that the lifetime of the state is approximately 17 μs – around 1000 times longer than the lifetime measured by Cleland’s team in 2010.

Schrödinger’s cat states

The Yale researchers now hope to use this extended lifetime to create more complex quantum states in the resonator. “In principle, we can, for example, put two phonons into our mechanical resonator,” says Chu, “That just involves exciting the qubit, putting in one phonon, exciting the qubit again and putting in another phonon: it’s just a matter of how many quantum operations we can do before the quantum state is lost.” They also hope to demonstrate “Schrödinger’s cat” states, in which the resonator vibrates in two independent ways at once. This would be a first for a mechanical system.

The other experiment was done by researchers at the University of Vienna in Austria and Delft University of Technology in the Netherlands. It involves firing laser pulses at a microscopic optical cavity that is cooled almost to absolute zero. Each laser photon has an energy greater than that of a photon at the optical resonance of the cavity. The excess energy is equal to the energy of a phonon at the cavity’s mechanical resonance frequency. While most of the photons are simply reflected from the cavity, about 1% create a photon and phonon inside the cavity. “If we detect a photon that’s on resonance with the cavity, we know we must have excited the mechanical system,” explains Delft’s Simon Gröblacher.

When this happens, the researchers switch to probing the cavity with much brighter laser pulses with photons of energy lower than the cavity’s optical resonance frequency. Occasionally, one of these photons will absorb energy from a phonon giving it a new energy that matches the optical resonance of the cavity.

Preserved states

By looking at the statistical properties of the photons emitted on resonance with the cavity, the researchers can show that these photons are in the same quantum state as the photons originally used to excite the mechanical resonator. The quantum state of the input photon had therefore been preserved in the single phonon they created and was then transferred back onto the output photon.

The team is now working to achieve more control over precisely when a phonon is inserted into the cavity and extracted, which would be necessary for creating a quantum memory: “At present, we get a receipt when the state has been transferred,” says Vienna’s Marcus Aspelmeyer. “We would like to know ahead of time with 100% probability that, when we push the button, the state will be transferred.”

The European researchers’ level of control over their quantum states is simpler than that achieved by the Yale group. However, their scheme has practical advantages because the input and output quantum states photons at telecommunication wavelengths and could therefore be used in quantum networking.

Quantum connections

Chu agrees: “Superconducting circuits are very good at generating complex quantum states, but photons have the advantage of being able to send information over very long distances using light,” she says. “One of the holy grails of this field is to be able to connect these two systems – for example to use a superconducting circuit to create a complicated quantum state, convert it into light and send it off down an optical fibre. One possible way of doing this is by coupling them through a mechanical oscillator: each of these two papers is demonstrating half of that conversion.”

Teufel, who was not involved in the two experiments, is impressed: “One of the frontiers of what people are trying to do [in quantum mechanics] is to measure mechanical systems at the quantum level,” he says, “These two papers, in two very different ways, are at the forefront of that.”

The research is described in two papers that are published in Science.

Quantum quartet is on the cards

By Hamish Johnston

Theoretical physicist and blogger Sabine Hossenfelder has created a lovely set of cards featuring pioneering quantum physicists. You can see my two favourites above.

Resembling football or baseball cards, they each feature a portrait plus a fact or two about the physicist – including a salacious aspect of Erwin Schrödinger’s personal life. You will have to go to Hossenfelder’s blog to learn more about that – and see the rest of the cards, including a feline “Bra-ket” as the joker of the deck.

Could ‘cellular Lego’ be the future of regenerative medicine?

French researchers have aggregated stem cells containing magnetic nanoparticles to engineer a tissue that’s deformable at will using magnets. This approach does not require any external supporting matrix for the tissue and could be the building brick of the regenerative medicine of tomorrow (Nature Communications 8 400).

Stem cells are the cornerstone of regenerative medicine due to their unique ability to evolve multiple times into specific cell types – a process called cellular differentiation. We know that mechanical factors, such as stretching the cells, can greatly influence stem cell differentiation, as highlighted by many studies in both 2D and 3D cell cultures.

When studying the behaviour of stem cells in volumes, the most common technique uses scaffolds to control the spatial distribution of the cells; but the stretching motion is limited by the scaffolds, potentially hindering the differentiation process.

Bypassing scaffolds

The research team, led by senior author Claire Wilhelmbypassed the need for scaffolding by injecting magnetic nanoparticles into stem cells and using magnets to aggregate the cells, like Lego bricks, and form a tissue. A second magnet then stretched and compressed the tissue at will to drive differentiation.

Securing those results was not a foregone conclusion; Wilhelm and her team had to ensure that many conditions were met. They first had to show that incorporating the nanoparticles had no impact on the functioning of the stem cell or its ability to differentiate, and that the cell would remain magnetic long enough to force differentiation. When the viability of the created tissue was established, the researchers magnetically stretched the tissue and observed whether the mechanical stimulation impacted cell differentiation.

By imposing a “magnetic beating” that imitated the contraction of the heart, they observed that the magnetic stem cells were differentiating toward cardiac cell precursors, providing further evidence of the potential for purely mechanical forces to induce cell differentiation.

A powerful tool for regenerative medicine

Resorting to magnetic nanoparticles in regenerative medicine is not a novel idea, as various strategies have used such particles for noninvasive in vivo tracking of stem cells or for targeting. This is, however, the first time that researchers have used their magnetic properties to drive differentiation, outperforming the gold standard technique – the hanging drop method – in terms of the success rate of formation and control of the tissue created while requiring less manipulation steps.

Building and manipulating the tissue with magnets constitutes an “all-in-one” approach that avoids the usual caveats of traditional techniques. These promising results must now be repeated for differentiated cells other than cardiac cell precursors, but this method is a step forward toward easier and more customizable tissue engineering in regenerative medicine.

‘Look happy dear, you’ve just made a discovery’

When Jocelyn Bell Burnell discovered pulsars 50 years ago, in 1967, she was not asked about her groundbreaking finding, but how she compared in height to Princess Margaret. Unfortunately, this was not the first time, nor the last, that she would face such sexist attitudes. Bell Burnell persevered, however, and has gone on to become a fellow of the Royal Society, a dame and a multi-award-winning scientist – although most controversially, not a Nobel prize winner.

In July Bell Burnell was presented with the prestigious President’s Medal of the Institute of Physics (IOP) “for her outstanding contributions to physics through pioneering research in astronomy, most notably the discovery of the first pulsars, and through her unparalleled record of leadership within the community”. The award, which is given at the discretion of the IOP president – currently Roy Sambles – was presented at the University of Birmingham in the UK as part of the International Conference on Women in Physics. Rather than giving an Oscars-style acceptance speech, Bell Burnell delivered a fascinating lecture, outlining her career in physics as well as the obstacles she had to overcome along the way. As her story would reveal, she faced much bias and stereotyping, but she refused to yield to society’s expectations.

Success in spite of bias

It was at a young age that Bell Burnell first had to fight for the right to learn science. At her school in Northern Ireland, only boys were taught science and girls were instead given cooking lessons – after all, they were going to get married and raise a family, not study or have a career. To say Bell Burnell was unhappy about this would be an understatement. Her parents “hit the roof” and, soon after, she and two of her female friends joined the boys’ science class.

It did not take Bell Burnell long to realize that of all the sciences, she found physics the most interesting. She was “not so keen” on chemistry, and biology was “boring”, but she found physics fascinating and came top of the class, going on to study the subject at the University of Glasgow and graduating in 1965.

At this time it was traditional for male students to cat-call, heckle and stamp their feet when a female student entered a lecture theatre – something Bell Burnell found herself facing alone in the final years of her undergraduate degree as the only woman left on her course. Undeterred, she went on to earn a PhD at the University of Cambridge. She was passionate about astronomy but also loved a good night’s sleep, and so optical astronomy – which involves working a lot at night – did not appeal. Instead, she opted for radio astronomy.

(right) Photograph of Jocelyn Bell Burnell in 1977 with an analogue trace from a radio telescope. (left) Photograph of Bull Burnell with her supervisor Anthony Hewish among the wires of the radio telescope in East Anglia, UK

As part of her PhD, Bell Burnell and five colleagues built a radio telescope on a patch of land two-and-a-half times the size of a football pitch. It took two years to construct and required a lot of manual labour. Her colleagues assumed she would be too “girly” to do such things – “It’s not suitable for a woman, you’re working up a ladder all the time,” she was told. But Bell Burnell had other ideas – she was too hands-on for that attitude to remain in place for long – and she quickly got stuck into the construction work.

After the team finished the telescope, its operation became Bell Burnell’s responsibility. Data collection was yet to be digitized and so a complete scan of the sky took four days and required 120 m of paper. Bell Burnell’s six-month period of observation generated 5.3 km – all of which she analysed by hand. In the autumn of 1967, her attention to detail identified a mysterious trace – a 0.5 cm-long “scruff” of signal showing a series of regular peaks in luminosity. This bugged her. “I just couldn’t understand it,” she recalls. When she detected the signal again in the same patch of sky, her brain jumped into gear. “You’ve seen this somewhere before, haven’t you?” she asked herself. After digging through shoe boxes full of data, she concluded that the signal was neither a scintillating source, nor man-made interference. Instead, the near-perfect regularity of the peaks, which repeated every one-and-a-third seconds, led to her noting “LGM-1: Little Green Men?” in the margin of her lab book.

It would soon become clear that the little “scruff” was the first ever detection of a pulsar – a rotating neutron star that emits a regular ticking signal of radio waves. “It was a totally unexpected result that caused a major reappraisal within astrophysics,” says Bell Burnell. In fact, her supervisor Antony Hewish initially refused to believe she had found a pulsar and was convinced the signal was a product of either humans or aliens. Bell Burnell and her colleagues tested for all such possible signals. If it were aliens, the pulses would show a Doppler shift as they orbited their own Sun – which they didn’t. The pulses couldn’t be made on Earth because they kept sidereal time – a timescale based on Earth’s rotation relative to stars other than the Sun. Another observing team on a different telescope also detected the pulsations, ruling out instrument error. With these other possible sources eliminated, in January 1968 the Cambridge team – with Hewish as lead author and Bell Burnell second – submitted a paper to Nature, announcing the “Observation of a rapidly pulsating radio source” (217 709).

In Bell Burnell’s mind, the discovery was not a consequence of her ingenuity or skill, but rather it was a result of her “imposter syndrome”. She didn’t feel bright enough to be at Cambridge and had struggled to find her place among her predominantly male colleagues, who were described by some as “frequently in error, but never in doubt”. She therefore worked like crazy to avoid being thrown out, and refused to give up.

Test of tenacity

In the press coverage that followed the finding, Bell Burnell found that sexist attitudes spanned beyond academia. In one photo shoot she was told to pose triumphantly – “Look happy dear, you’ve just made a discovery!” And despite her being listed as second author on the Nature paper, the press was more concerned with her love life than her scientific accomplishments. “Initially, the discovery didn’t have a major effect on my career… except to help me survive,” she says.

There was also the matter of the Nobel prize, which even in the 1970s sparked controversy. Hewish and his colleague Martin Ryle were awarded the Nobel Prize in Physics in 1974 for the pulsar discovery, while Bell Burnell did not get recognition. While many people nowadays assume that this was down to her gender, she in fact attributes it to her being a student at the time. She doubts, however, that such an injustice could happen again, pointing out that the 1993 Nobel prize for finding a binary pulsar went to Russell Alan Hulse, who was a student at the time of the discovery, along with his supervisor Joseph Hooton Taylor Jr. “At least they don’t make the same mistake twice,” she says.

Photograph of Jocelyn Bell Burnell at the Oxford Literary Festival in March

Continuing her groundbreaking research, Bell Burnell went on to find three more pulsars. Her friends were more interested in her love life, though, and when she got engaged to be married between pulsars two and three, this was all they spoke about. Bell Burnell wore her engagement ring to the lab, but later regretted it, as she says it was seen as a signal to the scientific community that she was quitting her research: “Society expected young women to get married, not make major astronomical discoveries!”

When Bell Burnell became pregnant, it presented new challenges to an already difficult research career. She explains that there was a consensus that “if mothers work, the children will become delinquent”, and no-one quite understood why she wasn’t delighted at the prospect of becoming a housewife. “You’ve got a husband, a new baby and a new house and you say you’re bored – what’s wrong with you?” another woman once said to her.

But attitudes in general were changing and Bell Burnell believes her generation experienced a turning point; for the first time, women could have successful professional careers alongside running a family. It was tough, though, and a woman’s career was not deemed as important as a man’s. Once, when her son was taken ill at school and the teachers couldn’t get hold of Bell Burnell because she was working at a telescope, they turned to him and asked “You’re not so ill we have to disturb daddy are you?”

Bell Burnell’s career, which “felt like snakes and ladders”, has spanned right across the academic spectrum, just as her research career has spanned the electromagnetic spectrum. She has been a researcher, a university lecturer, a tutor, a manager, a professor, a head of department, a dean, a principal researcher, an outreach ambassador and IOP president.

Lessons still to learn

Despite her own success, Bell Burnell is concerned that “younger women think the battles have all been fought”. The statistics are still bad – women are still under-represented in physics, and it gets worse in senior positions, where women progress more slowly than men and are less successful when they put in grant or job applications.

Bell Burnell describes how the attitude for solving this problem is often “Fix the women!” – make them braver, give them special training to address their inability to communicate and so on. However, this assumes the problem is with women, not with the scientific community or society in general. Bell Burnell notes that, while individuals may be fine, flawed institutional structures and policy can nurture bias.

To this end, she identified that universities would only become more gender balanced if they were competing for something. This revelation led to the creation of Athena SWAN – the world’s first award scheme to recognize commitment to women’s academic careers. Bell Burnell was pivotal in establishing the scheme in 2005, which is a gender-equality charter for university departments that challenges people to address and stop institutional sexism. But, she says, there is still a long way to go.

Bell Burnell concluded her IOP President’s Medal lecture with the infamous quote by Laurel Thatcher Ulrich: “Well-behaved women seldom make history.” This is a statement that perfectly describes her fight against the expectations of society and her career that continues to inspire both men and women today.

    • The Jocelyn Bell Burnell Award Event, which recognizes an outstanding very early career female physicist, will be held on 11 October in London. For more details see ow.ly/UIVC30dZlhV

 

Sarah Tesh is features editor of Physics World (click link below for full bio) and Jess Wade is a postdoc at the Centre for Plastic Electronics at Imperial College London, UK, @TeshSarah and @jesswade

Honeybees under threat from pesticide use on oilseed rape

While previous research in the UK has suggested that bees do not feed on oilseed rape, a new study shows it is the most important nectar source to honeybees while it is in flower. This must now be considered in future discussions on neonicotinoid pesticides, which are widely used on oilseed rape.

Matthew Pound and student collaborators from Northumbria University in the UK analysed pollen from samples of honey, which they took from different times of the year to assess the bees’ changing preferences. Honey samples taken in June and July suggested that bees visited oilseed rape flowers more than any other, with the plant flowering in spring and providing the bees with copious quantities of nectar.

This contrasts with previous research, which indicated only limited honeybee foraging on oilseed rape. Pound and colleagues believe this is due to the previous study looking at pollen pellets the bees collected to feed their colony, whereas bees use oilseed rape solely as a source of nectar for producing honey.

The researchers used a standard method of chemical processing to extract pollen from the honey samples. This involved diluting the honey with ethyl alcohol and using a centrifuge to concentrate the pollen. The pollen grains were then subjected to acetolysis using acetic anhydride and sulphuric acid. After all this, the pollen samples are mounted on microscope slides, allowing the individual pollen grains to be counted and analysed.

The result of this study impacts on our thinking around neonicotinoid pesticides. These chemicals are known to be harmful to honeybees, leading them to be banned by the European Union in 2013 (though their use can still be permitted in emergencies). Previous findings disguised the risk posed to honeybee colonies of using these pesticides on oilseed rape. Additionally, with the UK set to leave the European Union and draft its own policy on pesticide use, the potential risk to bees must now feature in the debate.

The importance of bees lies in their pollination of plants, including crops. Some 75–80% of global crop pollination has been estimated to be due to insects, with an estimated worth to the global economy of €153bn. But honeybees are in decline in Europe and North America, part of a much wider, global fall.

Environmental change, pathogens, parasites and beekeeping practices have all been linked with this decline, along with the use of pesticides. While the European ban on neonicotinoids has reduced the yield from oilseed rape, this loss must be balanced against the lasting damage these substances do to honeybees, which are of great importance to the ecosystem and the economy.

Pound and his colleagues published their work in Palynology.

Can geofinancial engineering get us out of hot water?

We need additional levers beyond governmental policy to improve the odds of averting catastrophic climate change. Financial market pressure is already playing a central role. Yet, while long-term institutional investors such as sovereign wealth and pension funds are integrating climate-change considerations into their investment decisions, such efforts fall well short of the climate challenge. One of the main reasons for this shortfall is the difficulty of connecting specific investment assets, such as stocks of publicly traded companies, to the negative consequences of companies’ actions. However, if such environmental externalities can be both quantified and attributed to a given company or asset in near real-time, then asset prices may begin to reflect the costs of irresponsible behaviours more accurately.

With that in mind, we propose a new approach: geofinancial engineering. This uses financial tools and scientific knowledge to leverage the capital markets to change human impact on the physical world and improve the odds of averting catastrophic climate change. As a market-based initiative, geofinancial engineering can operate independently from government action or can amplify the effectiveness of market-based regulation like carbon pricing or emission penalties. Given the recent policy swing from the science-driven Obama administration to the climate denialism of the Trump administration, policy-independent mechanisms are particularly timely.

figure 1

As a mitigation strategy, geofinancial engineering aims to preemptively reduce the prevalence of behaviours that cause the most environmental damage – such as extracting and burning high-carbon fossil fuels, deforestation, methane-intensive agricultural practices, and loss of non-renewable water sources (figure 1).

Geofinancial engineering aims to improve climate outcomes by increasing the cost of capital for companies engaged in harmful behaviour, and by reducing the cost of capital for more climate-resilient options (figure 2).

figure 2

As we mentioned, many long-term institutional investors are already integrating climate-change considerations into their investment decisions. However, hedge funds and quantitative (or algorithmic) trading systems, which often buy and sell in a matter of milliseconds are, with few exceptions, inherently climate agnostic. Although these now-pervasive strategies focus more on signals than fundamentals of underlying securities, they do not take real-time signals linked to the environment into account. Such signals either aren’t accessible in a systematic and standardized way or simply aren’t connected to the applications used by the trading community.

Making relevant, real-time environmental risk data readily available to climate-agnostic traders and investors – both automated and human – could shift financial market sentiment against behaviour that damages climate stability. At the same time, such transparency could enable those traders and investors who decide to use the new datasets proactively to avoid climate-related market risks and seize opportunities. Through this mechanism, environmental researchers can spur action by traders and investors and accelerate shifts in the global capital markets, where approximately $350 billion in equities are traded daily.

Satellite data on methane emissions by publicly traded fossil-fuel producers and utilities is one promising example. A suite of information aggregated from current and planned public satellites has the potential to detail methane flaring and venting activity at a fine enough spatial and temporal scale to allow for a nearly real-time assessment of methane emissions (figure 3). Venting or flaring (burning) methane accelerates climate change, the global impacts of which are well known and documented. Venting is particularly problematic because methane is 86 times more potent than carbon dioxide as a greenhouse gas and venting accounts for about a third of global methane emissions.

figure 3

Several existing remote sensing satellites are capable of measuring both methane flares and venting. In particular, NOAA’s VIIRS satellite can measure flaring at a sufficiently high resolution to track daily changes by flare site and link it directly to the company running the facilities. And private ventures such as Planet Labs’ satellite constellation can provide even higher resolution data. Venting is harder to measure since spectrometers must capture the non-visible parts of the spectrum. Planned and proposed satellites, particularly the German EnMAP satellite (set for launch in 2019), may for the first time provide a global dataset at the frequency and resolution needed to pinpoint venting emissions by source. Local tracking also is available using spectrometers from airplanes and drones, as well as other sensors on ground vehicles. Collated together, these sources can build a state-of-the-art, real-time dataset that provides detailed information on all five of these key methane indicators – consistent flaring, consistent venting, change in methane flaring practices, change in methane venting practices and large methane venting anomalies (see figure 3). By assigning these emission data to the spatial location of specific company assets, market players can act on changes in both long-term behaviour and short-term anomalous emission activity.

Up to the minute

Delivering environmental risk data to investors, human traders and algorithmic trading systems – in near real-time before it is generally known – in a format that seamlessly integrates with their workflow would pull the geofinancial engineering lever.

A successful system must create transparency, preferably in near real-time, since knowing about intangible disasters before others is what makes the information actionable to traders. Knowing first –by a day, an hour or a few hundred milliseconds – can deliver the information advantage they seek.

figure 4

Increasingly, such algorithmic strategies seek trading and investment signals from non-financial sources to gain an advantage, however slight. In this context, environmental researchers can trigger shocks in risk perceptions or confidence around investment in fossil fuels or other damaging activities. They can do so by delivering timely empirical data and analysis to machine-readable apps and other analytic tools, and incorporating them into the financial terminals that drive these apps (figure 4).

A hypothetical methane risk analyser (figure 5) illustrates how real-time methane risk data on a specific publicly traded company might appear on a (climate-agnostic) trader’s terminal in the near future. When a new anomaly is detected, this information would be fed into the terminal and immediately affect the risk score, allowing traders to react given the size of the leak, past company behaviour, and the risk scores of its peers. Such an app could be programmed to include a notification system that triggers buy/sell trading decisions given pre-selected risk thresholds.

figure 5

Though we are characterizing the benefits of a trading-based approach to changing corporate behaviour, there are potential side effects. Privatizing benefits of greenhouse gas reduction is a key strength of the geofinancial engineering approach because it drives uptake in a large and diverse marketplace, and we need large-scale transformation to prevent catastrophic climate change. But this may also have downsides, including potential further concentration of financial gains, and shifting power in the carbon markets away from socially-minded investors and policy makers. At the Geofinancial Engineering Initiative we propose a collaborative civic-oriented “open data” approach to mitigate some of these concerns.

Another important caveat concerns the effectiveness of geofinancial engineering. Specifically, the newsworthiness or public perception of certain greenhouse gas emissions may not correlate well with the climate impact of different greenhouse gas sources. For instance, venting is far worse than flaring since unburned methane has higher impacts than burned methane, but flaring means more fire, which could be more “newsworthy” and viewed by traders as more likely to move the short-term market. In other words, information may not move the market for the right reason or may incentivize the wrong things. In the same vein, a focus on publicly traded “super emitters” could overlook cumulative venting by numerous, smaller emission sources, as well as the many private- and state-owned “super emitters” that are shielded from financial market pressure for transparency and accountability. In those cases, engagement may be the best and only option.

Although climate change as an investment consideration is moving into mainstream asset management and institutional investing, the financial market sectors that now generate most of the daily trading volume – hedge funds and algorithmic trading systems – simply do not include it.

Our proposed concept takes the next step: real-time, satellite-captured information relevant to climate change is linked to specific assets and fed into the financial markets. This “real-time” pricing of externalities can apply to both climate-motivated and climate-agnostic traders and investors as long as there are market signals that impose price penalties on the heightened liabilities of corporations engaging in climate-irresponsible behaviours.

Geofinancial engineering would benefit from further research and initial testing to improve information collection, analysis, and dissemination. Investments in improved remote and local sensing options should enhance the ability of markets to respond to real-time information from geofinancial engineering tools. Given the value of improved temporal detail and spatial accuracy, for example, one could imagine a cluster of nanosatellites such as the private Planet Labs constellation that includes spectrometers similar to those of EnMAP or other proposed satellites. This would take real-time, accurate information on greenhouse gas emissions to the next level. Careful consideration of the indirect effects of geofinancial engineering is also critical, particularly since the social need for greenhouse gas emission reduction is large and pressure to take action could and should intensify in the near future.

Geofinancial engineering is now moving from the drafting board to the lab. For example, testing of methane emissions data by traders will begin shortly to determine if data offer actionable trading insights (see Geofinancial Engineering Initiative). Exploration of environmental remote sensing data other than methane and complementary information on supply chains and asset-level data for attribution is also under way.

Longer term, given the declining cost of launching satellites, privately funded consortia consisting of NGOs, academic institutions, foundations, and climate impact-oriented tech firms like Google, Tesla or SpaceX might make such information publicly available. If coupled with publicly funded satellite data and a freely-accessible platform such as the one illustrated in figure 5, then all investors, traders and stakeholders, regardless of their capital resources, could leverage this information to make environmentally sound decisions – and improve climate outcomes.

  • The full version of this article has previously been published by the Journal of Environmental Investing in volume 8 and can be obtained at www.thejei.com.

Reminiscing about Fermilab, CAPTCHA tests your physics, science of guitar strings

By Hamish Johnston

What do huge snowstorms, pioneering childcare and bison have in common? The answer is that they all feature in video recollections of Fermilab, the particle physics facility that is celebrating its 50th anniversary this year. A playlist of the videos is available and you can watch Cindy Joe’s musings over Snowpocalypse 2011 above.

According to physics blogger ZapperZ, the online retailer Amazon is developing a new CAPTCHA technology that relies on human’s innate understanding of the laws of physics. The idea, apparently, is that a user would be presented with before and after scenarios and asked which seem plausible. This could be a ball rolling down a ramp, or a projectile flying through the air. It seems that web robots can’t solve simple mechanics problems – at least for now.

Ending on a bright note, music technologist and erstwhile physicist Jonathan Kemp of the University of St Andrews in Scotland claims to have invented a revolutionary type of guitar string. He riffs on his new creation in the video above.

Collider serves up drop of primordial soup

A tiny drop of an exotic ultra-hot “soup” that permeated the universe for an instant immediately after the Big Bang appears to have been created in collisions between gold nuclei and deuterons at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Lab in the US. Evidence that a quark–gluon plasma (QGP) may be generated even in collisions involving very light nuclei such as the deuteron first emerged five years ago from data at the Large Hadron Collider in Geneva. But the new RHIC results push this evidence to record-low collision energies, which should help physicists better understand how a QGP forms and evolves.

A quark–gluon plasma (QGP) occurs at a temperature about 100,000 times that at the centre of the Sun. Protons and neutrons “melt” into an unbounded mass of quarks and their force-carrying particles, gluons. This extreme state of matter is believed to have persisted for only a few milliseconds after the Big Bang but can be created artificially by colliding very heavy nuclei that are travelling near to the speed of light.

The first, albeit indirect, evidence for such collisions was claimed by scientists at CERN in 2000, who smashed high-energy beams of lead ions into fixed targets made of lead or gold. Five years later, physicists working at RHIC also showed that they could create a QGP – in this case by colliding two beams of gold ions head on. However, these results contained a surprise: physicists had expected a QGP to behave like a gas, but instead it appeared to resemble a liquid.

Perfect liquid

Plotting the trajectories of the many thousands of particles created in each collision, the RHIC researchers found that the particles were not emitted in random directions as they would be in a gas. Rather, some of the emissions were correlated, meaning that more particles flew off at a narrow angle to the plane of the collision than were emitted at roughly right angles. This elliptical distribution signified that the particles were moving collectively and in fact “flowing” together like an almost friction-free perfect liquid, according to the researchers.

The latest work shows that this liquid state appears to be created even when heavy nuclei collide with very light nuclei at low energies. The first, unexpected, glimpse of such a small-scale QGP came in 2012 from the analysis of collisions between lead nuclei and protons at CERN’s Large Hadron Collider – events that were intended simply as a control for lead–lead collisions rather than to generate the scalding plasma themselves. That result prompted scientists in RHIC’s PHENIX collaboration to re-analyse data from collisions between gold nuclei and deuterons, particles consisting of one proton and one neutron. As they reported in 2013, they too saw telltale signs of fluid flow. A couple of years later the PHENIX team saw similar behaviour in collisions between nuclei of gold and helium-3.

Now, the PHENIX researchers have returned to gold–deuteron events. But this time they have studied the effect of varying the collision energy to see if the liquid behaviour disappears. Carrying out measurements over five weeks last year, they set the collision energy at four different levels: 200, 62.4, 39 and 19.6 GeV. They found very similar correlations at all energies when comparing the trajectories of particles grouped into pairs and into sets of four. At 200 GeV, they also looked at six-particle correlations. Their conclusion: liquid flow probably occurs all the way down to at least 20 GeV.

Tiny volumes

According to PHENIX deputy spokesperson Julia Velkovska of Vanderbilt University in Tennessee, the droplets they have created this time around are about 100 times smaller than those generated in collisions between large nuclei – occupying the volume of just four protons. Calculations suggest that the plasmas also only last for about a fifth of the time – a fleeting 7 ×10–24 s. “We have found correlations on scales much smaller than previously thought possible,” she says. “That means that as long as we achieve a high-enough energy density, size is not so important. A QGP may still be formed.”

The researchers’ next step is to see whether other expected signatures of a QGP, already verified in collisions with heavier nuclei, are also present in these smaller-scale collisions. One such signature is the abundance and flow pattern of different types of particles emitted in the collisions. Collaboration member Darren McGlinchey of Los Alamos National Laboratory says that abundance data provide information on the temperature of a QGP.

PHENIX colleague Ron Belmont of the University of Colorado says it is still possible that the elliptical emission they have observed is due not to the formation of tiny QGPs but instead down to nuclear properties prior to collision. When accelerated close to light speed, time slows down for the heavy nuclei, which means, according to quantum chromodynamics, that they appear as a dense wall of gluons. The fact that these condensates are thicker in the centre of the nuclei might explain why particles generated in the collisions are not emitted in random directions, he says.

Theoretical predictions

The expected effects of these condensates on the gold–deuteron collisions are still being worked out by theorists, says Velkovska, and as such have not yet been compared to the latest results. In contrast, she notes, predictions from hydrodynamics are in, and, she says, “agree with the data well”.

The results have been posted to the arXiv server (arXiv: 1707.06108 and arXiv: 1708.06983) and have been submitted for publication in Physical Review Letters and Physical Review C.

Copyright © 2026 by IOP Publishing Ltd and individual contributors