Skip to main content

Top 10 physics breakthroughs of the decade

Physics World has been selecting its breakthrough of the year since 2009. Without doubt, the past decade has included some truly quantum leaps in physics – quite literally in one case. Winners of the award all met the following criteria:

  • Significant advance in knowledge or understanding
  • Importance of work for scientific progress and/or development of real-world applications
  • Of general interest to Physics World readers

This short video takes you on a flyby tour of all the 10 winners since 2009.

Organic electrochemical transistor monitors bone cell differentiation

A new way of detecting chemicals secreted by stem cells as they differentiate into bone cells could make it possible to electrically monitor the differentiation process in real time. The technique relies on an organic electrochemical transistor (OECT) with a gate electrode that is sensitive to one of the molecules involved in differentiation, and the researchers who developed it say it offers a simple and practical route to understanding how stem cells transform into other types of tissue.

Mesenchymal stem cells are “multipotent”, meaning they can differentiate into other types of cells such as fat, bone, cartilage, tendon or muscle cells. The differentiation process that produces bone is highly complex, involving a range of molecules that includes collagen type I, osteopontin, osteonectin, osteocalcin and a cytokine known as Bone Morphogenic Protein 2 (BMP-2). All of these molecules can be used as biomarkers to monitor stem cell differentiation, but current techniques do not allow their concentrations to be monitored as they are secreted.

Anchored antibodies

The new OECT was made by a team of researchers led by Róisín Owens and Donata Iandolo from the University of Cambridge in the UK and Mines Saint-Etienne in France, as well as Beatrice Fraboni of the University of Bologna in Italy. The device is a three-terminal transistor composed of a source and a drain, connected by a channel made of the conducting polymer PEDOT-PSS and a gate on which are anchored antibodies against BMP-2. When BMP-2 binds to the antibodies, the current through the OECT changes by an amount that the researchers can measure.

The researcher say that their device detects BMP-2 at levels approaching those employed in in vitro experiments to induce stem cell differentiation. This means it might be used in future experiments of this type, such as those that use applied electrical fields to kick-start the differentiation process.

The transistor might also be able to detect other cytokines or analytes (such as osteocalcin, osteopontin and osteonectin) produced during stem cell osteogenic differentiation by simply changing the selected capture element on the PEDOT-PSS-coated gate.

Applications beyond differentiation monitoring

Stem cell differentiation monitoring might not be the only application, either. Iandolo says that the team’s device could also be used to detect early-stage diseases in small-volume samples of body fluids. Another option might be to integrate the transistor into structures such as bandages, where it could be used to detect clinically-relevant markers of disease.

The researchers say they are now working on a European Space Agency-funded project called BONUS, which aims to develop in vitro and in vivo pre-screening models and services to detect (and potentially prevent) the bone and muscle fragility that can develop in astronauts during long-term space missions. The team are particularly interested in investigating the effect of space missions on bone cell differentiation. “We are looking into integrating highly specific sensors within a platform containing cell cultures in a controlled fluidic environment to establish in vitro models of bone tissue in this context,” Iandolo tells Physics World. These models might also come in useful for when it comes to developing stem-cell-based therapies for osteoporosis, or brittle bone disease, which affects one in three women and one in five men over the age of 50.

The present work is detailed in Flexible and Printed Electronics, which (like Physics World) is published by IOP Publishing.

 

Screening platform traps, images and then retrieves single bacteria

Bacterial cultures are highly diversified, with each bacterium very different to another. And just as it is not fair to tar everyone with the same brush, researchers are looking for tools with which to investigate the properties of individual cells. Indeed, studying cells one by one is extremely helpful for understanding the dynamics and behaviour of cellular populations.

Locating and studying many individual bacteria over time, however, is not an easy task. Most single-cell techniques come with an undesirable trade-off: they can measure many single cells for a very short time frame each; or track fewer cells over longer times. Selectively picking and retrieving the “odd ones out” for further analysis is even more difficult.

A recent study by Scott Luro and colleagues from Harvard University reports on a new tool that overcomes this trade-off (Nature Methods 10.1038/s41592-019-0620-7).

The research team used a microfluidic device, known as the mother machine, to localize thousands of individual bacteria in microscopic channels. Once a single bacterium, referred to as the mother cell, enters one of the channels, its growth is constrained to a single direction so that daughter cells can be characterized with time-lapse microscopy over many generations. Such lengthy observations capture dynamic cell-to-cell differences while providing enough data to reliably quantify bacterial traits (the so-called phenotype, such as shape, growth rate or behaviour).

After the long-term screening process, an external optical trap can pick up any bacterium of choice using a focused low-power laser beam to attract the bacterium in the laser focal spot. This allows the safe transport of cells to a second parallel collection channel. From here, single bacteria are flushed out of the chip and collected for further analysis, such as genome sequencing or plating. This process enabled the researchers to link specific observed phenotypes on chip to their genomic origins.

“The mother machine has enabled quantitative measurements and analyses of many subtle but important biological phenomena,” says Luro, lead author and graduate student in the Johan Paulsson Lab at Harvard. “The tool we created transforms this powerful imaging platform into a screening device, with the ability to cleanly collect live individual cells of interest.”

Large-scale screening of microbes

In a set of proof-of-principle experiments, the researchers created a mock microbial culture containing three labelled sub-populations mixed into a larger population of unlabelled bacteria to simulate a screening run. The three minority populations showed different phenotypes to the rest: they were fluorescent bacteria emitting red, yellow or cyan light. After growing the culture for 24 hours, the team separately picked three individual different-coloured bacteria, flushed them out of the chip and plated them to prove successful isolation of a bacterium from each sub-population.

In a second series of experiments, the team studied synthetic gene oscillators, namely systems that periodically produce certain proteins induced by deliberate and specific modifications of bacterial plasmids. An important requirement for the study was long-term imaging to accurately measure the amplitude and phase of the oscillations, for which the newly developed chip provided a perfect tool. In addition, the large size of the screened population enabled exploration of oscillatory properties from many different genetic variants. These features allowed the researchers to reliably measure and design some of the most regular periodic gene oscillators known to date.

A new tool for efficient genetic screening

The method developed by Luro and colleagues offers potential for many applications. The ability to screen such a huge number of phenotypes, pinpoint those of interest, and strongly link them to their genotype, could be the cornerstone of new genetic screening procedures. Furthermore, the device enables such screening to be performed at the single-cell level.

“We believe this screening platform could be particularly useful for generating ‘designer’ cells engineered to carry out complex functions, since live top-performing variants are physically collected. This circumvents the need for strain reconstruction, as would be required for similar methods reliant on in situ barcoding,” explains Luro. “We are also looking beyond microbes and have begun adapting our chips for mammalian cell cultivation and isolation, which looks promising.”

In turn, the method allows users to trap and track thousands of bacterial lineages and then retrieve down to a single bacterium off the chip, be it a phenotypic outlier, a specific mutant or your favourite-looking bacterium.

Festive gift ideas for your physics-loving friends and family

A few weeks ago I was chatting to CERN physicist Kate Kahle about CERN Courier, which IOP Publishing (publisher of Physics World) publishes on behalf of the Geneva-based lab.

I happened to notice Kate was wearing a fantastic shirt covered in mathematical equations, which she told me came from the specialist Paris-based online store Coton Doux. Envious, I looked online and was pleased to find it’s also available for men offering “elegance and unusualness in a perfect equation”.

Now I’m not saying I’d buy the shirt myself, but it got me thinking: what physics-themed presents would be perfect for the physicists among your family and friends? Well, who better to ask than the Physics World editorial team themselves!

So Kate Gardner has spotted some great LEGO space R&D sets, these cool science-themed XKCD T-shirts, plus Andrea Beaty’s book Ada Twist, Scientist. Aimed at children aged 4–8, the book won the 2017 Little Rebels Award in its attempts to upend science stereotypes.

Tushna Commissariat, our resident science-fiction nut, sent me a huge wish-list, which includes this funky Apollo 11 lunar lander, a 20-cm diameter particle accelerator clock (er, right), some wooden Higgs boson coasters, and this Schrödinger’s cat in a box, which apparently is a “very unique present for the special geek, nerd or cat person in your life”. The box contains a 1-inch-square cat enamel pin that’s either dead or alive until you choose to open the box (or not if you’d rather stay in limbo). Tushna also suggests this “Map of the Universe” sculpture, which is “officially the smallest-scale commercially available map ever made”. (I think I’ll stick with the Coton Doux shirt.)

Michael Banks has his eyes on this Playmobil Space 9487 Mars space station, ideal for children ages 6+. I’m only worried that it contains a “187-piece play figure set” and a further 183 accessories, which are bound to be lost down the back of the settee before you know where you are.

Margaret Harris says her older niece is getting the book Rosie Revere Engineer, also by Andrea Beaty, in which quiet-by-day Rosie turns at night into “a brilliant inventor of gizmos and gadgets who dreams of becoming a great engineer”. Margaret’s younger niece, aged not quite two, also “wants” a good science-based alphabet book if Margaret can find one that is.

Sarah Tesh, meanwhile, fancies some science-themed jewellery from Boutique Academia, including a π-to-35 decimals necklace, earrings featuring the iconic image of a black-hole taken by the Event Horizon Telescope earlier this year, and an atomic-physics necklace boasting a “dark grey Swarovski pearl on a hand-wrapped silver wire” that apparently “matches everything, from T-shirts to prom dresses”.

Over in the US, the Institute for Systems Biology has drawn up its own gift list, which includes this cool solar-system crystal ball, a DIY kit that lets you insert a jellyfish gene into bacteria, creating bacteria that glows green when you shine a light on them, as well as these interesting “science pants” – not underwear, but trousers featuring “prints of real microscopic cellular images digitally printed on organic, recycled fabric”.

Our friends at the American Physical Society, meanwhile, have got plenty of gifts on offer, including this red T-shirt featuring the slogan “If this shirt is blue, you’re going too fast”. Now that made me smile, so APS, if you’re listening, I’ll have one please!

Event Horizon Telescope’s Shep Doeleman explains how to image a black hole

This episode of the Physics World Weekly podcast features an exclusive interview with Shep Doeleman of the Harvard–Smithsonian Center for Astrophysics, who is founding director of the Event Horizon Telescope (EHT). Doeleman and colleagues bagged the 2019 Physics World Breakthrough of the Year for capturing that iconic image of the shadow of a supermassive black hole at the centre of the Messier 87 galaxy.

Doeleman explains to me how the EHT uses eight radio dishes distributed across the western hemisphere to collect vast amounts of data, which are then combined to create images with remarkable angular resolution. He also says that the team would like to put dishes in space to improve the resolution even further. We also chat about how EHT astronomers are trying to image the supermassive black hole at the heart of the Milky Way – and how this could eventually lead to movies of black-hole dynamics.

I also chat with my colleagues Sarah Tesh and Matin Durrani about whether the 2010s has been the decade of black holes and ponder what field of physics could define the 2020s.

This is the final episode of the weekly podcast for 2019. Tune in on 9 January 2020 for the next installment.

 

Gas-pressure standard gets down to fundamentals

A new standard for pressure measurement that does not rely on artefacts such as mechanical pistons or columns of mercury has been developed by researchers in Germany. The method, which draws instead on first-principles calculations and sensitive measurements of the electrical properties of helium gas, is accurate to within 5 parts per million at pressures of up to 7 MPa and could eventually replace pressure standards based on physical objects.

In the mid-1600s, scientists such as Evangelista Torricelli and Christiaan Huygens began using open-ended columns and tubes of mercury to measure the pressure exerted by a gas relative to atmospheric pressure. Systems of this type are still used as pressure standards, but in recent decades metrologists have worked to develop alternatives that eliminate the need for toxic mercury. The new pressure standard grew out of one such effort, led by Christof Gaiser and colleagues at the Physikalisch-Technische Budesanstalt Institut (PTB) in Berlin.

The PTB team’s first step was to replace mercury columns with precision-engineered pistons. The pressure below a piston can be calculated in a straightforward way, by multiplying the surface area of the piston by the mass of the load. The difficulty, Gaiser explains, is that both the surface area of the piston and the gap between the piston and the surrounding cylinder must be measured to a very high degree of accuracy. Moreover, since the pistons are physical objects, each one is slightly different. “The piston gauges we have at PTB with an uncertainty of one part per million are artefacts,” Gaiser says. “They are all unique, and they must be characterized very accurately.”

A piston balance standing on a table

An independent method

To eliminate this disadvantage, and to check the accuracy of their mechanical piston gauges, Gaiser and colleagues developed an alternative gas-pressure standard based on a technique known as dielectric constant gas thermometry (DCGT). This method, which was invented in the early 1980s and refined at PTB as part of an international effort to fix the value of the Boltzmann constant (and thereby redefine the kelvin unit of temperature), involves measuring how electrical capacitance changes when the space between a capacitor’s electrodes is filled with a pressurized gas. The change in capacitance is related to the gas’ dielectric constant, which depends in turn on its density. Once you know the density, Gaiser says, it is straightforward to calculate temperature using a modified version of the ideal gas law.

The PTB team’s latest result offers a new twist on DCGT. Instead of using the change in capacitance to calculate temperature, they used it to derive pressure, taking advantage of the now-fixed definition of temperature and theoretical calculations of two quantities: the electrical polarizability of the gas and the strength of interactions between gaseous atoms.

In helium, such calculations can be performed from first principles. Gaiser notes that they have a long history, with the first values for helium’s electrical polarizability being derived in the 1920s and 1930s. More recently, the kelvin-redefinition project spurred major advances in the accuracy of these calculations, with several research groups independently developing better methods of performing them. These improvements, together with the PTB group’s own work in the laboratory, made it possible to develop the new capacitance-based standard.

‘Probably the world’s best pressure measurements’

“[The PTB researchers] had to do (probably) the world’s best pressure measurements when determining the Boltzmann constant by DCGT, so I am not surprised you can do the reverse and use a similar approach to measure pressure accurately,” says Graham Machin, a fellow at the National Physical Laboratory in Teddington, UK.

Machin, who led the UK’s contribution to redefining the kelvin but was not involved in the present work, says that the PTB team’s method could form the basis of a future non-mechanical pressure standard. Although he cautions that mechanical standards will not be replaced overnight, “in the longer term, when older standards reach the end of their life, this would be a serious alternative.”

James Schmidt, a physicist and pressure expert at the US National Institute of Standards and Technology (NIST), concurs. “In the same way that the kelvin has been replaced by a definition of the Boltzmann constant, the pressure scale could be replaced by the density of a well-characterized gas and electronic measurements of either the dielectric permittivity or refractive index of that gas,” he says. “While the re-definitions and new techniques may not immediately replace operations on the factory floor, they are important for a few of the highest-level national standards institutions, such as PTB and NIST.”

Gaiser agrees that the PTB method will need further work before it is widely adopted. “It’s a quite huge experiment and you have to take into account that you need a very pure gas and good temperature stability,” he says. Pushing the method beyond its current 7 MPa limit will also require further advances in the theory of helium gas at very high pressures, he adds.

The team report their work in Nature Physics.

MRI enables robotic navigation in deep blood vessel networks

A team of researchers from the NanoRobotics Laboratory at Polytechnique Montréal has demonstrated a new technique that uses the fringe field of a clinical MRI scanner to enable robotic navigation of tethered instruments in deep vascular regions. The approach could one day lead to significant improvements in a number of medical procedures, including neurosurgery for the treatment of aneurysms (Science Robotics 10.1126/scirobotics.aax7342).

The pioneering technique, dubbed fringe field navigation (FFN), used the superconducting magnet of an MRI scanner to generate a strong magnetic field that pulls a micro-catheter capped with a spring-shaped magnetic tip through complex vascular structures. The team demonstrated that the instruments could successfully travel through narrow and complex areas in the neck and brain arteries of live pigs, which are well out of reach for existing manual procedures and magnetic platforms.

Table positioning

As co-author Arash Azizi, until recently a graduate research assistant at Polytechnique Montréal, explains, the lines of this magnetic field are distributed around the scanner. And manufacturers typically use shimming techniques to increase the gradient at the entrance of the tunnel so that the stray field – also known as the fringe field – decays faster the further it is from the scanner.

“The high gradient of the magnetic field available at the entrance of the tunnel of MRI scanners motivated us to use it for the purpose of magnetic navigation,” Azizi explains. “The MRI fringe field is static. So, we used a robotic system to position the sample with six degrees-of-freedom in the fringe field to apply directional magnetic gradient forces. Also, between [moving] the MRI and the sample, it is easier to move the sample.”

Potential applications

The main application for the new magnetic system is the improved navigation of a micro-tethered instrument through what Azizi describes as the “bifurcations and tortuous paths” of deep vascular regions. Existing techniques make use of a guidewire that comes into contact with vessel walls at multiple locations – with a small diameter micro-guidewire with negligible stiffness required for targeting narrower vessels.

However, because the insertion of such devices into vessels is often impractical after passing along what Azizi calls a “distance of tortuous vessels”, he and his team proposed an innovative method that applies a magnetic pulling force on the tip of the device, enabling users to reach deeper locations. Looking ahead, Azizi believes the technique has a high potential for use in neurosurgery to treat aneurysm and occlusion in the cerebral artery.

“FFN has been developed to navigate tethered instruments in the vascular systems, so it can be used for endovascular intervention in different organs,” Azizi says. He notes that the researchers believe that FFN is not an appropriate tool for cardiologists, as it relies on robotic positioning of the patient, which is relatively slow compared with the fast dynamics of the heart.

“The next step to advance this project is to develop custom micro-guidewire depending on the application and the region of intervention inside the body, and designing intervention protocols for FFN intervention in different regions of the body,” Azizi adds.

Co-author Sylvain Martel, director of the NanoRobotics Laboratory, agrees that interventions in the brain, as well as in other physiological spaces that are difficult to access – for example in urology –are excellent applications for the technology.

“Miniaturization in technology is also progressing very fast. As a result, technology becomes smaller [and it becomes more and more possible] to bring them deeper in the human body, as FFN does for diagnostics,” he says. “I believe that the need for – as well as the number of – applications will increase in line with the level of miniaturization of instruments that can be navigated deeper into the body.”

How Feynman diagrams transformed physics

Tools can change not only how theorists calculate but also what they calculate about.

As the International Year of the Periodic Table draws to a close, I’m reminded of this lesson through the work of the Swedish scientist Jacob Berzelius, one of the fathers of modern chemistry. Back in the early 19th century, he developed a new tool for writing chemical formula. It involved giving elements simple labels such as Si for silicon – one of four elements Berzelius discovered – along with numbers denoting their proportions. Vinegar, for example, is C2H3O2, though Berzelius used superscripts rather than subscripts. The system is still used today, and we assume that it represents chemicals “as they really are”.

But two decades ago, Ursula Klein, a scholar from the Max Planck Institute for the History of Science in Berlin, pointed out that the tool changed chemistry. It not only organized complex information in the “jungle of organic chemistry” but also transformed the way chemists looked at chemicals. To explain how all this happened, Klein introduced the notion of a “paper tool” – it showed how Berzelius’ notation system transformed ideas about what chemicals were and how to study them, thereby providing chemists with new perspectives, concepts and goals.

The notion of a “paper tool” showed how Berzelius’ notation system transformed ideas about what chemicals were and how to study them, thereby providing chemists with new perspectives, concepts and goals

Klein’s notion of a paper tool has since been applied elsewhere. Michael Gordin, a historian of science at Princeton University in the US, applied the concept to the early history of periodic tables in his 2004 book A Well-Ordered Thing (Basic Books). Meanwhile, David Kaiser – a physicist and historian of science at the Massachusetts Institute of Technology – has used paper tools to explore the impact of Feynman diagrams. As he writes in his 2005 book Drawing Theories Apart: the Dispersion of Feynman Diagrams in Postwar Physics, these illustrations – pioneered by Richard Feynman in the late 1940s and early 1950s – “helped to transform the way physicists saw the world and their place within it”.

Doodling physics

Feynman diagrams, you’ll recall, are line drawings that represent mathematical expressions of the behaviour of subatomic particles. Feynman developed them to keep track of calculations of self-energy, or how charged particles interact with their own fields. These calculations are done by perturbation expansions, which work by viewing each self-energy interaction as a small change, or perturbation, of some known state. A perturbation calculation then adds up a series of such small changes.

Unfortunately, keeping track of any corrections beyond the simplest case, let alone adding them all up, makes such calculations forbidding. In work for which he would share the 1965 Nobel Prize for Physics with Shin’ichiro Tomonaga and Julian Schwinger, Feynman said he used the diagrams as a “book-keeping device for wading through complicated calculations”. That was disingenuous; the diagrams did far more than that. Kaiser’s account of this imaging tool reveals at least four different ways in which Feynman diagrams acted as more than a simple tool but transformed particle physics itself.

First, the diagrams required apprenticeship. Feynman diagrams have a deceptive visual simplicity, but even at first physicists did not find them natural or intuitive. They could not spread, Kaiser writes, through the equivalent of “correspondence courses”, in which training happens by sending and receiving information from a distance rather than face-to-face encounters. Instead, Feynman had to tutor colleagues, notably Freeman Dyson, who helped spread the new techniques to a group at the Institute for Advanced Study in Princeton. Members of that cohort, in turn, spread them further. The diagrams required something like “craft skill or artisanal knowledge”, Kaiser writes, and the mentored and often laborious acquisition of techniques of the sort associated with new traditions of painting, fashion or art.

The second transformative impact of Feynman diagrams was that they framed the projects that theorists undertook in a new way. Here again, there are affinities to painting and the way that artists created new approaches to traditions like realism. Growing confidence in the calculations enabled by Feynman diagrams thus reinforced confidence in the diagrams themselves as a tool, thereby in the diagrams’ applications, and so on.

Third, Feynman diagrams are what philosophers of technology call “multistable”, rapidly mutating in their application and structure. In traditional history of science, Kaiser points out, theoretical tools are thought to spread like “batons in a relay race – stable objects that retained their meaning and form as they were passed from one user to another in a growing network”. Feynman diagrams, instead, transformed into tools that could be used not only in high-energy physics, but also in nuclear physics, solid-state physics, gravitational physics and an ever-widening circle of applications. “Improvization and bricolage,” Kaiser writes, “can lead to applications that had never been envisioned by the tool’s inventors.”

Fourth, Feynman diagrams transformed what physicists conceived as real. In an old story familiar to philosophers of science and technology, tools such as Feynman diagrams not only shaped the practices of those inside the workshop but also came to be taken for granted and seemingly transparent avenues to what appears to be the “real”. Kaiser, again, compares this to traditions in art history that each appear (deceptively) to represent nature to the “innocent eye”.

The critical point

A still more radical lesson of Kaiser’s book concerns not what it says about how Feynman diagrams are used, but what the use of the diagrams says about the nature of theory itself. Theory, Kaiser suggests, is ultimately less important to theorists than the tools that mediate their calculations. Moreover, tools fashioned within one theoretical framework can take on lives of their own, and find new uses even when the original theory, for which they had been drafted, falls out of favour. Theoretical tools, like experimental ones, can outlive the theories they were meant to elucidate.

Elves and gamma rays emerge simultaneously from thunderstorm

New insights into two types of radiation flashes that are associated with lightning have been identified from observations made aboard the International Space Station. Torsten Neubert at the Technical University of Denmark and colleagues deduced that both “elves” and terrestrial gamma-ray flashes are powered by the onset of lightning – with each phenomenon visible at the tops of thunderclouds.

From our perspective on the ground, lightning is both a beautiful and violent effect. The flashes are created by the strong electric fields that build up between the ground and the free electrons inside storm clouds, creating the bolts that reach down to the ground. At the same time there are other processes that take place above the clouds and out of our normal view.

Terrestrial gamma-ray flashes (TGFs) are short-lived emissions of radiation that are believed to be generated by electrons that have been are accelerated to very high speeds by the strong electric fields. Also, above the clouds are transient ultraviolet and optical emissions called “elves”. These are believed to be created in the lower ionosphere by electromagnetic waves, which expand outwards from powerful lightning currents.

Unsure connections

Both effects are known, but meteorologists continue to debate the precise generation processes of elves and TGFs. Scientists are also unsure as to whether the effects are connected. To resolve the issue, Neubert’s team used the Atmosphere-Space Interactions Monitor (ASIM), which simultaneously captures emissions from across the electromagnetic spectrum. From its position aboard the International Space Station, the researchers pointed ASIM’s sensors straight down to minimize any gamma-ray flux losses from atmospheric absorption, allowing them to obtain images with a temporal resolution of just 10 µs.

Through this setup, Neubert and colleagues gathered high-speed observations of emissions at heights up to 12 km above a thunderstorm that struck Indonesia in October 2018, in which one particular TGF was closely accompanied by an elve. From this behaviour, the team deduced that at the onset of lightning current pulses, strong electric fields form inside thunderstorm clouds – giving rise to TGFs a few milliseconds later. This suggested that lightning currents form quickly, and at high altitudes. Together, the observations suggest that both the storm’s elve and its TGF arose from the same sequence of events.

The images enabled Neubert’s team to conclude that both TGFs and elves are powered by lightning, which is strong evidence that the two processes are, in fact, connected. They now hope to further explore the sequences of events triggering each type of flash in further detail; potentially helping meteorologists to understand more about the vibrant physical processes that play out within thunderstorms.

The observations are described in Science.

When TEXAS came to Portsmouth: black holes, neutrinos and gravitational waves

Growing up in coastal Somerset, UK, once a year I used to watch from the living-room window with incredulity as American muscle cars and pick-up trucks roared past our house. This procession of  Fords, Dodges and Chryslers were filled to the brim with people dressed up in their finest Deep South regalia – on their way to the windswept Brean Country and Western Festival.

The name of the conference I find myself attending kindles images from those bizarre childhood memories. But luckily, attendees at TEXAS 2019 are not sporting cowboy boots or questionable American Indian costumes. They are physicists from all over the world, meeting at the beautiful Guildhall venue in the heart of Portsmouth, UK, to hear about and discuss the latest discoveries in fields related to the relativistic theory of gravitation and cosmology. TEXAS 2019 is hosted by the University of Portsmouth’s Institute of Cosmology and Gravitation.

Though the first day is not yet over as I write – with a number of in-depth parallel sessions still ongoing ­– there has already been plenty for conferencegoers to talk about. For instance, the morning’s plenary talks began with a bang as University of Arizona’s Dimitrios Psaltis got straight into the most talked about news from astrophysics this year – the first-ever picture of a black hole, which last week bagged the Physics World Breakthrough of the Year award for 2019.

Photo of Dimitrios Psaltis at the TEXAS2019 symposium

Psaltis was one of the lead scientists behind the image, which was taken by the Event Horizon Telescope (EHT), an international collaboration involving over 200 astronomers and eight radio dishes in six different locations across the globe.

Somehow Psaltis adroitly managed to condense over 40 years of black hole imaging theory, the 10-year history of the EHT, and how black hole imaging might offer tests for some of the big outstanding mysteries in relativity, into just 40 minutes. And he still managed to provide fascinating and often amusing insights: “The data [from EHT] is so immense that we cannot transfer it over the Internet, so we literally put it in crates and FedEx them,” he revealed. “One of those crates when we opened it in Bonn, we found fabric – and there was a factory in Germany that opened their crates expecting fabric and found hard drives.”

The data [from EHT] is so immense that we cannot transfer it over the Internet, so we literally put it in crates and FedEx them.

Dimitrios Psaltis

The black-hole image Psaltis and the EHT team took was of a supermassive black hole residing at the heart of nearby (cosmologically speaking) giant elliptical galaxy M87. It was apt then that the following speaker – the University of Cambridge’s Chris Reynolds – continued the supermassive black hole theme by explaining how they are not only the “ultimate laboratory for studying relativistic gravity,” but may also offer hints towards building a fundamental theory of how the universe works beyond the Standard Model of particle physics.

The Standard Model is one of the great triumphs of 20th-century physics, as precise and well tested as relativity. Yet though it accurately explains everything we see – matter – it does not explain 95% of the theorised universe. Hypothesised uncharged particles called axions are candidates for about 27% of the missing universe: dark matter. But they have never been observed.

One way Reynolds explained how supermassive black holes are already helping to go beyond the Standard Model and illuminate axions’ existence (or not) is by spinning. “If axions exist and are very low mass, they can actually form gravitational atoms with black holes… and there will be a wholesale zapping of the black hole spin energy that will then get radiated away,” he said. “So the very observation of spin black holes in nature is constraining these axions.”

After a fascinating and detailed talk by Elena Gallo from the University of Michigan on models and new evidence suggesting jets  – extremely powerful streams of particles emanating from active galactic nuclei  (AGNs) – could be powered by black hole spin, the last plenary of the day was delivered by Elisa Resconi.

Resconi is a neutrino physicist from Technical University, Munich. As part of the IceCube Collaboration – the team behind the IceCube South Pole neutrino observatory – she was involved in another astrophysics breakthrough that made the headlines when in 2018 the team announced they had, for the first time, pinpointed a cosmological source for the neutrinos they had detected. Since the catchily named TXS 0506+056a blazar – an AGN with a relativistic jet pointed towards Earth – was identified as an IceCube neutrino source, Resconi revealed during her talk that the team had identified around 70 high-energy neutrino events they could associate with blazars.

Though impressive, Resconi thinks astrophysics has only scratched the surface of neutrino astronomy’s usefulness. “The questions that concern neutrinos are: where are the most extreme cosmic accelerators? What are their compositions. How do particles get boosted in relativistic jets? And eventually, is there any exotic physics happening inside these jets?,” she stated. “We need to build more neutrino telescopes and we need to be able to operate them together ‘in plenum’.”

We need to build more neutrino telescopes.

Elisa Resconi

With the black hole-focused first day of talks coming to a close, attendees have a fascinating four more to look forward to, covering a broad range of hot topics. But they (and I) also face a challenge. Given many major astrophysical discoveries have been announced at the biennial TEXAS symposia since 1963, picking which lectures to attend is going to be fraught with difficulty.

Copyright © 2025 by IOP Publishing Ltd and individual contributors