Skip to main content

Pushy bacteria create their own superfluids

Certain types of swimming bacteria can lower the viscosity of an ordinary liquid, sometimes even turning it into a superfluid, according to work done by researchers in France. The team studied how the collective swimming motion of bacteria can substantially alter a fluid’s hydrodynamic properties. In some cases, the change is so great that highly active bacteria create a “negative-viscosity” liquid and are then pushed along by the fluid itself. The researchers suggest that the energy from such bacterial suspensions could be used to drive tiny mechanical motors in microfluidic systems.

The viscosity of a liquid is a measure of its resistance to being forced to flow. For example, honey and oil have much higher viscosities than water, and therefore do no flow as easily. Viscosity arises because of collisions between neighbouring particles in a fluid that move at different velocities. On the other hand, a superfluid – such as liquid helium – is a type of ideal fluid that has zero viscosity and flows as if it is not affected by surface tension or gravity.

All shook up

In the past, scientists had suspected that the presence of certain bacteria in a fluid could cause a change in its viscosity because of bacterial motion. Some models even sought to explain how this might happen: for example, the swimming movement of the bacteria being thought to make local changes in the liquid’s flow as the bacteria align themselves to reduce the velocity gradient of the liquid.

Now, Harold Auradou of the University Paris-Sud and colleagues have studied the well-known Escherichia coli, or E. coli, bacteria. These are a type of “pusher swimmer” that force fluid to flow outwards away from their flagella as they propel themselves forward. Auradou’s team studied E. coli suspensions made up of varying amounts of bacteria in solutions of water and just enough nutrients to keep the cells alive, but not adequate for them to reproduce. The flow of the solutions was studied as they were spun at different speeds in a rheometer – a device that applies shear stress through a rotating outer wall and is used to measure the viscosity of a liquid.

Bacterial brews

With this set-up, the researchers were able to determine that, for low to moderate stress values, the bacteria do indeed lower the viscosity of the liquid, as predicted. When the team then increased the number of bacteria and “doped” them with extra nutrients, the higher activity meant that the viscosity plummeted to zero – and even below zero.

The team still cannot say for sure what causes the viscosity drop, although the pusher-swimming motion may play a key role. Auradou and colleagues are confident, however, that the viscosity drop was indeed caused by the motion of the bacteria, rather than their mere presence, because adding dead bacteria to the solution made no difference to the viscosity. The team says that it may, in theory, be possible to somehow harness the viscosity-lowering ability of such bacterial cocktails. This could involve placing tiny rotors in the fluid that would be dragged around and could power a small device such as a microfluidic pump.

The research is published in Physical Review Letters.

The Magnus effect in action, destroying the world, an astrophysicist camps out in Manchester and more

By Hamish Johnston and Michael Banks

This week’s Red Folder opens with a fantastic video (above) from the folks at Veritasium. It involves dropping a spinning basketball from the top of a very tall dam in Tasmania and watching as the ball accelerates away from the face of the dam before bouncing across the surface of the water below. In comparison, a non-spinning ball simply falls straight down. This happens because of the Magnus effect, which has also been used to create flying machines and sail-free wind-powered boats. The effect also plays an important role in ball sports such as tennis and is explained in much more detail in our article “The physics of football”.

(more…)

Metal foams could make promising radiation shields

Lightweight composite metal foams are effective at blocking harmful radiation, according to a new study carried out by researchers in the US. Indeed, the foams can efficiently block X-rays, gamma rays and neutron radiation, and could also absorb the energy of high-impact collisions. The research may pave the way to metal foams being used in medical imaging, nuclear safety, space exploration and other shielding applications.

With their unusual mechanical and thermal properties, composite metal foams are attracting increasing levels of interest. Afsaneh Rabiei and colleagues at North Carolina State University first began investigating the potential for metal foams to be used in military and transport applications such as for blast protection and as armour, before turning their attention to possible uses in space exploration or nuclear shielding. The team was keen to see if such metal foams could actually block various types of radiation, provide structural support and protect against high-energy impacts.

Foamy tests

Composite metal foams consist of pre-fabricated metallic hollow spheres that are embedded within a metallic matrix. In the latest work, different varieties of the composite foams were made using three main materials – aluminium–steel, steel–steel and tungsten- and vanadium-enriched “high-Z steel–steel” – and in different sphere sizes of 2, 4 and 5.2 mm in diameter.

To test their effectiveness, Rabiei’s team measured the attenuation of different types of radiation – specifically, X-rays, gamma rays and neutron radiation. The results of the foam tests were compared with those from bulk samples of an aluminium alloy (A365), lead and steel – three materials that are commonly used in shielding applications. To make the best comparison, the researchers prepared each sample with the same areal density, with the test materials varying in volume but having the same weight.

The foams were seen to perform well, with the most effective shielding material overall being the “high-Z” steel–steel foam. This foam was the best material against neutron radiation and low-energy gamma rays, and was comparable with the alternatives at blocking high-energy gamma rays. For X-ray applications, the high-Z foam was also effective, being bested only by pure lead. The effectiveness of the foam was seen to be largely independent of the size of the sphere used to form the material’s cavities.

Environmentally friendly

“This work means there’s an opportunity to use composite metal foams to develop safer systems for transporting nuclear waste, more efficient designs for spacecraft and nuclear structures, and new shielding for use in CT scanners,” says Rabiei. In addition, she notes, “[The] foams have the advantage of being non-toxic, which means that they are easier to manufacture and recycle.”

Russell Goodall, a metallurgist at the University of Sheffield who was not involved in this study, told physicsworld.com that metal foams have “a range of useful properties; unusually, many of these derive from both the material they are made from and the foam structure”. Noting that radiation absorption is a property of the foam’s material, rather than structure – and therefore that the same degree of protection would be afforded by solid plates of the same material – he adds that it will be important to assess the other advantages foam will provide, such as impact energy absorption or structural support, for potential applications.

“The use of high-tungsten steel as a matrix of these foams is a breakthrough, as it provides excellent radiation shielding via the tungsten, while also offering outstanding strength, ductility and toughness and maintaining low density through the pores,” says David Dunand, a metallurgist at Northwestern University in the US, who was also not involved in the current work. “There are no known materials with this combination of properties,” he adds.

With its initial study complete, Rabiei’s team is looking to modify the composition of its foams to make them more effective than lead at blocking X-rays.

The research is described in the journal Radiation Physics and Chemistry.

The search for alien life gathers pace

 

By Hamish Johnston

Earlier this week in London the billionaire physics enthusiast Yuri Milner joined forces with some of the biggest names in astronomy and astrophysics to announce a $100m initiative to search for signs of intelligent life on planets other than Earth. The money will be used to buy time on a number of telescopes to search for radio and optical signals created by alien civilizations.

(more…)

Weyl fermions are spotted at long last

Evidence for the existence of particles called Weyl fermions in two very different solid materials has been found by three independent groups of physicists. First predicted in 1929, Weyl fermions also have unique properties that could make them useful for creating high-speed electronic circuits and quantum computers.

In 1928 Paul Dirac derived his eponymous equation, which describes the physics of spin-1/2 fundamental particles called fermions. For particles with charge and mass, he found that the Dirac equation predicts the existence of the electron and its antiparticle the positron, the latter being discovered in 1932.

Other solutions

However, there are other solutions of the Dirac equation that suggest the existence of more exotic particles than the familiar electron. In 1937 Ettore Majorana discovered a solution of the equation that describes a neutral particle that is its own antiparticle: the Majorana fermion. Although there is no evidence that Majorana fermions exist as fundamental particles, Majorana-like collective excitations (or quasiparticles) have been detected in condensed-matter systems. Another solution of the Dirac equation – this time for massless particles – was derived in 1929 by the German mathematician Hermann Weyl. For some time it was thought that neutrinos were Weyl fermions, but now it looks almost certain that neutrinos have mass and are therefore not Weyl particles.

Now, a group headed by Zahid Hasan at Princeton University has found evidence that Weyl fermions exist as quasiparticles – collective excitations of electrons – in the semimetal tantalum arsenide (TaAs). In 2014 Hasan and colleagues published calculations that suggested that TaAs is “Weyl semimetal”. This means that TaAs should have Weyl fermions in its bulk and a distinct feature on its surface called a “Fermi arc”. Using a standard technique called angle-resolved photoemission spectroscopy (ARPES), the team found evidence of a Fermi arc. The team then used a technique called soft X-ray ARPES to probe deeper into the bulk of the material, where it found further evidence for Weyl fermions in the form of “Weyl cones” and “Weyl nodes” – both of which were in agreement with the researchers previous calculations.

Fermi arcs have also been predicted and then spotted in TaAs by an independent research group that includes Hongming Weng and colleagues at the Chinese Academy of Sciences. The team, which has members at the Collaborative Innovation Center of Quantum Matter in Beijing and Tsinghua University, also used ARPES in its study.

Double-gyroid crystal

Meanwhile, at the Massachusetts Institute of Technology and Zhejiang University in China, Marin Soljačić and colleagues have spotted evidence for Weyl fermions in a very different material – a “double-gyroid” photonic crystal. This crystal is made from slabs of plastic with a matrix of holes drilled in them. The slabs are then stacked in such a way that there are continuous paths through the crystal for microwave radiation to follow.

The physics of the Weyl fermion are so strange, there could be many things that arise from this particle that we’re just not capable of imagining now
Zahid Hasan, Princeton University

The team fired microwaves at the crystal and measured microwave transmission through the crystal while changing its orientation to the incident microwave beam – and varying the frequency of the microwaves. This allowed the researchers to map out the photonic band structure of the crystal, which reveals which microwave frequencies can travel through the crystal and which cannot. This revealed the presence of “Weyl points” in the band structure, which are indicative of Weyl fermion states existing in the photonic crystal.

“The discovery of Weyl points is not only the smoking gun to a scientific mystery,” says Soljačić, adding, “it paves the way to absolutely new photonic phenomena and applications.”

Fast and robust

Weyl fermions could be very useful because their massless nature would allow them to conduct electric charge through a material much faster than normal electrons – which could be used to create faster electronic circuits. This property is also shared by electrons in graphene. However, unlike graphene, which is a 2D material, Weyl fermions should exist in more practical 3D materials. Furthermore, Weyl particles are topologically protected from scattering, which means that they could be useful in quantum computers.

Hasan is also enthusiastic about how the discovery of Weyl fermions could lead to further advances: “The physics of the Weyl fermion are so strange, there could be many things that arise from this particle that we’re just not capable of imagining now.” Indeed, Hasan told physicsworld.com that Weyl could become a “motherboard for future electronic devices”, because they combine high mobility with topological protection.

The Hasan and Soljačić studies are described in separate papers in Science: 10.1126/science.aaa9297 and 10.1126/science.aaa9273. The work by Weng and colleagues is described in a preprint on arXiv.

What can we learn by listening to the ocean?

What can we learn by listening to the ocean?
From marine earthquakes to dolphins chattering, the ocean rings with noises
Video Player is loading.
Current Time 0:00
Duration 1:42
Loaded: 0%
Stream Type LIVE
Remaining Time 1:42
 
1x
    • Chapters
    • descriptions off, selected
    • subtitles off, selected
    • en (Main), selected

    In less than 100 seconds, Philippe Blondel explains that activities within the world’s oceans create a cacophony of sound that can reveal vast amounts of information about the environment in which the noises are generated. Waves, wind and rain at the surface, earthquakes beneath the seafloor, shipping, the movements and communication sounds of marine animals – the sources of noise go on and on.

    Blondel is a physicist at the University of Bath in the UK with a focus on the physical understanding of acoustic remote sensing and its uses in underwater environments. In this 100 Second Science video, he explains how tracking the sounds of the sea can be used to study environmental change, such as the sound created by melting ice in the polar regions. Among other uses, this information can provide an early warning of the mini tsunamis caused by sudden ice collapse.

    Room at the bottom

    Everyone has heard of Silicon Valley, but few really understand how it became the home of the global computing industry. Before the 1980s, the area’s technological economy was dominated by the development and manufacture of magnetic recording storage, and it has been said, somewhat tongue-in-cheek, that it was then more of a “Rust Valley” due to the prevalence of various ferrous-ferric oxide mixtures employed in this industry. The label “Silicon Valley” did not gain currency until later, when Fair-child Semiconductor and its descendants Intel and Advanced Micro Devices began to commercialize their metal-oxide-semiconductor (usually silicon) field-effect transistor technology (MOSFET) – thus transforming both the valley and its worldwide image.

    Moore’s Law celebrates the life and career of a scientist who played a major role in these developments. In the geek world, Gordon Moore is best known as the progenitor of “Moore’s Law”, the empirical observation (made in 1965) that the density of MOSFETs on an integrated circuit would double every 18–24 months. This doubling has indeed occurred more or less on schedule. The book’s subtitle describes Moore as a “quiet revolutionary” and the first word is certainly accurate; Moore is definitely not a superstar who attracts the kind of press promotion received by the likes of Bill Gates, Steve Jobs and, most recently, Elon Musk. But I prefer the description “quiet hero”. In his own industry, Moore has been to his colleagues what Steve Wozniak was to Steve Jobs at Apple Computer – the real font of technical (not sales) innovation behind their respective enterprises.

    The hardback book I now hold in my hands is some 4 cm thick and contains much more material than can be absorbed in one or two evenings of reading. To summarize, it describes how Moore was born and raised in the San Francisco Bay Area; attended local universities in San José and Berkeley; graduated from the latter in 1950 with a degree in chemistry; and obtained a PhD in that discipline from the California Institute of Technology. Following postdoctoral studies at Johns Hopkins, Moore joined William Shockley at Beckman Instruments in California, but in 1957 he and seven other young researchers broke with the notoriously difficult Shockley and accepted financial support from an entrepreneur, Sherman Fairchild. Over the next 10 years, their new company, Fairchild Semiconductor, pioneered the development of MOSFET devices, but not their successful commercialization. That began in 1968, when Moore and Robert Noyce founded the company that became Intel – arguably one of the most successful American enterprises of the later 20th century.

    Narrating this tale takes up most of the book, which is replete with moving family memorabilia and corporate intrigue. An example of the latter was Intel’s uneasy alliance with IBM, which Moore engineered in the early 1980s. With demand for IBM’s line of personal computers and mainframes exceeding its in-house manufacturing and development resources, it purchased, temporarily, a 15% interest in Intel to assure continuity of supply. Ordinarily, such a purchase could have fallen foul of US antitrust legislation, but at the time, IBM mainframes underpinned a large number of US defence and intelligence resources. This led to concerns that if the company had to source parts for its machines off-shore (particularly in Japan), it could engender a security risk. Hence, IBM was assured that its temporary funding of Intel would not be subject to antitrust action.

    So much for biography. Now let’s put on our physicist hats. Just how did Moore’s law come to be, and when will it be repealed? The basic concept behind MOSFETs was revealed in patents filed by two physicists, Julius Edgar Lilienfeld in the US and Oskar Heil in the UK, in 1926 and 1935 respectively. (Perhaps these dates should be the real “t = 0″ for Moore’s law.) So why did it take almost four decades for the device to be realized in practice? Developing ancillary tools for fabrication took time, of course, but lack of demand was also a factor. Simply put, it took a while for the window of “conventional” technology (the vacuum tubes, junction transistors and bulk diodes that underpinned the devices that emerged after the Second World War) to slam shut, and for demand for faster and smaller “1” and “0” switches to take off. The micro- and nano- “wrenches” -“hammers” and “pliers” (actually vacuum deposition chambers, X-ray and electron diffraction, lithography and an alphabet soup of other technologies) required for manufacture had actually existed in the tool sheds of academic research institutions, US national laboratories, and a few hi-tech companies (notably IBM and Bell Labs), but it took the opening window of economic promise to get these tools off the shelf.

    So, was the inevitability of Moore’s law foreseen in the basic physics of MOSFETs and of the tools needed for its commercialization? I would argue that it was, and here Richard Feynman deserves a lot of credit. In 1959, well before Moore’s 1965 speculation, Feynman gave his now-famous lecture “There’s Plenty of Room at the Bottom” (a play on the title of the 1959 film Room at the Top). In the lecture, Feynman pointed out that our known laws of materials physics more than allowed the evolution of micro-nano fabrication that gave rise to Moore’s law. And the rest is history.

    Well, almost. On current trends, MOSFET volumes will approach atomic dimensions in a decade, and the last section of Moore’s Law (entitled “All Good Exponentials End”) discusses this problem. Keep in mind we’re talking physics here, not economics. Today, all computers, whether in the cloud or in your pocket, are based on the Turing–Von Neumann stored program concept using “irreversible” binary logic and switching devices. By “irreversible”, I mean that the storage technology is incapable of “remembering” whether it contained a 1 or 0 before its current state. In 1961 – barely a year after Feynman and four before Moore – Rolf Landauer of IBM postulated a thermodynamic limit on the density of irreversible binary logic. Roughly stated, the Landauer limit scales as the number of switches per unit volume times kT ln 2. This unitary Landauer limit was verified in a 2012 article in Nature.

    So when will Moore collide with Landauer? This has been a point of debate for at least a decade, but unfortunately it is not clearly addressed in Moore’s Law. Some have suggested that Landauer’s limit could be overcome by storing and manipulating our 1s and 0s in a black hole – a sort of Feynman cellar, if you will. If we could somehow convey this to Feynman’s spirit today, his response might be, “Of course. There’s still plenty of room at the bottom…and the top of the universe as well!”

    • 2015 Basic Books $35.00hb 560pp

    Searching for life on other planets

    The search for signs of extraterrestrial life looks set to be one of the most exciting scientific endeavours of the 21st century and scientists have no shortage of places to look. Astronomers have already discovered nearly 2000 exoplanets and they look set to find many more. While most of these known exoplanets are gas giants that appear to be inhospitable to life, the discovery of Earth-like rocky exoplanets could come courtesy of the next generation of telescopes.

    In this podcast recorded at the Canadian Association of Physicists Congress in Edmonton, Sara Seager tells physicsworld.com editor Hamish Johnston how astronomers are gearing up to use the James Webb Space Telescope – due to launch in 2018 – and other ground- and space-based facilities to look for water vapour, oxygen and other gases in the atmospheres of rocky exoplanets. These and other gases such as methane could indicate the presence of life on these distant worlds, but Seager points out that many measurements on many different exoplanets will be needed before we can say with reasonable certainty that life exists.

    ‘Metasheet’ blocks a narrow band of radiation, letting the rest pass

    A “metasheet” that is extremely efficient at absorbing electromagnetic radiation in a very narrow band of wavelengths, while remaining transparent elsewhere in the spectrum, has been produced by researchers in Finland and Belarus. It was made by placing simple wire helices in strategic locations throughout the material, so that the helices absorb and dissipate energy contained in both electric and magnetic fields. The metasheet works for microwave radiation, and could be useful for making radiation detectors, telecommunications devices, energy-harvesting systems and even radar-cloaking devices. In principle, the design could also be modified to work for visible light.

    The idea of a device that absorbs radiation at specific wavelengths is not new, but most existing devices reflect the unabsorbed radiation back to its source. This rules out many useful applications where transmission of the unabsorbed radiation is needed. To address this shortcoming, several groups have attempted to develop “Huygens’ metasurfaces”, which comprise arrays of sub-wavelength inclusions that scatter radiation only in the forward direction. If the resonant wavelength of the inclusions is chosen correctly, they can collectively dissipate radiation at a wavelength of choice. Radiation at other wavelengths is diffracted by the inclusions and its wavefronts are reconstructed to achieve transmission.

    Previous designs have used different inclusions to absorb the electric and magnetic components of the incident radiation. While the different inclusions can be designed to have their central resonance peaks at the same wavelength, the absorption tends to fall away at different rates either side of the peaks. This prevents other wavelengths from being perfectly transmitted and leads to undesirable reflections. One solution is to use a material that is “bianisotropic”, which means that it can interact with both the electric and magnetic fields of the incident radiation. While this solves the problem of mismatched interactions away from the resonance peak, previous metasurfaces based on this principle could only observe one circular polarization of radiation.

    Handy helices

    Earlier this year, Viktar Asadchy and colleagues at Aalto University in Finland produced a “metamirror”. This device is transparent to wavelengths away from a resonant wavelength, while reflecting the resonant wavelength at a specific angle. The team has now extended this work to produce a surface that absorbs radiation at a specific wavelength and dissipates its energy as heat.

    With these multifunctional structures, we can achieve completely amazing properties
    Viktar Asadchy, Aalto University

    The researchers made their resonators from helices of nickel-chromium wire, which is a dissipative material commonly used in electrical-resistance heaters. These helices are bianisotropic, which when excited by incident electromagnetic radiation become electrically polarized along the axis of the helix and magnetically polarized azimuthally. Because of its chirality – a helix can either twist with a right-handed or left-handed orientation – each helix is polarization sensitive, and therefore only absorbs light with a single circular polarization. The researchers therefore designed their metamaterial to include both right-handed and left-handed helices embedded in a plastic-foam substrate. This, they calculated, should produce metasheets that absorb light at a desired wavelength regardless of its polarization.

    Manufacturing imperfections

    They then tested the absorption of materials containing both single- and double-turn helical inclusions. At the resonance wavelength, they found that the single-turn helices absorbed 92% of the incident microwave radiation, whereas the double-turn helices absorbed 81%. These absorption figures, while impressive, are lower than the researchers’ theoretical predictions that single-turn helical arrays would absorb 96.5% of radiation and double turn 99.9%. The researchers attribute this difference to manufacturing imperfections.

    The team is now looking to build on this and previous research to produce non-reflecting arrays of transmitters that can be stacked in layers. “For example, the first layer will transmit the wave in one direction or focus it at one point,” Asadchy explains. “The second layer, which will stay behind this first one, will focus a wave of another wavelength at another point. Then we can combine several layers of these transmit arrays because they are transparent at their non-operational wavelengths. “With these multifunctional structures, we can achieve completely amazing properties,” says Asadchy.

    “I think that this work is very significant for our community because it points out a very new device, which is an invisible filter,” says Filiberto Bilotti of the University of Rome. He cautions, however, that while in principle the physics is scalable to work at shorter wavelengths, creating a practical material that works for near-infrared or visible light is “not that trivial” because the conductivity of metals drops at shorter wavelengths. As a result, a different strategy would be needed to create short-wavelength metasheets.

    The metasheets are described in Physical Review X.

    Spain and Chile will host next-generation gamma-ray observatory

    Sites in Spain and Chile have been chosen to host the Cherenkov Telescope Array (CTA) – a huge gamma-ray observatory 10 times more sensitive than existing instruments, which will study supernova explosions, binary star systems and active galactic nuclei. Astronomers working on the project expect they will get approval at the end of the year to start building the arrays. It is hoped that the CTA will begin taking data at both locations by the end of 2020, with full operations by 2023.

    High-energy gamma rays are generated in the most energetic events in the universe, and studying these messengers can reveal important information about the violent processes that created them. When a gamma ray interacts with a particle in the Earth’s atmosphere, it produces a shower of lower-energy particles. These particles travel through the atmosphere faster than the speed of light in the atmosphere, creating a cone of blue light akin to a sonic boom. Telescopes on the ground collect this Cherenkov radiation, which scientists then analyse to determine the energy of the original gamma ray and from what direction it came.

    The CTA will consist of two arrays. The smaller array – consisting of 15 telescopes 12 m in diameter and four at 23 m – will study the northern sky from the Spanish island of La Palma, which is off the Atlantic coast of North Africa. The larger observatory will have 70 telescopes at 4 m diameter, 25 at 12 m and four at 23 m. It will look toward the southern sky from Paranal in Chile’s Atacama Desert, and the first few small telescopes are likely to be deployed to the Chile site in mid-2016.

    ‘Excellent astronomical conditions’

    According to CTA project manager and technical director Christopher Townsley, these two sites were chosen over other competitors for several reasons, including the level of existing infrastructure and the estimated long-term operation costs. “Both the sites chosen have well-established observatories nearby and proven excellent astronomical conditions,” Townsley explains. If for any reason the La Palma and Paranal site negations fall through, however, the group has alternative north and south locations in Mexico and Namibia, respectively.

    The CTA will build on the technologies developed for current ground-based gamma-ray telescopes – such as the Very Energetic Radiation Imaging Telescope Array System in the US as well as the High Energy Stereoscopic System in Namibia – that use Cherenkov imaging techniques. Apart from being 10 times more sensitive than any rival, the CTA will also study a wider range of energies, from about 10 GeV to 300 TeV, although energies above 10 TeV will be accessible only with the southern site’s 4 m instruments.

    Scientists plan to do two astronomical surveys with the CTA: one of the galactic plane that contains the galactic centre – a site swarming with high-energy sources – and the other of one-quarter of the full sky. The observatory could also shed light on dark matter, according to CTA spokesperson Werner Hofmann of the Max-Planck-Institute for Nuclear Physics in Heidelberg. He explains: “If dark matter is indeed made of neutralino particles in the TeV mass range, CTA is in the best position to detect this radiation – a discovery that would have tremendous impact and far-reaching consequences for astrophysics, particle physics and cosmology.”

    The most exciting science that the next-generation gamma-ray detector can do, however, is uncovering new surprises. “When you build an instrument that is much more capable than existing instruments, and you’re exploring a new waveband,” says CTA co-spokesperson Rene Ong of the University of California, Los Angeles, “you’re going to discover something unexpected.”

    Copyright © 2025 by IOP Publishing Ltd and individual contributors