Skip to main content

The Arecibo Observatory’s ‘powerful radiation environment’ led to its collapse, claims report

The Arecibo Observatory’s “uniquely powerful electromagnetic radiation environment” is the most likely initial cause of its destruction and collapse in December 2020. That’s according to a new report by the National Academies of Sciences, Engineering, and Medicine, which states that failure of zinc in the cables that held the telescope’s main platform led to it falling onto the huge 305 m reflector dish – causing catastrophic damage.

While previous studies of the iconic telescope’s collapse had identified the deformation of zinc inside the cable sockets, other reasons were also put forward. They included poor workmanship and the effects of hurricane Maria, which hit the area in 2017. It subjected the telescope’s cables to the highest structural stress they had ever endured since the instrument opened in 1963.

Inspections after the hurricane showed some evidence of cable slippage. Yet these investigations, the report says, failed to note several failure patterns and did not provide plausible explanations for most of them. In addition, photos taken in 2019 gave “a clear indication of major socket deterioration”, but no further investigation followed.

The eight-strong committee, chaired by Roger McCarthy of the US firm McCarthy Engineering, that wrote the report found that move surprising. “The lack of documented concern from the contracted engineers about the inconsequentiality of cable pullouts or the safety factors between Hurricane Maria in 2017 and the failure is alarming,” they say.

Further research

The report concludes that the root cause of the catastrophe was linked to the zinc sockets, which suffered “unprecedented and accelerated long-term creep-induced failure”. Metallic creep – the slow, permanent deformation of a metal – is caused by stress and exacerbated by heat, making components based on the metal to fail. “Each failure involved both the rupture of some of the cable’s wires and a deformation of the socket’s zinc, and is therefore the failure of a cable-socket assembly,” the report notes.

As to the cause of the creep, the committee sees the telescope’s radiation environment as “the only hypothesis that…provides a plausible but unprovable answer”. The committee proposes that the telescope’s powerful transmitters induced electrical currents in the cables and sockets, potentially causing “long-term, low-current electroplasticity” in the zinc. The increased induced plasticity accelerated the natural ongoing creep in the zinc.

The report adds that the collapse of the platform is the first documented zinc-induced creep failure, despite the metal being used in such a way for over a century. The committee now recommends that the National Science Foundation (NSF), which oversees Arecibo, offer the remaining socket and cable sections to the research community for further analysis on the “large-diameter wire connections, the long-term creep behavior of zinc spelter connections, and [the] materials science”.

  • Meanwhile, the NSF had planned to reopen the telescope site as an educational center later this month but that has now be delayed until next year to coincide with the NSF’s 75th anniversary.

Top-cited author Vaidehi Paliya discusses the importance of citations and awards

More than 50 papers from India have been recognized with a top-cited paper award for 2024 from IOP Publishing, which publishes Physics World. The prize is given to corresponding authors who have papers published in both IOP Publishing and its partners’ journals from 2021 to 2023 that are in the top 1% of the most cited papers.

The winners include astrophysicist Vaidehi Paliya from Inter-University Centre for Astronomy and Astrophysics (IUCAA) and colleagues. Their work involved studying the properties of the “central engines” of blazars, a type of active galactic nucleus.

Vaidehi Paliya

“Knowing that the astronomy community has appreciated the published research is excellent,” says Vaidehi. “It has been postulated for a long time that the physics of relativistic jets is governed by the central supermassive black hole and accretion disk, also known as the central engine of an active galaxy. Our work is probably the first to quantify their physical properties, such as the black hole mass and the accretion disk luminosity, for a large sample of active galaxies hosting powerful relativistic jets called blazars.”

Vaidehi explains that getting many citations for the work, which was published in Astrophysical Journal Supplement Series, indicates that the published results “have been helpful to other researchers” and that this broad visibility also increases the chance that other groups will come across the work. “[Citations] are important because they can therefore trigger innovative ideas and follow-up research critical to advancing scientific knowledge,” adds Vaidehi.

Vaidehi says that he often turns to highly cited research “to appreciate the genuine ideas put forward by scientists”, with two recent examples being what inspired him to work on the central engine problem.

Indeed, Vaidehi says that prizes such as IOP’s highly cited paper award are essential for researchers, especially students. “Highly cited work is crucial not only to win awards but also for the career growth of a researcher. Awards play a significant role in further motivating fellow researchers to achieve even higher goals and highlight the importance of innovation,” he says. “Such awards are definitely a highlight in getting a career promotion. The news of the award may also lead to opportunities. For instance, to be invited to join other researchers working in similar areas, which will provide an ideal platform for future collaboration and research exploration.”

Vaidehi adds that results that are meaningful to broader research areas will likely result in higher citations. “Bringing innovation to the work is the key to success,” he says. “Prestigious awards, high citation counts, and other forms of success and recognition will automatically follow. You will be remembered by the community only for your contribution to its advancement and growth, so be genuine.”

  • For the full list of top-cited papers from India for 2024, see here.

How to boost the sustainability of solar cells

In this episode of the Physics World Weekly podcast I explore routes to more sustainable solar energy. My guests are four researchers at the UK’s University of Oxford who have co-authored the “Roadmap on established and emerging photovoltaics for sustainable energy conversion”.

They are the chemist Robert Hoye; the physicists Nakita Noel and Pascal Kaienburg; and the materials scientist Sebastian Bonilla. We define what sustainability means in the context of photovoltaics and we look at the challenges and opportunities for making sustainable solar cells using silicon, perovskites, organic semiconductors and other materials.

This podcast is supported by Pfeiffer Vacuum+Fab Solutions.

Pfeiffer is part of the Busch Group, one of the world’s largest manufacturers of vacuum pumps, vacuum systems, blowers, compressors and gas abatement systems. Explore its products at the Pfeiffer website.

 

Lightning sets off bursts of high-energy electrons in Earth’s inner radiation belt

A supposedly stable belt of radiation 7000 km above the Earth’s surface may in fact be producing damaging bursts of high-energy electrons. According to scientists at the University of Colorado Boulder, US, the bursts appear to be triggered by lightning, and understanding them could help determine the safest “windows” for launching spacecraft – especially those with a human cargo.

The Earth is surrounded by two doughnut-shaped radiation belts that lie within our planet’s magnetosphere. While both belts contain high concentrations of energetic electrons, the electrons in the outer belt (which starts from about 4 Earth radii above the Earth’s surface and extends to about 9–10 Earth radii) typically have energies in the MeV range. In contrast, electrons in the inner belt, which is located between about 1.1 and 2 Earth radii, have energies between 10 and a few hundred kilo-electronvolts (KeV).

At the higher end of this energy scale, these electrons easily penetrate the walls of spacecraft and can damage sensitive electronics inside. They also pose risks to astronauts who leave the protective environment of their spacecraft to perform extravehicular activities.

The size of the radiation belts, as well as the energy and number of electrons they contain, varies considerably over time. One cause of these variations is sub-second bursts of energetic electrons that enter the atmosphere from the magnetosphere that surrounds it. These rapid microbursts are most commonly seen in the outer radiation belt, where they are the result of interactions with phenomena called whistler mode chorus radio waves. However, they can also be observed in the inner belt, where they are generated by whistlers produced by lightning storms. Such lightening-induced precipitation, as it is known, typically occurs at low energies of 10s to 100 KeV.

Outer-belt energies in inner-belt electrons

In the new study, researchers led by CU Boulder aerospace engineering student Max Feinland observed clumps of electrons with MeV energies in the inner belt for the first time. This serendipitous discovery came while Feinland was analysing data from a now-decommissioned NASA satellite called the Solar, Anomalous, and Magnetospheric Particle Explorer (SAMPEX). He originally intended to focus on outer-belt electrons, but “after stumbling across these events in the inner belt, we thought they were interesting and decided to investigate further,” he tells Physics World.

After careful analysis, Feinland, who was working as an undergraduate research assistant in Lauren Blum’s team at CU Boulder’s Laboratory for Atmospheric and Space Physics at the time, identified 45 bursts of high-energy electrons in the inner belt in data from 1996 to 2006. At first, he and his colleagues weren’t sure what could be causing them, since the chorus waves known to produce such high-energy bursts are generally an outer-belt phenomenon. “We actually hypothesized a number of processes that could explain our observations,” he says. “We even thought that they might be due to Very Low Frequency (VLF) transmitters used for naval communications.”

The lightbulb moment, however, came when Feinland compared the bursts to records of lightning strikes in North America. Intriguingly, he found that several of the peaks in the electron bursts seemed to happen less than a second after the lighting strikes.

A lightning trigger

The researchers’ explanation for this is that radio waves produced after a lightning strike interact with electrons in the inner belt. These electrons then begin to oscillate between the Earth’s northern and southern hemispheres with a period of just 0.2 seconds. With each oscillation, some electrons drop out of the inner belt and into the atmosphere. This last finding was unexpected: while researchers knew that high-energy electrons can fall into the atmosphere from the outer radiation belt, this is the first time that they have observed them coming from the inner belt.

Feinland says the team’s discovery could help space-launch firms and national agencies decide when to launch their most sensitive payloads. With further studies, he adds, it might even be possible to determine how long these high-energy electrons remain in the inner belt after geomagnetic storms. “If we can quantify these lifetimes, we could determine when it is safest to launch spacecraft,” he says.

The researchers are now seeking to calculate the exact energies of the electrons. “Some of them may be even more energetic than 1 MeV,” Feinland says.

The present work is detailed in Nature Communications.

First human retinal image brings sight-saving portable OCT a step closer

Image of a human retina taken with the Akepa photonic chip

UK health technology start-up Siloton is developing a portable optical coherence tomography (OCT) system that uses photonic integrated circuits to miniaturize a tabletop’s-worth of expensive and fragile optical components onto a single coin-sized chip. In a first demonstration by a commercial organization, Siloton has now used its photonic chip technology to capture a sub-surface image of a human retina.

OCT is a non-invasive imaging technique employed as the clinical gold standard for diagnosing retinal disease. Current systems, however, are bulky and expensive and only available at hospital clinics or opticians. Siloton aims to apply its photonic chip – the optical equivalent of an electronic chip – to create a rugged, portable OCT system that patients could use to monitor disease progression in their own homes.

Siloton's Akepa photonic chip

The image obtained using Siloton’s first-generation OCT chip, called Akepa, reveals the fine layered structure of the retina in a healthy human eye. It clearly shows layers such as the outer photoreceptor segment and the retinal pigment epithelium, which are key clinical features for diagnosing and monitoring eye diseases.

“The system imaged the part of the retina that’s responsible for all of your central vision, most of your colour vision and the fine detail that you see,” explains Alasdair Price, Siloton’s CEO. “This is the part of the eye that you really care about looking at to detect disease biomarkers for conditions like age-related macular degeneration [AMD] or various diabetic eye conditions.”

Faster and clearer

Since Siloton first demonstrated that Akepa could acquire OCT images of a retinal phantom, the company has deployed some major software enhancements. For example, while the system previously took 5 min to image the phantom – an impractical length of time for human imaging – the imaging speed is now less than a second. The team is also exploring ways to improve image quality using artificial intelligence techniques.

Price explains that the latest image was recorded using the photonic chip in a benchtop set-up, noting that the company is about halfway through the process of miniaturizing all of the optics and electronics into a handheld binocular device.

“The electronics is all off-the-shelf, so we’re not going to focus too heavily on miniaturizing that until right at the end,” he says. “The innovative part is in miniaturizing the optics. We are very close to having it in that binocular headset now, the aim being that by early next year we will have that fully miniaturized.”

As such, the company plans to start deploying some research-only systems commercially next year. These will be handheld binocular-style devices that users hold up to their faces, complete with a base station for charging and communications. Speaking with over 100 patients in focus groups, Siloton confirmed that they prefer this binocular design over the traditional chin rest employed in full-size OCT systems.

“We were worried about that because we thought we may not be able to get the level of stability required,” says Price. “But we did further tests on the stability of the binocular system compared with the chin rest and actually found that the binoculars showed greater stability. Right now we’re still using a chin rest, so we’re hopeful that the binocular system will further improve our ability to record high-quality images.”

The Siloton founding team

Expanding applications

The principal aim of Siloton’s portable OCT system is to make the diagnosis and monitoring of eye diseases – such as diabetic macular oedema, retinal vein occlusion and AMD, the leading cause of sight loss in the developed world – more affordable and accessible.

Neovascular or “wet” AMD, for example, can be treated with regular eye injections, but this requires regular OCT scans at hospital appointments, which may not be available frequently enough for effective monitoring. With an OCT system in their own homes, patients can scan themselves every few days, enabling timely treatments as soon as disease progression is detected – as well as saving hospitals substantial amounts of money.

Ongoing improvements in “quality versus cost” of the Akepa chip has also enabled Siloton to expand its target applications outside of ophthalmology. The ability to image structures such as the optic nerve, for example, enables the use of OCT to screen for optic neuritis, a common early symptom in patients with multiple sclerosis.

The company is also working with the European Space Agency (ESA) on a project investigating spaceflight-associated neuro-ocular syndrome (SANS), a condition suffered by about 70% of astronauts and which requires regular monitoring.

“At the moment, there is an OCT system on the International Space Station. But for longer-distance space missions, things like Gateway, there won’t be room for such a large system,” Price tells Physics World. “So we’re working with ESA to look at getting our chip technology onto future space missions.”

‘Buddy star’ could explain Betelgeuse’s varying brightness

An unseen low-mass companion star may be responsible for the recently observed “Great Dimming” of the red supergiant star Betelgeuse. According to this hypothesis, which was put forward by researchers in the US and Hungary, the star’s apparent brightness varies when an orbiting companion – dubbed α Ori B or, less formally, “Betelbuddy” – displaces light-blocking dust, thereby changing how much of Betelgeuse’s light reaches the Earth.

Located about 548 light-years away, in the constellation Orion, Betelgeuse is the 10th brightest star in the night sky. Usually, its brightness varies over a period of 416 days, but in 2019–2020, its output dropped to the lowest level ever recorded.

At the time, some astrophysicists speculated that this “Great Dimming” might mean that the star was reaching the end of its life and would soon explode as a supernova. Over the next three years, however, Betelgeuse’s brightness recovered, and alternative hypotheses gained favour. One such suggestion is that a cooler spot formed on the star and began ejecting material and dust, causing its light to dim as seen from Earth.

Pulsation periods

The latest hypothesis was inspired, in part, by the fact that Betelgeuse experiences another cycle in addition to its fundamental 416-day pulsation period. This second cycle, known as the long secondary period (LSP), lasts 2170 days, and the Great Dimming occurred after its minimum brightness coincided with a minimum in the 416-day cycle.

While astrophysicists are not entirely sure what causes LSPs, one leading theory suggest that they stem from a companion star. As this companion orbits its parent star, it displaces the cosmic dust the star produces and expels, which in turn changes the amount of starlight that reaches us.

Lots of observational data

To understand whether this might be happening with Betelgeuse, a team led by Jared Goldberg at the Flatiron Institute’s Center for Computational Astrophysics; Meridith Joyce at the University of Wyoming; and László Molnár of the Konkoly Observatory, HUN-REN CSFK, Budapest; analysed a wealth of observational data from the American Association of Variable Star Observers. “This association has been collecting data from both professional and amateur astronomers, so we had access to decades worth of data,” explains Molnár. “We also looked at data from the space-based SMEI instrument and spectroscopic observations collected by the STELLA robotic telescope.”

The researchers combined these direct-observation data with advanced computer models that simulate Betelgeuse’s activity. When they studied how the star’s brightness and its velocity varied relative to each other, they realized that the brightest phase must correspond to a companion being in front of it. “This is the opposite of what others have proposed,” Molnár notes. “For example, one popular hypothesis postulates that companions are enveloped in dense dust clouds, obscuring the giant star when they pass in front of them. But in this case, the companion must remove dust from its vicinity.”

As for how the companion does this, Molnár says they are not sure whether it evaporates the dust away or shepherds it to the opposite side of Betelgeuse with its gravitational pull. Both are possible, and Goldberg adds that other processes may also contribute. “Our new hypothesis complements the previous one involving the formation of a cooler spot on the star that ejects material and dust,” he says. “The dust ejection could occur because the companion star was out of the way, behind Betelgeuse rather than along the line of sight.”

The least absurd of all hypotheses?

The prospect of a connection between an LSP and the activity of a companion star is a longstanding one, Goldberg tells Physics World. “We know the Betelgeuse has an LSP and if an LSP exists, that means a ‘buddy’ for Betelgeuse,” he says.

The researchers weren’t always so confident, though. Indeed, they initially thought the idea of a companion star for Betelgeuse was absurd, so the hardest part of their work was to prove to themselves that this was, in fact, the least absurd of all hypotheses for what was causing the LSP.

“We’ve been interested in Betelgeuse for a while now, and in a previous paper, led by Meridith, we already provided new size, distance and mass estimates for the star based on our models,” says Molnár. “Our new data started to point in one direction, but first we had to convince ourselves that we were right and that our claims are novel.”

The findings could have more far-reaching implications, he adds. While around one third of all red giants and supergiants have LSPs, the relationships between LSPs and brightness vary. “There are therefore a host of targets out there and potentially a need for more detailed models on how companions and dust clouds may interact,” Molnár says.

The researchers are now applying for observing time on space telescopes in hopes of finding direct evidence that the companion exists. One challenge they face is that because Betelgeuse is so bright – indeed, too bright for many sensitive instruments – a “Betelbuddy”, as Goldberg has nicknamed it, may be simpler to explain than it is to observe. “We’re throwing everything we can at it to actually find it,” Molnár says. “We have some ideas on how to detect its radiation in a way that can be separated from the absolute deluge of light Betelgeuse is producing, but we have to collect and analyse our data first.”

The study is published in The Astrophysical Journal.

Black hole in rare triple system sheds light on natal kicks

For the first time, astronomers have observed a black hole in a triple system with two other stars. The system is called V404 Cygni and was previously thought to be a closely-knit binary comprising a black hole and a star. Now, Kevin Burdge and colleagues at the Massachusetts Institute of Technology (MIT) have shown that the pair is orbited by a more distant tertiary star.

The observation supports the idea that some black holes do not experience a “natal kick” in momentum when they form. This is expected if a black hole is created from the sudden implosion of a star, rather than in a supernova explosion.

When black holes and neutron stars are born, they can gain momentum through mechanisms that are not well understood. These natal kicks can accelerate some neutron stars to speeds of hundreds of kilometres per second. For black holes, the kick is expected to be less pronounced — and in some scenarios, astronomers believe that these kicks must be very small.

Information about natal kicks can be gleaned by studying the behaviour of X-ray binaries, which usually pair a main sequence star with a black hole or neutron star companion. As these two objects orbit each other, material from the star is transferred to its companion, releasing vast amounts of gravitational potential energy as X-rays and other electromagnetic radiation.

Wobbling objects

In such binaries, any natal kick the black hole may have received during its formation can be deduced by studying how the black hole and its companion star orbit each other. This can be done using the radial velocity (or wobble) technique, which measures the Doppler shift of light from the orbiting objects as they accelerate towards and then away from an observer on Earth.

In their study, Burdge’s team scrutinized archival observations of V404 Cygni that were made using a number of different optical telescopes. A bright blob of light thought to be the black hole and its close-knit companion star is prominent in these images. But the team noticed something else, a second blob of light that could be a star orbiting the close-knit binary.

“We immediately noticed that there was another star next to the binary system, moving together with it,” Burdge explains. “It was almost like a happy accident, but was a direct product of an optical and an X-ray astronomer working together.”

As Burdge describes, the study came as a result of integrating his own work in optical astronomy with the expertise of MIT’s Erin Kara, who does X-ray astronomy on black holes. Burge adds, “We were thinking about whether it might be interesting to take high speed movies of black holes. While thinking about this, we went and looked at a picture of V404 Cygni, taken in visible light.”

Hierarchical triple

The observation provided the team with clear evidence that V404 Cygni is part of a “hierarchical triple” – an observational first. “In the system, a black hole is eating a star which orbits it every 6.5 days. But there is another star way out there that takes 70,000 years to complete its orbit around the inner system,” Burdge explains. Indeed, the third star is about 3500 au (3500 times the distance from the Earth to the Sun) from the black hole.

By studying these orbits, the team gleaned important information about the black hole’s formation. If it had undergone a natal kick when its progenitor star collapsed, the tertiary system would have become more chaotic – causing the more distant star to unbind from the inner binary pair.

The team also determined that the outer star is in the later stages of its main-sequence evolution. This suggests that V404 Cygni’s black hole must have formed between 3–5::billion years ago. When the black hole formed, the researchers believe it would have removed at least half of the mass from its binary companion. But since the black hole still has a relatively low mass, this means that its progenitor star must have lost very little mass as it collapsed.

“The black hole must have formed through a gentle process, without getting a big kick like one might expect from a supernova,” Burdge explains. “One possibility is that the black hole formed from the implosion of a star.”

If this were the case, the star would have collapsed into a black hole directly, without large amounts of matter being ejected in a supernova explosion. Whether or not this is correct, the team’s observations suggest that at least some black holes can form with no natal kick – providing deeper insights into the later stages of stellar evolution.

The research is described in Nature.

UK particle physicist Mark Thomson selected as next CERN boss

The UK particle physicist Mark Thomson has been selected as the 17th director-general of the CERN particle-physics laboratory. Thomson, 58, was chosen today at a meeting of the CERN Council. He will take up the position on 1 January 2026 for a five-year period succeeding the current CERN boss Fabiola Gianotti, who will finish her second term next year.

Three candidates were shortlisted for the job after being put forward by a search committee. Physics World understands that the Dutch theoretical physicist and former Dutch science minister Robbert Dijkgraaf was also considered for the position. The other was reported to have been Greek particle physicist Paris Sphicas.

With a PhD in physics from the University of Oxford, Thomson is currently executive chair of the Science and Technology Facilities Council (STFC), one of the main funding agencies in the UK. He spent a significant part of career at CERN working on precise measurements of the W and Z boson in the 1990s as part of the OPAL experiment at CERN’s Large Electron-Positron Collider.

In 2000 he moved back to the UK to take up a position in experimental particle physics at the University of Cambridge. He was then a member of the ATLAS collaboration at CERN’s Large Hadron Collider (LHC) and between 2015 and 2018 served as co-spokesperson for the US Deep Underground Neutrino Experiment. Since 2018 he has served as the UK delegate to CERN’s Council.

Thomson was selected for his managerial credentials in science and connection to CERN. “Thomson is a talented physicist with great managerial experience,” notes Gianotti. “I have had the opportunity to collaborate with him in several contexts over the past years and I am confident he will make an excellent director-general. I am pleased to hand over this important role to him at the end of 2025.”

“Thomson’s election is great news – he has the scientific credentials, experience, and vision to ensure that CERN’s future is just as bright as its past, and it remains at the absolute cutting edge of research,” notes Peter Kyle, UK secretary of state for science, innovation and technology.“Work that is happening at CERN right now will be critical to scientific endeavour for decades to come, and for how we tackle some of the biggest challenges facing humanity.”

‘The right person’

Dirk Ryckbosch, a particle physicist at Ghent University and a delegate for Belgium in the CERN Council, told Physics World that Thomson is a “perfect match” for CERN. “As a former employee and a current member of the council, Thomson knows the ins and outs of CERN and he has the experience needed to lead a large research organization,” adds Ryckbosch.

The last UK director-general of CERN was Chris Llewellyn Smith who held the position between 1994 and 1998. Yet Ryckbosch acknowledges that within CERN, Brexit has never clouded the relationship between the UK and EU member states. “The UK has always remained a strong and loyal partner,” he says.

Thomson will have two big tasks when he becomes CERN boss in 2026: ensuring the start of operations with the upgraded LHC, known as the High-Luminosity LHC (HL-LHC) by 2030, and securing plans for the LHC’s successor.

CERN has currently put its weight behind the Future Circular Collider (FCC), which will cost about £12bn and be four times as large as the LHC with a 91 km circumference. The FCC would first be built as an electron-positron collider with the aim of studying the Higgs boson in unprecedented detail. It could later be upgraded as a hadron collider, known as the FCC-hh.

The construction of the FCC will, however, require additional funding from CERN member states. Earlier this year Germany, which is a main contributor to CERN’s annual budget, publicly objected to the FCC’s high cost. Garnering support from the FCC, if CERN selects it as its next project, will be a delicate balancing act for Thomson. “With his international network and his diplomatic skills, Mark is the right person for this,” concludes Ryckbosch.

That view is backed by particle theorist John Ellis from King’s College London, who told Physics World that Thomson has the “ideal profile for guiding CERN during the selection and initiation of its next major accelerator project”. Ellis adds that Thomson “brings to the role a strong record of research in collider physics as well as studies of electron-positron colliders and leadership in the DUNE neutrino experiment and also extensive managerial experience”.

Timber! Japan launches world’s first wooden satellite into space

Researchers in Japan have launched the world’s first wooden satellite to test the feasibility of using timber in space. Dubbed LignoSat2, the small “cubesat” was developed by Kyoto University and the logging firm Sumitomo Forestry. It was launched on 4 November to the International Space Station (ISS) from the Kennedy Space Center in Florida by a SpaceX Falcon 9 rocket.

Given the lack of water and oxygen in space, wood is potentially more durable in orbit than it is on Earth where it can rot or burn. This makes it an attractive and sustainable alternative to metals such as aluminium that can create aluminium oxide particles during re-entry into the Earth’s atmosphere.

Work began on LignoSat in 2020. In 2022 scientists at Kyoto sent samples of cherry, birch and magnolia wood to the ISS where the materials were exposed to the harsh environment of space for 240 days to test their durability.

While each specimen performed well with no clear deformation, the researchers settled on building LignoSat from magnolia – or Hoonoki in Japanese. This type of wood has traditionally been used for sword sheaths and is known for its strength and stability.

LignoSat2 is made without screws of glue and is equipped with external solar panels and encased in an aluminium frame. Next month the satellite is expected to be deployed in orbit around the Earth for about six months to measure how the wood withstands the environment and how well it protects the chips inside the satellite from cosmic radiation.

Data will be collected on the wood’s expansion and contraction, the internal temperature and the performance of the electronic components inside.

Researchers are hopeful that if LignoSat is successful it could pave the way for satellites to be made from wood. This would be more environmentally friendly given that each satellite would simply burn up when it re-enters the atmosphere at the end of its lifetime.

“With timber, a material we can produce by ourselves, we will be able to build houses, live and work in space forever,” astronaut Takao Doi who studies human space activities at Kyoto University told Reuters.

Physicists propose new solution to the neutron lifetime puzzle

Neutrons inside the atomic nucleus are incredibly stable, but free neutrons decay within 15 minutes – give or take a few seconds. The reason we don’t know this figure more precisely is that the two main techniques used to measure it produce conflicting results. This so-called neutron lifetime problem has perplexed scientists for decades, but now physicists at TU Wien in Austria have come up with a possible explanation. The difference in lifetimes, they say, could stem from the neutron being in not-yet-discovered excited states that have different lifetimes as well as different energies.

According to the Standard Model of particle physics, free neutrons undergo a process called beta decay that transforms a neutron into a proton, an electron and an antineutrino. To measure the neutrons’ average lifetime, physicists employ two techniques. The first, known as the bottle technique, involves housing neutrons within a container and then counting how many of them remain after a certain amount of time. The second approach, known as the beam technique, is to fire a neutron beam with a known intensity through an electromagnetic trap and measure how many protons exit the trap within a fixed interval.

Researchers have been performing these experiments for nearly 30 years but they always encounter the same problem: the bottle technique yields an average neutron survival time of 880 s, while the beam method produces a lifetime of 888 s. Importantly, this eight-second difference is larger than the uncertainties of the measurements, meaning that known sources of error cannot explain it.

A mix of different neutron states?

A team led by Benjamin Koch and Felix Hummel of TU Wien’s Institute of Theoretical Physics is now suggesting that the discrepancy could be caused by nuclear decay producing free neutrons in a mix of different states. Some neutrons might be in the ground state, for example, while others could be in a higher-energy excited state. This would alter the neutrons’ lifetimes, they say, because elements in the so-called transition matrix that describes how neutrons decay into protons would be different for neutrons in excited states and neutrons in ground states.

As for how this would translate into different beam and bottle lifetime measurements, the team say that neutron beams would naturally contain several different neutron states. Neutrons in a bottle, in contrast, would almost all be in the ground state – simply because they would have had time to cool down before being measured in the container.

Towards experimental tests

Could these different states be detected? The researchers say it’s possible, but they caution that experiments will be needed to prove it. They also note that theirs is not the first hypothesis put forward to explain the neutron lifetime discrepancy. Perhaps the simplest explanation is that the gap stems from unknown systematic errors in either the beam experiment, the bottle experiment, or both. Other, more theoretical approaches have also been proposed, but Koch says they do not align with existing experimental data.

“Personally, I find hypotheses that require fewer and smaller new assumptions – and that are experimentally testable – more appealing,” Koch says. As an example, he cites a 2020 study showing that a phenomenon called the inverse quantum Zeno effect could speed up the decay of bottle-confined neutrons, calling it “an interesting idea”. Another possible explanation of the puzzle, which he says he finds “very intriguing” has just been published and describes the admixture of novel bound electron-proton states in the final state of a weak decay, known as “Second Flavor Hydrogen Atoms”.

As someone with a background in quantum gravity and theoretical physics beyond the Standard Model, Koch is no stranger to predictions that are hard (and sometimes impossible, at least in the near term) to test. “Contributing to the understanding of a longstanding problem in physics with a hypothesis that could be experimentally tested soon is therefore particularly exciting for me,” he tells Physics World. “If our hypothesis of excited neutron states is confirmed by future experiments, it would shed a completely new light on the structure of neutral nuclear matter.”

The researchers now plan to collaborate with colleagues from the Institute for Atomic and Subatomic Physics at TU Wien to revaluate existing experimental data and explore various theoretical models. “We’re also hopeful about designing experiments specifically aimed at testing our hypothesis,” Koch reveals.

The present study is detailed in Physical Review D.

Copyright © 2025 by IOP Publishing Ltd and individual contributors