Skip to main content

New dawn for South African radio astronomy as major telescope nears completion

A $25m radio telescope in South Africa that is dedicated to observing the early universe is expected to be complete early next year. Nearly six years after construction began, the remaining dozen 14 m-diameter dishes belonging to the Hydrogen Epoch of Reionization Array (HERA) will be installed over the coming months. It will then aim to study the first galaxies and black holes in the universe.

The signal we are trying to detect has travelled over 13 billion light-years to reach us, so we need a big telescope in order to receive enough signal to detect it

David DeBoer

Funded by several institutions as well as the US National Science Foundation, the Gordon and Betty Moore Foundation, HERA is situated next to MeerKAT in the arid Karoo region, near Carnarvon in the Northern Cape province. Using a 350 antenna-array of 14 m dishes, HERA’s primary science goal will be studying the “Epoch of Reionization”. This occurred about 13 billion years ago when the universe was still young and is the period when the first stars and galaxies formed.

The first billion years of cosmic history are shrouded in mystery and the shroud only began to lift when ultraviolet light from the first stars and galaxies ionized the fog of neutral hydrogen gas that filled the universe. HERA will allow scientists to probe this epoch directly.

HERA currently has the best sensitivity to measure the tiny radiation emitted from hydrogen atoms that is used to observe the epoch and this month HERA released the first set of observations in 2017–2018 using about 50 dishes.

“The signal we are trying to detect has travelled over 13 billion light-years to reach us, so we need a big telescope in order to receive enough signal to detect it,” says David DeBoer, HERA project manager and an astronomer at the University of California, Berkeley. “The data taken to-date with the first antennas is allowing us to place some limits, but not actually detect our elusive signal. When done, and when we’ve sufficiently understood the complex details, we expect to detect the signal over may times and many different spatial scales.”

Dawn of an HERA

When complete, HERA will have enough collecting area to quickly detect this signal from far away in the universe. The full HERA array will also be able to probe the universe even further back in time before the stars started to ionize the universe, known as the “cosmic dawn”.

“It will tell us something fundamental about the initial processes of star formation and about cosmology in the early universe, just a mere hundreds of million years after the Big Bang,” says Mario Santos of the University of Western Cape and the South African Radio Astronomy Observatory, who sits on HERA’s board. “But such detection will require very sophisticated signal processing and analysis techniques as well as lots of computing power that are being developed hand in hand with construction of the telescope.”

South Africa currently has several research groups that are working on HERA’s data and its theoretical interpretation. “As we are able, we will make the data public and provide some tools for its use,” says DeBoer.

Hyperbaric oxygen therapy could slow, or even reverse, Alzheimer’s disease pathology

Hyperbaric oxygen therapy (HBOT), an established medical treatment that involves breathing pure oxygen in a pressurized environment, could provide a new means to slow the progression, or even prevent the development, of Alzheimer’s disease. That’s the conclusion of a new study from researchers at Tel Aviv University and Shamir Medical Center.

Reporting their findings in Aging, the researchers demonstrate that HBOT can improve cerebral blood flow and cognitive function in mouse models of Alzheimer’s disease, as well as in elderly patients with significant memory loss. They also show for the first time that HBOT can reduce the volume of amyloid plaques in these mouse models. As amyloid plaques are one of the main hallmarks found in the brains of people with Alzheimer’s disease, this gives hope for similar improvements in Alzheimer’s disease patients.

Reduced cerebral blood flow, and the resulting decrease in oxygen supply to the brain (hypoxia) is associated with the onset of dementia and correlates with the degree of cognitive impairment in Alzheimer’s disease. The team hypothesized that developing a technique to target vascular dysfunctions, such as reduced vessel diameters, may offer a way to treat Alzheimer’s disease – a disorder for which there is presently no effective intervention.

Animal investigations

The researchers first examined the impact of HBOT on 5XFAD transgenic mice, which have impaired cognitive abilities. They exposed the mice to HBOT at twice atmospheric pressure for 60 min per day, 20 times over four weeks. Post-mortem staining revealed a significant reduction in amyloid burden in the hippocampus of treated mice, with fewer and smaller plaques than seen in control untreated 5XFAD mice.

Two-photon microscopy of amyloid plaques

The team also performed two-photon microscopy imaging in live animals to study the dynamics of amyloid plaque formation. Without treatment, pre-existing plaques increased in size over time, with smaller plaques growing the most.

HBOT halted the growth of these small plaques and reduced the volumes of medium-sized and large plaques. Overall, existing plaques in control mice grew by an average of 12.3% over one month, while plaques in treated mice shrank by 40.05% on average.

Using HBOT to increase oxygen delivery to the brain also prevented the formation of new plaques. Over one month, the number of plaques seen in untreated mice almost doubled, while in mice receiving HBOT the number remained constant. Live animal imaging also revealed that HBOT alleviated the reduction in vessel diameter seen in Alzheimer’s disease mice, leading to increased cerebral blood flow and reduced hypoxia.

Finally, the researchers investigated whether these physical responses to HBOT affected the behavioural performance of the mice. They found that HBOT improved the animals’ nest construction abilities and exploratory behaviour, compared with control mice. In addition, control mice showed a decreased spatial recognition and contextual memory performances over time, which was not seen in treated animals.

“We have discovered for the first time that HBOT induces degradation and clearance of pre-existing amyloid plaques – treatment, and stops the appearance of newly formed plaques – prevention,” explains corresponding author Uri Ashery in a press statement.

Clinical study

Motivated by these findings, Ashery, co-author Shai Efrati and colleagues treated six elderly patients with significant memory loss with 60 HBOT sessions over three months. Each session included breathing 100% oxygen via a mask at twice atmospheric pressure for 90 min, with 5 min breaks every 20 min. High-resolution MR imaging before and after HBOT revealed that this treatment improved cerebral blood flow in several brain areas, with increases of 16–23%.

The team employed computerized cognitive tests to evaluate the subjects’ cognitive functions before and after HBOT. At baseline, patients attained a mean global cognitive score of 102.4±7.3 (a score of 100 represents the average in their age group), and a lower mean memory score of 86.6±9.2. HBOT improved both measures, increasing the global cognitive score to 109.5±5.8 and the mean memory score to 100.9±7.8.

“Elderly patients suffering from significant memory loss at baseline revealed an increase in brain blood flow and improvement in cognitive performance, demonstrating the potency of HBOT to reverse core elements responsible for the development of Alzheimer’s disease,” says Efrati.

The researchers conclude that the mouse model findings – combined with the similar effects observed in patients – suggest that HBOT causes structural changes in blood vessels, which serve to increase cerebral blood flow, reduce brain hypoxia and improve cognitive performance. They note that HBOT holds promise for the prevention of Alzheimer’s disease as it not only addresses the symptoms, but targets the core pathology and biology responsible for the advancement of the disease.

‘Most perfect graphene ever’ grows fold-free on metal foil

A team of researchers in Korea claims to have synthesized the most perfect large-area single-crystal graphene film ever by pinning down the temperature above which unwanted folds naturally develop in the carbon sheet. The new fold-free film will likely be used to make high-performance electronic and photonic devices.

Pure samples of graphene (a sheet of carbon just one atom thick) are usually grown using chemical vapour deposition (CVD). In this technique, a metal substrate is allowed to react under vacuum with volatile carbon-containing precursor chemicals, and a thin layer of pure carbon forms on its surface. While CVD is good for making high-quality graphene sheets with relatively large areas (a few centimetres on a side), the resulting samples generally contain some imperfections. These can include grain boundaries, regions with additional layers (adlayers), wrinkles and folds, all of which can degrade the material’s performance.

Compressive stress is the culprit

For scientists in search of ultrapure graphene, wrinkles and folds have proven especially tough to eliminate. These imperfections arise from the compressive stress created by the thermal contraction of the metal substrate on which the graphene is grown. As the materials cool down after the high-temperature CVD process is complete, this stress is partially released, and the resulting “de-adhesion” process produces wrinkles and folds in certain regions of the graphene.

Previously, researchers led by Rod Ruoff from the Center for Multidimensional Carbon Materials (CMCM) at the Institute for Basic Science within the Ulsan National Institute of Science and Technology (UNIST) in Korea discovered folds even in adlayer-free graphene grown on single crystalline copper metal foils. Their measurements showed that these folds are three-layered structures in the graphene sheet that vary in width from ten to hundreds of nanometres. They form parallel to each other at separations of about 50 to 100 microns, and sometimes run for centimetres in length.

While some researchers have reported obtaining graphene without folds – by, for example, using metal film substrates with a lower coefficient of thermal expansion – no group had achieved fold-free graphene films on metal foil substrates. The distinction between foil and film is important because metal foils have many advantages over conventional metal films: they are cheaper; easy to scale up to larger sizes and can be grown by stacking many of them in parallel over a single growth run.

Restricting the growth temperature

Ruoff and colleagues now say that they have produced large areas of single-crystal, fold-free monolayer graphene films on a metal foil substrate by restricting the temperature at which the material is grown to between 1000 K and 1030 K. To do this, they performed a series of experiments in which they used a mixture of ethylene with hydrogen in a stream of argon gas as the precursor to grow graphene on copper-nickel (Cu-Ni) foils developed in their laboratory. By repeating this process at different temperatures (including 1020 K), they determined the temperature at which folds form.

The researchers then used a technique called low-energy electron diffraction (LEED) to show that their fold-free graphene films form as single crystals over the entire growth substrate because they have a single orientation over a large area. They supplemented their LEED results with several other measurements, including transmission electron microscopy performed by UNIST graduate student Myeonggi Choe, a member of Zonghoon Lee’s group at UNIST, to confirm the presence of large area, single crystal, fold-free graphene.

Single crystal with essentially no imperfections

To test the electronic properties of the new material, the team prepared field-effect transistors (G-FETs) from it and found that the devices boasted very high electron and hole mobilities (around 7 × 103 cm2/V/s). Importantly, they observed that the G-FETs could be configured in any direction and at any location within the graphene sheet, with the mobilities of all the G-FETS remaining remarkably similar – a feat not achieved with other graphene films to date. “Such remarkable performance is possible because the fold-free graphene film is a single crystal with essentially no imperfections,” says team member and UNIST graduate student Yunqing Li, who was responsible for this part of the study.

The researchers also showed that they could grow their graphene on five foils (measuring 4 × 7 cm) at once, with each foil yielding two identical pieces of high-quality fold-free film on both sides of the foil. The graphene can then be removed, or delaminated, from the foils in about a minute using a technique called electrochemical bubbling transfer and the Cu-Ni foils can quickly be readied for a new growth/transfer cycle, says team member and UNIST graduate student Meihui Wang. Indeed, Wang found that the team’s 80:20 Cu-Ni(111) foils can be reused over and over once the graphene has been removed from them.

Study co-author Da Luo and colleagues say their high-quality material could be used to make advanced devices with improved electronic properties. “The fold-free sheets can likely also be stacked more easily – either with themselves or sheets made from other two-dimensional materials, something that could further the range of potential applications,” they conclude.

The team, which reports its work in Nature, is now undertaking further studies with this graphene. “We expect to report on these experiments after more is learned,” Ruoff tells Physics World.

Life beyond the Nobel: Brian Josephson and his interest in the mind

With a Nobel prize under your belt and unshackled by the need to “prove” yourself, it must be tempting to set off in new directions – to try your hand at topics beyond the area in which you originally made your name. One Nobel-prize-winning physicist who has perhaps veered off the conventional path more than any other is Brian Josephson, who leads the self-styled Mind-Matter Unification Project at the University of Cambridge in the UK. It aims to understand “from the viewpoint of the theoretical physicist, what may loosely be characterized as intelligent processes in nature, associated with brain function or with some other natural process”.

In other words, Josephson, 81, spends his days thinking about how the brain works, investigating topics such as language and consciousness, and pondering the fundamental connections between music and the mind. Most controversially, as far as physicists are concerned, he also carries out speculative research on paranormal phenomena, a field known as parapsychology. Josephson’s interests even touch on homeopathy and cold fusion – two areas in which few physicists would dare to dabble.

Photo of Brian Josephson

But Josephson’s interest in consciousness and the mind is nothing new. In fact, it began long before his Nobel Prize for Physics, which he won in 1973 for work done in the early 1960s during his PhD at the Cavendish Laboratory, Cambridge. Under the supervision of Brian Pippard, Josephson had predicted that a superconducting current can tunnel through an insulating junction, even when there is no voltage across it, with the current oscillating at a well-defined frequency when a voltage is applied (Phys. Lett. 7 251). Such “Josephson junctions” are central to superconducting quantum interference devices (SQUIDs), which can measure magnetic fields with exquisite sensitivity.

But almost as soon as his PhD was over, Josephson’s attention quickly switched elsewhere. During a year-long postdoc at the University of Illinois at Urbana-Champaign, John Bardeen (still the only person ever to have won two Nobel physics prizes) tried to persuade Josephson to continue his work in on superconductivity. Unconvinced, he decided to work instead with Leo Kadanoff on critical phenomena. “But after that I felt that the easier and more interesting aspects of many-body theory had been done and I started trying to understand brain function,” Josephson says.

I felt that the easier and more interesting aspects of many-body theory had been done and I started trying to understand brain function

Brian Josephson

His interests were piqued further after returning to Cambridge, when Josephson got to know the mathematical geneticist George Owen, who in his spare time investigated poltergeist claims. “He talked to me about parapsychology generally and got me interested in that, particularly as I could see parallels between psi phenomena and quantum mechanics,” Josephson recalls. In 1974, shortly after picking up his Nobel prize in Stockholm, Josephson was invited by Owen to a conference on “psychokinesis” in Toronto, where he saw demonstrations of metal bending. “I did some research on this but have always regarded it as a sideline,” Josephson says.

Still, Josephson’s new line of interest was clearly established. He went on to give a course on “creative intelligence” at the Cavendish and even collaborated with Hermann Hauser – the physicist and technology entrepreneur who helped set up Acorn Computers – on a paper on the logic of developmental processes. As the years went by, Josephson began getting invited to conferences concerned with mind processes. “I then started trying to link concepts such as semiosis with quantum physics. Currently I’m working with a quantum physicist who is developing the maths side,” he says.

Looking back on his career, Josephson thinks he would have moved into studying the mind even if he had never received his Nobel prize. “Having tenure would probably have given me freedom to work on it,” he says. Still, despite the prestige that a Nobel brings, life outside the mainstream has not been easy. “The Nobel prize didn’t stop the department being very hostile,” Josephson claims, citing incidences of potential collaborators being discouraged from working with him and having their promised funding withdrawn.

Recognition that mind is fundamental rather than matter will be as significant step for physics as the step from classical to quantum physics.

Brian Josephson

He has also faced criticism from the likes of geneticist David Winter, who have accused him of suffering from “Nobel disease” – the notion that a Nobel prize gives a scientist who is an expert in one area an “unfounded confidence” to speak on subjects they know nothing about. Winter believes the affliction encourages sufferers to “spout anti-scientific rubbish”, citing the Nobel-prize-winning chemist Linus Pauling who thought that high doses of vitamin C are medicinally useful.

Such comments do not seem to deter Josephson, who believes that, on the contrary, it’s his critics who are in the dark. “It is people such as Winter who speak with unfounded confidence, on subjects they know essentially nothing about such as telepathy, or memory of water,” he insists. “In the latter case, fallacious arguments are frequently used to dismiss the possibility.”

Indeed, Josephson told Physics World he thinks his present work is “much more important than my superconductivity work” even though it has not been finalized. “The point is that recognition that mind is fundamental rather than matter will be as significant step for physics (and for science generally) as the step from classical to quantum physics,” he says. “A number of people have asserted this in the past of course and the question is when we will reach the ‘tipping point’ when ’the club’ will start to take note?”

Oxford Instruments Logo 2021

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more.

From order to disorder: NMR insights into ionic conduction in battery materials

Want to learn more on this subject?

The development of next-generation solid-state ion conductors hinges on an understanding of microscopic diffusion mechanisms and the identification of roadblocks along macroscopic diffusion pathways (e.g. intragrain defects and grain boundaries). At the microscopic scale, ion conduction relies on transient short-range interactions between the diffusing and framework ions, and on the connectivity of the diffusion sites, hence, on the local structure and composition.

Two common assumptions in the design of solid electrolytes are that: 1) the absence of planar defects in the bulk of crystalline inorganic electrolytes; and 2) rapid polymer chain rearrangements (segmental motion) in polymer electrolytes; are required for fast ion diffusion. Yet, our recent work on Li- and Na-ion conducting rock salt halide electrolytes and polymeric ionic liquids demonstrates that these rules are relaxed for specific composition-structure combinations, such that fast ion diffusion on the order of mS/cm can be obtained in rock salt halides containing a significant fraction of planar defects, and in semi-crystalline polymeric ionic liquids.

Using a combination of electrochemical impedance spectroscopy (EIS), solid-state nuclear magnetic resonance (ssNMR), pulsed field gradient NMR (PFG-NMR), NMR relaxometry, and first principles calculations, we provide a multiscale understanding of ion diffusion processes and link these findings to local structure features, crystallinity and materials synthesis/processing conditions.

Want to learn more on this subject?

Raphaële Clément is an assistant professor in the materials department at the University of California, Santa Barbara (UCSB), US. She received her PhD in chemistry in 2016 from the University of Cambridge, UK, working under the supervision of Prof. Clare Grey. Her doctoral work focused on the study of layered sodium transition metal oxide cathodes for Na-ion secondary batteries. She then joined Prof. Gerbrand Ceder’s group at the University of California, Berkeley (UC Berkeley), US, focusing on cation-disordered rock salt oxyfluorides for Li-ion battery applications. She joined the UCSB faculty in 2018. Her primary research focus is the development and implementation of magnetic resonance techniques (experimental and computational) for the study of battery materials and beyond, with a strong emphasis on operando tools. She is an Associate Editor for Battery Energy, a new open-access journal by Wiley.




X-ray guidance keeps cardiac radioablation on target

Image projections and target outlines

With a single radiation treatment, a patient’s irregular heartbeat will be corrected; but there’s a catch. This particular patient inhales unevenly but exhales smoothly, presenting an uncommon challenge to the patient’s care team: as the patient breathes unsteadily, their heart moves irregularly, so healthy heart tissues will receive some unwanted  radiation.

Nicholas Hindley, an Australian Fulbright scholar and PhD candidate at the University of Sydney’s ACRF Image X Institute, may have found a ready solution to this problem. He and his collaborators developed an algorithm that tracks the diaphragm – the predominant driver of internal organ motion during a radiation treatment – in real-time using low-energy X-rays. The tracking algorithm would help a patient’s care team irradiate affected regions of the heart and avoid healthy tissue without needing any specialized equipment, just a standard linear accelerator and some code.

Irregular diaphragm motion? No problem

Hindley is driven by the increasing global burden of cardiac arrhythmias – abnormal heart rhythms – and the growing demand for timely, inexpensive and non-invasive treatments like cardiac radioablation.

“There has been a recent flurry of activity in the medical physics community to develop safe and effective ways for treating abnormal electrical activity in the heart using radiotherapy,” said Hindley. “The technology required for real-time tracking of targets and organs-at-risk is available now, without the need for fancy, expensive equipment. We simply need to complement existing hardware with clever software.”

Hindley’s latest work, published in Physics in Medicine & Biology, uses simulation studies to demonstrate that the research team’s diaphragm-tracking algorithm, combined with the imaging capabilities of a standard linear accelerator-based radiotherapy system, can focus radiation to certain regions of the heart while avoiding others, all while a patient breathes.

The algorithm uses pictures obtained with four-dimensional computed tomographic (4D-CT) imaging to figure out how far the heart moves relative to the diaphragm’s motion. This respiratory motion model is then used to track the diaphragm, and thereby the heart, in real-time during treatment. The method accounts for motion of the heart’s substructures by applying a uniform margin before treatment.

“There are many parameters within this method that may be tweaked, and the use of asymmetric margins is certainly one of them,” Hindley explains. “One could speculate that asymmetric margins may work well in an image-guided regime, since we know that the motion and deformation experienced by cardiac substructures is not uniform across the left–right, superior–inferior and anterior–posterior axes.”

Motion tracking during radioablation

Saving lives, improving quality of life

Hindley notes that their study is proof-of-concept as it uses digital phantoms developed by Paul Segars at Duke University. These digital representations of a patient, created using real patient images, are adjusted using biomechanical models so that researchers like Hindley can test their algorithms. Hindley plans to demonstrate the performance of the team’s code on real patient images using clinical trial data, anthropomorphic phantoms, or some combination of these.

One challenge Hindley foresees before the algorithm is implemented clinically is justifying the additional radiation dose the patient receives with X-ray imaging guidance. But he argues that the benefits outweigh the potential costs.

“This added radiation dose from X-ray imaging is miniscule when compared to the potential reductions in radiation with a more targeted approach,” Hindley says. “We hope tools like ours increase confidence for radiotherapy teams in treating these tricky targets. Over the long-term, we hope to facilitate the widespread adoption of cardiac radioablation, which has the potential to save or improve the quality of millions of lives every year.”

Life beyond the Nobel: how Luis Alvarez deduced the disappearance of the dinosaurs

I don’t remember the first time I heard the theory that the dinosaurs were wiped out by an asteroid crashing into the Earth. It’s a dramatic story that gets told to wide-eyed children in classrooms and natural history museums at an earlier age than many can remember, so it feels more like absorbed knowledge. What is less commonly known, however, is that one of the originators of this proposal was Luis Walter Alvarez, who won the 1968 Nobel Prize for Physics for his work on the hydrogen bubble chamber.

But it wasn’t just dinosaurs and asteroids that Alvarez got excited about. Throughout his long and varied career, Alvarez was also involved in sending particle detectors into the sky in high-altitude balloons and searching for hidden chambers inside ancient Egyptian pyramids. It appears that his innate curiosity and experimental creativity, which were so vital for winning the Nobel prize, also led him to investigate many more questions both within physics and beyond.

Born in San Francisco in 1911, Alvarez studied at the University of Chicago and gained a PhD there too after building a cosmic ray telescope with Arthur Compton. He then moved to the University of California, Berkeley, working with nuclear scientist Ernest Lawrence to obtain the first observational evidence of K-electron-capture – the process by which a proton in a nucleus can absorb an atomic electron, turning into a neutron and emitting a neutrino. He also developed a method to produce beams of very slow neutrons; and, with Felix Bloch, measured the magnetic moment of the neutron.

His innate curiosity and experimental creativity, which were so vital for winning the Nobel prize, also led him to investigate many more questions both within physics and beyond

After war-time military research, including a stint on the Manhattan nuclear-bomb project, Alvarez returned to Berkeley, becoming an expert on particle accelerators. Most importantly, he led the development of the hydrogen bubble chamber in the 1950s, with which his team then discovered many particles and resonance states.

Luis Walter Alvarez portrait

In the years between developing the hydrogen bubble chamber and winning the Nobel prize for it, however, Alvarez started taking his expertise out of the purpose-built lab and into real-world settings. In 1964 he proposed gathering data on high-energy particle interactions by sending lab equipment to high altitudes in balloons. This might sound like a whimsical idea, but it resulted in the High Altitude Particle Physics Experiment, which paved the way for the Cosmic Background Explorer (COBE) satellite.

Perhaps the 1960s brought Alvarez a flurry of out-of-the-lab inspiration, for it was in 1965 that he suggested a new investigation of the Egyptian pyramids. As unexpected a project as this sounds for a physicist, there was a key connection with his previous work; the idea was to place a particle detector underground beneath the pyramids to measure muons – one component of the cosmic rays constantly showering Earth. This is called muon tomography, and it can indicate hollow spaces in a structure through differences in the energies of muons coming from different directions.

Together with an international team of archaeologists and physicists, Alvarez spent a few years using this technique to search the Pyramid of Khafre – the second largest of the Pyramids of Giza – and the project was in full swing when he was awarded the Nobel prize. His biography published by the Nobel Committee at the time did not mention his archaeological exploits, which was perhaps no bad thing as, when the search concluded the following year, no chambers had been detected despite 19% of the pyramid having been scanned.

This null result might not sound exciting, but it was a meaningful one for archaeologists, and muon tomography has continued to be a useful tool in searching other structures. Speaking to Physics World in 2014, Arturo Menchaca, a physicist who has used muons to study the Pyramid of the Sun in Mexico, recalled once meeting Alvarez and referring to the project at the Pyramid of Khafre as having discovered nothing. “He furiously corrected me,” Menchaca said. “He had demonstrated there was nothing inside the pyramid.”

Since the pyramid project was well under way before Alvarez won the Nobel prize, it can’t have been this prestige that allowed him to pursue such a left-field idea. But perhaps the fact that he had already done such stellar (albeit more conventional) work in physics gave him the freedom and credibility to lead a team on this bold departure from the lab.

So when his son Walter, who was a geologist, told him about the mystery surrounding the dinosaur extinction, it was perhaps no surprise that Alvarez was quick to get involved. He enlisted the help of two nuclear chemists he knew at Berkeley – Frank Asaro and Helen Michel – to study the layer of sediment that represents, within multitudes of geological strata, the point in time when the extinction happened.

Perhaps the fact that he had already done such stellar (albeit more conventional) work in physics gave him the freedom and credibility to lead a team on this bold departure from the lab

The team discovered that the layer is hundreds of times richer in iridium than average, and suggested that an asteroid strike covered the Earth in the element and triggered the mass extinction event. The theory, now known as the Alvarez hypothesis, was hotly debated, and Alvarez defended it ardently right up until his death in 1988.

More evidence has accumulated over the years, not least the discovery of the huge Chicxulub impact crater under the Yucatán Peninsula in Mexico. Although there isn’t a complete consensus among geologists, the “Alvarez hypothesis” is now generally accepted as the most likely explanation for why the dinosaurs disappeared.

It’s hard to compare Alvarez’s eclectic mix of achievements with one another, for the very reason that they are in such disparate areas  – he even looked into the assassination of President John F Kennedy. His bubble-chamber work will naturally be what physicists remember him for, but for the public, it was his research relating to asteroids and dinosaurs that captured the imagination. How remarkable that his Nobel-prize-winning work isn’t even his most famous achievement.

Oxford Instruments Logo 2021

Physics World‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience, a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more.

New optical transistor uses quasiparticle condensate to switch rapidly

A new optical transistor has been designed by researchers in Russia, Switzerland, and Germany. The team, led by Anton Zasedatelev at Skoltech in Moscow, used a combination of laser beams, an optical cavity, and a specialized organic polymer to trigger sudden switching between two distinct quantum states in their device. The transistor could be a promising step towards advanced optical computers, which have the potential to outperform their electronic counterparts.

The transistors at the heart of modern technology work by switching currents of electrons on and off. As electrons flow through circuits, they dissipate heat. Getting rid of this waste heat is a significant challenge in modern chips containing vast numbers of tightly-packed transistors.

A more energy efficient avenue involves quantum optics technologies. These replace the role of electrons with photons, which dissipate much less heat than electrons. In their study, Zasedatelev’s team developed a new concept for an optical transistor, This features an organic semiconducting polymer, sandwiched between the highly reflective walls of a light-trapping microcavity. The researchers direct two pulsed laser beams at the material: a bright “pump” laser, and a far weaker “seed” laser, which delivers just a few photons per pulse.

Light–matter hybrids

As the pump pulses bounced repeatedly between the cavity walls, their intensity is boosted by a factor of up to 23,000. This results in a strong coupling between the laser photons and the polymer’s organic molecules, generating groups of quasiparticles called exciton-polaritons – quantum particles that are a hybrid of light and matter.

When the seed pulses are switched on, it stimulates the exciton-polaritons to make a sudden transition from a quantum state with the same energy as the pump beam, to a Bose-Einstein condensate. The latter is an exotic state of matter comprising many identical exciton-polaritons in a collective ground state.

By measuring the difference between the number of exciton-polaritons in their ground state both with and without the seed beam, Zasedatelev and colleagues could reliably detect light at the single-photon level. As a result, their device could rapidly and efficiently switch between two possible logic states – making for an ideal optical transistor.

Faster and less power

Compared with the latest electrical transistors, the device exhibited numerous advantages: it requires 10,000 times less power and operates at room temperatures. In addition, it could perform some 1 trillion operations per second – at least 100 times faster than the most advanced electrical transistors available today.

Although the commercial rollout of the technology is still some way off, Zasedatelev’s team expect that further improvements could soon be made by replacing the organic polymer with perovskite crystals, which enhance the coupling between light and matter. This could enable the use of less intense pump lasers, further reducing power consumption. More broadly, the researchers hope that their transistor could become part of a growing toolkit of optical components, suitable for a new generation of vastly superior optical computers.

The research is described in Nature.

UK government sets out plan to ‘unleash’ nation’s space sector

The UK government has released a long-term vision for the country’s burgeoning space sector. The National Space Strategy includes several measures that it says will “unleash” the industry’s potential and brings together – for the first time – the UK government’s civil and defence space activities. Yet while the strategy includes many bold aims, some state that it is recycling “old ideas”.

The report, released yesterday, notes that the global space economy is projected to grow from an estimated £270bn in 2019 to £490bn by 2030. The space sector in the UK is worth over £16.4bn per year and employs more than 45 000 people. Next year, the UK aims to become the first country to launch a rocket into orbit from Europe and by 2030 hopes to be a leading provider of commercial small satellite launches in Europe.

The strategy notes that while commercial space stations are being planned and built – and space-tourism operators are taking their first customers into orbit – advances in space are also a threat to information networks. “Space is changing,” the report notes, “the UK must respond”. The strategy therefore brings together science and technology, defence, regulation and diplomacy into what it calls a single “bold national vision”.

Cash will be the deal breaker

Chris Newman

The document includes a package of measures to “unlock growth in the UK space sector”. They include allowing space businesses to access private finance through “space-oriented venture capital funds”; maintaining the UK’s role in the European Space Agency while building new relationships with other countries; as well as collaborating on the NASA-led Artemis programme to return humans to the Moon.

The strategy also commits to the delivery of the UK’s first Defence Space Portfolio, which will see the government investing an additional £1.4bn – above the £5bn already committed to enhance the military’s satellite communications.

Chris Newman, a professor of space law at Northumbria Univeristy, claims that the new UK strategy is “more of a restatement of existing ideas rather than a new strategic direction”. He adds, however, that the proposal is “much more integrated in terms of defence and civilian than anything we’ve seen previously”. Newman says that while it is “nice to see mention of things like sustainability and applications of space data to climate change, cash will be the deal breaker.”

Solar power

Alongside the space strategy, meanwhile, the UK government has released a feasibility study into Space Based Solar Power (SBSP). The technique would involve placing satellites – adorned with photovoltaic panels – into geo-stationary orbit. The solar energy captured by the panels would then be beamed to a fixed point on Earth via radio waves.

The feasibility study, which examined two SBSP concepts — the US-led SPS Alpha and the UK-led CASSIOPeiA — states that the engineering challenges for SBSP could be overcome to allow enrolment of the technology by the 2050s to meet “Net Zero” emissions pledges.

The report states that a thorough cost and economic analysis should now be undertaken. The UK government has also stated that future funding will be made available for SBSP technologies through the £1bn Net Zero Innovation Portfolio.

X-ray dark-field imaging may help diagnose lung disease

© AuntMinnieEurope.com

X-ray dark-field chest imaging – a new technique touted as the most significant advance in standard chest X-ray in 100 years – has shown for the first time that it may help diagnose lung disease in humans, according to a study published in Radiology.

German researchers designed and built a prototype and tested the system in healthy patients. They confirmed X-ray dark-field chest imaging picks up signals in the lungs that are undetected in standard chest X-rays and established its qualitative and quantitative characteristics – a crucial step toward evaluating the new system in future trials.

“These findings prove that the dark-field signal is indeed sensitive to the subject’s lung condition alone and is independent from demographic factors, highlighting its potential value for diagnosis and monitoring of respiratory diseases,” write a team led by first author Florian Gassert of the Technical University of Munich.

The researchers first investigated X-ray dark-field chest imaging in 2008. In contrast to attenuation-based conventional radiography, dark-field imaging harnesses the wave properties of X-rays. Specifically, the system detects the signal of ultrasmall-angle scattering that takes place in water-to-air transitions in the alveolar structure of the lungs.

In attenuation-based radiography, dense structures generate a high signal, while in dark-field imaging, the small-angle scattering in lung tissue generates a high signal. Because dark-field X-ray imaging excludes unscattered photons, the space around the lungs appears dark (because there is no material there to scatter photons).

Since they developed the system, the researchers have conducted a number of animal studies that show the signal in X-ray dark-field chest imaging decreases in lung diseases that interfere with alveolar structure, such as emphysema, fibrosis, lung cancer and ventilation-induced lung damage.

In the current study, they sought to describe the qualitative and quantitative characteristics of X-ray dark-field images for the first time in humans.

Attenuation-based and dark-field radiographs

Between October 2018 and January 2020, the researchers enlisted 40 healthy patients who underwent chest CT as part of their diagnostic workup. Inclusion criteria were a normal chest CT scan, the ability to consent and the ability to stand upright without help.

Importantly, the researchers developed a prototype that acquired both attenuation-based and dark-field chest radiographs simultaneously. Each patient’s total dark-field signal was correlated with his or her lung volume, and the dark-field coefficient was correlated with age, sex, weight and height.

The researchers found normal human lungs on dark-field chest X-ray imaging produced high signal, while the surrounding osseous structures had low signal and soft tissue produced no signal. The average total dark-field signal intensity over all participants for the entire lung was (17±4)×10-3 m2 and the total signal demonstrated a positive correlation with lung volume.

In addition, no differences were found between men and women, and age, weight and height did not influence the dark-field signal.

“Because of the nature of signal generation in dark-field imaging, bone structures and soft tissue generate only a minimal dark-field signal compared with lung tissue. This feature allows for a detailed depiction of lung tissue without impairment by surrounding structures. We found that the quantitative X-ray dark-field coefficient based on the total dark-field signal and lung size is independent from the subject’s characteristics,” the researchers write.

The team noted limitations, namely that only healthy subjects were included. However, future studies will assess abnormal changes in the lung tissue, they conclude.

In an accompanying editorial, Hiroto Hatabu and Bruno Madore of Harvard Medical School note that dark-field X-ray is still in its infancy – even results in animal models are still very preliminary – but that the work is an important step toward establishing normative values for dark-field X-ray chest imaging in humans.

Chest imaging has come a long way since it first emerged as a clinical test after the discovery of X-rays by Wilhelm Röntgen, yet through it, all X-ray imaging has continued to only exploit the particle aspect of X-ray photons rather than their wave aspects, they say.

“We welcome the addition of the dark-field approach to the universe of chest radiography methods available for human imaging and diagnoses,” Hatabu and Madore write.

  • This article was originally published on AuntMinnieEurope.com ©2021 by AuntMinnieEurope.com. Any copying, republication or redistribution of AuntMinnieEurope.com content is expressly prohibited without the prior written consent of AuntMinnieEurope.com.
Copyright © 2026 by IOP Publishing Ltd and individual contributors