Skip to main content

Seeking cosmic particles using a super-pressure balloon, the physics of babies

This episode of the Physics World Weekly podcast features an interview with Angela Olinto, who is principal investigator of the EUSO-SPB2 mission. EUSO stands for Extreme Universe Space Observatory and SPB refers a super pressure balloon, which will soon be hoisting the experiment to an altitude of 33 km. There it will spend about 100 days detecting neutrinos and ultra-high energy cosmic rays.

Olinto, who is based at the University of Chicago, talks about the challenges of operating a particle-detection system floating high above Earth and what the EUSO-SPB2 collaboration hopes to observe.

Also in this episode, Physics World’s Michael Banks talks about his new book, The Secret Science of Baby, which charts the first 1000 days of human development starting at conception. Banks talks about the physics of three key phenomena related to the creation and development of a child – the swimming of sperm; the operation of the placenta; and the development of speech. These are also described in a feature article by Banks that appears in Physics World.

Defect suppression enables continuous-wave deep-UV lasing at room temperature

Researchers have succeeded in making the first room-temperature continuous-wave deep-UV laser diode, using wide-bandgap semiconductor materials. The device could find applications in novel sterilization systems and for more precise laser processing.

Invented in the 1960s, laser diodes today operate at wavelengths ranging from the infrared to blue-violet, with applications such as optical communications devices and Blu-ray discs. Until now, however, they did not work in the deep-UV part of the electromagnetic spectrum.

A team led by Nobel laureate Hiroshi Amano at Nagoya University’s Institute of Materials and Systems for Sustainability (IMaSS) began developing deep-UV laser diodes back in 2017, thanks to a collaboration with Asahi Kasei, the company that made the first 2-inch aluminium nitride substrates. These materials are ideal for growing aluminium gallium nitride (AlGaN) films for UV-light-emitting devices.

The first devices that the team made required input powers of 5.2 W, which was too high for continuous-wave lasing because it heated up the diode too quickly and stopped it from lasing.

In their new work, Amano and colleagues overcame this problem. By improving the design of the device structure, they could suppress the heat generated during operation. In particular, they eliminated the crystal defects that occur at the laser stripe in the AlGaN and deteriorate the paths through which current propagates. They achieved this by tailoring the side walls of the laser stripe such that current was able to efficiently flow to the active region of the laser diode. In this way, they could reduce the required operating power for 274 nm laser diodes to just 1.1 W at room temperature.

“Compared to conventional deep-ultraviolet lasers, our laser is more compact and can achieve higher efficiency,” says Amano. “The device could be employed in practical applications in healthcare, including virus detection. More broadly, it could be used to detect particulates, in gas analysis and high-definition laser processing.”

“Its application to sterilization technology could be ground-breaking,” adds team member Zhang Ziyi. “Unlike the current LED sterilization methods, which are time-inefficient, lasers can disinfect large areas in a short time and over long distances.”

The Nagoya team now plans to improve the operating characteristics of their laser diode for practical use. “We also hope to realize a shorter-wavelength laser diode,” Amano tells Physics World.

The study is detailed in Applied Physics Letters.

Einstein as you’ve never seen him before

In November 1922, just over a century ago, Albert Einstein received the Nobel Prize for Physics. That much is clear. What you might not know is that the prize was officially awarded for the year 1921. The delay was due, in part, to a controversy in the physics community over Einstein’s general theory of relativity and, to a lesser extent, antisemitism among certain physicists – including the Nobel laureate Philipp Lenard, who would later support the Nazis.

Indeed, relativity went unmentioned in the citation from the Royal Swedish Academy of Sciences. It referred instead to Einstein’s “services to theoretical physics and in particular…his discovery of the law of the photoelectric effect”, which was his key contribution to quantum theory. Not that Einstein was concerned: when he gave his Nobel-prize lecture in 1923, he ignored the citation and talked about relativity, not quantum theory.

Published to mark the Nobel prize’s centenary, the book is a lavishly produced and highly appealing collection of photographs, most of which are signed by Einstein

I mention the Nobel-prize episode because it is characteristic of Einstein’s unique work and personality. As he remarked in 1930: “To punish me for my contempt for authority, fate made me an authority myself.” I was therefore delighted to see a handwritten version of this immortal aphorism, with his signature, appearing in solitary splendour on the final page of Einstein: the Man and His Mind by Gary Berger and Michael DiRuggiero.

Published to mark the prize’s centenary, the book is a lavishly produced and highly appealing collection of photographs, most of which are signed by Einstein and others. Arranged chronologically, they are complemented by images of his letters, scientific manuscripts and publications. These are all rounded off with quotations from Einstein and a lively commentary and captions by Berger and DiRuggiero.

The earliest known signed photograph comes from 1896, taken to commemorate Einstein’s graduation from high school in Switzerland. Even as a teenager, his frizzy hair shows hints of its legendary adult unruliness. The book ends with what is believed to be the last photograph he signed, at home in Princeton not long before his death, aged 76, in 1955.

A physician, rather than a physicist, Berger is a retired surgeon who has written 170 medical articles and a dozen books about reproductive medicine. He first became fascinated with Einstein about 20 years ago while working in Chapel Hill, North Carolina, where he started collecting portraits of the great man, some signed by him.

With the encouragement of DiRuggiero, owner of the Manhattan Rare Book Company, Berger then began to acquire documents relating to Einstein’s scientific life. Eventually he invited DiRuggiero to help curate this material. Located in Chapel Hill as the Berger Collection, it is now probably the largest archive of Einstein imagery in private hands.

“Not being a physicist, I could appreciate his pictures, if not the complex mathematics in his writings,” writes Berger in his preface. “The photos gave me the feeling of a personal connection to Albert Einstein – the real, living man – almost as if I knew him.” It is a view shared by DiRuggiero, who describes in the epilogue how he would often find himself staring at a photo of Einstein and wondering what makes his image so powerful.

“Whether he is trying to solve a difficult scientific problem, full of despair contemplating the fate of the world, or lightheartedly playing with children, Einstein seems to be communicating his emotions directly to us,” DiRuggiero writes. “Somehow, in looking at these photographs we feel we know him, that if he walked into the room right now we could talk to him and understand each other. This is extraordinary, considering we are contemplating someone who explored realms of thought inaccessible to nearly all of us.”

Albert Einstein with his children

The book contains a foreword by the physicist Hanoch Gutfreund, academic head of the Albert Einstein Archives at the Hebrew University of Jerusalem, who feels the photos from the second half of Einstein’s life “evoke an image of a friendly non-conformist”. He thanks Berger and DiRuggiero for generously donating all royalties from the book to the extraordinary Jerusalem collection. It deserves to succeed.

It’s not the first book of this kind, though, with a similar, large-format title having been published by Ze’ev Rosenkranz and Barbara Wolff of the Einstein Archives in 2007. Entitled Albert Einstein: the Persistent Illusion of Transience, that book was both wonderfully illustrated and authoritative. Einstein: the Man and His Mind, however, is stronger in its illustrations than in its text.

Astonishingly, it mentions neither Cambridge nor Oxford universities. Cambridge was the scientific home not only of Isaac Newton (an inspiration for the young Einstein) but also of Arthur Eddington, who led the team that provided the astronomical proof of general relativity in 1919 and who was friendly with Einstein. Oxford, meanwhile, was the scientific home of the physicist Frederick Lindemann, who hosted Einstein in the city in 1931, 1932 and 1933, on the final occasion as a refugee from Nazism.

In fact, the book contains no reference to Britain’s vital role in rescuing Einstein from likely assassination by Nazi agents in Belgium in 1933. Nor is there any mention of Abraham Flexner, who founded the Institute for Advanced Study in Princeton, where Einstein settled in 1933. Other omissions include Einstein’s first wife, Mileva Marić (mother of his two children); philosopher Bertrand Russell (who organized the crucial Russell–Einstein Manifesto against nuclear weapons in 1955); and Mahatma Gandhi (whom Einstein the pacifist called “the greatest political genius of our time”).

Another person not mentioned in the book is Wolfgang Amadeus Mozart, whom Einstein (a violinist himself) regarded as his favourite composer. “Mozart’s music is so pure and beautiful,” Einstein once said, “that I see it as a reflection of the inner beauty of the universe.” It is a shame that Mozart is missing because Einstein also sought that beauty – not through music, but through mathematics, reasoning and pure thought.

  • 2022 Damiani 209pp £60.00hb

THETIS phantom detects image distortions to support MR-based treatment planning

The THETIS 3D MR Distortion Phantom helps medical physicists to quantify, as well as track over time, potential distortions that can arise in MR images used for radiotherapy treatment planning. Developed by laser and radiotherapy QA specialist LAP, the THETIS phantom enables the multidisciplinary care team to deploy MRI systems safely in a radiotherapy context and, in so doing, maximize clinical effectiveness through the precision targeting of diseased tissue.

In the treatment suite, MRI delivers clinical upsides along multiple coordinates, not least its superior soft-tissue contrast (versus CT) and the ability to visualize a matrix of functional information – including diffusion processes, blood volume and oxygenation, and localized metabolic activity within tumour sites. Equally compelling is the fact that MRI interrogates the patient using non-ionizing radio waves – a major plus when treating children and in cases where serial imaging scans are needed to track tumour response through multiple radiation fractions.

“Because of those upsides, the adoption of MRI and MR-Linac systems in radiation therapy has grown massively in recent years,” explains Torsten Hartmann, director of product management for healthcare at LAP. “We developed THETIS to enable the radiation oncology team to generate MR images of the highest geometrical accuracy – detecting possible distortions of the MR images reliably and quickly.”

Such potential distortions have their origins in tiny perturbations to the uniformity of the MRI scanner’s magnetic field and the field gradients used to image the patient. By extension, when the MRI scanner’s imaging sequences are not optimized, problems can occur downstream and introduce errors in the patient’s treatment plan. “THETIS makes it easy to determine where distortions are affecting the image and whether the scanner’s magnetic field has changed over a period of time,” Hartmann adds.

Enabling independent QA

In terms of specifics, the THETIS phantom exploits a square grid of embedded silicon markers, each of which provides a strong, localized MR signal (and with 258 signal sources per measurement plate). The phantom – which is aligned to the isocentre or the image centre of the MRI scanner using LAP’s MRI laser systems and its integrated levelling aids – can detect residual image distortions from gradient nonlinearities or main-magnet inhomogeneities to ensure they are within acceptable limits. In this way, the silicon markers help the medical physicist to visualize the loss of geometric fidelity with distance from the magnet isocentre, preventing a potentially inaccurate view of organs located in the outer areas of the MR image.

Torsten Hartmann

“Of course, all MRI system vendors provide methods and algorithms for distortion correction,” notes Hartmann. “While that’s as it should be, a large-field phantom like THETIS underpins those all-important independent QA checks to ensure safe deployment of MRI systems in the radiotherapy setting.” Those QA checks begin with the commissioning of a new MRI scanner and characterization of the machine’s baseline imaging performance versus manufacturer specifications. Equally important, THETIS offers streamlined workflows when it comes to systematic QA of MR image distortions over the lifetime of the MRI scanner – ensuring, for example, the geometric fidelity of MR images after major hardware and software upgrades to the imaging system.

“Clinical teams need a granular view of how such upgrades affect MR image quality,” adds Hartmann. “At the same time, THETIS supports the regular QA monitoring of MR image quality – for example, as part of the monthly or quarterly checks of image distortion and how it changes over time.”

Collaborative innovation

Operationally, the clinical and commercial release of the THETIS phantom is the outcome of an R&D collaboration between the LAP product development team and the MRI technology division at Siemens Healthineers, Germany. The latter is increasingly providing dedicated MRI systems into the radiotherapy clinic and, as such, wants to offer a reliable and affordable image distortion phantom tailor-made for its MRI equipment portfolio.

“We moved quickly from prototyping and evaluation into product development and construction – just six months in all before we entered beta-testing,” explains Hartmann. Geographical proximity certainly helped to streamline the product innovation cycle, with LAP’s Nuremberg manufacturing facility just 20 km or so from the Siemens Healthineers MRI technology hub in Erlangen. “Key to successful delivery was being able to jointly test, iterate and optimize the THETIS phantom with our colleagues in the MRI R&D team at Siemens Healthineers,” Hartmann adds.

Worth noting, though, that the commercial phantom is vendor-agnostic, being optimized on Siemens Healthineers’ MRI systems, but also compatible with a range of open-bore scanners from other manufacturers. In the radiotherapy clinic, meanwhile, the phantom is equally versatile. According to Hartmann, THETIS is compatible with emerging MR-only treatment planning workflows as well as the established standard-of-care in which a fused CT-MRI dataset provides the MR information needed to outline the tumour volume and organs at risk, while the CT is used for dose calculation.

Further reading

Carri K Glide-Hurst et al. 2021 AAPM Task Group 284 report. Magnetic resonance imaging simulation in radiotherapy: considerations for clinical implementation, optimization, and quality assurance Med. Phys. 48 (7) e636

THETIS in brief

The THETIS 3D MR Distortion Phantom is designed for QA of MR images in radiation therapy and diagnostic settings. Key features include:

  • Modularity: expansion stages allow optimal adaptation to different system and workflow requirements.
  • MR-safety: the phantom contains no ferromagnetic components and, as such, is ideally suited to the MRI environment.
  • Easy handling: integrated levelling aids streamline alignment, while intuitive handling of the phantom simplifies 3D examination of the entire MRI space.
  • Figures of merit: 10 plates (maximum expansion stage); three extension modules; 3 T MRI-tested; 258 signal sources per measurement plate.

Telescope with large-aperture metalens images the Moon

Telescope made with a metalens

An important step towards the practical use of optical metasurfaces has been taken by researchers in the US. The team used a common semiconductor manufacturing process to produce a large aperture, flat metalens. Its optical performance was demonstrated by using it as the objective lens in a simple telescope that was aimed at the Moon. The telescope achieved superior resolving power and produced clear images of the surface of the Moon.

Telescopes have been used to peer out into the universe for more than 400 years. In the early 1600s, Galileo Galilei used a telescope to observe the moons of Jupiter and last year the James Webb Space Telescope began taking spectacular images of the cosmos.

The telescopes used today by professional astronomers tend to be large and bulky, which often puts limits on how and where they can be used. The size of these instruments is a result of their large apertures and often-complicated multi-element optical systems that are necessary to eliminate aberrations and to provide the desired high performance.

Engineered nanostructures

Optical metasurfaces offer a potential way to make telescopes and other optical systems smaller and simpler. These are engineered nanostructures that can be thought of as a series of artificial optical antennas (see figure). These antennas can manipulate light, changing, for example, its amplitude, phase, and polarization.

These metasurfaces can be engineered to focus light, thereby creating metalenses that can offer significant advantages over conventional optics. For example, the flat surfaces of metalenses are free of spherical aberrations and metalenses are ultrathin and low in weight when compared to conventional optics.

However, the production of metalenses is still in its infancy. Current fabrication methods are based on scanning systems such as electron-beam (e-beam) lithography and focused ion beam (FIB) techniques. These are slow, expensive, and restrict the size of metalenses to just a few millimetres. This makes large-volume production almost impossible and means that metalenses are currently expensive and too small for the large-aperture applications such as telescopes.

A meta-telescope

Now, researchers at Pennsylvania State University and the NASA-Goddard Space Flight Center have come up with a much better way of making metalenses. Their process can be scaled up for large-scale production and can be used to create metalenses with large aperture sizes that are suitable for telescope applications.

The team used deep-ultraviolet (DUV) lithography, which is a technique commonly used in the semiconductor industry. Their process involved patterning the top of a four-inch silica wafer. Their 80-mm-diameter meta-lens was divided into 16 parts that were combined by exposing the same patterns on different quadrants of the wafer. Pattern stitching and wafer rotation eliminated the need for an expensive single large mask that exposes the entire surface.

Intensity profile

The performance of the metalens was characterized by measuring the intensity profile of focused laser beams over a broad wavelength range spanning 1200–1600 nm. The tests showed that the metalens can tightly focus light close to the diffraction limit over the entire range, despite being designed to operate at 1450 nm. However, diffractive dispersion did vary the focal length throughout the wavelength range – a detrimental effect called chromatic aberration.

The resolving power of the metalens was tested by using it as an objective lens inside a telescope. The team used the telescope to successfully image various features of the Moon’s surface with a minimum resolving feature size of approximately 80 km. This is the best reported resolving power for this type of metalens so far.

Next-generation systems

Lead researcher Xingjie Ni at Pennsylvania State University believes that metasurfaces can be a game changer in optics, because their unprecedented capability for light manipulation makes them powerful candidates for next-generation optical systems. This, he says, is why his team is dedicated to advancing the capabilities of scalable, fabrication-friendly metasurfaces.

“We plan to improve our design techniques to achieve fabrication-imperfection-tolerant nanostructures. This will allow us to use high-volume manufacturing technology such as photolithography to make large scale metalenses working in the visible range and incorporate more complex nanoantenna designs, for example, freeform shaped nanoantennas, to compensate for chromatic aberration,” he tells Physics World.

Din Ping Tsai at the City University of Hong Kong was not involved in the research and he thinks that this work expands the working scenarios of metalenses and will inspire research on metalenses with large apertures. He says that DUV lithography could be used to achieve the high throughput manufacturing of low cost metalenses with reasonable resolution. This would bring the components into commercialization and make them part of our daily life in the coming years.

Tsai believes that the chromatic aberration in the Penn State metalens limits its use to monochromatic applications. He also points out that the design of large-area broadband achromatic meta-lens is still a big challenge and is in strong demand. In addition, he believes that a large mask is the preferred way to make metalenses in order to avoid stitching errors and to simplify the fabrication process.

The research is described in ACS Nano Letters.

MRI guidance reduces side effects of prostate cancer radiotherapy

Stereotactic body radiotherapy (SBRT) is an established treatment for prostate cancer. It involves delivering large daily doses of precisely targeted radiation in five or fewer fractions, traditionally using either planar X-ray or cone-beam CT images to guide the radiation delivery.

The prostate is a highly mobile target and it’s essential to account for its motion during irradiation to maximize treatment effectiveness. This is typically achieved by creating a planning target volume (PTV) that includes a margin around the prostate to ensure adequate target dosing. However, the high-dose regions of the PTV often overlap portions of the bladder, rectum and other nearby structures, which can cause side effects such as urinary, bowel and sexual dysfunction.

The recent introduction of MRI-guided linacs could help minimize the risk of such toxicities. MRI-linacs offer high soft-tissue contrast and the ability to track intra-fraction prostate motion directly (rather than relying on fiducial markers) and control the beam in real time during treatment. These advantages should enable the use of significantly smaller margins around the prostate. To date, however, the theoretical advantages of MRI-guided radiotherapy for prostate SBRT have not been demonstrated in a randomized clinical trial.

The MIRAGE (MRI-guided stereotactic body radiotherapy for prostate cancer) trial aims to address this shortfall and determine whether MRI-guided radiotherapy offers an evident benefit for patients. The phase III randomized clinical trial, led by Amar Kishan and Michael Steinberg at the University of California, Los Angeles (UCLA), enrolled men receiving SBRT for localized prostate cancer. Between May 2020 and October 2021, the trial randomized 156 patients to receive SBRT with either CT guidance (77 patients) or MRI guidance using the MRIdian system (79 patients).

Patients were treated with 40 Gy in five fractions, using planning margins of 4 mm in the CT arm and 2 mm in the MRI arm. The researchers note that this 2 mm margin is narrower than used in any previous large study. They hoped to show that this aggressive margin reduction could reduce toxic effects following SBRT.

“MRI guidance offers several advantages over standard CT guidance, most notably the ability to dramatically reduce planning margins, providing more focused treatment with less injury to nearby normal tissues and organs,” says Kishan in a press statement. “MRI technology is more costly than CT, both in terms of upfront equipment expenses and longer treatment times, which is one reason our study set out to determine if MRI-guided technology offers tangible benefits for patients.”

Improved outcomes

The results of the trial, described in JAMA Oncology, revealed that MRI guidance led to fewer toxicities and better quality-of-life, as judged by both patients and the doctors treating them.

In 154 patients available for follow-up, the incidence of acute grade 2 or greater genitourinary (GU) toxic effects was significantly lower following MRI- than CT-guided SBRT: 24.4% in the MRI group versus 43.4% in the CT group. Patients in the MRI group also had fewer acute grade 2 or greater gastrointestinal toxic effects: 0.0% versus 10.5%, respectively. In a multivariate analysis accounting for all candidate variables, the MRI-guided arm remained associated with a 60% reduction in odds of grade 2 or greater GU toxicity.

After 100 patients reached 90 or more days post-treatment, the researchers conducted an interim analysis. At this time, the incidence of acute grade 2 or greater GU toxic effects was significantly reduced in men receiving MRI-guided SBRT compared with those receiving CT-guided SBRT (24 of 51 versus 11 of 49). They re-estimated the required sample size as 154 patients and, as 156 patients had already been treated, closed the trial for further accrual.

“This is the first large scale SBRT trial to use a dose of 40 Gy to the PTV, which we felt was an appropriate dose given the anticipated risk level of the cohort we would be treating. Because dose is associated closely with toxicity, we knew beforehand that the estimates of toxicity we used to power the trial might be underestimates,” Kishan tells Physics World. “Thus, we stipulated an interim analysis should occur after 100 patients were eligible for analysis in order for us to formally re-evaluate the power considerations for the trial.”

One unique aspect of the study was its inclusion of patient-reported outcomes. Significantly fewer patients receiving MRI-guided SBRT experienced large increases in urinary symptoms. Similarly, far more patients experienced a clinically notable decrease in bowel-related quality-of-life with CT guidance.

The researchers note that longer term follow-up is necessary to determine whether these benefits persist, whether differences in late urinary or bowel toxic effects occur and to evaluate differences in sexual outcomes. They plan to continue to monitor toxicity outcomes and perform an analysis of 2-year patient-reported outcomes.

“MRI-guided radiation has apparent theoretical benefits in this treatment scenario, and it was important to conduct a rigorous comparison,” says Steinberg. “Given the significance of the outcomes realized, we’ve evolved our prostate cancer treatment approach at UCLA to preferentially utilize MRI-guided SBRT.”

The surprising physics of babies: how we’re improving our understanding of human reproduction

Diverse group of ten babies playing

Becoming a parent or carer for the first time is a joyous, if fairly loud, occasion. When a baby enters the world covered in bodily fluids, they inflate their lungs to take a breath and let out an ear-piercing cry. It’s the first sign for bleary-eyed expectant parents that their life will never quite be the same – they will soon get to grips with constant feeds, dirty nappies and, of course, a lack of sleep. Part of the challenge for new parents is dealing with the many changes that lie ahead, not only in their own life but that of the newborn; as babies rapidly develop in the coming days, months and years.

“The first thousand days” is a common term used by paediatricians to describe the period from conception to the child’s second birthday – a time during which so many critical developments occur; right from the very moment of conception as the embryo, and then fetus, undergo rapid daily transformations. Some nine months later at birth, the infant’s reliance on the placenta to sustain itself in utero comes to an end. The baby must get to grips with breathing on its own and feeding on the breast or from the bottle while also adapting to their new environment. Months later, the development takes on other dimensions as the infant rolls, crawls, stands on unsteady legs, and then ultimately walks. If that wasn’t enough, there is also the not-so-small matter of communication, by learning a language.

Given how crucial the first thousand days are; many aspects concerning conception, pregnancy and babyhood remain woefully understudied

It is easy to take any of these individual milestones for granted – and many parents do, for no fault of their own. After all, infants are seemingly built to take on these challenges. But considering how crucial these two-and-a-half years are, many aspects concerning conception, pregnancy and babyhood remain woefully understudied. Pregnancy, for example, has commonly been seen as something to be endured rather than investigated. Research on the properties and working of the placenta, uterus and cervix, lags decades behind that of other organs such as the heart, lungs and brain. One reason for this is the ethical perspective of studying pregnant women and newborn babies; not to mention the fact that research into healthcare for women has long been marginalized, and often overlooks key differences between men and women. Studies must be carefully designed, and various ethical procedures and guidelines need to be adhered too. That will remain; but what is different today is finally seeing these topics as worthy of investigation in the first place – a move that has also been helped by advances in imaging and theoretical techniques.

While some may think that it is only biology and neuroscience that can shine a light on conception, pregnancy and babyhood, physics too has the necessary tools to provide a fresh perspective into many of these issues. Physics plays a key role in everything from how sperm are able to navigate the complex fluids of the female reproductive system to reach the egg (see “Conception – life begins at low Reynolds number”); to the forces that are involved to support the development of the embryo; and how the placenta is able to control the diffusion of a wide range of solutes to and from the fetus (see “Pregnancy and the placenta; the tree of life”). Physical processes are involved in the manner that contractions can coordinate and travel across the uterus to expel a baby; how a newborn can effortlessly extract milk from the breast; what acoustic properties of babies’ cries make them so hard to ignore; and how toddlers are able to learn grammar so effectively (see “Babyhood – it’s good to talk”).

Today, research into these matters from a physical-science perspective is not only throwing up surprises about what the human body is capable of, but is also highlighting potential treatments – from new methods to monitor fetal movements, to innovative ways to help premature babies to breath. Such endeavours are also deepening our appreciation of the processes that life has put in place to propagate itself. And there remains much more to discover.

Conception – life begins at low Reynolds number

“[Sperm] is an animalcule which mostly…swims with its head or front part in my direction. The tail, which, when swimming, it lashes like a snakelike movement, like eels in water.” So wrote the Dutch businessman and scientist Antonie van Leeuwenhoek to the Royal Society in the 1670s concerning his observations of sperm. Using his custom-built microscopes, which were more powerful than anything made before, van Leeuwenhoek was the first to peer into the microscopic realm. His devices, which were about the size of a hand, let him image objects with micrometre resolution, clearly resolving many different types of “animalcule” that reside on or in the body, including sperm.

Human egg and sperm

Despite van Leeuwenhoek’s acute observations, it took hundreds of years to get any firm idea about how sperm could propel through the complex fluids that exist within the female reproductive tract. The first clues came in the late 1880s from the Irish physicist Osborne Reynolds who worked at Owens College in England (now the University of Manchester). During that time, Reynolds conducted a series of fluid-dynamics experiments, and from them obtained a relationship between the inertia that a body in a liquid can provide and the viscosity of the medium – the Reynold’s number. Roughly speaking, a large object in a liquid such as water would have a large Reynolds number, which means inertial forces created by the object are dominant. But for a microscopic body, such as sperm, it would be the viscous forces of the liquid that would have the most influence.

The physics explaining this strange world where viscous forces dominate was worked out by several physicists in the 1950s, including Geoffrey Taylor from the University of Cambridge. Conducting experiments using glycerine, a high-viscosity medium, he showed that at a low Reynolds number, the physics of a swimming microorganism could be explained by “oblique motion”. If you take a thin cylinder, such as a straw, and let it fall upright in a high-viscosity fluid like syrup, it will do so vertically – as you may expect. If you put the straw on its side, it will still drop vertically, but half as fast as the upright case due to increased drag. However, when you put the straw diagonally and let it fall, it does not move vertically downwards but falls in a diagonal direction – what is known as oblique motion.

This occurs because the drag along the length of the body is lower than in the perpendicular direction – meaning that the straw wants to move along its length faster than it does perpendicular, so it slips horizontally as well as falling vertically. In the early 1950s, Taylor and Geoff Hancock from the University of Manchester, UK, carried out detailed calculations about how a sperm could travel. They showed that as the sperm whips its tail, it creates oblique movements at different sections, producing viscous propulsion.

Today, researchers are building ever complex models for how sperm swim. These models are not just for theoretical insights, but also have applications in assisted reproduction techniques. Mathematician David Smith from the University of Birmingham, UK – who has worked on biological fluid dynamics for over two decades – and colleagues have developed a sperm-analysis technique. Dubbed Flagella Analysis and Sperm Tracking (FAST), it can image and analyse the tail of a sperm in exquisite detail. From the images, it uses mathematical models to calculate how much force the body is applying to the fluid. The package also calculates the sperm’s swimming efficiency – how far it moves using a certain amount of energy.

The team began clinical trials with FAST in 2018, and if the technique is successful, it could help couples assess what type of assisted reproduction technique may work for them. The simulations may show, for example, that “intrauterine insemination” – in which sperm are washed and then injected into the uterus, bypassing the cervical canal – could be just as successful over several cycles as carrying out more expensive and invasive IVF procedures. Alternatively, their technique could be used to help to analyse the impact of male contraception. “This project is about harnessing 21st-century technologies to address male fertility problems,” says Smith.

Pregnancy and the placenta – the tree of life

Consisting of a network of thick purple vessels and resembling a flat cake, the placenta is the life-giving alien within. An organ unique to pregnancy, a healthy placenta at full term is around 22 centimetres in diameter, 2.5 centimetres thick, and with a mass of about 0.6 kg. It is a direct link between the mother and fetus, providing the fetus with oxygen and nutrients, and allowing it to send back waste products, such as carbon dioxide and urea, a major component of urine.

From just a collection of cells in early pregnancy, the placenta begins to form a basic structure once it intertwines with the lining of the uterus. This eventually leads to a network of fetal vessels that branch out to form villous trees – a bit like Japanese bonsais – that are bathed in maternal blood in the “intervillous space”. The placenta could be described as fifty connected bonsai trees upside down at the top of a fish tank that is full of blood, thanks to the pumping of several maternal arteries at the bottom.

The placenta

Estimated to contain around 550 km of fetal blood vessels – similar in length to the Grand Canyon – the placenta’s total surface area for gas exchange is around 13 m2. Part of the difficulty studying the placenta is due to these varying scales. The other issue is knowing how this huge network of fetal vessels, which are each about 200 μm across, ultimately affects the performance of a centimetre-scale organ.

The exchange of gases between maternal and fetal blood is via diffusion through the villous tree tissue – with the fetal vessels closest to the villous tissue thought to be doing the exchange. By combining experimental data with mathematical modelling of the intricate geometry of the fetal blood vessels, for the past decade mathematician Igor Chernyavsky from the University of Manchester and colleagues have been studying the transport of gases and other nutrients in the placenta.

The team found that despite the incredibly complex topology of the fetal vessels, there is a key dimensionless number that can explain the transport of different nutrients in the placenta. Determining the chemical state of a mixture is a complex problem – the only “reference” state being equilibrium, when all the reactions balance each other and end up in a stable composition.

In the 1920s, physical chemist Gerhard Damköhler attempted to work out a relationship for the rate of chemical reactions or diffusion in the presence of a flow. In this non-equilibrium scenario, he came up with a single number – the Damköhler number – that can be used to compare the time for the “chemistry to happen” with the flow rate in the same region.

The Damköhler number is useful when it comes to the placenta because the organ is diffusing solutes – such as oxygen, glucose and urea – in the presence of both a fetal and maternal blood flow. Here, the Damköhler number is defined as the ratio between the amount of diffusion against the rate of blood flow. For a Damköhler number larger than one, diffusion dominates and occurs faster than the blood flow rate, known as “flow limited”. For a number less than one, the flow rate is greater than the diffusion rate, known as “diffusion limited”. Chernyavsky and colleagues found that, despite the various complex arrangements of fetal capillaries in the terminal villus, the movement of different gases in and out of the fetal capillaries could be described by the Damköhler number – which he called the “unifying principle” in the placenta.

The researchers found, for example, that carbon monoxide and glucose in the placenta are diffusion limited, while carbon dioxide and urea are more flow limited. Carbon monoxide is thought to be efficiently exchanged by the placenta, which is why maternal smoking and air pollution can be dangerous for the baby. Intriguingly, oxygen is close to being both flow and diffusion limited, suggesting a design that is perhaps optimized for the gas; which makes sense given it is so critical to life.

It is unknown why there is such a wide range of Damköhler numbers, but one possible explanation is that the placenta must be robust, given its many different roles, which include both nourishing and protecting the baby from harm. Given the difficulty of experimentally studying the placenta both in utero and when it is delivered in the third stage of labour, there is still a lot we don’t know about this ethereal organ.

Babyhood – it’s good to talk

Toddler deciding what to say

It is difficult to express how hard, in principle, it is for infants to pick up their language – but they seem remarkably good at doing so. When a child is two to three years old, its language becomes sophisticated incredibly quickly, with toddlers being able to construct complex – and grammatically correct – sentences. This development is so rapid that it is difficult to study, and is far from being fully understood. Indeed, how infants learn language is hotly contested, with many competing theories among linguists.

Almost all human languages can be described with what is known as a context-free grammar – a set of (recursive) rules that generates a tree-like structure. The three main aspects of a context-free grammar are “non-terminal” symbols, “terminal” symbols, and “production rules”. In a language, non-terminal symbols are aspects like noun phrases or verb phrases (i.e. parts of the sentence that can be broken down into smaller parts). Terminal symbols are produced when all the operations have been carried out, such as the individual words themselves. Finally, there are the hidden production rules that determine where the terminal symbols should be placed, to produce a sentence that makes sense.

A diagram showing how language is learnted

A sentence in a context-free grammar language can be visualized as a tree, with the branches being the “non-terminal” objects that the infant does not hear when learning language – such as verb phrases, and so on. The leaves of the tree, meanwhile, are the terminal symbols, or the actual words that are heard. For example, in the sentence “The bear walked into the cave”, “the bear” and “walked into the cave” can be split off to form a noun phrase (NP) and a verb phrase (VP), respectively. Those two parts can then be split further until the final result is individual words including determiners (Det) and prepositional phrases (PP) (see figure). When infants listen to people talking in fully formed sentences (that are, hopefully, grammatically correct), they are only exposed to the leaves of the tree-like network (the words and location in a sentence). But somehow, they also have to extract the rules of the language from the mixture of words they are hearing.

In 2019, Eric De Giuli from Ryerson University in Canada modelled this tree-like structure using the tools of statistical physics (Phys. Rev. Letts. 122 128301). As infants listen, they continually adjust the weights of the branches of possibilities as they hear language. Eventually, branches that produce nonsensical sentences acquire smaller weights – because they are never heard – as compared to information-rich branches that are given larger weights. By continuously performing this ritual of listening, the infant “prunes” the tree over time to discard random-word arrangements, while retaining those with meaningful structure. This pruning process reduces both the number of branches near the tree’s surface and those deeper down.

The fascinating aspect of this idea from a physical point of view is that when the weights are equal, language is random – which can be compared to how heat affects particles in thermodynamics. But once weights are added to the branches and adjusted to produce specific grammatical sentences, the “temperature” begins to decrease. De Giuli ran his model for 25,000 possible distinct “languages” (which included computer languages), and found universal behaviour when it came to “decreasing the temperature”. At a certain point, there is a sharp drop in what is analogous to thermodynamic entropy, or disorder, when the language goes from a body of random arrangements to one that has high-information content. Think of a bubbling pot of jumbled words that is taken off the stove to cool, until words and phrases begin to “crystallize” into a specific structure or grammar.

This abrupt switch is also akin to a phase transition in statistical mechanics – at a certain point, the language switches from a random jumble of words to a highly structured communication system that is rich in information, containing sentences with complex structures and meanings. De Giuli thinks that this model (which he stresses is only a model and not a definitive conclusion for how infants learn language) could explain why at a certain stage of development a child learns incredibly quickly to construct grammatical sentences. There comes a point when they have listened to enough for it all to make sense to them. Language, it seems, is just child’s play.

Alex Müller: Nobel-prize-winning condensed-matter physicist dies aged 95

The Swiss condensed-matter physicist and Nobel laureate Alex Müller died on 9 January at the age of 95. Müller shot to fame in 1986 when he and his colleague Georg Bednorz discovered a material with a superconducting transition temperature far above those of so-called conventional metal superconductors. The work earned the pair the Nobel Prize for Physics the following year.

Born on 20 April 1927 in Basel, Müller received a diploma in physics and mathematics from the Swiss Federal Institute of Technology (ETH) in Zürich in 1952. After graduating, he stayed on at ETH Zurich to study paramagnetic resonance of solid-state materials, obtaining his PhD in 1958.

Following a stint as head of the magnetic-resonance group at the Battelle Memorial Institute in Geneva, he took up a position at the University of Zurich in 1962. A year later he also joined IBM Research Zurich, heading the physics department from 1971. In 1985 he left IBM but remained at the University of Zurich before retiring in 1994.

During his life, Müller published several groundbreaking papers in the field of magnetic resonance and phase transitions in ferroelectrics. At IBM and Zurich he also began working on oxide materials, especially perovskites, which would later be useful in his work in superconductivity in the 1980s.

Fighting for acceptance

It has been known for more than a century that when certain metals are cooled to extremely low temperatures, they become superconductors, conducting electrical current without resistance. But as materials had to be cooled to just a few degrees above absolute zero for this phenomenon to occur, research in to the field began to stagnate.

That all changed, however, in 1986, when Müller and Bednorz discovered that a material composed of copper oxide with lantanum and barium became superconducting at around 35 K. This was about 50% higher than the previous highest value of 23 K, which had been achieved more than a decade earlier in niobium-germanium (Nb3Ge).

Given that oxide materials are ceramics and are known to conduct poorly at higher temperatures, their finding did not convince everybody, with some researchers believing the materials were not exhibiting superconductivity.

“It was disappointing seeing their reaction,” Bednorz later told Physics World in 1988. “We had the impression that they didn’t believe what we had found out and after that experience we were convinced that we would have to fight for one, two or even more years to get our results accepted in the scientific community.”

The results, however, were soon confirmed by other researchers, with Müller and Bednorz being credited with having discovered an entirely new class of superconductors. The buzz surrounding these so-called “cuprate superconductors” reached fever pitch at the American Physical Society’s March Meeting in New York in 1987. The meeting was later dubbed “the Woodstock of physics” due to the fervent nature of the talks and discussions that went on late into the night.

In 1987 Bednorz and Müller shared the Nobel Prize for Physics “for their important break-through in the discovery of superconductivity in ceramic materials”. The finding sparked decades of research into similar cuprate materials where superconducting transition temperatures over 100 K were later discovered, which Müller followed “with great interest and commitment”.

Nobel laureates warn Ukraine crisis is bringing Europe closer to nuclear war

Some 14 Nobel laureates have signed an open petition calling for an immediate ceasefire in the Ukraine war and for those suffering its impact to receive humanitarian aid. The laureates, who include Barry Barish, Giorgio Parisi and Andre Geim, warn that the escalation in words and military action towards nuclear weapons has “brought Europe much closer” to their use.

The petition – so far signed by over 800 people – was initiated by the DESY particle physicist Hannes Jung as well as organizations such as the Science4Peace Forum and the Ukrainian Pacifist Movement. In it they say that scientists cannot remain silent with the increasing threat of nuclear war.

“Any nuclear attack from any side will create responses and retaliations from other nuclear powers and in a short time millions of people could be killed, huge areas of land and sea destroyed and contaminated,” they write. Other Nobel-prize-winning physicists to sign the letter include Takaaki Kajita, Gerardus ‘t Hooft  and Michel Mayor.

The laureates and signatories call on politicians and leaders to immediately stop any verbal escalation when it comes to the use of nuclear weapons; to make sure that scientific advice is considered when making decision; and to remember the “inferno” that occurred during the Second World War when nuclear bombs were dropped on Hiroshima and Nagasaki.

We believe that scientists must speak up and warn of the enormous risks

Hannes Jung

They also call on governments, especially nuclear powers, to publicly declare that they subscribe to the “no-first-use policy” for nuclear weapons and that they join the United Nations’ Treaty on the Prohibition of Nuclear Weapons if they are not already members.

Jung told Physics World that further escalation in the war in Ukraine increases the risk that “one of the parties sees no other way” than to use nuclear weapons.

“Even without the explicit use of nuclear weapons, there are nuclear power plants under fire, and the risk for a nuclear catastrophe is increasing the longer this war continues,” he adds. “We believe that scientists must speak up and warn of the enormous risks and we hope, that our voice as scientists together with Nobel laureates will be heard and taken seriously.”

Cosmic-ray muons used to create cryptography system

The random arrival times of cosmic-ray muons at the Earth’s surface can be used to encode and decode confidential messages – according to Hiroyuki Tanaka at the University of Tokyo. He claims that the new scheme is more secure than other cryptographic systems because it does not require the sender and receiver of a message to exchange a secret key. Having confirmed important aspects of the technology in the lab, he reckons it will be commercially competitive for use over short distances in offices, data centres and private homes.

Cryptographic protocols involve generating and distributing a secret key that is used to encrypt and decrypt messages. Today, commonly used cryptography systems could be cracked by those with the ability to find the prime factors of very large numbers. This is fiendishly difficult to do using conventional computers but it should be a much easier task using quantum computers of the future.

Among the options for dealing with this threat is itself quantum – the use of Heisenberg’s uncertainty principle to ensure that any would-be eavesdropper cannot steal the key without revealing their presence in the process.

Quantum flaws

However, even this “quantum key distribution” has its flaws. Scientists have shown it is possible to exploit weaknesses in the encryption hardware, such as shining bright light on single-photon detectors to turn them into classical devices. This particular problem can be avoided by using a third party (who need not be trustworthy) to carry out the detection of key bits, but this arrangement is more expensive than straightforward two-party encryption.

Tanaka’s new proposal is designed to defeat eavesdroppers by instead turning to a natural and ever-present resource of randomness: cosmic-ray muons. Cosmic rays, which are mainly protons, rain down on the Earth from deep space and generate showers of pions and other particles when they collide with nuclei in the atmosphere. Those pions then decay into muons, which are heavy versions of the electron. These muons hit the Earth’s surface completely independently of one another and are able to pass through large amounts of solid material while only losing a small fraction of their energy by ionizing the materials.

The idea is to position the message sender and receiver close enough to one another that they are both exposed to the same cosmic-ray showers and can make their own separate detections of specific muons within a shower – namely, those particles whose trajectory crosses the detectors of both individuals. By each recording the arrival time of those muons and using the time stamps as the random data for cryptographic keys, the sender and receiver can independently generate the same secret keys – without having to send the keys to one another.

Synchronized clocks

Ensuring that sender and receiver use the same muons to create the keys relies on working out the precise time delay between the two detections, which is done by knowing the distance between the detectors (muons typically travel at 99.95% of light speed) while carefully synchronizing clocks at each end. Synchronization can be achieved using a global-positioning system to coordinate the ticking of local clocks such as crystal oscillators.

Tanaka calls his technique “Cosmic Coding & Transfer” (COSMOCAT) and it uses two detectors that measure muon arrival with a plastic scintillator and a photomultiplier tube. Carrying out tests on four different days in June last year, he showed that muons do indeed arrive at random points in time – the probability of observing a given number of events in a certain period following a Poissonian distribution. He also showed that the two detectors consistently produced the same, random time stamps.

However, owing to limitations in the GPS signals and the electronics used to carry out the experiment he was only able to establish common muon detections (as opposed to interception of other random particles) in about 20% of cases. Overcoming this problem involved the receiver using multiple keys to try and decode a given message and then moving on to the next message only once the receiver had signalled success.

Smart buildings

These extra steps add time to the decryption process and so slow down the rate at which data can be transmitted. Nevertheless, Tanaka says that the system would still be considerably quicker than much existing technology. Indeed, agreed detections took place at an average of around 20 Hz, implying a data transmission rate of at least 10 Mbps. This is faster than the 10 kbps that is typical of a local network system such as Bluetooth Low Energy. He reckons that this greater bandwidth should make the new scheme attractive for short-range wireless communication such as connecting sensors within “smart” buildings and securely exchanging information during the powering of future electric vehicles.

Like Tanaka, Michail Maniatakos of New York University Abu Dhabi in the United Arab Emirates has worked on developing a random number generator from cosmic muons for cryptography. But he and his colleagues found that muons do not arrive at the Earth’s surface in sufficient numbers to generate enough “entropy” in a given amount of time from a suitably small detector. “Our research concluded that muons are not a practical approach for sourcing randomness in a real system,” he says.

Tanaka acknowledges that muon detection rates place limits on the technology but insists that rates are adequate for wireless communication over distances of up to about 10 m. In his demonstration he used quite large detectors – each measuring 1 m2 – in order to maximize the bit rate. However, Tanaka reckons he could shrink the detectors to a fifth of their current size by upping the key generation rate by a factor of five. As to how long it will take to perfect the technology, he says he should have a working prototype within five years.

One potential weakness in the scheme, he notes, is the possibility that an eavesdropper could position a third detector between the sender’s and receiver’s devices and record the muon strikes independently. He reckons that any such plan would be “totally impractical” but says that the system comes with a built-in safeguard – a small temporal offset compared with standard time broadcast by GPS satellites. This offset, which the communicating parties can change at any time of their choosing, causes the would-be eavesdropper to disagree on the muon arrival times – with the upshot, he says, that they “cannot steal the key to decode the message”.

The research is described in iScience.

Copyright © 2025 by IOP Publishing Ltd and individual contributors