Skip to main content

Photonic nanostructures bring bees to flowers

Repeating nanostructures on the petals of certain flowers help to attract bees, scientists have shown. Parallel folds in each epidermal-cell’s cuticle – the protective coating on the petal surface – approximate diffraction gratings but with a degree of disorder that preferentially scatters blue–ultraviolet (UV) light.

Many bee species are particularly sensitive to light in the blue–UV range, since correspondingly coloured flowers often make especially rewarding foraging targets. Pigments responsible for this colouration are difficult to produce, however, and most flowering plants lack the necessary genetic and biochemical machinery. Now, though, researchers at the University of Cambridge, UK, led by Edwige Moyroud, along with collaborators at the Royal Botanic Gardens, propose that the evolution of a few diverse species of flowering plants has converged on photonic nanostructures as a solution.

Deliberate disorder

Some level of irregularity is to be expected in natural structures, but it was not known if this is a necessary feature or merely a side effect of biological development. To test the importance of pattern disruption, the researchers used finite-difference time-domain (FDTD) calculations to model the effect of different degrees of disorder on the optical response.

Perfectly periodic gratings produced an iridescent effect with diffraction peaks determined by the wavelength. When the pattern was disrupted to an extent similar to that seen in the natural nanostructures, however, diffraction-induced iridescence disappeared, leaving a response dominated by scattered blue–UV light. The effect was consistent over a wide range of grating parameters, meaning all the flowers studied showed a similar “blue halo”, despite significant differences in structure at the nanoscale.

Artificial flowers

To investigate if the effect is meaningful in the real world, the researchers made artificial flowers with a range of background pigments and nanostructures, including perfectly periodic gratings and flower-like disordered gratings. Bumblebees trained to seek a sucrose solution as a reward (in preference to an undesirable punishment solution) learned to discriminate between smooth and nanostructured surfaces, but were quicker to respond to the disordered blue-halo pattern. The bumblebees seemed able to spot a blue halo against any background colour, with yellow and black flowers discovered as efficiently as all-blue flowers. Blue haloes on already blue flowers had little effect on behaviour.

It is not known if natural photonic structures can attract other insect pollinators, or whether other plants use the effect for different purposes.

Full details of the research are published in Nature.

Atomic cloud is squashed and strained by light

A cloud of cold atoms can be squashed by shining light on it using a new technique developed by Nir Davidson and colleagues at the Weizmann Institute of Science in Israel. The effect was used to flatten spherical clouds of atoms in much the same way that a material can be distorted by an electric field. This optomechanical strain is caused by a force that depends on the density of atoms in the cloud and offers a new way of fine-tuning forces between cold atoms.

For several decades physicists have used two different ways of using light to push, pull and trap atoms. The first involves an atom absorbing and emitting photons at a specific transition frequency – with the momentum of the atom changing in the process. The second involves the atom feeling the force of the electric field associated with a beam of light. The latter is used to create an optical lattice of atoms in a standing wave of light.

Intense pulse

Now, Davidson and colleagues have illuminated a different type of interaction between light and atoms. Their experiments begin with a spherical cloud of about one million cold rubidium-87 atoms. An intense pulse of infrared light with a frequency that is “far detuned” from atomic transitions in rubidium-87 is shone on the cloud. The parameters of the beam are chosen to minimize the force between an individual atom and the electric field of the light.

The cloud acts like a lens and deflects the light. Conservation of momentum means that the atoms feel a force pushing in the opposite direction of the deflection. Nir and colleagues point out that this is a collective force because it acts only on atoms that are in a cloud – and the strength of the force depends on the density profile of the atoms in the cloud. They refer to this effect as electrostriction because the shape changes are similar to those that occur in materials when subjected to an electric field.

Bose–Einstein condensate

The force generated by the light pulse causes the cloud to expand in directions perpendicular to the propagation of the light. This was seen both for ultracold clouds that exist as Bose–Einstein condensates (BECs) and higher-temperature “thermal” clouds. In the case of the BECs, the clouds remained partially condensed even when subjected to relatively intense light pulses.

The team has also developed a theory that successfully explains their observations. Writing in Physical Review Letters, the team concludes “Electrostriction has the potential to be an important tool in cold atom experiments as it effectively induces interparticle interactions, which can be optically tuned”.

Diversity can mitigate the effects of drought

Drought is rare in tropical rainforests yet it does occur, especially during El Niño Southern Oscillation years. Climate change also predicts an increase in the severity and frequency of these intermittent droughts. The impacts of intensified drought and the resilience of ecosystems will depend on tolerances of individual species but also of the community as a whole.

Michael O’Brien of the Consejo Superior de Investigaciones Científicas, Spain, and his team investigated the problem by simulating intermittent drought conditions using rainfall exclusion shelters over a two-year period. They tested the effect on both monocultures and mixtures of tropical trees in Malaysian Borneo, as they report in Nature Ecology & Evolution.

The researchers found the growth rates of species grown as a monoculture to be significantly lower in drought conditions compared with wet conditions. When these same species were grown in mixed communities, there was no difference in the growth rates between drought and wet conditions.

The findings indicate that diverse communities show greater resilience to drought. The researchers think this is because a community consisting of a mixture of species experiences less intense competition for water, allowing growth rates to be maintained.

The implications are two-fold. Forests with high diversity are more resistant to drought. Additionally, drought can increase diversity by reducing the success of individuals in communities that aren’t very diverse. Alternatively, a negative feedback loop can develop: as forests with low diversity are more susceptible to drought, exposure to drought can lead to a further loss of biodiversity.

“To prevent this reverse feedback loop, it is essential to conserve biodiversity in tropical forests – both by maintaining existing diverse forests and by using diverse seed mixtures when planting new forests during restoration,” said O’Brien. “Using such diversity strategies will improve forest resistance to climate change.”

Dialogues on physics, great women who changed science, a virtual reality journey to six exoplanets

By Hamish Johnston

I spend an hour or so every day looking at physics-related websites including several dozen blogs by professional physicists. One of my favourites is Asymptotia by Clifford Johnson, a theoretical physicist at the University of Southern California. Johnson is a talented visual artist and next month he has a new graphic book out called The Dialogues. The above video gives you a taste of what to expect.

Who's who: great women who changed science. (Courtesy: Perimeter Institute)

Sticking to physics and art, the folks at the Perimeter Institute for Theoretical Physics have produced a wonderful series of posters featuring some of the top female scientists of the past 200 years. Download them,” says the institute. “Print them. Share them. Post them in classrooms, dorm rooms, living rooms, offices, and physics departments. Talk about these women. Share their stories.” Above is a montage of the five posters, can you name all five scientists?

From graphical art to a video that is out of this world. Above is a virtual-reality tour of six real exoplanets – so strap on your virtual-reality headset and enjoy.

From photo, to 3D model, through to wound healing

A research team headed up at Amirkabir University of Technology in Iran has proposed a novel approach to chronic wound healing. Peyman Gholami and collaborators introduced a framework to produce biologically compatible 3D prints of deep chronic wounds. Such custom made wound fillings could potentially aid with the healing process (IEEE J. Biomed. Health. Informatics doi: 10.1109/JBHI.2017.2743526).

The research group aims to provide a proof-of-concept that integrates various technologies into a unified solution. Such technologies include the almost automatic identification of the wound, the automatic creation of a 3D model and the bioprinting of this model. While these technologies are established in their respective fields, it is their merging that provides a semi-automatic solution to chronic wound healing.

The framework used to produce 3D prints of wounds

An interdisciplinary framework

Gholami’s work begins with an image of a chronic wound found on the body. After applying the pre-processing procedure, the user segments the wound using the LiveWire algorithm, which creates a mask of the wound with minimal user interaction. Once this mask is generated, the image is calibrated to measurements of the height and width of the wound, matching the pixels in the image to the actual size in millimetres. Alongside the measurements of the wound, external measurements of the depth are input into a software system that produces a 3D model of the wound that is compatible with a 3D printer.

The group chose the LiveWire algorithm following comparison with other common segmentation techniques, due to the high values it achieved on baseline performance indicators, including a score of over 97% in accuracy. The other segmentation algorithms examined included region-growing, active contours, Level set, and texture segmentation.

Using the wound coordinates generated from the segmented image and depth information, the researchers produced the computer instructions, containing G-code, for the bioprinting robot. They printed the wound model using bio-ink hydrogel, which consisted of alginate and gelatine, and was then used for cell encapsulation.

Wounds that won’t heal

A recent study on chronic wound detection explains that dealing with chronic wounds is a challenging task for skin pathologists and dermatologists (Wound Repair Regen. 24 181). In normal scenarios, a wound evolves through a set of phases that signify different processes being carried out. Chronic wounds comprise cases where the wound is stuck in one of the phases and doesn’t close or heal properly. Causes include factors such as poor circulation, but can sometimes be linked to other conditions.

Gholami and his collaborators used wound images from patients with diabetes, burns and metabolic conditions – which cause granulation and slough – and metabolic conditions that can cause tissue death. The proposed framework provides a good approach for producing custom made wound fillings.

 

Concerns raised over NIST security breaches

Security at America’s standards lab has been found to be woefully inadequate, according to a report by the US Government Accountability Office (GAO). Over the past two years, GAO agents used “very basic espionage techniques” to carry out 15 attempted break-ins at secure parts of campuses of the National Institute of Standards and Technology (NIST) in both Gaithersburg, Maryland and Boulder, Colorado. They were successful at each attempt.

The security breaches have raised significant questions about NIST’s lack of effective safeguards against unauthorized entry to the campuses, parts of which house dangerous chemicals and a nuclear research reactor. The concerns came despite NIST and the Department of Commerce, of which it is a part, having apparently tightened security recently in response to two illicit entries in 2015. At the Colorado campus, an unauthorized individual had found his way into a secure building while in Maryland a government police officer had caused an explosion while making methamphetamine in a partly empty laboratory.

Taking action

GAO representatives showed videos of the latest breaches last week at a closed portion of a hearing by a subcommittee of the House of Representatives Committee on Science, Space and Technology. “The evidence produced in these videos shines a light on the porous nature of NIST’s physical security, and are particularly concerning to the committee, especially in light of the fact that the July 2015 meth-lab explosion served to put NIST on notice that its physical security programme was flawed,” noted subcommittee member Darin LaHood, a Republican from Illinois.

The GAO says that the lack of communication between NIST and the Department of Commerce is a probable cause of the breaches. “Management of NIST’s physical security programme is fragmented between the Department of Commerce and NIST,” the report states. “This is inconsistent with the federal Interagency Security Committee’s physical security best practices, which encourage agencies to centrally manage physical security.” The report recommends that the Department of Commerce and NIST now develop better security co-ordination and risk-management policies.

“NIST takes its responsibility to ensure the physical security of NIST’s two campuses very seriously,” the agency’s acting director Ken Rochford, told the House subcommittee. “NIST is working with the Office of Security to strengthen the security culture at NIST, which the GAO notes has already had some success, though there is still some work to be done.”

That job will now be down to Walter Copan, who was approved as NIST director by the US Senate a few days after the House hearing. Copan has a PhD in chemistry and has worked at Brookhaven National Laboratory and the Department of Energy as well as in the private sector.

Tale of two physicists

It’s hard to imagine a more mercurial pairing of minds in physics than that of John Archibald Wheeler and Richard Feynman. The latter went to Princeton University in 1939, expecting to work on his PhD under Eugene Wigner, but found that he had been assigned to Wheeler instead, less than seven years his senior. Both wildly inventive thinkers, they clicked, and each helped the other to reach the top of their game. Physicist and science writer Paul Halpern describes the duo’s work and relationship in The Quantum Labyrinth: How Richard Feynman and John Wheeler Revolutionized Time and Reality.

From Feynman came the approach to quantum electrodynamics that Wheeler christened a “sum over histories” – a way of dealing with the bothersome infinities that appeared when one tried to quantize Maxwell’s theory of electromagnetic fields, and which featured the famous Feynman diagrams. From Wheeler and his other collaborators came black holes and wormholes (terms he popularized), quantum “many worlds” and the foundations of quantum information theory encapsulated in the mantra “It From Bit”. Between them, the two physicists built a bridge between the physics of Einstein, Bohr and Pauli, and that which laid the foundations of so much of the discipline today: quantum field theory, quarks, the Standard Model, quantum and relativistic cosmology.

Whether this amounts to “revolutionizing time and reality”, as Halpern’s subtitle has it, is a matter of debate. But by documenting the relationship between the two men, his book provides a portrait of a rather neglected era in physics. Following the twin revolutions of quantum theory and relativity in the early 1900s, the 1940s to the late 1960s can appear like a time when already challenging ideas became all but incomprehensible beyond the academy. But Halpern shows that it was every bit as significant as the pre-war period that looks now to be an age almost of gods and legends.

Wheeler was astonished by Feynman, who as a student at the Massachusetts Institute of Technology had achieved the highest score by far in the daunting national Putnam Mathematics competition. “Nobody else who’s applying here at Princeton comes anywhere near so close to the absolute peak,” he said, adding, however, that “We’ve never let in anyone with scores so low in history and English.” He must be, Wheeler concluded, “a diamond in the rough”.

A working-class kid from Queens in New York, Feynman certainly was rough around the edges by the standards of the day. Wheeler’s wife Janette once scolded him for not standing up when approached by a lady, but Wheeler enjoyed the young man’s candour and humour. After Feynman’s first meeting with his new supervisor when Wheeler laid out a pocket watch on the desk to stick to time, Feynman went out and bought a cheap one that he set out in their next interview “like a countering move in a chess game”. Both young men burst out laughing; there was plenty more where that came from.

Wheeler quickly realized that he could treat Feynman as an equal in discussing physics, and that he could rely on his student to find mathematical solutions to the most challenging problems. What’s more, both of them were open to ideas as crazy as you like, provided that they didn’t obviously contravene physical laws. Some of their earliest work was on the “absorber theory” of how accelerating charged objects emitted radiation, which entailed signals travelling backwards in time. It was a fertile notion, but ultimately wrong.

As Halpern points out, the calm and elegant Wheeler was generally more wildly speculative than the flamboyant, excitable Feynman. It was Wheeler who called Feynman up one day to exclaim that he knew why all electrons are identical: they are all the same one electron, zigzagging all over time and space. That idea too came to nothing. Wheeler delighted in coining paradoxical turns of phrase – “law without law”, “charge without charge” – that set you thinking. When the Wheeler–Feynman absorber theory was resurrected, and adapted by steady-state cosmologists, it was dubbed Wheeler Without Wheeler.

For developing ideas that led to a theory of quantum electrodynamics without infinite singularities – the key problem was how to tame a point-charge electron’s interaction with its own field – Feynman won the 1965 Nobel Prize for Physics, shared with Julian Schwinger and Sin-Itiro Tomonaga. (Freeman Dyson, who showed how to weave the theories of the three men together, would surely have shared it too, says Halpern, but for the three-person limit.) Wheeler never attained that height, but his influence on modern physics was wide and deep: as well as Feyn­man, his students included Hugh Everett, who concocted what is now known as the “many worlds” interpretation of quantum mechanics, as well as several of the protagonists of the renaissance of general relativity, such as Jacob Bekenstein, Charlie Misner and Kip Thorne.

It’s hard to tell this story without delving into some recondite physics. Halpern makes ample and generally effective use of analogies, but there are limits to what they can do, and once you’re trying to picture rocking chairs coupled by a clothesline draped with blankets, it’s not easy to keep sight of the phenomenon it is meant to explain. Halpern occasionally ducks the deeper issues. It’s all very well to invoke the common notion that Feynman’s path-integral method in quantum electrodynamics implies that “everything that can happen does”, but what does that mean in ontological terms? To say that “reality proceeds by an awareness of all the possibilities before you arrive at an actuality” is to seek refuge in ambiguous words. It’s far from clear that Feynman himself would have been happy with an “everything happens” interpretation; pragmatic and disdainful of philosophy, he took Bohr’s view (so says Dyson) that the purpose of quantum theory is “to describe nature, not to explain nature”. It works, and that’s enough.

That no-nonsense attitude is part of Feynman’s enduring appeal. But Halpern hints that it was at least partly an intentionally constructed persona. Feynman liked to play the Everyman, but one capable of magical feats: his legendary safe-cracking at Los Alamos during the Manhattan Project was the result of much concealed study and practice. His celebrated “ask me anything” lectures at Caltech also reveal a desire to be seen as an insouciant wizard.

Those exploits make for great stories, but they suggest there was more ego and artifice in Feynman than he wanted us to see. Here too the contrast with Wheeler is illuminating. Feynman was, I suspect, the more brilliant, but despite Feynman’s bongo-playing and (slightly voyeuristic) life drawing, Wheeler was the more rounded – prepared, like his mentor Bohr, to entertain the idea that questions science can’t answer can still be worth asking.

  • Paul Halpern The Quantum Labyrinth: How Richard Feynman and John Wheeler Revolutionized Time and Reality 2017 Basic Books £25/$30hb 320pp

A natural neutron source

Photos of a lightning strike at 0 microseconds, 100 microseconds and 700 microseconds

The link between thunderstorm and neutron science is not an obvious one. Indeed, it took a Nobel laureate to spot it. By the time he made the connection, Willard Libby was already a highly regarded scientist thanks to his profound work on radiocarbon dating. This method – which became a standard tool for archaeologists, and earned Libby the Nobel Prize for Chemistry in 1960 – stemmed from the observation that when cosmic rays impinge on the Earth’s atmosphere, they produce a shower of particles, including neutrons. These neutrons can react with atmospheric nitrogen to create carbon-14, or radiocarbon (n + 147N → 146C  + p), which enters the food chain when plants absorb the resulting radioactive carbon dioxide. Libby’s insight was to realize that in living organisms, carbon-14 is constantly refreshed together with other carbon isotopes. However, when an animal or plant dies, the ratio of radiocarbon to the stable isotopes carbon-12 and carbon-13 decays with a half-life of 5730 years – making it possible to estimate the age of objects made from formerly living matter with a high degree of accuracy.

By 1973, decades had passed since Libby published this Nobel-winning work, and he was entering the autumn of his career. He was still thinking about neutrons, though, and as he was examining tree rings with a colleague, H R Lukens, the pair noticed interesting fluctuations in the amount of radiocarbon in each ring. These fluctuations could not be explained by variations in the cosmic ray flux. Instead, Libby and Lukens found a surprising correlation with thunderstorm activity. The effect was not negligible: they estimated that thunderstorms could account for up to one percent of the neutrons produced in the atmosphere. A few years after Libby’s death in 1980, G N Shah and colleagues (1985 Nature 313 773) presented convincing measurements to back up the suggestion that lightning produces neutrons. Shah estimated that between 10 and 100 million neutrons are produced per stroke. But how?

False starts and potential sources

From laboratory studies, researchers knew that intense electrical discharges through polymer fibres could produce neutrons at 2.45 MeV, probably by deuterium fusion (21H + 21H → 32He  + n). Hence, the first attempts to explain neutron production by lightning focused on fusion. Visible lightning strikes (as shown in the third panel of the figure “Follow the leader”, above) can reach temperatures of up to 30,000 K, and it was assumed that this would, in combination with natural deuterium in water vapour, do the trick. This theory dominated the literature for a long time, even though a number of independent measurements proved otherwise. The problem was that if one excludes deuterium fusion as a neutron source, one has to explain what is providing the energy to release neutrons from the nuclei of typical air molecules. For nitrogen, the binding energy is 10.5 MeV, while for oxygen it is 15.6 MeV.

Hints for a completely different (and correct) explanation came from far outside the laboratory. Since the 1960s, satellites equipped with gamma-ray detectors have monitored compliance with nuclear-test ban treaties here on Earth; later, similar spacecraft were launched that use gamma-ray flashes to study the cosmic realm. In 1993, however, more sensitive detectors recorded gamma-ray flashes coming from Earth that had nothing to do with weapons. These so-called terrestrial gamma-ray flashes (TGFs) are microsecond-to-millisecond-long pulses of photons with up to 40 MeV of energy (see “Gamma-ray and neutron generation” map), and in 1996 it was found that TGFs can be related to individual lightning strokes.

A world map with the locations of terrestrial gamma-ray flashes marked with yellow dots and regions with a high frequency of lightning shaded in white. The yellow dots and the white-shaded areas often overlap

Since then, research on high-energy atmospheric physics has gained momentum. Ground-, balloon- and plane-based observations measured fluxes of gamma rays with energies as high as tens of MeV – more than enough to free neutrons from nitrogen. These non-satellite observations also revealed a new phenomenon, much dimmer and of longer duration, that came to be known as gamma-ray glow and that is also accompanied by neutron release. But regardless of whether they are long and dim or short and intense, these gamma-ray phenomena pushed nuclear fusion off the table. As L P Babich argued very clearly in 2014 (JETP 118 375), the only way to create neutrons in a thunderstorm in detectable numbers is by photonuclear reaction of gamma rays with nitrogen, and, to a lesser degree, with oxygen.

Shifting the problem

So far, so good: we understand that neutrons can be generated in a thunderstorm by gamma rays with energies well above the 10.5 MeV photonuclear threshold of nitrogen. If you look more closely, however, you will see that this explanation merely shifts the problem. Gamma rays in our atmosphere are mostly the result of bremsstrahlung radiation, which occurs when energetic electrons and positrons (collectively known as leptons) collide with air molecules. So how does a thunderstorm generate substantial numbers of leptons with > 10 MeV energies? What role do the electric fields inside the storm play, and which fields at which stage of storm evolution are responsible?

To answer these questions, we need to dive into the current theory of lightning physics. First, we need to understand how free electrons move in a thunderstorm environment. In vacuum, electrons can easily be accelerated by external electric fields; this is how particle accelerators in the laboratory typically work, from Brown’s tubes up to synchrotron facilities such as DESY. However, in air, electrons also lose energy in collisions with air molecules. Hence, as long as the field does not exceed a threshold of about 0.2 MV/m at standard pressure and temperature, friction from inelastic or ionizing collisions with air molecules balances the acceleration provided by the field, and electrons drift with a field-dependent velocity rather than accelerating continuously (see “Runaway electrons” graph). On the other hand, if the electric field exceeds a value of about 25 MV/m, friction is always smaller than the acceleration provided by the field, and all electrons enter a “run-away” mode.

A double logarithmic plot of the friction force on electrons as a function of electron energy

In fact, electric fields in clouds cannot remain at such a high value for long, as classical electric breakdown sets in at a field of 3 MV/m. At that point, free electrons can gain enough kinetic energy from the field to liberate more free electrons when they collide with air molecules, and hence to set off ionization avalanches and eventually to create a plasma that cancels out the external electric field. In any case, the fields measured in thunderclouds do not exceed the lower value of 1 MV/m, according to balloon measurements performed a decade ago along with recent nonintrusive measurements using cosmic particle showers as a probe and the radio-telescope LOFAR as a detector (P Schellart et al. 2015 PRL 114 165001).

So can such a low field nevertheless support relativistic electrons, which have kinetic energy of 500 keV or more? To answer this question, we need to look again at the “Runaway electrons” figure, which shows the friction an electron experiences in air at standard temperature and pressure as a function of the electron energy. This friction reaches a maximum at an electron energy of about 200 eV and then decreases, before increasing again when the electrons attain MeV energies and start radiating substantial amounts of gamma rays by bremsstrahlung. The figure also indicates an acceleration force eE in an electric field E. If this force is larger than the friction, electrons gain energy, and if it is smaller, they lose energy; this is indicated by the blue arrows. For the field indicated in the figure, the electrons clearly fall into two populations: electrons with a lower initial energy approach a steady mean energy in the eV range, while electrons with higher initial energies are accelerated into the runaway regime, reach MeV energies and start to radiate gamma rays. But how do electrons get into this runaway regime within the moderate fields measured inside thunderstorms?

Recall that in the previous section, we mentioned that there are two different thunderstorm phenomena: terrestrial gamma-ray flashes and gamma-ray glows. These phenomena produce neutrons on different time scales, and according to our present understanding, they are associated with different physical mechanisms.

Careful measurements have found that gamma-ray glows actually occur before lightning activity starts, when large volumes of air inside a developing thunderstorm have built up an electric field exceeding the lower runaway threshold of ~0.2 MV/m. Cosmic rays shooting into the atmosphere create particle showers, including a continuous flux of relativistic electrons that gain additional energy in this electric field, and form relativistic run-away electron avalanches. This process can continue as long as the field is present and electrons are refreshed, producing a dim gamma-ray glow of long duration. Recent work by Ashot Chilingarian and colleagues, especially, shows clear evidence of thunderstorm-correlated boosts in the flux of high-energy electrons, gamma rays and neutrons on a timescale of minutes. Remarkably, these fluxes are not correlated with lightning strokes, but are actually competing with lightning as a discharge mechanism. As a result, lightning tends to snuff out gamma-ray glows.

Terrestrial gamma-ray flashes, on the other hand, are correlated with lightning “leaders”. While our eyes just see one lightning flash, high-speed cameras resolve how the lightning channel grows from the cloud to the ground, as shown in the “Follow the leader” images. The growing plasma channel shown in the left and middle panels is called a lightning leader. (It should be noted that leaders propagate for much longer distances inside clouds, in a more horizontal direction, before the visible cloud-to-ground lightning starts. However, during that stage they can only be detected with radio antennas, not by eye or by cameras.) Negatively charged lightning leaders propagate in steps, creating the characteristic zig-zag pattern familiar to anyone who has ever witnessed a big electrical storm. This stepping sometimes occurs through the formation of so-called “space stems” that initially have no visible connection to the glowing lightning leaders (middle panel of the figure), although they will become part of the leader somewhat later. The formation of a space stem and its integration into a leader constitutes one leader step.

It is now thought that during such a step, the very transient electric field near the leader tip could be high enough to accelerate even thermal electrons into the run-away regime in an intensive explosion – a so-called “cold runaway” – and thereby produce a TGF. But compared to gamma-ray glows, TGFs are much harder to study. In gamma-ray glows, the charged particle densities are so low that they do not change the local electric field, which makes the problem linear: one can simply model the particles developing in a given external (thundercloud) field. For TGFs, the situation is very different. The process of leader stepping is increasingly well measured, but the physical mechanism behind it is not well understood. One thing is clear: it is deeply nonlinear. To propagate, the leader both enhances the ambient electric field at its tip and interacts with its streamer corona, as well as with the mysterious space stems. Leaders are dynamical structures that change the electric field, accelerate electrons to the run-away regime and interact with them in extending the plasma region of the lightning channel. Researchers in our group, led by Jannis Teunissen, have recently made important progress towards fully three-dimensional simulations that are required to model many of these processes (arxiv.org/abs/1708.08434). Leader stepping and the associated high-energy emissions pose important challenges to current lightning research.

Neutrons as a diagnostic tool

This brings us back to neutrons. As it turns out, the very phenomenon that guided Libby to a connection between tree rings and thunderstorms could actually become a handy probe for building a deeper understanding of lightning leaders and the production of TGFs. The reasons are manifold. One is that neutrons are neutral and therefore do not affect the electric fields of the lightning leaders or the overall thunderstorm. Their presence is also a signature of gamma-rays significantly exceeding the 10.5 MeV threshold of neutron production in air. Neutrons have a much longer lifetime and spread out more isotropically than gamma rays, which makes them much easier to detect. A nice example is a 2013 study of how neutron bursts at ground level correlate with cloud-to-ground lightning of different polarities. In this work, A A Toropov and colleagues saw that neutrons are only emitted from negative cloud-to-ground lightning, and not from the positive variety. This is consistent with satellite observations showing that TGFs are produced by negative lightning leaders when they step.

The connection between neutrons and thunderstorms has a long history, but the use of neutrons as a tool to research them is still in its infancy. Even so, it is clear that neutrons give us an interesting window to study the highly intense bursts of energetic radiation from thunderstorms known as TGFs. In the future, they might also help us understand how lightning propagates in steps. Such information could be key to protecting our increasingly vulnerable infrastructure from lightning strikes and similar (potentially destructive) processes that occur in plasma and high-voltage technology.

The physics of bread – an introduction

Nathan Myhrvold – the polymath physicist whose passions range from cosmology to cooking – is publishing a massive, five-volume book about the science of bread and bread-making. To find out what happened when Physics World columnist Robert P Crease caught up with this intellectual livewire at his Cooking Lab headquarters in Seattle, see this free-to-read article from the October 2017 issue of Physics World.

Atoms and Josephson junctions simulate 1D quantum liquid

A theory that describes how quantum particles interact with each other in 1D has been put to the test by two independent teams of physicists. In one experiment, aspects of the Tomonaga–Luttinger theory were verified using laser-trapped ultracold atoms. The other study made use of superconducting devices. Confirmation of the theory could lead to the development of new technologies based on nanowires and other 1D systems. Applications include electronics, sensing, energy harvesting and quantum information.

Tomonaga–Luttinger theory describes a 1D ensemble of interacting quantum particles in terms of a Tomonaga–Luttinger liquid (TLL). It predicts properties of 1D quantum systems such as how electrons behave in a nanowire. Testing these predictions in a systematic way has not been possible, however, because it is very difficult to control how particles interact in 1D systems such as nanowires.

Quantum simulator

One way forward is to create an analogous quantum system such as an ensemble of trapped ultracold atoms, in which parameters such as particle interactions can be controlled. While there has been some progress in studying TLLs using quantum simulators, challenges remain. One problem is that a TLL has a homogenous distribution of particles whereas most quantum simulators have spatial order. Ultracold atoms, for example, are held in a 1D lattice with regular spacing.

Now, Bin Yang, Yang-Yang Chen and colleagues at the University of Science and Technology of China have overcome this inhomogeneity problem in an ultracold-atom simulator. Their experiment begins with a regular 1D array of rubidium-87 atoms trapped in an optical lattice of laser light. A laser pulse is then used to eject atoms from the central region of the trap, which sets off a density wave that moves outwards from the centre of the trap. The atomic density in the central region of the trap becomes nearly uniform, thus providing a homogeneous analogue to a TLL.

TLL parameter

By measuring the density and speed of sound in the central region, the team could work out the “TLL parameter” – which measures the level of quantum fluctuations in the system. Yang, Chen and colleagues then measured the momentum distribution in the system and confirmed that it was as predicted by the TTL model.

Meanwhile at the University of New South Wales, Timothy Duty and colleagues took a very different approach. They made their quantum simulators using lines of superconducting material that are interrupted by Josephson junctions at intervals of about 1 μm. In this case, the quantum particles are the Cooper pairs of electrons that are responsible for superconductivity.

Disorder versus interactions

Josephson junctions are non-superconducting regions through which the Cooper pairs can tunnel. There is inherent disorder in the materials used to make simulators and this results in slight differences in the number of Cooper pairs at each junction. By studying simulators with different sized junctions – and therefore different levels of disorder – Duty and colleagues were able to look at how disorder and particle interactions compete against each other to determine the properties of the TLL. When interactions dominate, for example, the system behaved as a superfluid. But when disorder prevails, the system became glass-like with no flow.

Both studies are described in Physical Review Letters.

Copyright © 2026 by IOP Publishing Ltd and individual contributors