Skip to main content

Quantum effects seen in collisions of ultracold ions and atoms

A new technique that allows researchers to cool ions to ultracold temperatures by placing the ions in contact with an ultracold atomic “buffer gas” has been developed by researchers in the Netherlands. As well as achieving the first experimental detection of quantum effects in collisions between atoms and ions, the research also opens several intriguing new possibilities, such as quantum states comprising both atoms and ions.

Ultracold atomic gases are routinely prepared in laboratories using a technique called forced evaporative cooling. These gases can then be used to cool trapped ions, and physicists had initially hoped that such ultracold buffer gases could cool ions to their quantum ground states. The problem they encountered is that the electric fields needed to trap the ions cause them to oscillate in the trap in a process cause micromotion. This kinetic energy is dispersed in the neutral atoms of the buffer gas, which heats the gas.

After laser cooling was developed in the 1980s, it became possible to cool ions to extremely low temperatures because, as the ions remain isolated from neutral atoms, the technique does not suffer from micromotion. Laser cooling has been used to cool individual ions to their ground state of motion, but unfortunately, isolating the ion sacrifices the potential to explore some intriguing aspects of quantum mechanics. “We can get very cold ions and also very cold atoms,” says atomic physicist Rene Gerritsma of the University of Amsterdam, “so why don’t we let them interact once and see what happens when they start to behave quantum mechanically?”

Heaviest and lightest

In 2012, Vladan Vuletić of Massachusetts Institute of Technology and colleagues showed theoretically that, by increasing the mass of the ion relative to that of the neutral atom, it might be possible to enter the regime at which quantum effects become detectable. “We looked at the periodic table and decided what were the heaviest ion and the lightest atom that we dared to work with,” says Gerritsma. The researchers settled on ytterbium-171 as their ion and as lithium-6 as their atom. They then started working on controlling the shapes of the electric fields in their ionic trap to ensure the ion would behave exactly as they desired in its interaction with the buffer gas and consequently would be cooled as much as possible.

In new work, the researchers demonstrate buffer gas cooling of ions to approximately 100 μK. This is approximately five times lower than the Doppler limit – the coldest temperature achievable by traditional laser cooling. “People used to believe laser cooling could not cool below the Doppler limit,” explains Gerritsma, “but Bill Phillips and other people were awarded the 1997 Nobel Prize for demonstrating that, by taking the quantum mechanics properly into account, you can cool to much lower temperatures.” Gerritsma’s team has not reached the temperatures achievable using this “sub-Doppler” cooling but they have, for the first time, cooled atom-ion mixtures to temperatures at which quantum effects should be detectable.

To look for quantum effects, they considered predictions from theoretical physicist Michał Tomza and colleagues at the University of Warsaw in Poland regarding so-called Langevin collisions, which occur when a colliding atom and ion exchange angular momentum. In classical Langevin theory, angular momentum varies continuously, so the collision rate is independent of angular momentum. However, the Gerritsma’s results showed clear evidence of peaks in the collision rate – evidence that angular momentum was being exchanged in quantized amounts. “I don’t think people are so surprised to see what they see now,” says Gerritsma, “but it’s the experimental breakthrough: we’re right at the border where it becomes quantum and we believe we can get it colder still in the future.” The team is now working to create molecular atom-ion interactions called Feshbach resonances using magnetic fields.

“I think this is a significant advance,” says Vuletić, who was not involved in this latest work.  “There’s very interesting physics in studying controlled atom-ion collisions, and for that you need to be cold enough so that only one particular process contributes.  The authors have reached this regime and opening this door will be very interesting for future studies. They did everything right and now they just need to overcome some more technical barriers to reach the ultimate limit of the cooling.”

The research is published in Nature Physics

Physicists place stringent limits on the neutron electric dipole moment

For more than 60 years, physicists have been searching for a miniscule separation of charge within the neutron known as an electric dipole moment. Finding this “neutron EDM” would be a major discovery, in part because it could help explain why matter in the universe did not simply become annihilated along with antimatter at the dawn of time. For the moment this dipole remains hypothetical, but new results obtained after painstaking observations of cold neutrons give us a better idea of just how small it would have to be – lowering the previous upper limit by 40% (arXiv:2001.11966).

Although the neutron has no overall charge, a conundrum that has long baffled physicists suggests the particle nevertheless  contains a region that is very slightly positive and another that is ever so slightly negative. That problem is the imbalance of matter and antimatter in the cosmos. The imbalance could be explained via the existence of processes that violate what is known as charge conjugation parity (CP) symmetry, but such violation within the Standard Model of particle physics is too limited to account for the overwhelming dominance of matter that we see around us.

Pointing the way

According to Dave Wark of the University of Oxford in the UK, who was not involved in the latest work, a neutron EDM would be “a signpost” of new physics beyond the Standard Model that might account for the matter-antimatter imbalance. That’s because the phenomenon would violate temporal (T) symmetry, and in turn CP-symmetry (the combined CPT being invariant). While a neutron’s spin magnetic moment would change when the flow of time is reversed, an EDM, if it existed, would not – meaning that the relationship between the two properties would change under such a reversal. (This is also true of other particles, and indeed some groups are hunting a possible electron EDM.)

In searching for the tiny putative effect, physicists rely on the fact that an EDM would cause a neutron’s spin axis to rotate when exposed to an electric field. To pick out this rotation they first place neutrons in a constant magnetic field, which itself sets the spin axes rotating. They then measure the particles’ precession frequency and do so again after applying a large electric field. If the neutrons really do have an EDM then their precession frequency should shift in line with the electric field strength.

Modern experiments use what are known as “ultra-cold” neutrons. These have such a low energy that their de Broglie wavelength is much longer than the distance between atoms, allowing them to be confined inside a material container over relatively long periods of time – typically several minutes. This increases experimental sensitivity. The most precise measurement until now had been made using neutrons from a reactor at the Institut Laue–Langevin (ILL) in Grenoble, France.

In 2006 a group of physicists from the ILL and the University of Sussex and Rutherford Appleton Laboratory in the UK reported an upper limit for the EDM of 2.9 × 10-26 e cm. That result was then modified very slightly in 2015 – to 3.0 × 10-26 e cm – after a more detailed analysis of the data by a larger group of researchers.

The latest research has been carried out by a pan-European collaboration led by Guillaume Pignol of the University of Grenoble and Philipp Schmidt-Wellenburg of the Paul Scherrer Institute near Zürich. The new group reused much of the apparatus from the earlier work – which is operated at room temperature – but this time took its neutrons from a spallation source at the Paul Scherrer Institute. It also did more to reduce systematic errors.

As was the case before, the researchers stored neutrons in a chamber also containing atoms of mercury vapour. By measuring the ratio of precession frequencies of the two types of particle they could largely cancel out the effects of any fluctuations in external magnetic fields. But just to ensure they could understand and control the magnetic field as far as possible they also installed a set of caesium magnetometers above and below the chamber. Pignol, Schmidt-Wellenburg and colleagues were able to reduce systematic errors by a factor of five compared to the 2015 result. Collecting data between 2015 and 2016, and performing a series of blind data analyses, they settled on a new, lower upper limit for the neutron EDM of 1.8 × 10-26 e cm.

Pushing the limit

To push the limit further down, researchers have been developing a cryogenic version of the experiment. The idea is to cool neutrons by scattering them off atoms of superfluid liquid helium, so boosting the density of ultra-cold neutrons as well as allowing longer storage times and higher electric fields. Wark says that this approach could yield sensitivities hundreds of times higher than is possible now, providing, he adds, that systematic errors can be controlled “to such a spectacular level”. Wark adds that the ILL group installed such an experiment but “were unable to get all the many parts of it working simultaneously for long enough to make a sensitive measurement”.

That potentially leaves the door open to a US collaboration building a cryogenic experiment at the Oak Ridge National Laboratory in Tennessee, which may start data-taking in 2023.

The paper has been accepted for publication in Physical Review Letters.

CT traces disease progression in novel coronavirus

© AuntMinnie.com

An ongoing investigation into the novel coronavirus (2019-nCoV) outbreak has characterized the most common CT findings associated with the virus and identified possible markers of disease progression several days after the onset of symptoms. The findings were published online 6 February (Radiology 10.1148/radiol.2020200274).

The researchers from the US and China examined the imaging findings and clinical data of 51 patients who were admitted to the Shanghai Public Health Clinical Center and confirmed via DNA testing to have 2019-nCoV.

All of the patients underwent at least one CT exam and all but one had some contact with individuals from the city of Wuhan. The median age of the patients was 49 years and roughly half were women. The most common symptoms were fever (96% of patients), cough (47%) and fatigue (31%).

The group, led by Yuxin Shi, identified bilateral ground-glass opacities on the CT scans of almost 90% of the patients – by far the most common imaging finding – confirming reports from earlier studies.

CT scans of coronavirus

To be specific, the ground-glass opacities were classified as pure in 77% of the cases, associated with interstitial and/or interlobular septal thickening in 75% of the cases, and associated with consolidation in 59% of the cases. In addition, the opacities involved the peripheral lungs in 86% of the patients and the posterior lungs in 80%. They also found that 80% of the patients’ CT scans showed bronchograms and 55% had a consolidation lesion.

Overall, the most common CT findings for 2019-nCoV coincided with the most common findings for other viruses, including the 2009 swine flu virus (H1N1). The distinguishing characteristic for 2019-nCoV was that the imaging findings tended to appear simultaneously in the same patient, with predominant distribution in the posterior and peripheral part of the lungs.

Furthermore, the researchers characterized disease progression for 2019-nCoV by comparing CT features from the first four days of symptom onset to those after the fourth day. Their comparison uncovered a statistically significant increase in the total number of lung findings over time (p = 0.02).

Most prominently, the proportion of ground-glass nodules with consolidation increased from 21% of the lesions in the first four days after symptom onset to 61% after the fourth day (p < 0.001). This statistically significant increase in the percentage of lesions with consolidation was also evident in older patients (50 years or older): 45% of the lesions showed consolidation in the older patients, compared with only 23% in the younger patients (p < 0.001).

This increase in lung consolidation as the disease extended its course indicates that “consolidation lesions could [serve] as a marker of disease progression or more severe disease,” Shi and colleagues wrote.

Finally, the high prevalence of bilateral organizing pneumonia in the patient cohort points to corticosteroids as a viable option to manage 2019-nCoV pneumonia, they concluded.

  • This article was originally published on AuntMinnie.com. ©2020 by AuntMinnie.com. Any copying, republication or redistribution of AuntMinnie.com content is expressly prohibited without the prior written consent of AuntMinnie.com.

Terahertz waves create short and stable electron pulses

Terahertz radiation has been used to reduce the timing jitter of ultrashort pulses of relativistic electrons. This was achieved independently by two teams, one in China led by Dao Xiang at Shanghai Jiao Tong University and the other in the US led by Emilio Nanni at SLAC National Accelerator Laboratory. Their work could allow researchers to generate high-quality electron beams at far lower costs than current radio-frequency (RF) techniques, allowing for advanced studies of atomic-scale structures and femtosecond-scale processes.

Electron beams comprising ultrashort pulses are rapidly advancing the capabilities of today’s most cutting-edge imaging techniques. Currently, they are generated using RF techniques, which can compress bunches of electrons so that their “heads” and “tails” are separated by less than 10 fs as they move. Yet as researchers’ demands for ever shorter and brighter pulses continue to grow, the RF equipment used to generate them is barely keeping up. Not only is the apparatus bulky and expensive; it causes the distribution of pulse arrival times at a target to become spread out, increasing the timing uncertainty, or jitter, of the beam. So far, this has hindered the progress of many studies requiring the best possible ultrashort pulses.

Velocity boosts

Using similar approaches, both Nanni’s and Xiang’s teams discovered that bunches of electrons produced by a photocathode can be compressed, and their timing jitter reduced, by replacing RF signals with terahertz waves. Their setups included two terahertz sources that are linearly polarized in parallel directions. The waves interact with electron bunches as they pass through two separate waveguides. During this interaction, each pulse imparts velocity boosts that increase as the beam passes through. This compresses the electron bunches by speeding up their tails more than their heads. In addition, since the same laser is used to drive the photocathode and both terahertz sources, each radiation-bunch interaction is highly synchronized, reducing the jitter of the beam.

In each case, beams were compressed to lengths of under 40 fs – not quite as short as those reachable through RF structures used today for electron pulse generation. However, the jitters produced in both studies were reduced to just around 30 fs – a significant improvement on previous techniques. All the while, the apparatus necessary for this is far more affordable than today’s bulky RF-based generators.

Although the teams did not collaborate, their similar methodologies both clearly demonstrated the unprecedented timing resolution afforded using terahertz radiation. With further improvements, their approach could soon satisfy the growing demand for intense, ultrashort electron beams. Possible applications include ultrafast electron diffraction, which can be used to capture images of atomic-scale structures. It could also be used to study the femotosecond-scale processes that play out within materials including semiconductors, yielding important new insights into their physics.

The Shanghai Jiao Tong University and SLAC studies are described in separate papers in Physical Review Letters.

Queensgate: collaborating to push the boundaries of measurement science

Queensgate – a brand of Prior Scientific Instruments – offers their customers maximum precision and accuracy along with high-speed solutions for the nanopositioning challenges they face. Customer  applications range from hard disk testing for companies such as Seagate (see ‘Queensgate reaches the pinnacle of nanopositioning performance‘) to realizing units with national measurement institutes such as the National Physical Laboratory (NPL).

It is this reputation that has seen Queensgate provide high-precision nanopositioning stages for some of NPL’s atomic force microscopes (AFMs), including NPL’s extremely accurate metrological AFM.

The metrological AFM forms a key part of some of NPL’s most important work. As the National Metrology Institute for the UK, it is NPL’s responsibility to ‘realize’ the international standards of measurement – the SI units – in the country. NPL uses this AFM to calibrate transfer standards that are then passed on to other AFM users in order for them to calibrate their AFMs.

Queensgate has been working with NPL in this way for many years. But more recently, NPL came to Queensgate with a more challenging task.

Realizing the metre

The redefinition of the SI in 2019 now means all SI units are based on precise, unchanging and universal fundamental constants of nature. New definitions of the kelvin, ampere, mole and the headline-grabbing kilogram are now in effect. Perhaps less well-known are the changes made to realizing the metre.

The metre was already defined by the speed of light back in 1983. Since then, metrologists have used optical interferometers to realize the metre and make length measurements – and for most measurements this technique is extremely precise. At the nanoscale, however, metrologists must subdivide the wavelength of light, which is several hundred nanometres, making the technique prone to errors.

We needed a bottom up approach, at an atomic scale.

Andrew Yacoot

“We needed a bottom up approach, at an atomic scale,” explains Andrew Yacoot, principal research scientist at NPL and chair of the Working Group for Dimensional Nanometrology of the Consultative Committee for Length (one of ten Consultative Committees that oversee the SI units).

To solve this issue, Yacoot and colleagues from other metrology labs used a method directed towards measurement of the Avogadro constant in which the lattice spacing of silicon is measured very accurately using a technique called X-ray interferometry. “We were able to use this technique together with the known value of silicon lattice spacing to characterize and measure errors in optical interferometers or other displacement sensors, as a technique for making traceable length measurements at the nanoscale” he says.

Queensgate was tasked with designing a custom nanopositioning stage and digital controller for Yacoot’s X-ray interferometry work. “The specification was challenging, because I wanted a long range of measurement (several hundred micrometres) together with picometre resolution and the stage’s payload was almost 1 kg,” he explains.  “These are competing requirements for stage design.”

To reduce the overall system noise, the Queensgate team used a digital interface to apply commands.  This also allowed greater timing accuracy, as the field-programmable gate array (FPGA) could conduct all the data processing, including commanding the controller. “We provided an out of the box system that NPL can command and resolve down to 10­–20 picometres, ten times smaller than the spacing between the atomic planes in a silicon crystal,” says Queensgate’s Principal Electronics Engineer. “That, I think, is impressive.”

A bi-directional partnership

Dialogue was the key element of the partnership for Yacoot and his team: “We’re very keen to have a deep understanding of how every component works and access to all the signals and control of the equipment, rather than having a black box,” he says. “We want to [use equipment] produced by a company that’s open to a dialogue with us and willing to share information – that’s certainly been the case with Queensgate.”

We want to [use equipment] produced by a company that’s open to a dialogue with us and willing to share information.

Andrew Yacoot

In fact, two-way dialogue and collaboration is the bedrock for the relationship between the two organizations. All Queensgate products are tested using interferometers and electronics, most of which are supplied by NPL. NPL acts as a third-party collaborator in verifying measurements on new products, validating their performance which in turn has helped Queensgate secure substantial new business.

Collaboration, then, benefits both parties. NPL gets to use Queensgate’s high-precision piezo systems and gains access to industry-level expertise in designing high-precision, flexure stages for their applications. Meanwhile, Queensgate gains trusted verification of the accuracy of its products. Moreover, by working in partnership with NPL’s leading experts on applications that push the limits of performance, Queensgate is able to maintain its position as an expert in producing the highest precision stages. Perhaps best of all, the collaboration benefits us all, as knowledge gained from public–private collaborations like this one leads to more accurate measurements, which in turn lead to more efficient and effective technologies and products.

Introducing the π-ton, which could be the newest known quasiparticle

A new type of quasiparticle has been predicted by Anna Kauch and colleagues at Technical University Vienna in Austria. Using computer simulations, the team concluded that the “π-ton” (pronounced pie-ton) is created by the bonding between two electron-hole pairs in semiconductor-like materials. The researchers now hope that π-tons could soon be studied in real experiments and even put to work in photovoltaics.

Quasiparticles are particle-like excitations that emerge from the collective behaviour of electrons and other entities in solids. They include polaritons, which arise from the interaction between electrons and light and excitons that are created when an electron in a semiconductor’s valence band is excited by a photon to the conduction band. In the electron’s place, a positively charged “hole” is left behind in the valence band, which the electron remains strongly attracted to. The electron-hole pair behaves like a particle – an exciton.

Kauch’s team initially set out to study excitons by doing computer simulations of “strongly correlated” materials that have strong interactions between electrons. Unlike semiconductors, which initially possess filled valence bands and empty conduction bands, strongly correlated materials have half-filled conduction bands. This means that their behaviours are dominated by fluctuations in “charge density waves” – linear chains of fermions which form standing waves.

Reverse rotation

The researchers had hoped to study the characteristics of exciton formation in these materials, but to their surprise, their simulation seemed to yield a new type of quasiparticle entirely. Kauch and colleagues discovered that when excited by a photon, two electron-hole pairs became bound together by the material’s charge density wave fluctuations, which were reversed by 180°, or π rad, at each crystal lattice point. This behaviour led directly to the proposal of a new quasiparticle; which the team dubbed the π-ton.

To make sure this result is not simply a quirk of their simulation, the team recreated the same conditions across multiple models. The π-ton reappeared every time, removing doubt of its existence. Kauch’s team believe that this provides a strong reason to do experiments involving real strongly correlated materials – in which π-tons could be created through photon excitation, then confirmed by the photons they re-emit as they disappear.

Already, the physicists have suggested samarium titanate as a strong candidate material for these efforts, since previous experimental data appears to suggest that it can host unusual quasiparticles. If achieved, such experiments could bring about a more in-depth understanding of the quantum interactions that take place between light and solids. They could also provide materials physicists with new research opportunities; along with innovations in technological applications including photovoltaics and semiconductors.

The research is described in Physical Review Letters.

Taking a tiger’s pulse, mending a broken heart and dancing your PhD

How do you monitor a lion’s breathing rate or take a tiger’s pulse? With great difficulty, one would imagine.

A team at the University of South Australia has now developed a way to perform these routine health checks using a high-resolution digital camera. The new approach will save the animals the stress of an anaesthetic – and presumably will greatly lower the stress of the zookeepers too.

The researchers filmed animals at Adelaide Zoo – including a giant panda, African lion, Sumatran tiger, orangutan, koala, red kangaroo and a little blue penguin – from up to 40 m away. By detecting tiny movements in the chest cavity, they could record the animals’ heart and breathing signals without needing any physical contact or disrupting the animals’ daily routine. You can hear engineer Javaan Chahl discuss the pilot study in the video above.

On this Valentine’s day, while some may be planning celebrations, spare a thought for the broken hearted. But help may be at hand – at least in a bioengineering sense. Heart disease has a huge impact on patients’ quality-of-life and researchers are continuously looking to develop new treatments, such as cardiac patches, for example, which can help restore damaged heart tissue following a heart attack. Bioengineers from Trinity College Dublin have now fabricated a conductive cardiac patch that can endure the strains and stresses exhibited by human cardiac muscle tissue as the heart beats.

The researchers used melt electrospinning writing to make a patch that can withstand repeated stretching and showed good elasticity. They fine‐tuned the patch geometry to reflect the directionally‐dependent mechanics of the heart, and coated the patches with an electroconductive polymer to provide conductive properties close to those of human myocardium. The team says that this work “essentially takes us one step closer to a functional design that could mend a broken heart”.

Finally, today sees the announcement of the winners of the 12th annual Dance Your PhD contest. This year’s overall winner is neuroscientist Antoine Groneberg. Her zebrafish larvae-inspired video, Early life social experiences shape social avoidance kinematics in larval zebrafish, “merged dance and science for an aesthetically stunning and intellectually profound masterwork of art,” according to judge Alexa Meade.

The judges also highlighted the winner of the physics category for special recognition, for its “original rap and professional production”. The “hilarious and yet scientifically informative” video, Utilizing multispectral lidar in the detection of declined trees, is a dance about multispectral scanning of forests, created by Samuli Junttila.

Precision scaffolds tailor biomaterials to promote wound healing

Biomaterials are used in the clinic as dressings that promote healing of wounds or burns. In addition to conventional wound dressings, scientists are developing skin substitutes containing patient-derived cells, as well as biomaterials that incorporate growth factors to stimulate and facilitate the healing process.

Once implanted, such biomaterials are exposed to the body’s immune response, particularly macrophages that can be either pro-inflammatory (M1) or pro-healing (M2) in type. While an initial inflammatory state is important for healing, prolonged inflammation is detrimental to tissue regeneration. Directing the immune response after implantation is thus a major challenge in the design of new biomaterials.

One way to drive macrophage type is by controlling their physical microenvironment, for example by tailoring the biomaterial geometry. With this aim, researchers at the University Hospital of Würzburg, have used melt electrowriting (MEW) to create high-precision 3D tissue scaffolds that cause human macrophages to differentiate towards the anti-inflammatory M2 type (Biofabrication 10.1088/1758-5090/ab5f4e).

“We wanted to determine the optimal scaffold geometry for the polarization of human macrophages into the M2 type,” explains senior author Jürgen Groll. “And when we found that macrophages on box-shaped scaffolds showed enhanced M2 polarization, we wanted to further determine the optimal pore size for them.”

Shape matters

Groll and colleagues used a customized MEW printer to fabricate 3D porous fibre scaffolds from the biocompatible polymer PCL. They created scaffolds with box-shaped, triangular, round and disordered geometries, and cultivated human-monocyte-derived macrophages on these scaffolds for seven days.

Cellular morphology

Scanning electron microscopy (SEM) revealed that cell morphology differed according to scaffold geometry. In particular, macrophages grown on box-shaped scaffolds developed an elongated shape and stretched across the pores.

Gene expression profiles after seven days growth also depended upon scaffold geometry, with box-shaped scaffolds appearing most promising for promoting macrophage differentiation towards M2 type. Macrophages cultivated on box-shaped scaffolds showed the highest expression of the M2 marker CD163, as well as the strongest downregulation of the pro-inflammatory cytokines IL-1β and IL-8 (which are released by M1 macrophages).

Guided by these outcomes, the team fabricated further box-shaped scaffolds with pore sizes from 40 to 100 μm and seeded them with macrophages. SEM images showed that, after seven days, macrophages cultivated in these scaffolds could stretch along single fibres and bridge across pores.

Decreasing the pore size increased the number of elongated macrophages with long cellular extensions. On scaffolds with 40 μm pores, more than half of the cells were elongated and had an average length of 80 μm. For scaffolds with 100 μm pores, only 20% of the cells were elongated, with a length of roughly 50 μm.

Gene expression

To study the impact of pore size on macrophage differentiation, the researchers examined gene expression from macrophages grown for seven days on scaffolds with varying pore sizes, as well as on 2D PCL films. All of the porous scaffolds triggered a significant decrease in M1 markers compared with the 2D film, suggesting an anti-inflammatory differentiation effect of porous scaffolds over the seven days.

Immunofluorescent staining

Expression of M2-specific markers significantly increased on scaffolds with a pore size of 40–60 μm, but decreased on those with 80 or 100 μm pores. Macrophages on the 2D control showed minimal up- or down-regulation of M2 markers over the seven days.

The team also examined the macrophages’ phagocytic activity (ingestion of other cells or particles) by adding fluorescent beads to the culture medium. Phagocytic activity is important in initial healing, to act against pathogens that enter the wound. However, high activity is a characteristic of inflammatory M1 macrophages.

Fluorescence imaging showed that initial phagocytotic activity was far lower on scaffolds with smaller pores than on those with larger pores. After seven days, there was less phagocytic activity on the 3D scaffolds than on the 2D control.

“We showed that macrophages on 3D scaffolds have a higher phagocytotic activity on day one than on day seven and, therefore, would still be able to react against pathogens,” Groll tells Physics World. “However, the activity is lower on smaller pore sizes, which is beneficial because the risk of frustrated phagocytosis – where macrophages try to internalize something that is too large – is also diminished.”

Groll and co-authors conclude that scaffolds with precisely controlled pore sizes cause elongation of adherent human macrophages, along with a polarization towards M2 type – effects that were most pronounced for the smallest 40 μm pores. These findings could enable creation of pro-healing scaffolds solely through structural control, to improve biomaterials for tissue regeneration and wound healing.

The team has several follow-up studies ongoing and planned. “We are working on the development of other scaffold types that can also promote the elongation of macrophages,” says Groll, noting that this research was conducted within the ERC funded project Design2Heal. “In addition, we would like to investigate the biological mechanism for the elongation-driven polarization of macrophages. Finally, in vivo studies are planned.”

Conspiracy theories, smartphone apps for identifying skin cancer, classical time crystals

In this latest instalment of the Physics World Weekly podcast we scratch our heads over why people believe in conspiracy theories and ask what physicists can do to help avoid the propagation of bizarre and sometimes dangerous ideas.

We also look at smartphone apps that allow someone to analyse a skin lesion to decide whether to seek medical advice; and we chat about a study that looks at the physics that drives lotus plants to have differently-shaped leaves.

Finally, we delve into the mysterious world of time crystals and discover why they could be governed by purely classical physics.

An uncertain growth

Sunflowers

When I first picked up Vaclav Smil’s latest book, Growth: From Microorganisms to Megacities, I wondered whether his main argument would be that systems as diverse as nature and demographics follow a universal growth trend. Indeed, it’s true that there are many similarities to learn from: the height gained by sunflowers during the summer, for example, forms the same growth pattern as the average area of American houses since 1990 and the adoption of mobile phones over the past two decades. But Smil – a Czech-Canadian environmental scientist and policy analyst – also takes pleasure in exploring the idiosyncrasies of these different growth processes, and the factors that influence them.

Throughout Growth he presents rigorous quantitative analyses of disparate systems, ranging from biological processes and crop cultivation to economies and changes in population. Many of these follow an S-shaped growth trajectory, in which initial incremental gains are followed by rapid expansion, and then a long tailing off as the system approaches its limit. Other processes may show more exponential growth from the start – such as the rapid adoption of the telephone at the beginning of the 20th century – but the gains always seem destined to level off over time.

Smil exploits these detailed analyses to draw important conclusions about the nature of growth. It becomes clear that even small variations or external interventions can disrupt the neat progression of an expected growth trajectory. The rapid spread of a flu epidemic, for example, can be curtailed by vaccinating just 20% of the population, while only tiny changes in rainfall or temperature can wipe out expected improvements in crop yields. Coupled with that unknown variability is the difficulty of fitting the best growth curve to observed data.

I particularly enjoyed Smil’s discussion of attempts to estimate how many musical masterpieces Mozart could have produced if he had lived beyond the age of 35. Some previous commentators had fitted an S-shaped curve to Mozart’s cumulative musical output, from which they concluded that he must have written 18 unpublished works as an infant, and that his creativity would have been 91% exhausted by the time he died. Smil puts those claims into doubt by presenting four alternative growth trajectories, all of which provide a good fit to Mozart’s back catalogue, with the number of projected compositions at age 50 ranging from fewer than 800 to more than 1300.

The lesson to be learnt, argues Smil, is that growth models are an unreliable predictor of the future. Growth curves often provide valuable insights into the evolution and development of particular systems, and offer some predictive power for repeatable processes that are well understood – such as the growth of bacteria and other living organisms. But extrapolating a trajectory from a few early data points is fraught with danger, particularly when dealing with more complex systems such as cities, economies or civilizations.

Smil is particularly critical of what he sees as wildly optimistic predictions for the development and adoption of new technologies, such as claims that we will all be driving electric vehicles by 2025. Such growth projections, he argues, often mistake initial performance improvements for the early stages of an exponential curve, while most new technologies advance in a more stepwise fashion – where periods of fast growth as new materials or designs are introduced are interspersed with frequent and sometimes long-lived plateaus.

What can seem like disruption is also often the result of previous, more incremental, advances – the recent explosion of digital technologies, for example, has been enabled by successive innovations in optical fibres and long-distance communications systems over the last 50 years or so. Even Moore’s law, which for decades has successfully predicted exponential growth in the number of transistors that can be fitted onto a silicon chip, is likely to slow down, now that transistor linewidths have reached the atomic scale – although Smil notes that computing power will continue to grow rapidly due to other improvements, such as more specialized chips and the emergence of photonic and quantum technologies.

Smil is also sceptical of the role that technological innovation plays in driving economic progress. The key test, he says, is whether a new invention improves productivity or quality of life – and in those terms the most innovative period was the half century before the First World War, when electricity, telephones and motorized vehicles all became widespread. In contrast, the digital revolution of the past few decades may have changed how we communicate and consume information, but it hasn’t delivered any measurable improvement in economic prosperity.

Smil reserves most of his ire for the modern notion that economies are healthy only if they grow

But Smil reserves most of his ire for the modern notion that economies are healthy only if they grow. He has long been a critic of using gross domestic product (GDP) as an economic indicator, partly because it is an aggregate metric that hides as much as it reveals. Nor does it truly reflect economic prosperity, since headline GDP growth takes no account of unpaid work – such as caring for children or the elderly – and can be achieved without any significant improvement in the quality of life. Japan, for example, enjoyed annual GDP growth of around 8% throughout the late 1950s and the 1960s, but wages remained low for most industrial workers.

Even more troubling to Smil is that the GDP metric ignores the impact of economic growth on natural resources and the environment. Alternative measures that take these factors into account – including a study by Smil himself on China’s rapid economic growth in the 1990s – suggest that gains in GDP can be negated or even reversed by the loss of so-called natural capital.

And this is the crux of Smil’s argument: that the growth of human activity and productivity is fundamentally limited by the resources available on our planet, just as the growth of sunflowers is limited by the amount of light available for photosynthesis. Those limits are already being reached, he says, with soil erosion causing crop yields to decline, irrigation depleting deep aquifers faster than they can be replenished, and urban spread and deforestation causing a loss of biodiversity. These and other environmental impacts – including the unpredictable consequences of a rapidly warming climate – are largely ignored in the modern quest for perpetual economic growth.

Smil doesn’t necessarily believe that a climate calamity is just around the corner, but he also rejects the idea that radical technologies – such as terraforming Mars – offer a viable solution for our long-term survival. Instead, he believes we need to recalibrate our expectations of success. Our progress should no longer be defined by economic growth or material consumption, but by our ability to inhabit our planet alongside other species for millennia to come. It’s hard to argue with the conclusion, but easy to question whether those in power will be persuaded to chart a different course.

  • 2019 The MIT Press 664pp £30hb
Copyright © 2025 by IOP Publishing Ltd and individual contributors