Skip to main content

Double-sided microfluidic blood oxygenator makes artificial placenta

Preterm births account for about 10% of all births in the US and according to the World Health Organization, this number is increasing rapidly. The survival rate for babies with a gestational age of 28 weeks or less is lower than 50% with respiratory disease syndrome being the second major cause of death. This is because lungs are among the last organs to fully develop. One of the main challenges here is to deliver oxygen to the new-borns using external devices, such as mechanical pumps, until their lungs are fully formed, but these ventilators can cause serious problems in themselves.

Researchers at McMaster University in Canada led by P. Ravi Selvaganapathy and Paracelsus Medical University in Germany led by Christoph Fusch have now developed a passive lung device that is pumped by the baby’s own heart (in which the arterio-venous pressure differential is between just 20 and 60 mmHg). Such a device is known as an “artificial placenta” and consists of microchannels that efficiently exchange oxygen between the blood and outside air. Such a device would be connected to the umbilical cord of the new-born.

Although the concept itself is not new, the design developed by Selvaganapathy and colleagues makes use of both sides of the microchannel network for gas exchange (as opposed to just one side as in previous devices). This significantly increases the surface area to volume ratio of the device.

343% better in terms of oxygen transfer

The highest-performing “double-sided single oxygenator units” (dsSOUs) that the researchers made were about 343% better in terms of oxygen transfer compared to single-sided SOUs with the same height. They used their design to make a prototype containing a gas exchange membrane with a stainless-steel reinforced thin (50-micron-thick) PDMS layer on the microchannel network. This design was based on a previous one developed in their lab (Biomicrofluidics 12 014107).

In the present work, they succeeded in incorporating a slightly thicker (150-micron) steel reinforced membrane on the other side of the blood channels to increase gas exchange. The new fabrication process means a 100% increase in the surface area for gas exchange while continuing to mimic the placenta by ensuring that the priming volume (how much blood is removed at a time for oxygenation) remains low.

“The key innovation here is developing a large-area microfluidic device,” says Selvaganapathy. “You want it to be microfluidic because a 1-kg baby, for example, might only have 100 ml of blood. You want a device to use only a one-tenth of that volume at a time.”

Meeting 30% of the oxygenation needs of a preterm neonate

The team has already produced an optimized oxygenator to build a lung assist device (LAD) that could meet 30% of the oxygenation needs of a preterm neonate weighing between 1 and 2 kg. The LAD provides an oxygen uptake of 0.78-2.86 ml/min, which correspond to an increase in oxygen saturation from around 57 to 100% in a pure oxygen environment.

Reporting on their work in Biomicrofluidics 12 044101, the researchers say that the design could be improved by coating the surface of the dsSOUs with antithrombin-heparin or polyethylene glycol to improve the anticoagulation properties of PDMS surfaces, which are in contact with blood. “Finally, new designs that have higher gas exchange can be used to provide the sufficient oxygenation in ambient air,” they write.

Jeffrey Borenstein of the Charles Stark Draper Laboratory in Cambridge, Massachusetts, who was not involved in this work says that this new study “targets an extremely exciting and promising opportunity in artificial organs research, using microfluidics technology to overcome many of the current limitations of respiratory assist devices.

“Selvaganapathy and colleagues’ advance points to microfluidics as a means to reduce the incidence of clotting, miniaturize the device, and potentially operate the system on room air to enable portable and wearable systems,” he tells Physics World.

Record-breaking entanglement uses photon polarization, position and orbital angular momentum

Physicists in China have fully entangled 18 qubits by exploiting the polarization, spatial position and orbital angular momentum of six photons. By developing very stable optical components to carry out technically demanding quantum logic operations, the researchers generated more combinations of quantum states at the same time than ever before – over a quarter of a million. They say that their research creates a “new and versatile platform” for quantum information processing.

Hans Bachor of the Australian National University in Canberra describes the work as a “true tour de force” in photon entanglement. “This team has amazing technology and patience,” he says, having been able to generate entangled pairs of photons “significantly” faster and more efficiently than in the past.

The decades-old dream of building a quantum-mechanical device that can outperform classical computers relies on the principle of superposition. Whereas classical bits exist as either a “0” or a “1” at any point in time, quantum bits, or “qubits” can take on both values simultaneously. Combining many qubits, in principle, leads to an exponential increase in computing power.

Physicists are now working on several different technologies to boost the qubit count as high as possible. Last year, researchers at the University of Maryland built a basic kind of quantum computer known as a quantum simulator consisting of 53 qubits made from trapped ytterbium ions. And in March, Google announced that it had built a 72-qubit superconducting processor based on the design of an earlier linear array of nine qubits.

Control and readout

However, quantity is not everything, according to Chao-Yang Lu of the University of Science and Technology of China (USTC) in Hefei. He says it is also crucial to individually control and readout each qubit, as well as having a way of hooking up all of the qubits together using the phenomenon of entanglement. Described by Einstein as “spooky action at a distance”, entanglement is what enables multiple qubits – each held in a superposition of two states – to yield the exponential performance boost.

Maximizing the benefits of entanglement involves not only increasing the number of entangled particles as far as possible but also raising their “degrees of freedom” – the number of properties of each particle that can be exploited to carry information. Three years ago, Lu’s was part of Jian-Wei Pan’s group at USTC, which entangled and teleport two degrees of freedom – spin and orbital angular momentum (OAM) – from one photon to another. In the latest work, they have gone one better and have entangled a photon’s spatial information.

The team begin by firing ultraviolet laser pulses at three non-linear crystals lined up one after another. This generates three pairs of photons entangled via polarization. They then use two polarizing beam splitters to combine the photons so each particle is entangled with every other, before sending each photon through an additional beam splitter and two spiral phase plates. This entangles the photons spatially and via their OAM, respectively. Finally, they measure each of the three degrees of freedom in turn. The last and most difficult of these measurements – the OAM – is achieved by using two consecutive controlled-NOT gates to transfer this property to the polarization, which, the researchers say, “can be conveniently and efficiently read out”.

Significant technical hurdle

Lu says that a significant technical hurdle was operating the 30 single-photon interferometers – one for the spatial measurement and four for the OAM of each photon – with sub-wavelength stability. This they did by specially designing the beam splitter and combiner of each interferometer and gluing them to a glass plate, which says Lu, isolated the set-up from temperature fluctuations and mechanical vibrations. The researchers also had to find a way of simultaneously recording all the combinations of the 18 qubits – of which there were 262,144 (218). To do this, they brought together 48 single-photon counters and a home-made counting system with 48 channels (being 23 channels per photon).

By successfully recording all combinations at the same time, Pan and colleagues beat the previous record of 14 fully-entangled trapped-ion qubits reported by Rainer Blatt and colleagues at the University of Innsbruck in 2011. Among possible applications of the new work, Lu says it could finally allow demonstration of the “surface code” developed in 2012 for error correction – a vital task in a practical quantum computer. Implementing this code requires very precise control of many qubits – and in particular, the ability to entangle them all.

Bachor says that the latest work represents a “great step” in showing the advantages of quantum technology over conventional computing. He believes that single-photon technology could play an important role in quantum cryptography and in ferrying data between processors within quantum computers. But he reckons that other technologies – perhaps trapped ions, semiconductor or superconducting qubits – are more likely to yield the first roughly 50-qubit computer capable of demonstrating major advantages over classical devices in executing highly-tailored algorithms. As to when that might happen, “a number of years’ time” is the most he will venture. “I won’t make a prediction,” he says.

The research is described in Physical Review Letters.

Quantum computing in the cloud

Quantum computers – devices that use the quantum mechanical superposition principle to process information – are being developed, built and studied in organizations ranging from universities and national laboratories to start-ups and large corporations such as Google, IBM, Intel and Microsoft. These devices are of great interest because they could solve certain computationally “hard” problems, such as searching large unordered lists or factorizing large numbers, much faster than any classical computer. This is because the quantum mechanical superposition principle is akin to an exponential computational parallelism – in other words, it makes it possible to explore multiple computational paths at once.

Because nature is fundamentally quantum mechanical, quantum computers also have the potential to solve problems concerning the structure and dynamics of solids, molecules, atoms, atomic nuclei or subatomic particles. Researchers have made great progress in solving such problems on classical computers, but the required computational effort typically increases exponentially as the number of particles rises. Thus, it is no surprise that scientists in these fields view quantum computers with a lot of interest.

Many different technologies are being explored as the basis for building quantum processors. These include superconductors, ion traps, optical devices, diamonds with nitrogen-vacancy centres and ultracold neutral atoms – to name just a few. The challenge in all cases is to keep quantum states coherent for long enough to execute algorithms (which requires strictly isolating the quantum processor from external perturbations or noise) while maintaining the ability to manipulate these states in a controlled way (which inevitably requires introducing couplings between the fragile quantum system and the noisy environment).

Recently, universal quantum processors with more than 50 quantum bits, or qubits, have been demonstrated – an exciting milestone because, even at this relatively low level of complexity, quantum processors are becoming too large for their operations to be simulated on all but the most powerful classical supercomputers. The utility of these 50-qubit machines to solve “hard” scientific problems is currently limited by the number of quantum-logic operations that can be performed before decoherence sets in (a few tens), and much R&D effort is focused on increasing such coherence times. Nevertheless, some problems can already be solved on such devices. The question is, how?

First, find a computer

Within the research sector, scientists have taken the first steps towards using quantum devices to solve problems in chemistry, materials science, nuclear physics and particle physics. In most cases, these problems have been studied by collaborations between scientists and the developers, owners and/or operators of the devices. However, a combination of publicly available software (such as PyQuil, QISKit and XACC) to program quantum computing processors, coupled with improved access to the devices themselves, is beginning to open the field to a much broader array of interested parties. The companies IBM and Rigetti, for instance, allow users access to their quantum computers via the IBM Q Experience and the Rigetti Forest API, respectively. These are cloud-based services: users can test and develop their programs on simulators, and run them on the quantum devices, without ever having to leave their offices.

As an example, we recently used the IBM and Rigetti cloud services to compute the binding energy of the deuteron – the bound state of a proton and a neutron that forms the centre of a heavy hydrogen atom. The quantum devices we used consisted of about 20 superconducting qubits, or transmons. The fidelity of their quantum operations on single qubits exceeds 99%, and their two-qubit fidelity is around 95%. Each qubit is typically connected to 3–5 neighbours. It is expected that these specifications (number of qubits, fidelities and connectivity) will improve with time, but the near future of universal quantum computing is likely to be based on similar parameters – what John Preskill of the California Institute of Technology calls “noisy intermediate-scale quantum” (NISQ) technology.

The deuteron is the simplest atomic nucleus, and its properties are well known, making it a good test case for quantum computing. Also, because qubits are two-state quantum-mechanical systems (conveniently thought of as a “spin up” and a “spin down” state), there is a natural mapping between qubits and fermions – that is, particles with half-integer spin that obey the Pauli exclusion principle – such as the proton and neutron that make up a deuteron. Conceptually, each qubit represents an orbital (or a discretized) position that a fermion can occupy, and spin up and down correspond to zero or one fermion occupying that orbital, respectively. Based on this ­Jordan-Wigner mapping, a quantum chip can simulate as many fermions as it has qubits.

Another helpful feature of the quantum computation of the deuteron binding energy is that the calculation itself can be simplified. The translational invariance of the problem reduces the bound-state calculation of the proton and the neutron to a single-particle problem that depends only on the relative distance between the particles. Furthermore, the deuteron’s Hamiltonian becomes simpler in the limit of long wavelengths, as details of the complicated strong interaction between protons and neutrons are not resolved at low energies. These simplifications allowed us to perform our quantum computation using only two and three qubits.

Then, do your calculation

We prepared a family of entangled quantum states on the quantum processor, and calculated the deuteron’s energy on the quantum chip. The state preparation consists of a unitary operation, decomposed into a sequence of single- and two-qubit quantum logical operations, acting on an initial state. With an eye towards the relatively low two-qubit fidelities, we employed a minimum number of two-qubit CNOT (controlled-not) operations for this task. To compute the deuteron’s energy, we measured expectation values of Pauli operators in the Hamiltonian, projecting the qubit states onto classical bits. This is a stochastic process, and we collected statistics from up to 10,000 measurements for each prepared quantum state. This is about the maximum number of measurements that users can make through cloud access, but it was sufficient for us because we were limited by noise and not by statistics. More complicated physical systems employing a larger number of qubits, or demanding a higher precision, could, however, require more measurements.

To compute the binding energy of the deuteron, we had to find the minimum energy of all the quantum states we prepared. This minimization was done with a classical computer, using the results from the quantum chip as input. We used two versions of the deuteron’s Hamiltonian, one for two and one for three qubits. The two-qubit calculation involved only a single CNOT operation and, as a consequence, did not suffer from significant noise.

However, the three-qubit calculation was considerably affected by noise, because the quantum circuit involved three CNOT operations. To understand the systematic effects of the noise, we inserted extra pairs of CNOT operations – equivalent to identity operators in the absence of noise – into the quantum circuits. This further increased the noise level and allowed us to measure and subtract the noise in the energy calculations. As a result, our efforts yielded the first quantum computation of an atomic nucleus, performed via the cloud.

What next?

For our calculation, we used quantum processors alongside classical computers. However, quantum computers hold great promise for standalone applications as well. The dynamics of interacting fermions, for instance, is generated by a unitary time-evolution operator and can therefore be naturally implemented by unitary gate operations on a quantum chip.

In a separate experiment, we used the IBM quantum cloud to simulate the Schwinger model – a prototypical quantum-field theory that describes the dynamics of electrons and positrons coupled via the electromagnetic field. Our work follows that carried out by Esteban Martinez and collaborators at the University of Innsbruck, who explored the dynamics of the Schwinger model in 2016 using a highly optimized trapped-ion system as a quantum device, which permitted them to apply hundreds(!) of quantum operations. To make our simulation possible via cloud access to a NISQ device, we exploited the model’s symmetries to reduce the complexity of our quantum circuit. We then applied the circuit to an initial ground state, generating the unitary time evolution, and measured the electron-positron content as a function of time using only two qubits.

The publicly available Python APIs from IBM and Rigetti made our cloud ­quantum-computing experience quite easy. They allowed us to test our programs on simulators (where imperfections such as noise can be avoided) and to run the calculations on actual quantum hardware without needing to know many details about the hardware itself. However, while the software decomposed our state-preparation unitary operation into a sequence of elementary quantum-logic operations, the decomposition was not optimized for the hardware. This forced us to tinker with the quantum circuits to minimize the number of two-qubit operations. Looking into the future, and considering more
complex systems, it would be great if this type of decomposition optimization could be automated.

For most of its history, quantum computing has only been experimentally available to a select few researchers with the know-how to build and operate such devices. Cloud quantum computing is set to change that. We have found it a liberating experience – a great equalizer that has the potential to bring quantum computing to many, just as the devices themselves are beginning to prove their worth.

The rise and rise of cryogenic electron microscopy

Combining physics, biology and chemistry, structural biology investigates the anatomy of biological macromolecules, and proteins in particular, improving understanding of diseases and enabling drug discovery. Speaking at the 68th Lindau Nobel Laureate meeting in Germany last week, biophysicist Joachim Frank says that the field is entering a new era with a bright future thanks to advances in cryogenic electron microscopy (cryo-EM).

Frank, a German-American Nobel Laureate in chemistry based at Columbia University, New York, traced the history of cryo-EM to the present day in his Lindau lecture. A technique pioneer, Frank shared the 2017 prize with fellow biophysicists Richard Henderson and Jacques Dubochet, for the significant contributions he made to its advancement.

Cryo-EM rapidly freezes molecular samples in solution and images them with an electron beam to reveal the molecular structure. Today, resolutions of 3-4 Å are routinely achievable, while at its upper limits, the technique can achieve atomic resolutions of 2 Å.

The technique’s particular strength is its ability to construct models of single, unattached molecules from images of the molecules in their natural state – encompassing a range of arrangements and binding states with other molecules.

A mainstay for structure determination, X-ray crystallography, in contrast, demands molecules are prepared as crystals, a form not typically taken by biomolecules. Crystallography can, however, take advantage of the regularly-ordered molecules to obtain diffraction patterns with which their structures can be reconstructed.

In the early 1970s, researchers saw electron tomography of individual molecule as a solution. However, in a major drawback, images of intact molecules were not possible, as the doses required inflict significant damage on the samples.

At around the same time, during his PhD, Frank had a new idea. It was to use computational and mathematical techniques to wrangle 2D images of hundreds of thousands of molecules in a sample. By aligning and averaging them, a single, clearer 3D picture of the molecule could be constructed, he proposed. The ribosome, a molecular machine found in large quantities in the cell cytosol that makes proteins, was to become his test object in developing the techniques.

A workbench for molecular reconstruction

To process and reconstruct the cryo-EM images, Frank developed SPIDER, a modular image processing program and the first of its kind in electron microscopy, around the turn of the 1980s. The computational equivalent of a workbench, it comprised hundreds of operations.

Key techniques developed in Frank’s lab included a correlation averaging technique to identify and average ribosomes facing in a particular direction into a single structure. In another, multivariate statistical analysis was applied to tackle the structural heterogeneity that occurs across a population of molecules in a sample, classifying and grouping similar structures together. Bringing all the techniques together, Frank and post-doc Michael Radermacher completed their first molecular reconstruction, the large subunit of the ribosome, in 1986.

Around the same time, Dubochet and colleagues at the European Molecular Biology Laboratory in Heidelberg made an important advance. For the first time, Dubochet vitrified sample solutions into a glass-like fluid by rapid freezing using liquid ethane cooled to -196°C. The technique prevents the formation of ice crystals that otherwise damage the molecules and diffract the electron beam, resulting in unusable images.

“This now gave the method I developed a very big boost, because now we could look at molecules in their native states,” said Frank. Exploiting this, Frank and collaborators went on to reconstruct several structures for the first time in the mid-1990s. They included a calcium release channel, octopus haemocyanin and the Escherichia coli ribosome. “These were all pioneering contributions at the time.”

Revealing ribosome movement

Another landmark finding followed in 2000. Frank and post-doc Rajendra Agrawal took advantage of further improvements in resolution to reveal the structure of the Escherichia coli ribosome in unprecedented detail. Their analysis revealed a ratchet-like movement of the ribosomes’ two subunits relative to one another during translation, the process where messenger RNA is used synthesize proteins. The movement proved critical to the ribosome’s functioning.

Despite the breakthroughs, however, Frank still saw resolution as a limiting factor in the lab’s research. In 2013, they achieved their best imaging on film after two years going through 260,000 images of the ribosome of Trypanosoma brucei, a parasite that causes African sleeping sickness. “We got stuck at 5.5 Å resolution. It was essentially a wall.” At this resolution, it was not possible to infer structures with high precision. Side chains on the molecules, for example, require a resolution of around 3 Å.

Cameras take cryo-EM to next level

Arrival of the first commercial single electron detection cameras in 2012, however, has had a decisive impact. Their detective quantum efficiency (DQE) is significantly higher than that of film. Frank’s lab used them in 2016 to resolve the structure of the ribosome in Trypanosoma cruzi, a parasite responsible for Chagas disease, at a resolution of 2.5 Å. His technique, with the aid of the camera even allowed a single water molecule to be resolved. “I couldn’t believe it when I first saw it,” said Frank.

Trypanosoma cruzi ribosome large subunit

Combined with maximum likelihood statistical methods, cryo-EM imaging technology is now enabling the determination of multiple, co-existing structures in a given sample at near-atomic resolution. This often results in snapshots of molecules in different states. Dubbed a “story in a sample” by Frank, the information means cellular functions can be visualized indirectly. Through these capabilities, he predicts a boom in knowledge. “There’s going to be a huge expansion of [structural] databases relevant for molecular medicine.”

Joachim Frank’s full lecture, Single-Particle Cryo-EM of Biological Molecules – the Sky Is the Limit, can be seen below. (Courtesy: Lindau Nobel Laureate meetings)

Warmer world needs more protected habitat

Some time later this century, the world’s need for protected habitat will be more acute even than today.

The greatest danger to the wild vertebrates that roam the planet will not be the intruding humans, their livestock and their pesticides and herbicides. It will be human-induced global warming and climate change.

The conversion of wilderness – forest, grassland and swamp – to urban growth, agriculture and pasture has already caused losses of perhaps one species in 10 in the natural ecosystems disturbed by humankind.

But what could be catastrophic climate change driven by profligate human burning of fossil fuels could by 2070 overtake the damage delivered by changes in the way land is used, with catastrophic consequences for birds, reptiles, mammals and other vertebrates.

Losses could reach 20% or even 40%, according to a new study in an academic journal, the Proceedings of the Royal Society B.

And a second, separate study in another journal spells out the challenge for governments, communities and conservators: the present targets for biodiversity conservation are simply inadequate. They leave 83% of the land surface unprotected, and 90% of the oceans not effectively conserved.

There have been calls to set at least half of the globe aside for the wild animals, plants and fungi that – until human numbers began to expand – dominated the planet. But the latest study, in the journal Nature Ecology and Evolution, suggests that even a half-share for nature might not be enough to save many species from extinction.

Researchers have been warning for two decades that climate change poses a real threat to the thousands of known species of wild creature, and millions of plants and animals yet to be identified and monitored.

They have argued that climate change will damage the forests that provide a natural home for countless forms of life; that global warming already presents dangers for known species; and that climate change may already have claimed more victims than anyone has so far realized.

Natural answer

They have also, in different ways, proved again and again that rich, biodiverse habitats, especially forests, are part of the natural machinery for limiting climate change – and in any case, in simple cash terms, forests are worth more to humankind as natural forests than as plantations, or cattle ranches.

And to rub home the message a third study in the same week, in the Journal of Animal Ecology, highlights the direct dangers of warmer sea waters to the colonies of black-browed albatross in the Southern Ocean. Meticulous monitoring since 1979 has showed that the biggest variation in population growth depends simply on sea water temperatures as the juvenile birds set off for their first year of independence over the open sea.

The cold Antarctic waters are rich in dissolved oxygen and support enormous levels of plant and tiny animal life on which the birds, fish and sea mammals depend. As waters warm, food becomes less available.

“As our oceans are projected to warm, fewer juvenile albatrosses will manage to survive and populations are expected to decline at a faster rate,” said Stéphanie Jenouvrier, of the Woods Hole Oceanographic Institution in the US.

The albatross populations of the Southern Hemisphere are already vulnerable: climate change will make them even more at hazard. And researchers have already pointed out that although great tracts of the world have already been declared reserves, many of those territories already protected have been systematically degraded by human invasion.

Heavy demands

“Humanity asks a lot of the natural world. We need it to purify our water and air, to maintain our soils, and to regulate our climate,” said Martine Maron of the University of Queensland, Australia, who led the Royal Society study.

“Yet even as we increase the extent of protected areas, they don’t necessarily prevent the loss of natural systems. They’re often located in areas that might not have been lost anyway – and the current target of protecting 17% of terrestrial systems will never be enough to protect species as well as provide the benefits humanity needs.”

And her co-author James Watson, of the Wildlife Conservation Society, who is also based at the University of Queensland, said: “We need a big, bold plan.

“There is no doubt that when we add up the different environmental goals to halt biodiversity loss, stabilise runaway climate change and to ensure other critical ecosystems services such as pollination and clean water are maintained, we will need far more than 50% of the Earth’s natural systems to remain intact.”

Watching planet birth, citizen science and a quantum love song

In the latest episode of Physics World Weekly, we’re joined by the astronomer and science communicator, Chris Lintott. A presenter on the BBC TV show The Sky at Night, Lintott discusses the story that broke this week that astronomers have captured the first images of a planet still in its formation. PDS 70b is a gas giant 370 light-years from Earth, which was imaged using an instrument of the Very Large Telescope (VLT) in Chile.

When Lintott last featured in a Physics World podcast back in 2015, he spoke about the Zooniverse – the citizen science platform he co-founded, which enables the general public to help scientists crunch through data sets. Lintott gives an update on the Zooniverse, which has expanded significantly during the past three years and currently offers nearly 90 projects, spanning the sciences, arts and humanities.  Lintott believes the combination of human insight and machine learning is emerging as a very powerful scientific tool.

Playing out this week’s show is a song inspired by quantum mechanics, sent by one of our listeners. If you enjoy the podcast then you can subscribe via iTunes or your chosen podcast app.

 

Mantis shrimp strikes again to inspire tougher composite materials

Famous for its great strength, the mantis shrimp pummels its prey with a dactyl club that moves at 80 km/h. But how the brittle material in the club survives repeated high-velocity impacts is something that has puzzled material scientists.

In 2017, theoretical work by US-based researchers showed that the impact energy is dissipated through the twisting of micro-cracks around spiral fibres within the club and that this prevents fibre breakage and catastrophic damage. That work was led by Purdue University’s Pablo Zavattieri and David Kisailus of the University of California, Riverside – and their teams have now performed a combination of computational and experimental studies that back-up this mechanistic theory. Their research also offers a new way of improving the toughness of composite materials.

The aerospace and other industries are keen to develop composite materials that are more resilient and lighter weight than materials available today. Designing better composites is a challenge because there is no ideal tool for predicting the properties of such materials and the slow process of slow trial and error can be the only route to success.

Some scientists are taking a different approach and examining materials that have emerged from nature’s process of trial and error, that is – evolution. In this search, scientists have been drawn to the strength displayed by the mantis shrimp and its smashing dactyl club.

Strengthening fibres

The skeleton of the club is made of calcium carbonate and calcium phosphate – a composition that is expected to be brittle with properties similar to everyday ceramics. However, its strength seems to lie in the spiral- helicoidal architecture of fibres within the club. These fibres are made of chitin, which is a long-chain polymer commonly found in the exoskeletons of crustaceans and insects. In previous studies, composites were designed with a similar spiral architecture. They were found to be much tougher than the traditional quasi-isotropic composite geometry in which the directionality of fibres is shifted by 45° between layers.

It’s okay to have multiple micro-cracks, as long as they don’t coalesce to form one big crack that splits the material

Pablo Zavattieri

Further microscopic analysis of the shrimp’s mighty club revealed that millions of micro-cracks appeared after an assault on a prey. However, these cracks are twisted around the spiral fibres without breaking the fibres themselves. “It’s okay to have multiple micro-cracks, as long as they don’t coalesce to form one big crack that splits the material,” explains Zavattieri.

Shrimp material

Using these observations, Zavattieri, Kisailus and colleagues delved further into the mechanism of how exactly a helicoidal architecture imparts strength to a material. They proposed the twisting cracks hypothesis – suggesting that the helicoidal architecture enabled the energy from an impact to dissipate within spiral micro-cracks, so preventing catastrophic failure.

To prove this hypothesis the scientists developed a theoretical model backed up by ancillary 3D simulations to examine the local stress intensity factors. This confirmed that as cracks dissipate within the shrimp’s dactyl club, local strain is reduced.

3D printing

The scientists further validated the twisting cracks hypothesis by combining computational and experimental biomimetic models. Composites with spiral architectures were made from carbon fibres and an epoxy matrix using traditional composite processes and 3D printing.

“The advantage of 3D printing is that you can make composites exactly how you want them, with different angles between layers,” says Zavattieri. “Of course there are some differences, because you’re printing fibres 1 mm in diameter and in nature it is a nanometre, but it allows you to demonstrate the mechanism.”

Pressure was applied to the top of the 3D printed composite bars to bend them. In the helicoidal materials the resultant fractures were observed to twist. Camera and digital image correlation techniques were used to examine crack shape, stress distribution and energy dissipation mechanisms. These techniques confirmed the twisting cracks mechanism and demonstrated how it improved fracture resistance in composites.

“They’ve coupled all of the essentials- from examining the microstructure of the natural material to 3D printing experiments- and so clearly demonstrated a successful bottom up design of materials,” says a materials expert, who wished to remain anonymous.

Zavattieri is excited that this basic work can be immediately implemented to create new composites using available technology. He envisions that the extrapolation of these studies will have numerous applications. For instance, Kisailus’ and Zavattieri’s groups are currently developing lightweight fibre-reinforced composites for aerospace, automotive and civil engineering.

“We are beginning to observe that borrowing ideas from nature is very beneficial,” says Zavattieri.

The research is described in the Journal of the Mechanical Behavior of Biomedical Materials and the International Journal of Solids and Structures.

Physical versus chemical paradigms play off in nanomedicine

Packaging requires specificity

Patrick Couvreur, professor at UMR CNRS 8612 in Paris, France, was publishing work on nanoparticle drug carriers as far back as 1979. His early work in the field included development of biodegradable polyalkylcyanoacrylate (PACA) nanoparticles that could make doxorubicin (DOX) – a widely used anticancer drug – invisible in vivo, helping it get past the multiple-drug-resistant defences of hepatocarcinoma. Further developments in this technology led to the release of Livatag®, which showed so much promise it was fast-tracked through Food and Drug Administration (FDA) clearance. Speaking to attendees of Nanotech 2018, Couvreur pointed out that in some ways Livatag has been a victim of its own success. In phase II clinical trials the Livatag survival curve was better than the “chemoembolization” control arm, where anti-cancer drugs are injected directly into the blood vessel feeding a cancerous tumor. However in the multicentric (conducted at more than one medical centre) phase III clinical trial, the survival curve overlapped the control arm of “polychemotherapy” treatment involving several different drugs.

Couvreur has gone on to develop other medicines based on the same approach, such as monochrome antibody streptavidin for targeting alzheimers. Despite these successes and the time elapsed since the first demonstration of PACA encapsulated DOX, Couvreur told attendees that the number of nanomedicines on the market or even in phase III clinical trials remains very low. He suggested that reasons for this include issues with loading and the fact that a fraction of the drug remains at the surface of the nanoparticles where they are not protected and their release is not controlled. “What was needed was a move from a physical to a chemical encapsulation paradigm using linkers,” said Couvreur.

Chemistry takes over

Patrick Couvreur addresses attendees at Nanotech France 2018

Couvreur described “squalenoylation” as “a new platform for nanomedicines”, giving as an example the success of the nanoassembly of anticancer drug gemcitabine with squalene linkers – SQGem. In the case of doxorubicin linked to squalene nanoparticles, features that improve the medicine’s efficacy include the extension of the nanoparticle by blood flow along streams, which gives longer activity post injection. In addition, interactions between SQGem and lipoproteins mean that it is readily transported by them, particularly cholesterol-rich lipoproteins. Tumour cells attract cholesterol to multiply, so this transport provides an indirect targeting mechanism.

Other applications of squalenoylation include nanoparticles of squalene with cis-diamminedichloroplatinum (CDDP) to increase intracellular delivery of platin and ROS production. Couvreur and his team have also investigated the possibility of combining with adenosine to treat spinal cord injury and brain ischemia. Here the blood-brain barrier can pose a challenge, but Couvreur and colleagues found that the nanoparticles interact with peripheral receptors of adenosine, relaxing the brain vessels and inducing neuroprotection of the brain microcirculation. As a result reperfusion improves while the nanoparticles themselves do not cross the blood-brain barrier.

Getting physical with viruses

Antibiotics – which revolutionized medicine in the 20th century – are among the most frequently administered drugs available on account of their broad efficacy against a range of infectious bacterial diseases. However no such drug exists to combat viral infections – yet. Following on from Couvreur, Francesco Stellacci a professor at Ecole Polytechnique de Lausanne (EPFL) in Switzerland, told attendees at Nanotech France 2018 about work using decorated gold nanoparticles that seem to mimic host cells. The nanoparticles lure the virus in to bond and attempt to infect them, but instead the nanoparticle applies a physical pressure on the virus that results in its rupture and disarmament.

This mechanism here is physical, which has the advantage that it is also non-toxic. “There are a lot of virucidal molecules out there, but they are toxic,” said Stellacci, highlighting alcohol as an example. Mild administration of alcohol may have other benefits but its intake is not effective as a viricide.

The gold nanoparticle viricides also work at nanomolar concentrations – significant since most FDA approved drugs are nanomolar. In addition Stellacci and his team tested them against wild-type viruses extracted from patients and found they are not only effective but that the effect is non-reversible. This means that once the gold nanoparticle viricides have reduced viral populations in a sample by a log 2 difference, the population depletion holds even following dilution of the whole sample, something that is not true of some alternative antiviral substances such as heparin.

Stellacci and colleagues demonstrated the approach on respiratory syncytial virus (RSV), which kills half a million each year. They have also demonstrated the efficacy of the puncturing ligands without the gold nanoparticle, attaching them to cyclodextrin instead. His team are in the process of attempts to apply the approach to combat rotavirus, which leads to diarrhoea, one of the main causes of death in children under five across the world.

Catalysis plays a role

Developments in catalysis may also have spin-off benefits for drug delivery. Jean-Pierre Mahy, professor at the Université de Paris Sud in France spoke to Nanotech France attendees about some of the progress in designing artificial enzymes that “combine the robustness of chemical catalysts with the activity of enzymes in mild conditions.”

One of the primary goals of his research has been to mimic the cytochromes P450 hemoproteins for selective oxidation – no small feat. The heme moiety of P450s has been described as “responsible for the remarkable and often exquisite, catalytic prowess of these enzymes”. As natural approaches involve multiple electron transfer processes and are very fiddly to reproduce, Mahy and colleagues turned to artificial hemoproteins (hemoartzymes) with mono oxygenase activity. To provide robust structure, high loading, enzyme protection and potential recycling, their recent work has focused on metallorganic frameworks (MOF).

Mahy and colleagues have shown that microperoxidase 8 (MP8) – a heme octapeptide obtained by hydrolytic digestion of horse heart cytochrome c – can have both peroxidase-like and cytochrome P450-like activities. They combined this with an MOF from MIL 101 nanoparticles (where MIL stands for Material of Institut Lavoisier) and were able to demonstrate charge-selective oxidation activity. They have also designed an artificial reductase based on a water-soluble polyimine polymer decorated with hydrophobic groups that allows use of O2 as an oxidant. “This is the first entirely synthetic heme monooxygenase,” said Mahy.

Alongside these demonstrations of the bioactivity of metalloenzymes, the ability to compartmentalize them in MOFs has suggested potential compatibility with living cells, and it is here that possible therapeutic applications really come into play. Some of Mahy’s most recent work has demonstrated artzymes catalysing organic reactions at the surface of living cells. The cells can then enantioselectively catalyse the abiotic Diels-Alder cycloaddition reaction of cyclopentadiene and azachalcone, and as Mahy told attendees, “This could be used to activate drugs.” In addition there is potential for on site synthesis of drugs and metabolites.

Nanotech France is an annual conference that took place this year in Paris on 27–29 June.

All-optical ultrasound delivers video-rate tissue imaging

Ultrasound is one of the most common medical imaging tools, but the electronic components in ultrasound probes make it difficult to miniaturize them for endoscopic applications. Such electronic ultrasound systems are also unsuitable for use within MRI scanners.

To address these shortcomings, researchers from University College London have developed an ultrasound system that uses optical, instead of electronic, components. The team has now demonstrated the first use of an all-optical ultrasound imager for video-rate, real-time 2D imaging of biological tissue (Biomed. Opt. Express 9 3481).

“All-optical ultrasound imaging probes have the potential to revolutionize image-guided interventions,” says first author Erwin Alles. “A lack of electronics and the resulting MRI compatibility will allow for true multimodality image guidance, with probes that are potentially just a fraction of the cost of conventional electronic counterparts.”

All-optical ultrasound systems eliminate the electronic transducers in standard ultrasonic probes by using light to both transmit and receive the ultrasound. Pulsed laser light generates ultrasound waves, scanning mirrors control the transmission of the waves into tissue, and a fibre-optic sensor receives the reflected waves.

The team also developed methods to acquire and display images at video rates. “Through the combination of a new imaging paradigm, new optical ultrasound generating materials, optimized ultrasound source geometries and a highly sensitive fibre-optic ultrasound detector, we achieved image frame rates that were up to three orders of magnitude faster than the current state-of-the-art,” Alles explains.

Optical components are easily miniaturized, offering the potential to create a minimally invasive probe. The scanning mirrors built into the device enable it to acquire images in different modes, and rapidly switch between modes without needing to swap the imaging probe. In addition, the light source can be dynamically adjusted to generate either low-frequency ultrasound, which penetrates deep into tissue, or high-frequency ultrasound, which offers higher-resolution images at a shallower depth.

The team tested their prototype system by imaging a deceased zebrafish, as well as an ex vivo pig artery manipulated to emulate the dynamics of pulsing blood. The all-optical device exhibited comparable imaging capabilities to an electronic high-frequency ultrasound system, and captured the dynamics of the pulsating carotid artery. The system demonstrated a sustained frame rate of 15 Hz, a dynamic range of 30 dB, a penetration depth of at least 6 mm and a resolution of 75×100 µm.

To adapt the technology for clinical use, the researchers are developing a long, flexible imaging probe for free-hand operation, as well as miniaturized versions for endoscopic applications.

Static electric field suppresses superconductivity

A static electric field can be used to manipulate the superconducting state of metallic superconducting thin films, according to new experiments by researchers in Italy. The effect, which was first put forward by the London brothers, Fritz and Heinz, in their original formulation of superconductivity back in 1935, might be exploited in novel-concept devices such as supercurrent and Josephson field-effect transistors, as well as classical and possibly even quantum bits.

“It seems that we are realizing a novel phase of the superconducting state driven by electric fields,” says Francesco Giazotto of the Consiglio Nazionale delle Ricerche (CNR) and the Scuola Normale Superiore in Pisa, who led this research effort. “At the moment, we are unclear as to the type of phase transition we are inducing but our finding definitely represents something very intriguing from the fundamental physics point of view.”

Electrostatic fields should not affect either a metal or a superconductor

Superconductivity is the complete absence of electrical resistance in a material and is observed in many materials when they are cooled to below their superconducting transition temperature (Tc). In the Bardeen–Cooper–Schrieffer (BCS) theory of (“conventional”) low-temperature superconductivity, this occurs when electrons overcome their mutual electrical repulsion and form “Cooper pairs” that then travel unheeded through the material as a supercurrent.

According to the theory of electrostatic screening, an electrostatic field should not have any effect on either a metal or a superconductor. Giazotto and colleagues have now turned this idea on its head and have found that an intense electric field can dramatically affect the superconducting state, be used to control the supercurrent, and, at sufficiently intense fields, quench the superconductivity altogether.

Spatial deformation of the Cooper pairing parameter?

The researchers obtained their results by applying intense static electric fields on the order of about 10V/m through either the side or bottom gates of all-metallic supercurrent transistors made of two different BCS superconducting thin films (a titanium- and an aluminium-based one). The devices were made using standard lithography and simple metallic thin film deposition techniques.

Giazotto and colleagues say that the superconductivity quenching might come from a spatial deformation of the Cooper pairing parameter by the electric fields localized at the surface of the superconductor. This leads to a reduced available area through which supercurrent can flow. More experiments will be needed to confirm this hypothesis, however.

“From the basic physics point of view, our results suggest that there are still some very important aspects of conventional superconductivity that need to be understood,” says Giazotto. “In this context, I’d say that we are still rather far from understanding the microscopic mechanism driving the phenomena we have observed. We believe that such a field-effect-driven phase transition in superconductivity could represent a valuable platform for developing new theories within the BCS model.”

A whole new area of research?

As for applications, it might be exploited to make new-concept devices, including all-metallic superconducting field-effect transistors and advanced quantum information architectures, he tells Physics World. Other possible devices include tunable Josephson weak links or interferometers and Coulombic and coherent caloritronic structures.

Reporting their study in Nature Nanotechnology 10.1038/s41565-018-0190-3, the researchers say they are now busy trying to better understand the microscopic origin of the field effect. “From the experimental side, we are looking at this effect in a wider range of metallic superconductors and investigating the impact of electric fields on Josephson interferometers,” says Giazotto. “In principle, our work may have opened up a whole new area of research focusing on the role of intense electric fields on superconductivity. Time will reveal whether this is the case or not.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors