Skip to main content

Giant negative thermal expansion seen in nanomagnet

Most materials expand when they are heated up and contract when cooled down – a phenomenon known as positive thermal expansion (PTE). Over the past three decades, however, an increasing number of materials showing the opposite effect – that is, negative thermal expansion (NTE) – have been discovered. Researchers in Portugal and the US say they have now found NTE above the magnetic ordering temperature in magnetic nanoparticles for the first time. The new discovery will be important for making composites from PTE and NTE materials that have zero thermal expansion for use in a host of technology applications.

The researchers, led by João Pedro Araújo, André Pereira and Joao Belo of the University of Porto found the NTE effect in Gd5Si1.3Ge2.7 magnetic nanogranules at low temperatures of between 90 and 150 K and more interestingly, from the point of view of applications, at the room-temperature interval of 260-340 K.

Gd5Si1.3Ge2.7 belongs to the technologically important family of materials R5Six Ge4−x (where R = rare earth). These have been extensively studied since 1997, when researchers discovered the giant magnetocaloric effect in Gd5Si2Ge2. Since then they have also found giant magnetoresistance, spontaneous generation voltage and colossal magnetostriction in this compound. These effects come from the coupling between the magnetic and structural phases in the material, which lead to magnetostructural transitions at a transition temperature TMS.

In the bulk form, these materials show PTE in both the ferromagnetic (T<TMS) and the paramagnetic states (T>TMS).

“A curious story”

“Ours is a very curious story,” says Belo. “In previous experiments on nanoscale Gd5Si1.3Ge2.7, we had already obtained synchrotron X-ray diffraction data as a function of temperature in the 90 to 340 K range in 5 K step intervals. In this early work, however, we were so focused on trying to study the structural transition that we completely overlooked the diffractograms taken at temperatures away from the temperature at which this transition takes place – that is, below 150 K and above 250 K.

“It was only about two years later when we were trying to analyse and understand the negative thermal dependence of the electrical resistivity versus temperature curve of this material that we decided to look more closely at the diffractograms of these low-and high-temperature regions. And that was when we found the NTE and observed how it correlated with the negative thermal dependence of the electrical resistivity.”

This result is particularly interesting since there are no reports of any kind of NTE in Gd5(SiGe)4 materials in their bulk form, he tells Physics World.

Nanoparticles under pressure

The researchers say this particle size-reduction effect comes from the surface pressure of the nanoparticles (which they estimate to be around 11 kbar).

“This intrinsic surface pressure is very much like what happens when a drop of water forms and is present on every nanoparticle,” explains Belo. “This pressure is inversely proportional to the diameter of a nanoparticle so we believe that NTE could also be induced by applying pressure to macroscopic samples of Gd5Si1.3Ge2.7.”

The team hopes that the result will trigger a renewed interest in the study of the thermal expansion of R5(SiGe)4 and magnetic nanoparticles in general. Raman spectroscopy on these materials, for example, could help unveil what changes take place on lattice vibrations at the PTE to NTE transition.

Creating composites with near-zero thermal expansion

NTE materials mixed with a PTE material in composites that have near-zero thermal expansion are particularly useful as temperature-stable thermal contacts in microelectronic and semiconductor devices undergoing thermal cycling operations, says Belo. They might also find use in precision instruments in which any volumetric change induced by temperature variations can cause them to malfunction. Other possible applications include fibre-optic and electro-optical sensors and substrate materials for mirrors in various telescope and satellite applications.

The researchers, reporting their work in Physical Review B, say they are now optimizing their procedure to synthesize controlled-size Gd5(SiGe)4 nanoparticles with different compositions and Si/Ge ratios. “We are also planning high-pressure synchrotron X-ay diffraction experiments to evaluate how size-confinement and pressure affects the atomic structure of these materials and how similar these two effects are.”

Community spirit stems from London tradition

Are you a young scientist who is keen to make contacts among senior academics and industry leaders? Or perhaps you already have lots of experience in science and engineering, and would like the opportunity to meet and discuss your ideas with like-minded individuals, and also give back by nurturing the talents of students and scientists who are just starting out on their careers?

If that sounds appealing, you might consider joining a unique organization that has emerged from some of the oldest traditions in the City of London. This modern livery company – an updated version of the original livery companies that were established in medieval times to manage trade within the City’s Square Mile – has a specific focus on scientific instrumentation, but shares many of the same ideals, traditions and privileges of its ancient counterparts.

Today, some 110 livery companies are recognized by the City of London. Each one is affiliated with a specific profession, ranging from fishmongers and drapers through to gunmakers and architects, and nearly all take the formal title of “Worshipful Company”. While the last of the old-style livery companies was created in 1746, a new breed of modern livery companies started to emerge in the 1930s – and among them was the Worshipful Company of Scientific Instrument Makers (WCSIM), which was founded in 1955.

From medieval London to modern science

Whether old or modern, all livery companies follow the same traditions and pursue two main goals: to share knowledge and fellowship among their members, and to support their community through philanthropy or charitable donations. “Modern livery companies allow members to make connections with other people in the same profession, while also supporting each other and funding charitable causes that are linked to their line of work,” says Misha Hebel, who is Clerk of the WCSIM – the most senior permanent employee in the livery.

STEM day at the WCSIM

Alongside fellowship and philanthropy, members enjoy the heritage and traditions of the City of London – although Hebel points out that members of the WCSIM are located throughout the UK and in other countries too. All the livery companies also share a common lexicon that’s been passed through the generations, with the term “livery”, for example, referring to the distinctive dress that would have been worn in the past to denote that an individual belongs to a specific trade.

Anyone connected with scientific instrumentation can join the WCSIM. Current members range from school and university students through to experienced scientists and engineers from both industry and academia, which always generates plenty of lively conversation and offers lots of opportunities for younger members to meet company CEOs or senior researchers.

Members benefit from insights gained across disciplines too. “You might get a physicist sitting next to an engineer, or an academic talking to a company CEO, and it can suddenly spark a completely different idea or solution,” says Charles Holroyd, a member of the WCSIM who is currently serving as the Junior Warden.

Holroyd joined the WCSIM as a Liveryman, the most senior form of membership, which is open to those with extensive experience in scientific instrumentation. Less experienced members usually join as Freemen, with the prospect of becoming full Liverymen as their career progresses. And, despite the name, both men and women can and do join the company at any level.

Students at school and university can be invited to join as Apprentices and Scholars, respectively. They are usually promising science or engineering students who have won a prize or competition supported by the WCSIM, such as the Arkwright Scholarships that are awarded to school students who excel in science and maths. “The hope is that they will stay interested in the livery as they progress through university and into a career connected to scientific instrumentation,” says Holroyd.

Connections count

Liverymen from academic and industrial backgrounds give back to the scientific community by giving younger members help and advice, whether about science and technology, career options, or how to start a new business. They also get involved in events and activities that encourage school students to pursue careers in science and engineering. “The WCSIM has a database of interests that makes it easy to link people together,” says Holroyd.

Dinner with the WCSIM

As Clerk of the WCSIM, Hebel spends much of her time organizing events that bring together the membership. They include informal trips and gatherings, as well as formal black-tie dinners that members enjoy for the “surroundings, the good food and wine, and the chance to dress up”. She also works closely with various committees and a small group of the most senior members, called the Court, which is headed by the company’s Master. The Master is elected from the members of the Court and changes every year, and part of the Clerk’s role is to guide each new Master and help them to represent the livery at important events in the City.

The WCSIM’s current Master is Ken Grattan, a professor of scientific instrumentation and Dean of the Graduate School at City University London, and a Fellow of the Royal Academy of Engineering and the Institute of Physics. “It’s partly about the fellowship and camaraderie, but also about the opportunity to give something back to the community,” comments Grattan. “We provide support to young scientists though grants and prizes, while more experienced members can act as mentors to early-career scientists and engineers who want to broaden their horizons.”

All members can attend the WCSIM’s events and have access to its Livery Hall in central London – an impressive building that the WCSIM shares with two other livery companies. Freemen can be granted Freedom of the City of London through redemption, which today is largely symbolic but in the past would have been their license to trade in the Square Mile. “It used to be a piece of paper that was kept in a little red envelope and had to be shown on entering the City in order to trade,” explains Hebel. Liverymen must also be Free of the City before they can be admitted.

Liverymen, meanwhile, have additional privileges. Only Liverymen can serve in the Court and become a Master, and they can also vote in the City of London elections for the Lord Mayor – the most senior person in the City.

Giving back

Apart from the Apprentices and Scholars, all members of the livery company pay an annual membership fee, called “quarterage”, which varies with age, and they are also encouraged to make charitable donations on an annual basis or through a legacy. Those funds are primarily used to support the younger generation of scientists and engineers, including student prizes and the prestigious Beloe Fellowship for post-doctoral researchers. Schools are also supported, with funding recently given to provide equipment and science exercise books at a nearby primary school.

Members can choose to take part in some of these initiatives, such as judging scientific competitions or giving careers talks in schools. And the WCSIM’s charitable activities extend to their own members, with the organization’s Almoner – currently experienced scientist Diane Howse – dedicated to keeping in touch with members who are sick or may need some sort of support.

While anyone connected to scientific instrumentation can join the WCSIM, joining a livery company is considered to be a lifetime commitment. Members take an oath when they join, which says that they won’t bring the company or their profession into disrepute, and also that they are committed to the philanthropic effort. Many of the events are held in London, which makes it easier for members in the UK to enjoy the fellowship offered by the WCSIM.

  • Anyone interested in joining the WCSIM as a Freeman or Liveryman can apply online, while the Clerk is always available to talk to prospective members on the phone. There are also regular Open Evenings; please look at the website wcsim.co.uk for dates.

UK air is cleaner but challenges remain

Take a deep breath. If you’re a long-term UK-dweller the air in your lungs right now is almost certainly fresher than the air you inhaled forty years ago. Policy interventions in the UK have significantly improved air quality since the 1970s, according to a new study, and resulted in a halving of the number of deaths attributable to some of the most common air pollutants. But there are significant challenges associated with reducing secondary pollutants such as ozone; the study suggests where the UK needs to focus its clean-up act next.

Outdoor air pollution is estimated to kill 4.2 million people each year worldwide. In the UK around 30,000 deaths a year are attributed to air pollution. Long-term exposure to air pollution increases risk of cardiovascular and respiratory disease and lung cancer. Short-term exposure aggravates respiratory and cardiovascular illness and triggers asthma attacks. There is also emerging evidence of links between air pollution and diabetes, obesity, cognitive decline, dementia and adverse birth outcomes.

Like most other industrialised countries, the UK saw a rise in air pollution through much of the 20th Century, reflecting an increasing demand for energy and mobility, and a fall in air pollution in recent decades due to more stringent emissions standards. But how much impact did policy interventions really have?

Stefan Reis from the Centre for Ecology & Hydrology in Edinburgh and colleagues at the University of Edinburgh, University of Oxford and Institute of Occupational Medicine modelled changes in key air pollutant concentrations — nitrogen dioxide, sulphur dioxide, fine particulate matter and ozone — and related health effects across the UK over the last 40 years.

In order to correct for the variability caused by meteorological factors, they used a fixed meteorological year for all simulations. This meant that the modelled changes in air pollutant concentrations and related health effects were solely a function of changes in emissions, and reflected the impact of policy interventions such as phasing out specific fuels or substances, regulating use of particular chemicals, or developing cleaner and more efficient technologies.

Overall, the results show that the UK attributable mortality due to exposure to fine particulate matter declined by 56% and that due to nitrogen dioxide exposure dropped by 44%, while ozone attributable respiratory mortality increased by 17% over the same period (with a slight decrease between 2000 and 2010).

Analysing the data more closely, it was clear that policy interventions and reductions in emissions were behind many of the trends. For sulphur dioxide and fine particulate matter, the researchers observed a consistent downward trend over the entire time period. Nitrogen dioxide was more complicated. Growth in vehicle numbers and miles travelled resulted in an increase in NOx emissions between 1970 and 1990, but then more stringent vehicle emissions controls kicked in and emissions began to fall. “The further growth of vehicle numbers and mileage has been offset by individual vehicles emitting less,” says Reis.

Ozone concentrations showed a steady increase over the 40-year period; ironically some of the greatest increases have occurred in areas where NOx emissions have decreased most. That’s because ozone is a secondary pollutant that forms from precursor pollutants including NOx and non-methane volatile organic compounds (NMVOCs). The ratio of nitrogen dioxide to NMVOC determines how much ozone is produced.

“Too much nitrogen dioxide leads to destruction of ozone, so you will typically not find high ozone concentrations on or near busy roads, but rather some distance away from the main sources, where the atmosphere is more mixed,” says Reis, whose findings are published in Environmental Research Letters (ERL).  “To reduce ozone further, consistent reductions of NOx and NMVOC emissions will be essential, so the current focus on reducing road transport emissions and more general fossil fuel combustion is on the right track.” Ozone can also travel long distances and so it will be important to look at sources outside the UK.

Reis and his colleagues suggest that there needs to be a greater focus on reducing emissions in sectors that have so far escaped major scrutiny. These include agriculture, domestic wood and coal burners, shipping and non-road mobile machinery. The researchers also note that we need to better quantify the sources contributing to the pollution and focus on a truly integrated assessment of policies to avoid unintended consequences. For example, the promotion of biomass for domestic heating has helped to reduce our carbon footprint but has also led to local air pollution hotspots because most domestic appliances do not have filters and can emit large amounts of fine particulate matter.

“Integrated policies that look at air pollution, greenhouse gases and other sustainable development indicators are essential to ensure interventions are leading to overall improvements, rather than solving one problem by creating new ones,” says Reis.

Targeted treatment guides radiation directly to pancreatic tumours

Targeted alpha therapy

Pancreatic cancer remains a leading cause of cancer‑related death worldwide. Patients are usually treated with chemotherapy or radiation therapy, but these are not always effective and can have toxic side-effects due to the treatments also impacting healthy cells.

A research collaboration between Osaka University and Heidelberg University Hospital is exploring an alternative approach: targeted radionuclide therapy, in which a radioactive molecule travels through the bloodstream to the tumour to deliver radiation directly to cancer tissue (J. Nucl. Med. 10.2967/jnumed.119.233122).

“With traditional anti-cancer therapies, there’s a trade-off between efficacy against cancer cells and off‑target effects in non‑cancerous cells,” explains lead author Tadashi Watabe from Osaka University Graduate School of Medicine. “We’re focused on finding ways to re-balance this trade-off in radiotherapy, by increasing the dose of radiation delivered to cancer cells while keeping it localized to those cells as much as possible.”

The new treatment employs the alpha particle emitting radionuclide 225Ac. Alpha particles travel a short distance in tissue, thereby limiting their off-target effect. The researchers targeted the therapy at the fibroblast activation protein (FAP), which promotes tumour growth and progression and is found almost exclusively on stroma cells surrounding pancreatic tumours and various other cancer types. The low expression of FAP in normal tissue makes it an excellent target for this approach.

The researchers used the positron emitter 64Cu and 225Ac to label small-molecule FAP inhibitor (FAPI) probes. They employed 64Cu-FAPI-04 to evaluate tumour uptake in mice bearing human pancreatic cancer xenografts, and 225Ac-FAPI-04 to assess radionuclide therapy of the tumours.

PET scans of mice injected with 64Cu-FAPI-04 revealed mild uptake in tumours and relatively high uptake in the liver and intestine, with rapid clearance through the kidneys and slow washout from tumours. Immunohistochemical staining revealed abundant FAP expression in the stroma of tumour xenografts, while cellular uptake analysis demonstrated minimal accumulation in the tumour cells themselves.

For the targeted therapy using an alpha emitter, the researchers injected 225Ac-FAPI-04 (at a dose of 34 kBq) into the tail veins of six tumour-bearing mice. They observed significant reduction in tumour growth in these mice compared with control mice. Importantly, the treatment did not significantly change the animals’ body weights, indicating that it likely had few toxic side effects.

“We are very encouraged by these initial results,” says Watabe. “We think the approach has enormous therapeutic potential, particularly for patients with pancreatic cancer who’ve exhausted their other treatment options. What’s especially exciting is that our method of targeting the stroma can in principle work against many other types of cancer. We think this could represent a new path forward in radiation therapy.”

The physics of blood spatter

Joe Bryan, once a popular and respected high-school principal in a small Texas town, has been in prison for over 30 years. He is serving a 99-year sentence for the shooting and murder of his wife in 1985. The evidence incriminating him involved spots of the victim’s blood found on a hand-held torch. A witness, who was rated as expert in the forensic technique of blood pattern analysis (BPA), interpreted these spots as placing Bryan near his wife when she was shot – a testimony that was at the forefront of Bryan’s conviction. It overrode countervailing evidence that he was in fact at a conference 120 miles away – an alibi that made it nearly impossible for him to have shot his wife, as he would have had to leave the event, travel home, commit murder and return to the conference within a specific timeframe. Bryan maintains his innocence to this day.

Evidence like this, based on the physical behaviour of the blood generated at a crime scene, has roots in late 19th-century Europe. It became prominent in the US during the famous Sam Sheppard murder trial in 1955, and has played an important role in other murder trials since – including those of football player and actor O J Simpson (1994–1995, verdict of not guilty) and music producer Phil Spector (2007–2009, retrial verdict of guilty).

Police investigators use BPA to work backwards from blood traces at a crime scene, allowing them to reconstruct the locations and actions of the people and weapons involved. The traces include drips, smears and spatters, which are created when drops of blood radiate from the impact of a bullet or blunt instrument until they encounter a surface and stain it. But according to a startling 2009 report from the US National Academy of Sciences (NAS) that still resonates today, BPA lacks scientific rigour and valid accreditation for its practitioners. This is a serious concern because BPA results have convicted people later shown to be innocent, as many believe Bryan to be; and because lack of confidence in BPA analysis may allow the guilty to go free. As a result, it has become essential to re-assess the physics behind BPA.

Although the US leads the world in gun ownership – there are 120 guns per 100 people, and 64% of US homicides are gun-related – other countries have many shootings too. For instance, 30% of homicides in Canada involve guns, while a dozen nations, including Brazil, exceed the US in their rate of gun deaths per 100,000 people. Establishing the scientific validity of BPA could therefore have an international impact on dealing with the world’s 250,000 annual gun-related deaths by helping to categorize them as homicides or suicides – and, in the former case, potentially bringing the perpetrators to justice.

Bloody behaviour

In terms of physics, BPA reconstruction is a complicated problem in fluid mechanics that involves tracing the behaviour of blood, under various forces and ambient conditions. The challenge is made more difficult because blood is a complex fluid containing both liquid (the plasma) and solid (the blood cells) components. Furthermore, the properties of blood – such as its pH or the number of red blood cells – vary from person to person.

But this work is more than just an academic exercise. It can also have real effects, according to Alicia Carriquiry at Iowa State University in the US. As a statistician and director of the Center for Statistics and Applications in Forensic Evidence (CSAFE) – which is funded by the US National Institute of Standards and Technology (NIST) – Carriquiry has a broad view of forensic science. “BPA is one of those areas in which science has a lot to say,” she says. “As opposed to other forensic disciplines, in BPA we actually have physical and fluid-dynamical models that can help answer questions such as those having to do with trajectory, point of origin and similar.”

Fundamental science has, however, not been well applied to BPA, according to the 2009 NAS report, which was entitled Strengthening Forensic Science in the United States. Co-chaired by a distinguished US federal judge and an academic statistician, it included contributors from relevant scientific disciplines including physics. In general, except for DNA analysis, the report found deficiencies in virtually every forensic technique – including the analysis of hair, fibres, fingerprints and bite marks. “The interpretation of forensic evidence is not always based on scientific studies to determine its validity,” it stated. “This is a serious problem.” A separate report in 2016 from the US president’s Council of Advisors on Science and Technology – written by the director of the Office of Science and Technology Policy together with a panel of scientists – echoed this critique.

For BPA in particular, the NAS report noted the complexities of fluid dynamics and indicated that BPA analysts should understand the physics involved. But with no strict educational requirements for certification as a BPA expert – they’re trained only to follow packaged procedures – the report concluded that “The opinions of bloodstain pattern analysts are more subjective than scientific…The uncertainties associated with [BPA] are enormous.” In 2018 the Texas Forensic Science Commission reached similar conclusions about the Bryan case, calling the interpretation of the BPA evidence “inaccurate” and “scientifically unsupportable”.

Still, properly used, BPA can give valuable clues towards understanding the circumstances of a shooting. For example, drops of blood that strike the floor at an angle will create a set of elliptical stains, whose width-to-length ratio gives the impact angle (figure 1). BPA analysts are trained to draw straight-line trajectories that follow the long dimension of each ellipse at that impact angle. These paths converge, providing a position from which the blood originated. While this correctly gives the projection onto the floor of the location of a gunshot wound, the straight-line procedure overestimates the height of the wound, since the true paths under gravity are parabolas modified by aerodynamic drag. The error is typically large enough to wrongly place a victim as standing rather than sitting.

1 Geometry of blood spatter

Figure 1When drops of blood hit the floor at an angle, they produce elliptical stains, where their width-to-length ratio gives that impact angle. Traditionally practitioners of blood-pattern analysis trace a straight-line from the stain at the impact angle to reveal where the blood originated. While this correctly maps the paths along the floor (grey lines), straight-line trajectories (dashed lines) overestimate the vertical height of the impact because the blood would have taken a modified parabolic path (blue) due to gravity and drag.

This is one of the established BPA methods that deeper physical analysis can improve. In 2011 physicists Christopher Varney and Fred Gittes at Washington State University put the projectile-motion equations, including gravity and drag, into a form that uses all the data inherent in a set of spatter bloodstains (Am. J. Phys. 79 838). They found that a plot of the impact angles for the stains versus the inverse of their horizontal distances from the vertical axis of impact gives a valid result for the height, provided that the launch angles for the drops are not too widely distributed. In a test that spattered a viscous blood substitute, the researchers used this approach to calculate the actual launch height of 88 cm to within 8%. For comparison, the linear trajectories overestimated the launch height by 100%.

In 2015 Nick Laan, of the University of Amsterdam and the Netherlands Forensics Institute, and colleagues instead used the fluid qualities of blood to find the height of a gunshot wound (Scientific Reports 5 11461). Earlier work had derived an equation relating the impact velocity of a liquid drop of blood to its volume and impact angle, and to the width of the dried stain it produced as determined by the known capillary and viscous behaviour of blood. To apply this method, the researchers created spatter patterns of human blood under controlled conditions. For each of 40 separate blood stains, they determined its width and impact angle, and, using a commercial 3D surface scanner, measured the volume of the stain, from which they found the volume of the original drop. These parameters yielded the impact velocity, giving enough information to solve the equations of motion under gravity with aerodynamic drag. The results for the height where each drop originated had an average value of 58.5 cm, only 8% below the true height of 63.7 cm. Meanwhile, the straight-line method gave 91.1 cm – a much larger 42% error.

A bullet’s journey

These two papers and others analyse the behaviour of blood drops after they have been formed, to enhance standard BPA. But mechanical engineers and fluid dynamicists Alexander Yarin and Patrick Comiskey from the University of Illinois at Chicago, together with Daniel Attinger at Iowa State, have gone further. They have modelled the entire process from the bullet entering the body to the final blood stain pattern.

Since 2016 these researchers have developed fluid dynamical theories for gunshot back spatter and forward spatter, where blood drops travel respectively against and with the direction of the bullet and display different characteristics. The back-spatter analysis – carried out for both regular (Phys. Rev. Fluids 1 043201) and blunt-nosed (Phys. Rev. Fluids 2 073906) bullets – is based on the well-known Rayleigh–Taylor instability. In this effect, acceleration perpendicular to the interface between two fluids of different densities – here, blood and air – creates growing turbulence and mixing between the fluids. (One remarkable example of the instability is the spectacular filaments seen in the expanding Crab Nebula, where the two fluids comprise material ejected by the Crab’s initial supernova explosion, and a plasma of relativistic charged particles powered by the Crab’s central pulsar.) Meanwhile, forward spatter, caused when the bullet exits a body after multiple disruptive encounters with blood and tissue, was treated differently. Its analysis used percolation theory, which describes available paths through randomly arranged clusters.

For both kinds of spatter, the researchers calculated the numbers, sizes and dynamical properties of the drops of blood generated by a bullet; then determined their trajectories under gravity and aerodynamic drag. Finally, the team found the number of resulting stains, their areas, impact angles and distribution with respect to distance (figure 2).

2 Shortfalls with blood pattern analysis

This diagram demonstrates three possible trajectories that could be mapped from back-spatter blood stains depending on what phenomena are taken into account. Straight-line paths (red), which do not account for gravity and drag, overestimate the height of impact. When drag is not considered (blue) the paths fall short. BPA needs to take into account both gravity and drag (black) to get a more accurate estimate of where the bullet impacted.

These calculated results for the blood-spatter distributions agree reasonably well with data obtained by shooting bullets into sponges or plastic foam soaked with swine blood for back spatter, and through a blood-filled reservoir for forward spatter. Although the researchers note that more experiments are needed, their results are significant steps toward a real, physics-based theory of spattering. Their work also points to new directions for study, such as how air is carried along with drops of blood in flight, which influences their trajectories, and the impact of temperature on blood viscosity.

The theoretical results so far show the value of the fluid dynamics approach but also that its complexity can add uncertainties to the analysis, for example through the variable properties of blood and blood stains. Besides temperature, the viscosity of blood also depends on the percentage of red blood cells, which varies by individual and could affect approaches like that used by Laan and colleagues. What’s more, the properties of the surface that a blood drop strikes may modify how it spreads and therefore affect the stain it leaves. Confounding elements like these should be taken into account for fully valid BPA that carries weight in court, and may limit claims about what BPA can definitively show. Certainly, there is much left to do.

From lab to crime scene

As the science of BPA progresses, a parallel challenge is to convert its results into new, practical and transparent procedures for murder investigations and courtroom presentations. But BPA practitioners have not entirely welcomed these changes, which threaten to upset established field procedures, a reaction also found elsewhere in the forensics establishment. Nevertheless, says Carriquiry, for many topics in forensic science “we have managed to make important inroads and established some meaningful partnerships with forensic practitioners who see our work as a means to make theirs more objective and ‘scientific’ ”. These partners include the Houston Forensic Science Center and the Los Angeles Police Department.

Now, with support from CSAFE and other US federal agencies, researchers are working specifically to strengthen connections between the BPA and fluid dynamics communities, and to provide practitioners with useful results. For example, Attinger and co-authors have written a tutorial paper in which they discuss the forces at play in fluid dynamics and how they determine the behaviour of blood drops at a crime scene (Forensic Science International 231 375). Attinger has also published charts based on fluid dynamics that make it simple for investigators in the field to estimate the maximum distance that a blood drop has travelled (Forensic Science International 298 97).

3 Blood and bullets in a lab

Figure 3

An example of blood back spatter on a card target, produced by researchers at Iowa State University using the set-up shown in figure 4. The bullet hole can be seen in the top left of the middle image.

In another effort, Attinger’s team has published back-spatter patterns of human blood produced in the lab by gunshot (figure 3), with rigorous control of the firearms and ammunition used and the physical arrangement (Data in Brief 22 269); and a second set of blood patterns produced by blunt instruments (Data in Brief 18 648). These provide high-resolution images of blood stains generated under varied conditions, for training and research use (figure 4). In one project, Hal Stern at the University of California–Irvine, a statistician and co-director of CSAFE, is examining the images for distinctive features that practitioners could use to distinguish among possible sources for observed spatters. In other outreach, BPA researchers also present talks and training sessions at professional societies.

4 A staged scene

Figure 4

The experimental set-up used by Daniel Attinger and colleagues in Iowa to study back-spatter blood stains.

Unfortunately, widespread adoption of more rigorous BPA practice and training will not come quickly, or automatically erase past deficiencies that produced unreliable evidence and false accusations. Nor is it likely that the legal standards for acceptance of BPA evidence will change soon enough to affect Joe Bryan’s upcoming appeal for a new trial. That request was denied in 2018, but his lawyers are now preparing a last-ditch effort before the Texas Court of Criminal Appeals. However, Bryan is nearly 80. Even if a new trial is approved, it may not come in time to do him any good.

Whatever that outcome, the extensive coverage of the Bryan case along with the NAS report and other evaluations of BPA have uncovered its problems and motivated progress toward a better physics of blood patterns. This may at least ensure that future blood evidence will be more effective in identifying the true perpetrators without unjustly condemning people who are innocent.

Metallic tin ‘reduces’ limitations of perovskite solar cells

Hybrid organic-inorganic perovskites have garnered significant interest in the solar cell community in light of their excellent optoelectronic properties and low manufacturing cost. Meanwhile, in efforts to produce perovskite solar cells (PSCs) with yet higher efficiencies, considerable steps have been made in developing monolithic tandem solar cells. These incorporate both a wide band gap (often Pb-based) perovskite layer and a narrow band gap (for example, Pb-Sn based) one in order to absorb as much of the solar spectrum as possible.

However, a key species in the narrow band gap layer, Sn2+, is prone to oxidation to Sn4+. This is undesirable as it leads to short charge-carrier lifetimes, thus short diffusion lengths in the perovskite film, which results in a less efficient solar cell.

Metallic tin to the rescue

A group of researchers led by Hairen Tan at Nanjing University demonstrated that metallic Sn could address this limitation by reducing Sn4+ back to Sn2+ when added to the precursor solution; a fine example of a comproportionation reaction (where two reactants, each containing the same element but with different oxidation numbers, yield a single product). Not only does metallic Sn readily reduce Sn4+, but it is also insoluble in the precursor solution until it does so, at which point it is itself oxidized to Sn2+ and becomes part of the perovskite lattice. The authors described this as a “tin-reduced precursor (TRP) solution strategy.”

Characterizing the tin-reduced precursor film

Tan and his colleagues then characterized the films using optical-pump terahertz-probe spectroscopy; a time-resolved technique capable of measuring photoinduced conductivity and charge-carrier mobility. This revealed that the TRP film had an increased charge-carrier mobility and a vastly improved charge-carrier lifetime when compared with the control (non-TRP) film. These results suggest that the density of defects associated with Sn4+ had been decreased.

Subsequently, the researchers fabricated a series of narrow band gap PSCs and optimized their performance by varying the thickness of their TRP layer. As a result of the longer diffusion length, the TRP layer could be made thicker than the control due to its increased light absorption without compromising on carrier collection.

When the researchers incorporated the TRP layer into an all-perovskite tandem cell – alongside a wide band gap Pb-based perovskite layer – they attained a champion efficiency of 24.8 %; impressive for a low-cost, lightweight solar cell such as this. This result adds further impetus to the drive to make the commercialization of perovskite solar cells a reality.

Full details of the research are reported in Nature Energy.

Net losses: why net zero carbon targets may backfire

A “net zero” carbon emission targets has been set by the UK, amongst others, for 2050. The European Union (EU)’s version fudges the date, due to opposition by some coal-reliant countries to the 2050 initially specified.

But whatever the date, the net zero formulation does not usually specify how net zero emissions are achieved, so in principle any project will be acceptable if it can claim to avoid, or compensate for, carbon dioxide production. These can include carbon offset and carbon removal projects, as well as renewable energy and energy efficiency schemes. Some argue that this mixes up basically conflicting policy approaches — emission avoidance and post-generation carbon removal. Emission avoidance at source is about decarbonising energy production and use, for example by switching to using renewables or by using energy more efficiently, so less carbon dioxide is produced. By contrast, carbon removal is about compensatory post-fossil-generation carbon dioxide clean-up, for example by Carbon Capture and Storage (CCS) and Negative Emission Technology (NET).

We can’t just keep sweeping emissions under the carpet

The documentation for the Green New Deal programme backed by the UK Labour Party says: “The use of a ‘net zero’ target that integrates both goals for decarbonisation and allowances for carbon removal is an unacceptably high risk strategy ….falsely discounting the carbon reductions that are needed while weakening ambition and delaying progress toward a fully decarbonised economy.” As I noted in an earlier post, the party wants to focus on the latter and says there are problems with CCS and NETs.

That view has been backed up by a study from Lancaster University, UK, which says that many of the technologies for carbon removal from the atmosphere are speculative, and may not actually be able to deliver. In his summary for Carbon Brief, Duncan McLaren says that “net-zero plans that rely on promises of future carbon removal – instead of reducing emissions now – are, therefore, placing a risky bet. If the technologies anticipated to remove huge quantities of carbon in the 2040s and 2050s fail to work as expected – or lead to rebounds in emissions from land-use change, for example – then it might not be practical to compensate for the cumulative emissions from mitigation foregone between now and then.”

McLaren looks to a “formal separation of negative emissions targets and accounting for emissions reduction, rather than combining them in a single ‘net-zero’ goal”. That would avoid the risk of carbon removal undermining the expansion of emission avoidance: carbon removal would be additional to reducing emissions, rather than a rival, although he says we may need both: “Even if emissions were brought to net-zero by 2050, the world would likely still need to achieve ‘net-negative’ emissions for a period, to reduce atmospheric carbon dioxide concentrations back to safer levels. At least some countries and sectors will need to go ‘beyond net-zero’.”

The Labour Party paper is also not opposed to some carbon removal offset options, but says that the Green New Deal must “establish a clear delineating between targets for emissions reductions and assumptions regarding negative emissions, and must limit the ‘net’ to include only those necessary emissions which can be offset through programmes such as domestic reforestation and rewilding”. So while some specific natural carbon removal projects may be condoned, artificial sequestration approaches are not. Instead, Labour’s main priority is “rapidly phasing out of fossil fuels, countering its decline with a massive programme of investment in renewable energy”, supported with help from the government..

That is also the approach adopted in the US Green New Deal proposed by presidential hopeful Bernie Sanders, which specifically excludes CCS as well as nuclear. Earlier proposed variants of this Green New Deal also specifically rejected market-based cap and trade/carbon tax approaches, in favour of government-led regulatory and intervention approaches, and that seems to be what Sanders also has in mind — though others see it differently. Carbon pricing and carbon market trading still have US adherents, including support from those that are pro-nuclear. Certainly, that might help nuclear stay in the game, as well as renewables, by pricing fossil fuel even further out of the market.

Back in the UK, the Conservative government is also still keen on carbon pricing and emission trading, as well as on nuclear. The Department for Business, Energy and Industrial Strategy (BEIS) is planning for the future of carbon pricing in a no-deal Brexit. Under these circumstances (any day now?), the UK would cease to participate in the EU Emissions Trading System (EU ETS), and BEIS would replace emissions trading in the UK with a fixed rate tax, evidently to be called the Carbon Emissions Tax. That presumably would hit unabated gas projects hard, including any using shale gas, but would stimulate support for fossil CCS projects, and maybe NET projects, as well as nuclear and renewable energy generation. So there could be some conflicts.

In addition to the conflict between emission reduction/avoidance at source and post-combustion carbon removal, there is also a more general – but similar – potential conflict between emission avoidance/reduction, sometimes in this context called “mitigation”, and “adaptation” to climate change, which aims to reduce vulnerability to its impacts. The use of these two terms can be confusing — you might say that reducing carbon emissions at source mitigates impacts, and so do adaptative lifestyle changes. However, it is clear that, while simple physical adaptation, for example by investing in protection against sea-level rise, may reduce some local impact costs in the short term, and some emergency measures clearly will have to be taken, unlike physical emission reduction/avoidance projects, that does not deal with the cause of the problem — carbon emissions.

You might say we need to do both, avoid/reduce emissions wherever possible and adapt to their impacts where you can’t, but it has been argued that there is a fundamental social equity conflict: “investments in emission reduction benefit everyone while adaptation only benefits the party that undertakes it”. What’s more, given that there will be competition for funding for climate-related action, it’s claimed that adaptation should be treated as only the option of last resort, and that too much of a focus on adaptation, just like too much of a focus on carbon removal, may slow emission reduction/avoidance.

However, there are problems with getting the right priorities, especially in poorer countries. Adaptation may be cheaper and easier than emission avoidance/reduction and it may be all that poorer countries can afford. In any case, emission reduction/avoidance at source may not be able to reduce local climate-related problems quickly. So mitigation may be deferred, leaving emission reduction to others to deal with. This may be understandable but, if a focus on adaptation occurs on a wide scale globally, that could reduce overall emission reduction efforts, the latter being the only long-term way to limit climate change.

None of this means that adaptation or setting targets to cut carbon levels are bad ideas. We clearly need to try to reduce the impacts of climate change, and “net zero carbon” targets have wide support, even if they mean some interim carbon offsetting. Certainly, planting more trees is a good idea, whatever else we do. In a recent EU-wide Eurobarometer public opinion survey (see reports 490/492), 92% of respondents thought that greenhouse gas emissions should be reduced to a minimum, while offsetting the remaining emissions, to make the EU economy climate neutral by 2050. Going even further, in a recent UK opinion poll, 33% of respondents said the UK should aim to hit net zero emissions by 2025 in line with the Extinction Rebellion demand, 25 years ahead of the government’s target.

However, we do have to be careful about over-reliance on carbon removal. Like adaptation, capturing carbon and storing it somewhere may buy us some time but it is not a permanent solution — we can’t just keep sweeping emissions under the carpet. The climate problem will just get worse if we do not cut emission at source.

Proton arc therapy: the next evolution in proton delivery?

Dose distributions

The techniques used to deliver photon-based radiotherapy have advanced over the years, from 3D conformal radiotherapy, to intensity-modulated radiotherapy (IMRT) and volumetric-modulated arc therapy (VMAT). Each progression aims to confer higher conformality, faster delivery and increased plan robustness. Proton therapy technology has also evolved, from passive scattering to scanned pencil beams and intensity-modulated proton therapy (IMPT). But what’s next?

According to Yunzhou Xia from the University of Manchester, the next evolution in proton delivery is proton arc therapy (PAT). “Arc therapy is treatment with the radiation continuously on as the gantry rotates around the patient,” she explained. “It is different to IMPT, where the beam is switched off between gantry movements.” Unlike VMAT, however, PAT has only been studied since 2016 and is not yet part of routine clinical practise.

Speaking at the recent Medical Physics & Engineering Conference (MPEC), Xia described how the Bragg peak could make proton treatments sensitive to range uncertainties and set-up errors, and suggested that the greater number of control points in PAT compared with IMPT might increase the  robustness of PAT treatment plans to uncertainties.

To assess the potential benefits of this approach, Xia created IMPT and PAT plans for one brain cancer and two head-and-neck cancer cases, with equivalent prescriptions for each pair of plans. She simulated range errors in the plans, including +/-3% over/undershoot and set-up errors of up to 3 mm in all directions. She then evaluated the dose distribution of each plan.

Xia first showed the audience the dose distributions for the brain tumour case, an ependymoma, for error-free IMPT and PAT plans. “In terms of target coverage, the two plans are very similar,” she said. “In the medium dose region, PAT delivers a more conformal dose to the target. However, in the lower dose region, PAT has more coverage.”

She then presented the values of conformity index (CI) and homogeneity index (HI) – tools used to quantitatively assess treatment plan quality – for pairs of plans with various uncertainties. PAT resulted in improved CI and HI values (closer to 1), for both range and set-up errors. This finding demonstrates that, for the ependymoma case, the PAT plans were more robust to errors than respective IMPT plans.

For the unilateral head-and-neck case, Xia showed that with no errors present, the PAT plan exhibited higher conformality and spared parts of the parotid gland compared with the IMPT plan. For plans with range errors, the CI and HI values were better with PAT than for IMPT, but PAT performed less well in plans with set-up errors.

Finally, Xia showed a bilateral head-and-neck case, with two target volumes: the first with a higher dose prescription than the second. Again, in the error-free instance, the PAT plan showed higher conformality than the IMPT version.

For the second, lower-dose, head-and-neck target, HI and CI values were closer to 1 for PAT plans than for IMPT, under both range and set-up errors. For the higher-dose target, CI was improved in the PAT plan while HI was better in the IMPT plan.

“Proton arc therapy has the potential to improve target robustness for certain brain and head-and-neck cases,” Xia concluded. “The narrower distribution of metrics in some cases may be beneficial  for robust optimization.” Xia now plans to repeat this analysis using more cases in order to draw a statistical conclusion.

Meet the pup stars

Cigarette case with image of Laika

Part art book, part popular science, Space Dogs: the Story of the Celebrated Canine Cosmonauts evolved from a private collection of objects, which turned into a photography project. Not so surprising, considering that the collector and artist in question is Martin Parr, a celebrated photographer based in Bristol, UK. But why did he spend 20 years collecting memorabilia related to the Russian space dogs of the 1950s and 1960s?

Aside from the quirkiness factor, the objects form a record of a less-well-known story from the history of space travel, which is here told by science journalist Richard Hollingham, alongside Parr’s photos of his memorabilia collection and original press photos of the doggy heroes and heroines (mostly the latter).

A night lamp decorated with porcelain figures of Belka and Strelka

From the dozens of dogs used in secret early experiments in low-pressure chambers and in suborbital flights, to the media furore surrounding the Soviet Union’s famed mongrel Laika and her successors, Hollingham packs plenty of fascinating detail into a small space. He explains why there were so many stray dogs on the streets of Moscow and the criteria for which strays were selected by Soviet scientists – including the requirement that they be light-coloured to show up well on TV cameras.

Hollingham doesn’t shy from including the sad deaths of some dogs, or the suffering that many survivors endured, though he also includes details of how well-loved they were by their team of scientists, doctors and engineers. The book devotes most of its pages to Laika, Belka and Strelka – the most famous and celebrated space dogs – but does find room for other canine characters. For example, the book talks about Ugolek and Veterok, the dogs whose mission lasted for 22 days in 1966 (they held the record for the longest spaceflight of any creature until 1971); and Tsygan and Dezik, the first dogs to experience suborbital flight, in 1951.

Parr’s photographs are equally revealing. In a Soviet Russia where individualism was deterred and celebrity shunned, the state not only deliberately turned its animal cosmonauts into superstars, but it did so through an array of commercial products. There are clocks, stamps, plates, books, confectionary boxes, pen holders and much more. And did you know there was a Russian-made “Laika” brand of cigarettes that lasted until 1990?

This small book is stylishly designed, striking a balance between humour, pathos, historical facts and adorable photos of dogs. What more could you want?

  • 2019 Laurence King 128pp £12.99

Compact high-voltage electron microscope overcomes coherence problem

A high-voltage transmission electron microscope (HVTEM) small enough to fit inside a university lab has been built for the first time by researchers in Japan. The team, led by Takumi Sannomiya at Tokyo Institute of Technology, used radio-frequency (RF) cavities to chop and accelerate electrons into coherent beams. Their achievement comes five decades after researchers first attempted to use RF linear accelerators for electron microscopy – and could lead to a wide range of new sub-nanometre imaging applications.

TEM uses the wave-like properties of electrons to obtain images of thin samples (100 nm or thinner) with spatial resolutions better than 1 nm. TEM has allowed scientists to visualize objects as tiny as single hydrogen atoms.

Most TEMs use beams of electrons in the energy range 60-300 kV but higher energy electrons offer two advantages. First, higher-energy electrons have shorter wavelengths and therefore will reveal smaller features. Second, higher-energy electrons can pass through thicker and denser samples, increasing the types of samples that can be imaged.

Much too large

One barrier to boosting the energy of TEMs is the size of conventional electrostatic electron accelerators.  A room-sized implementation is limited to about 300 kV, whereas a HVTEM running at 1 MV must be housed within a building-sized space.

Since the 1970s HVTEM designers have tried to overcome this problem by using RF cavity accelerators, which in principle can deliver high-energy electrons in a much smaller space. However, the lack of coherence of the electron beams from such accelerators has made this difficult.

Synchronized chop

Sannomiya’s team have solved this problem by using a series of RF-cavity components that maintain the coherence of the beam. The beam is created at 100 kV using a standard TEM accelerator. It then passes through the two RF-cavity choppers where it is cut into  pulses that are synchronized for the next stage of the journey, which is a 400 kV RF-cavity accelerator. This synchronization ensures that coherent pulses pass through the sample. The transmitted electrons are then decelerated to 200 kV using a final RF cavity accelerator so that they can be focused and detected using a standard TEM set-up.

The team’s instrument is small enough to fit comfortably in their lab, but can still accelerate electrons to 500 kV, which is around half the energy acheiveable at much larger facilities. Sannomiya and colleagues demonstrated the efficacy of their HVTEM by imaging sub-nanometre features within micron-thick samples – much thicker than the 100 nm limit of most TEMs.

The team now hopes to improve their microscope further using superconducting cavities, which would accelerate electron beams to even higher voltages, while making the device more compact and energy efficient. With these upgrades, the HVTEM could find a diverse range of new imaging applications, including atomic-scale tomography for biological tissues and whole cells, as well as in situ observations of liquid and gas environments.

The HVTM is described in Physical Review Letters.

Copyright © 2025 by IOP Publishing Ltd and individual contributors