Skip to main content

Ultracold atoms put high-temperature superconductors under the microscope

Physicists have deployed a Bose–Einstein condensate (BEC) as a “quantum microscope” to study phase transitions in a high-temperature superconductor. The experiment marks the first time a BEC has been used to probe such a complicated condensed-matter phenomenon, and the results – a solution to a puzzle involving transition temperatures in iron pnictide superconductors – suggest that the technique could help untangle the complex factors that enhance and inhibit high-temperature superconductivity.

A BEC is a state of matter that forms when a gas of bosons (particles with integer quantum spin) is cooled to such low temperatures that all the bosons fall into the same quantum state. Under these conditions, the bosons are highly sensitive to tiny fluctuations in the local magnetic field, which perturb their collective wavefunction and create patches of greater and lesser density in the gas. These variations in density can then be detected using optical techniques.

The new instrument, known as a scanning quantum cryogenic atom microscope (SQCRAMscope), puts this magnetic field sensitivity to practical use. “Our SQCRAMscope is basically like a microscope – a big lens, focusing light down on a sample, looking at the reflected light – except right at the focus we have a collection of quantum atoms that transduces the magnetic field into a light field,” explains team leader Benjamin Lev, a physicist at Stanford University in the US. “It’s a quantum gas transducer.”

Making a practical probe

The SQCRAMscope grew out of Lev’s earlier work on atom chips. In these devices, clouds of ultracold atoms are confined within a vacuum chamber to isolate them from their environment and levitated with magnetic fields generated by a miniaturized circuit, or chip.

One possible application of atom chips is to use these suspended atom clouds as sensitive, high-resolution probes of magnetic fields. But there was a problem: whenever researchers wanted to change the configuration of their chips, or bring a new sample of material near the atoms, they needed to break the vacuum, remove various optical components, take the chip out, and then put everything back. “It would take literally months to change anything,” Lev says. “It was not very good for rapid trials.”

A further complication is that running current through an atom chip causes the chip to heat up, affecting the temperature of any nearby sample in ways that are hard to predict or control. This, Lev observes, is “not ideal if you want to study phase transitions” and other temperature-dependent phenomena.

The Stanford team’s solution was to separate their atom chip from the samples they wanted to study. While the ultracold rubidium atoms in the SQCRAMscope remain inside a vacuum chamber, the chip used to magnetically trap them is located just outside it, with a few-hundred micrometre gap in between for the samples to slide in. “It’s a crazy Rube Goldberg kind of device, but it works,” Lev says.

A pnictide puzzle

After testing their SQCRAMscope on samples of gold wire, the researchers turned to a far more complex material: an iron pnictide superconductor with the chemical formula Ba(Fe1−xCox)2As2. At room temperature, this material is a metal with a tetragonal crystal structure, but as it cools, it undergoes a transition to a nematic phase at a temperature that depends on the doping fraction x. At this point, the material’s crystalline symmetry is partially broken, and it exhibits a characteristic patchwork pattern of magnetic domains. However, the exact temperature of this nematic transition was the subject of some debate, because measurements that were sensitive to the material’s bulk properties gave one answer, while methods that focused on its surface properties suggested another.

Graph of the sample's birefringence and magnetic field domains show order at temperatures below the transition temperature and disorder above it

Enter the SQCRAMscope. “These spatial patterns [in the magnetic domains] develop on the scale of a few microns, and that’s exactly matched to the spatial resolution of the SQCRAMscope,” Lev says. With help from collaborators in Stanford’s Geballe Laboratory for Advanced Materials, he and his team prepared a sample of the iron pnictide, connected it to gold wires, and used the SQCRAMscope to image magnetic field fluctuations as current flowed through it. By combining these magnetometry measurements with separate measurements of the sample’s optical birefringence, the team could study the sample’s bulk and surface magnetic properties simultaneously, and resolve the debate engendered by previous conflicting measurements.

What they found is that, contrary to hints of a discrepancy in previous studies, the material’s structural transition to a nematic phase and its electron transition occur at the same temperature: about 135 K for an undoped sample, and 96.5 K for a sample with 2.5% doping (see image). “People are excited about multimodal probes, where you can do two distinct measurements at the same time without any sort of change to the sample or substrate,” Lev says. “We have a perfect example of that, and it’s because the atoms are transparent to all wavelengths except for those that are resonant with rubidium.”

Scope for improvements

Now that the SCQRAMscope has proved its worth as a means of studying strongly correlated materials, Lev says that he and his colleagues have “a long list of fun projects to do”. Possible follow-up experiments include using the SCQRAMscope to probe other properties of pnictides (including superconducting electron transport) and study electron transport in 2D materials such as graphene. The team is also improving the instrument’s technical capabilities by reducing its use of cryogens, extending its range of operating temperatures and designing a new mount to make sample exchange easier.

Further details of the study appear in Nature Physics.

Rock, paper…plastic? A quiz about pop songs and bands linked to materials

1. Which of these artists has never had a song with the word “diamonds” in the title?

A. The Beatles B. David Bowie C. Paul Simon D. Rihanna

2. What metal was the title of the 2011 chart-topper by David Guetta, in which the singer Sia likens her strength to this material’s ability to be “bullet-proof” and “stone hard”?

A. Chromium B. Platinum C. Titanium D. Vanadium

3. In Spandau Ballet’s cheesy 1983 hit “Gold”, what attribute of the metal does singer Tony Hadley use to describe his lover?

A. Immutable B. Imperishable C. Incorruptible D. Indestructible

4. Which of these artists once recorded a song called “Made of Stone”?

A. Joss Stone B. Sly and the Family Stone C. The Rolling Stones D. The Stone Roses

5. Lady Gaga’s name can be spelled out using the symbols of which of the following groups of elements?

A. Lanthanum Dysprosium Gadolinium Gadolinium  B. Lawrencium Dysprosium Gadolinium Gadolinium C. Lanthanum Dysprosium Gallium Gallium D. Lawrencium Dysprosium Gallium Gallium

6. What two materials formed the title of a 1965 number-one hit by British band Unit 4 + 2?

A. “Cement and Ceramic” B. “Cement and Clay” C. “Concrete and Ceramic” D. “Concrete and Clay”

7. What was the stage name of Marianne Joan Elliot-Said, the lead singer of influential 1970s punk band X-Ray Spex?

A. Poly Methyl Methacrylate B. Poly Styrene C. Poly Thene D. Poly Vinyl Chloride

8. German pop star Drafi Deutscher – whose career famously nose dived after he urinated off a balcony – had a hit song in 1965 with what three materials in the title?

A. Gold, silver, bronze B. Concrete, cement, granite  C. Marble, stone, iron D. Rock, chalk, clay

9. Which one of the following bands with a metal in the title never, as far as we know, existed?

A. Tin Machine B. Steel Pulse C. Iron Maiden D. Brass Monkey

10. What’s the final element to be named at the very end of Tom Lehrer’s song “The Elements”?

A. Antimony B. Barium C. Sodium D. Zirconium

11. And finally, what Queen song, written by PhD physicist Brian May, is also the atomic mass of potassium?

A. “37” B. “38” C. “39” D. “40”

Stuck for the answers? We’ve listed them below.

With thanks to Lou Bailey, Tami Freeman, Hamish Johnson, Ed Jost and Curtis Zimmermann for inspiration.

Answers: 1. B (David Bowie had a song Diamond Dogs, which had “diamond” in the singular) 2. C  3. D  4. D  5. C  6. D  7. B  8. C (the German song was called Marmor, Stein und Eisen Bricht) 9. D  10. C  11. C

First report of clinical MRI-Linac treatments wins journal citations prize

A research paper describing the first clinical use of a 1.5 T MRI-Linac for MRI-guided radiotherapy has won its authors the 2020 Physics in Medicine & Biology (PMB) citations prize. This annual prize recognizes the PMB paper that received the most citations in the preceding five years.

The paper, First patients treated with a 1.5 T MRI-Linac: clinical proof of concept of a high-precision, high-field MRI guided radiotherapy treatment, was written by researchers from UMC Utrecht, where the MRI-Linac concept was originated, and Elekta, which has now commercialized the system as the Elekta Unity.

Image-guided radiotherapy provides a means to visualize a tumour target in real time; and the use of MRI for this guidance confers exceptional soft-tissue contrast without the need for ionizing radiation. The winning paper describes the first-in-man treatments using the prototype MRI-Linac, which integrates a diagnostic-quality 1.5 T MR scanner with a state-of-the-art linear accelerator.

The team treated four patients with lumbar spine bone metastases, chosen as these tumours are clearly visualized on MRI and the surrounding spinal bone can be detected on the integrated portal imager, providing independent verification of the targeting accuracy. Patients were treated with an intensity-modulated radiotherapy (IMRT) plan created while they were on the treatment table and based on the online MR images.

This study demonstrated that the MRI-Linac concept does indeed work in the clinic and that radiation can be precisely targeted using high-field MRI guidance. Absolute doses were found to be highly accurate, with deviations ranging from 0.0% to 1.7% in the isocentre. Portal imaging confirmed that the MRI-based targeting was better than 0.5 mm, ranging from 0.2 mm to 0.4 mm.

First author Bas Raaymakers, who has pioneered this work on hybrid MRI radiotherapy systems since 1999, suggests that the paper’s popularity may be due to readers being pleased to see that the MRI-Linac worked with patients.

“The possibility of acquiring 1.5T MRI data that reveal anatomical details and motion prior to treatment does resonate very well with existing efforts on image-guided radiotherapy,” he explains. “The fact that these data can also be acquired during irradiation offers all kind of exciting opportunities to move towards real-time adaptive radiotherapy.”

Into the clinic

The paper represented the final step in demonstrating the technical feasibility of the MRI-Linac. Since its publication, the MRI-Linac has gained 510(k) and CE approval and there are now around 30 systems in clinical use around the globe. UMC Utrecht has two clinical MRI-Linacs and a third expected in 2021.

Other advances include the move from treating vertebral bodies to soft-tissue targets, such as lymph nodes, prostate tumours, rectal tumours and others. Improvements in workflow, meanwhile, have led to an average treatment slot of 45 minutes for the majority of patients.

“Within research, we are now prototyping next-generation workflows with intra-fraction replanning to get to the real-time adaptive radiotherapy,” Raaymakers tells Physics World.

PMB authors

The PMB citations prize is marked with the presentation of the Rotblat medal, named in honour of Sir Joseph Rotblat, PMB’s second and longest-serving editor. “This award is a much appreciated recognition and motivation for the entire department and the long-standing, ongoing industrial collaborators,” says Raaymakers.

“Of course, we are intrinsically motivated to keep on improving, it is really fun and rewarding work. But at the same time it feels good that the long journey to get to the clinic is something that is followed and cited by the PMB audience.”

Quantum software company tackles big computing challenges

“My mission is to demystify quantum computing,” says Ilyas Khan, who is founder and chief executive of Cambridge Quantum Computing (CQC) – a UK-based provider of software for quantum computers. Khan is our guest in this episode of the Physics World Weekly podcast, and he explains how CQC helps its clients use quantum computers to solve big problems in the design of pharmaceuticals, machine learning and cybersecurity.

Later in the programme, Physics World editors talk about what is new in physics this week, including how to make a Bose–Einstein condensate using perovskite excitons; a new way of tackling tumours using immunotherapy and a beam of carbon ions; and clever ways of using spin impurities as quantum bits.

This podcast is sponsored by Teledyne Princeton Instruments.

US president-elect Joe Biden set to put science centre stage

One of Joe Biden’s first moves as he became US president-elect earlier this month was to appoint a new coronavirus advisory panel. Featuring senior figures such as Vivek Murthy, a former US surgeon general, and David Kessler, a former commissioner of the Food and Drug Administration, the approach brought relief to many and contrasted sharply with the anti-science rhetoric of departing US president Donald Trump. While Biden emphasized that his administration will be “built on a bedrock of science”, the election failed to deliver a convincing majority for the former vice-president, meaning that his science-based agenda could face an uphill battle in Congress.

For many in the US scientific community, however, Biden’s election represents a return to normality that had disappeared during the four years of Donald Trump. “[Physicists] have expressed delight at the election result and look forward to working with the new administration,” says physicist Philip Bucksbaum from Stanford University, who is president of the American Physical Society (APS). That view is backed by Michael Moloney, chief executive of the American Institute of Physics (AIP), who says that many scientists will now be “eager” to serve in the next government.

Yet the prospective benefits of a science-friendly administration face obstacles. In the short term, Trump’s refusal to concede defeat has slowed down exchanges of information necessary for a smooth transition on 20 January, particularly when it comes to containing COVID-19. The Trump administration has also announced fresh appointments and actions intended to continue its policy of emphasizing economics over science. In mid-November, for example, the administration removed Michael Kuperberg from his post overseeing the National Climate Assessment, which provides the basis for regulations on global warming. His intended replacement is the University of Delaware geographer David Legates, who denies anthropogenic climate change.

A longer-term difficulty is the failure of a “blue wave” to emerge during the elections, which Democrats had hoped would give them control of the Senate and increase their majority in the House of Representatives. The party in fact lost seats, although not its majority, in the House, while control of the Senate now depends on run-off elections for two Senate seats in Georgia on 5 January. An unlikely Democratic win in both seats would produce a Senate divided 50–50 giving the deciding vote to vice-president-elect Kamala Harris. But even with that achievement, the administration could face problems pursuing its science-based agenda given that much legislation requires 60 votes in the Senate to move forward.

Reversing policies

Biden will have some immediate opportunities to fulfil his vow of relying on science. He has nominated Avril Haines, who has an undergraduate degree in physics from the University of Chicago, as director of national intelligence. His administration is also expected to soon overturn three Trump administration actions by re-joining the Paris climate agreement, the World Health Organization and the Joint Comprehensive Plan of Action on Iran. Biden can achieve those aims with executive orders that do not need Senate approval and he can also renew the START treaty on nuclear-weapons reduction, which expires on 5 February. “Certainly, the physics community is interested in seeing an extension,” adds Bucksbaum.

Biden will have some immediate opportunities to fulfil his vow of relying on science

Hope has also emerged regarding climate change, with insiders seeing signs that some Republican legislators have begun to take a nuanced view of the issue. “The scientific evidence has got stronger and the potential outcomes of climate change scarier,” says former US presidential science adviser Neal Lane, who is a senior fellow in science and technology policy at Rice University. “A lot of Republican senators realize there’s a problem here and that they must find a way to change.” Indeed, Illinois Representative Bill Foster, the only PhD physicist in Congress, adds that Republicans on the House Science Committee – with the approval of House Republican leader Kevin McCarthy – have already proposed doubling the US research budget to help develop technologies to mitigate climate change.

During Trump’s presidency, the Environmental Protections Agency (EPA) overturned several government regulations intended to protect other aspects of the environment such as limits on toxic chemicals in the air and soil, the commercial use of government-owned land and the efficiency of car fleets. Since it used executive orders for several of those actions, the Biden administration can counter with its own orders.

The Biden administration is also expected to quickly overturn Trump’s severe restrictions on immigration. “AIP has been on the record about how limits to immigration negatively impact the scientific enterprise as well as how increasing diversity, accessibility and belonging within the STEM workforce will lead to better scientific outcomes,” says Moloney. Those views are backed by Bucksbaum, who says that the APS is working to persuade the new administration to overturn a Trump proposal that would limit certain overseas students’ visas to two years, rather than the traditional duration sufficient to take individuals through the completion of graduate studies.

New appointments

As Biden announces nominees for his cabinet, the science community will focus attention on the new presidential science adviser to replace the current incumbent, atmospheric scientist Kelvin Droegemeier. Lane would like to see a woman in the role for the first time and believes it should be a cabinet-level position. “The first science adviser needs to have a good understanding of health issues,” he says. “They should be the key point person in dealing with the pandemic on the ground.”

Another open post is NASA boss. In an interview with Aviation Week after the election, current NASA administrator Jim Bridenstine announced he would resign – even if Biden asked him to stay on. A former Oklahoma Congressman who has led the agency since April 2018, Bridenstine overcame initial suspicions about his attitude to climate change and many think he has run NASA smoothly and efficiently such as advancing the schedule for human Moon landings. “He’s managed to work well with both [political] parties,” says Lane, a fellow Oklahoman. “He’s done a good job working with SpaceX and moving toward stronger partnerships with the private sector.”

Whether the rest of the government’s science-based agencies can work as effectively as NASA remains to be seen. Several political pundits assert that Biden is a natural conciliator, whose long service in Congress has gained him the respect of Republican legislators. That could help him to achieve agreement on science-friendly policies. “It will be important for both Republicans and Democrats to focus on bipartisan issues,” says Bucksbaum. “Science can then return to the historical position of being something that everybody likes.”

American Physical Society tightens rules on meeting locations

The American Physical Society (APS) has sharpened its criteria when choosing the locations in which to hold future meetings, saying it will ask the leadership of possible cities to report on local policing policies and related demographic issues.

The society will, in particular, examine statistics on the use of force by the local police; policies on police use of strangleholds and other methods of restraint; as well as the status of investigations into the deaths of individuals in police custody. Candidate cities will be expected to train all their police in de-escalation methods; have evaluations of policing performance that include shootings and deaths of unarmed individuals by police officers; as well as have “a well-defined plan and timetable for improving local policing practices”.

The policy has been introduced following the use of excessive force by the police, particularly against members of under-represented or minority groups. “We became aware that members of our community are vulnerable in ways we had never imagined, and we resolved to act,” says Susan Gardner of the University of Kentucky, who chairs the APS’s committee on scientific meetings. APS president Philip Bucksbaum of Stanford University told Physics World that the membership has been the motivation for the move. “We hear loud and clear that our members see that they shouldn’t have to go to meetings in cities where they feel unsafe and threatened,” he says.

Deep-learning model enables rapid lymphoma detection in PET/CT images

Whole-body positron emission tomography combined with computed tomography (PET/CT) is a cornerstone in the management of lymphoma (cancer in the lymphatic system). PET/CT scans are used to diagnose disease and then to monitor how well patients respond to therapy. However, accurately classifying every single lymph node in a scan as healthy or cancerous is a complex and time-consuming process. Because of this, detailed quantitative treatment monitoring is often not feasible in clinical day-to-day practice.

Researchers at the University of Wisconsin-Madison have recently developed a deep-learning model that can perform this task automatically. This could free up valuable physician time and make quantitative PET/CT treatment monitoring possible for a larger number of patients.

To acquire PET/CT scans, patients are injected with a sugar molecule marked with radioactive fluorine-18 (18F-fluorodeoxyglucose). When the fluorine atom decays, it emits a positron that instantly annihilates with an electron in its immediate vicinity. This annihilation process emits two back-to-back photons, which the scanner detects and uses to infer the location of the radioactive decay.

Because tumours grow faster than most healthy tissue, they must consume more energy. Much of the radioactive tracer will therefore be drawn towards the lymphoma lesions, making them visible in the PET/CT scan. However, other types of tissue, such as certain fatty tissues, can “light up” the scans in a similar manner, which can lead to false positives.

Neural networks: accurate and fast

In their study, published in Radiology: Artificial Intelligence, Amy Weisman and colleagues investigated lesion-identifying deep-learning models built from different configurations of convolutional neural networks (CNNs). They trained, tested and validated these models using PET/CT scans of 90 patients with Hodgkin lymphoma or diffuse large B-cell lymphoma. For this purpose, a single radiologist delineated lesions within each scan and classified each one on a scale from 1–5, depending on how sure they were that a lesion was malignant.

The researchers found that a model consisting of three CNNs performed best, identifying 85% of manually contoured lesions (923 of 1087, the so-called true positive rate). At the same time, it falsely identified four lesions per patient (the false positive rate). The time to evaluate a single scan was cut from 35 minutes using manual delineation to under two minutes for the model.

It is extremely difficult to classify every lymph node in a scan as cancerous or not with 100% certainty. Because of this, if two radiologists delineate lesions for the same patient, they are not likely to agree with each other completely. When a second radiologist evaluated 20 of the scans, their true positive rate was 96%, while they marked on average 3.7 malignant nodes per patient that their colleague had not. In these 20 patients, the deep-learning model had a true positive rate of 90%, at 3.7 false positives per scan – making its predictions almost as good as the variation between two observers.

Expected, and unexpected, challenges

Often, one of the biggest hurdles in creating this type of model is that training it requires a large number of carefully delineated scans. The study authors tested how well their model performed depending upon the number of patients used for training. Interestingly, they found that a model trained on 40 patients performed just as well as one trained on 72.

Amy Weisman

According to Weisman, obtaining the detailed lesion delineations for training the models proved a more challenging task: “Physicians and radiologists don’t need to carefully segment the tumours, and they don’t need to label a lesion on a scale from 1 to 5 in their daily routine. So asking our physicians to sit down and make decisions like that was really awkward for them,” she explains.

The initial awkwardness was quickly overcome, though, says Weisman. “Because of this, Minnie (one of our physicians) and I got really close during the time she was segmenting for us – and I could just text her and say ‘What was going on with this image/lesion?’. Having a relationship like that was super helpful.”

Future research will focus on incorporating additional, and more diverse, data. Acquiring more data is always the next step for improving a model and making sure it won’t fail once it’s being used, says Weisman. At the same time, the group is working on finding the best way for clinicians to use and interact with the model in their daily work.

A matter of evidence

“This experimental evidence does not match our theoretical prediction. We shall admit its failures and attempt a new theory with neither shame nor pride.” That’s how good science should work. It’s what we strive for. But it’s not what always happens in practice. Sadly, the following has become all too familiar: “This evidence does not match our theoretical prediction. Let’s pick out the best bits and publish anyway or we’ll be accused of not doing our jobs properly and lose funding.” Outside of science you might get: “It’s a shame the evidence isn’t what we hoped for. Perhaps if we put trust in groundless hope, then it might change tomorrow.” Or, worse still: “If the evidence suits us, then use it. If it partly suits us, then twist it. If it doesn’t suit us, then either ignore or debunk without backup.” And, of course, there’s the not-too-uncommon: “To hell with evidence!”

Why is “evidence” such an emotive, even political, concept? For one thing, it’s easy to find evidence to suit your needs. For example: on 4 June 2020, the temperature in Bristol was below the long-term average for that day of the year. This is a piece of evidence that could be used to suggest that climate change is not happening. It might well be an inconvenient truth that evidence to the contrary is overwhelming, but it’s still easy to find some evidence either way.

Why is “evidence” such an emotive, even political, concept?

Consider the other side to evidence: our theoretical understanding of the world. In our early years of life, we learn basic mechanics at a considerable rate, and we do so mostly independently. Before our first birthday, we learn that if you push a block on a surface, then it moves. Not much later, we experiment with rough and smooth surfaces and see moving blocks come to a halt. We are using these experiments to build rudimentary empirical models of the world and these models tend to be just good enough for what we need: “live” models in our brain to help us spear a mammoth, or push a block along an inclined plane. At school, we teach the formal theories of classical mechanics to numerically predict the motion of said blocks. At university, physics students learn about how relativity and quantum mechanics affect motion at extreme speeds or scales.

Does this succession of progressively more sophisticated modelling mean we disregard the previous, simpler methods? Of course not. Even the most enthusiastic physicists would not expect relativity to be considered when checking their car speedometers. For the mechanics case above, we have three layers of sophistication at our disposal: an empirical model for day-to-day life, a classical theoretical model for most engineering tasks, say, and relativistic/quantum theories for extreme situations. I’m sure we could subdivide these layers further, but the point is this: we use a sophistication level to match the needs of the problem at hand. Yet, this idea of simplified models of a given phenomenon causes much confusion – just as the significance of different forms of evidence does. It has a similar propensity to carve chasms between science and the wider world.

When it comes to theory and simulation, we face similar communication difficulties to those for evidence. A recent article in the Washington Post, published during the early days of the COVID-19 pandemic, presented successively sophisticated numerical simulations of the spread of the virus, subject to varying social-distancing measures. The simulations were all grossly simplistic – modelling people as spheres bouncing against each other in a confined plane – but it was a fine article that made its limitations explicit.

According to the scientific method, the response might be: “The simulations show greatly reduced loss of life for the case of moderate-to-extensive distancing. The model’s assumptions are huge so we must improve it, but the general conclusions are compatible with other works so regard this as further evidence in support of social-distancing measures, subject to the results of future improved simulations, manageable financial impacts, and other considerations.” However, a well-meaning, if somewhat naive, reader might argue: “Well that was interesting, but people aren’t anything like little balls floating in space! Clearly this doesn’t prove anything.” A more hostile reader might contend: “This is ridiculous; no wonder people are sick of experts.”

If a science communicator over-simplifies evidence or theory, then they might be accused of “dumbing down” or presenting pointless material. If an attempt is made to convey the intricacies, including (heaven forbid) maths, then the articles become inaccessible to most people. If it is claimed that something is proven beyond doubt (stating with 100.0000% certainty that climate change is real and anthropogenic, for example) then the article is probably dubious itself. Yet if you state something is not technically proven in a true scientific sense, but faces overwhelming evidence and consensus among experts (climate change again, for example), then it’s “just a theory”.

So how best to communicate science? Call it a campaign problem: (a) present evidence, (b) show that a solution exists (assuming it does). The poor old harbinger of bad news who only ever does (a), faces much hard-talk: “Give me solutions – or votes – not problems!” Perhaps the world shouldn’t be that way. But it is. And scientists need to accept that. Just as everyone needs to accept the evidence for climate change and the necessity of pandemic-mitigation strategies.

Magnetic perovskite gets the lead out

A new “double perovskite” material could become a more environmentally friendly platform for spintronics devices thanks to its lead-free nature. While the material in its current form is only magnetic below 30 K – too low for practical applications – developers at Linköping University in Sweden, together with colleagues in the US, the Czech Republic, Japan, Australia and China, say that their preliminary experiments are nevertheless a promising step towards making rapid and energy-efficient information storage devices from this novel optoelectronic material.

Halide perovskites in general have an ABXstructure, where A is caesium, methylammonium (MA) or formamidinium (FA); B is lead or tin; and X is chlorine, bromine or iodine. Materials of this type absorb light over a broad range of solar spectrum wavelengths thank to their tunable bandgaps, and the electrons and holes within them diffuse quickly over long distances (high charge carrier mobility and lifetime). These properties make them attractive building blocks for high-performance optoelectronic devices such as solar cells, light-emitting diodes, lasers and photodetectors.

Spin-related properties

Recently, researchers discovered that lead halide perovskites also boast interesting spin properties thanks to lead’s strong spin-orbit coupling. This coupling links the motion of an electron to its quantum spin, and its strength determines how much the intrinsic spin of an electron will interact with the magnetic field induced as the electron moves through the material. Such a coupling is therefore important not only for the magnetic properties of a material, but also for the performance of any spintronics devices – that is, devices that exploit the spin of an electron as well as its charge – that are made from it.

Until now, lead-based halide perovskites were thought to be the only materials in their class to possess this desirable magnetic property. Because lead is toxic for humans, animals and the wider environment, its presence has limited the materials’ development.

Preliminary experiments

Now, however, a team led by Feng Gao has created a new perovskite material that retains the magnetic properties of its lead-based cousin but contains paramagnetic iron ions (Fe3+) instead of lead. The team created this material by incorporating the iron ions into a perovskite made of caesium, silver, bismuth and bromine, with the chemical formula Cs2AgBiBr6. In a series of measurements made using near edge X-ray absorption fine structure (NEXAFS) and solid-state nuclear magnetic resonance (ssNMR) techniques, Gao and colleagues showed that Fe3+ replaces Bi3+ in this iron-alloyed double perovskite and forms FeBr6 clusters that are evenly distributed throughout the material’s crystal structure.

According to combined SQUID (superconducting quantum interference device) and ESR (electron spin resonance) measurements, the new perovskite material is magnetic at temperatures below 30 K. While Gao acknowledges that this temperature is too low for practical applications, he also points out that the material is still at a very early stage in its development. The researchers add that they are not even completely sure what is causing its magnetic response, although their results suggest that it is probably due to a weak ferromagnetic or anti-ferromagnetic response from localized regions in the material.

“If so, we have a whole new class of halide double perovskite alloys that can potentially be used for spintronics applications,” Gao says. “But more research is needed, not least to obtain the magnetic properties at higher temperatures.”

The researchers, who report their work in Science Advances, say they now plan to repeat their experiments at higher pressures and use chemical co-doping and alloying to try to retain the material’s magnetic properties at higher temperatures. “We will also be focusing on the structure-property relationships of magnetic double perovskites to better understand how to design these materials,” Gao tells Physics World. “We hope that our work will encourage future efforts in exploring spintronic double perovskites for rapid and energy-efficient information storage.”

Medicare Radiation Oncology Alternative Payment Model: what you need to know

Want to learn more on this subject?

Join our webinar with Shawn Prince and Awais Mirza from Accuray who will provide an overview of the finalized Medicare Radiation Oncology Alternative Payment Model (RO-APM). This payment model is designed to test whether bundled, prospective, site-neutral, modality agnostic, episode-based payments to physician group practices, hospital outpatient departments and freestanding radiation therapy centers for radiotherapy episodes of care, reduces Medicare expenditures, while preserving or enhancing the quality of care for Medicare beneficiaries.

Key topics covered in this webinar will include:

  • A detailed review of the RO-APM.
  • RO-APM model implementation timelines and processes.
  • RO-APM billing requirements and timelines.

Want to learn more on this subject?

Shawn Prince has an undergraduate degree in Business Administration from The Ohio State University and a Master’s of Business Administration degree from Arizona State University. Shawn is a Senior Director of Patient Access at Accuray. His previous professional experience includes reimbursement and sales for various pharmaceutical and biotechnology companies.

Awais Mirza is the senior manager of patient access with Accuray. Within the patient access department, Awais works to secure appropriate coding, reimbursement coverage and payment from both Medicare and commercial insurers. He has an expansive background in clinical and administrative radiation oncology and has worked for a number of hospitals as a licensed radiation therapist and administrative manager/leader prior to joining Accuray.

Helen Berman: the crystallographer who pioneered the Protein Data Bank

Helen Berman

When Helen Berman was working as an X-ray crystallographer at the Institute for Cancer Research (ICR) in Philadelphia in 1969, she and a handful of other colleagues realized that the field was about to be inundated. Until then, determining the structure of proteins – describing the positions of its individual atoms – was a time-consuming process, requiring complex diffraction data to be produced and interpreted by hand. Most proteins have tens of thousands of atoms, and back then barely a dozen or so protein structures had been determined.

But Berman, who was an ICR research associate, realized that improved instrumentation and new computational methods would soon send the number of protein structures skyrocketing. A proper database able to archive voluminous numbers of structures in a standard form was urgently needed, she felt. Accomplishing that goal would require extensive computing capabilities and state-of-the-art computer graphics.

And so it was that at a 1971 conference at the Cold Spring Harbor Laboratory on Long Island, New York, Berman and her colleagues broached the subject with Walter Hamilton, a renowned crystallographer from Brookhaven National Laboratory, inspiring him to create the Protein Data Bank (PDB) that year. Berman’s intuition was correct: entries in the PDB grew exponentially from seven in 1971 to 100 by 1982, 1000 in 1993 and 10,000 in 1999. Today the PDB has almost 170,000, making it the most important open-access, digital-data resource in biology.

Off-the-scale problems

But in 1998, when Berman moved the PDB to Rutgers University, she sensed another imminent crisis. While initially X-ray crystallography was the main tool for determining protein structures, it was soon joined by nuclear magnetic resonance spectroscopy (NMR) and cryogenic electron microscopy (cryo-EM). Researchers were therefore increasingly having to turn to “integrative” structure determination based on data from multiple experimental methods, not just cryo-EM and NMR but also chemical crosslinking, small angle scattering, Förster resonance energy transfer and others.

Berman thought this pointlessly hampered research since users had to navigate different data management practices. Seeking a way to allow these differently determined protein models to be entered into a single repository, she immediately realised that each community had not only created different ways to store and manage data, but had also developed a different language, standards, criteria for acceptability, and different limits on the possible positions of individual atoms in a protein.

For a while, Berman overcame this obstacle by fitting integrative structures into the PDB case-by-case, but it was gruelling work. As she told me, she realized, “No way this is gonna scale!” Berman tried for a while to expand the PDB standards and give them to the different communities involved in integrative modelling, but it didn’t go well. Each community worked differently, making sense of data using procedures and standards appropriate to their research; their members feared that new standards imposed from outside would interfere with creativity and success.

Reactions ranged from polite lack of interest to outright hostility: one researcher sent Berman an offensive message insisting he had no intention of doing what “Madame President” wanted.

Robert P Crease

Reactions ranged from polite lack of interest to outright hostility: the day after one presentation, for example, a researcher sent Berman an offensive message insisting he had no intention of doing what “Madame President” wanted. Her first thought was: “How can I get this guy on my side?” Figuring that out wasn’t easy. Or as Berman politely puts it: “Scientists are not usually skilled at social engineering.”

She therefore decided to leave the standards to the communities themselves, befriending members, tracking down the experts, and explaining how success would benefit their communities. Most importantly, she listened to why they felt certain things might be impossible in practice. “That way, the leaders of each community remained the leaders,” says Andrej Sali, a structural biologist at the University of California, San Francisco, who works with Berman. “They kept taking care of their corner of the universe.”

However, the communities needed to figure out ways to exchange data and communicate with each other. The result was PDB-Dev – a flexible test platform that can accommodate different types of integrative structures before they are eventually archived in the PDB. It succeeded in bringing structural biologists and the different experimental communities together, with two committees – one to archive models and exchange data, and the other to validate different models. PDB-Dev released its first structure in 2016.

Berman stepped down as the head of the PDB in 2014, but continues her involvement in it, including leading a workshop last year. “It’s a work in progress,” she admits. “As each structure comes in, we find a new set of problems that we hadn’t thought of before.” PDB-Dev now has 61 structures and aims to fold into the PDB over the next five years.

The critical point

Berman attributes some of her success to her own family background: her father was a surgeon and professor at a medical school, while her mother was a community health organizer in poor neighbourhoods of Brooklyn. “Her talent was to get people to talk together,” Berman said. “The next step is to show people that there’s a problem.” In much the same way, she feels, her skill lies in encouraging scientists to speak outside their technical vocabularies in language those in other communities can understand – including why some things could not work.

After one of Berman’s presentations, an audience member told her that what she was doing sounded like the work of the American political economist Elinor Ostrom (1933–2012). As someone who specialized in describing the principles of managing resources in diverse communities, Ostrom found that the wrong approach was to seek a “theory” for managing combined practices of different groups; the key is to get them to work together first. Or, as Berman puts it: “Science is alive, a fluid thing. Making it go is community work.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors