Skip to main content

Molecular shapeshifter: material that gets thicker when stretched

Most materials get thinner when stretched. Take a rubber band, stretch it along its length, and it will shrink in the other two directions, getting narrower and thinner as you pull. But there are types of materials known as ‘auxetics’ that do the opposite: getting thicker when stretched. Human tendons, cat skin and certain types of metal are examples of natural auxetics, which form tiny voids when stretched, lowering the overall density of the material.

Recently, however, a PhD student at the University of Leeds, UK, discovered a new type of auxetic that displays auxetic behaviour at the molecular scale. Watch the video to find out why this liquid-crystal elastomer (LCE) could be more durable than existing auxetics, making it suitable for body armour, car windows and solar cells. For a more detailed scientific description of this new material, read this feature by Helen Gleeson, the physicist who supervised this research.

#BlackInPhysics week set to celebrate Black physicists

What are the aims of #BlackinPhysics week?

Charles Brown (CB): #BlackInPhysics week, which runs from 25 to 31 October, is dedicated to celebrating the contributions that Black physicists make to science and to revealing a more inclusive picture of what a physicist looks like. The aim of the week is to strengthen intra- and inter-generational connections between Black physicists, encourage long-lasting collaborations and further push for supportive environments where current and future Black physicists everywhere can thrive.

How did the idea come about?

Xandria Quichocho (XQ): There are 12 organizers of #BlackInPhysics week and it has been a team effort. The organizing team was inspired by initiatives such as Black Birders Week, #BlackinAstro and #BlackintheIvory, which gave voice to the aggressions, racism and sexism faced by Black academics across the world. Each of these movements lit the kindling that is growing into a larger fire that is saying “I am Black, I am a scientist and my life matters.” We are all living in the middle of a global revolution that is shouting and telling the world that Black Lives Matter. As people who identify as Black, we have been told time and time again to be neutral and objective in our professional workplaces and places of study. But as the news is saturated with the constant injustices and brutality that are happening to Black, Indigenous and people of colour, it is impossible to pretend that we can go into our classrooms, into our labs and ignore what is going in the outside world. For us, it is not just the “outside” – it is our everyday lives.

What inspired you to organize #BlackinPhysics?

Eileen Gonzales (EG): I wanted to be part of something that focused on connecting members of our community, especially helping to link students who may be the only Black physicist at their institution. We all wanted to focus on all aspects of being Black in physics and not just our science. We found there is a lack of these things being addressed specifically for Black physicists in our community. It is either you focus on being Black in physics and showcase the science, or you focus on a workshop targeting, say, impostor syndrome, but it will lack the perspective of a Black person. We wanted to provide a space for conversations like what it is like to deal with impostor syndrome, mental health and advocating for yourself – all through the lens of a Black physicist.

What are the most pressing barriers facing Black physicists?

XQ: “Representation, representation, representation!” is a phrase we’ve all been hearing recently. It is talked about regarding movie casting, television programmes, in board rooms and even children’s programming. A lack of representation for Black, Indigenous and people of colour isn’t just a problem in popular media; it is an issue that persists and exists in physics departments and labs. According to data collected by the American Physical Society from 2006 to 2016, only 4% of BSc physics degrees were earned by students who self-identified as Black or African American. Unfortunately, this number does not include students who identified as Black mixed-race, more than one race including Black, or Afro-Latinx and Afro-Indigenous.

Are the barriers simply a problem with underrepresentation?

XQ: It’s just the beginning of the problem. Academia itself is a barrier to Black physicists. Traditional STEM programmes operate under the flawed myth of the meritocracy – and completely ignore the biases, racism and sexism that Black, Indigenous and people of colour combat every day. The structures in academia and in physics intentionally bar Black students from gaining access to the field or remaining in it once they have broken the glass ceiling. For example, about 1800 physics PhDs were awarded in the US in 2017 alone but this year will see only the 100th Black American woman ever to achieve that feat. However, there has been a shift in these past few months to change this. Universities, national labs and individuals that took part in #ShutDownSTEM and the #Strike4BlackLives over the summer made the first steps towards creating a more equitable and physics field. The American Institute of Physics recently published its extensive and detailed TEAM-UP Report that not only highlights the barriers for Black students in physics but includes real, long-lasting strategies that departments can take to create a safer environment for Black students to thrive in.

We will judge the success by how effective it has been to celebrate the work of Black physicists and whether it has helped to build an engaged community

What events are planned during #BlackInPhysics week?

EG: We will run both professional and social events targeted at Black physics students, postdocs, faculty, industry and the general public. There are panel discussions aimed at specific career stages such as dealing with impostor syndrome and mental health as well as a three-minute thesis competition for PhD students to showcase their work. We will also host six days of social events including an “ask-a-scientist” session for the general public to engage in conversations with Black physicists, a special Halloween Murder Mystery as well as other events to provide a place for the community to network, relax and socialize. Each day will also feature an article by a Black physicist regarding different aspects of our identities [published jointly on the Physics World and Physics Today websites].

What was the reason for splitting up the week in terms of different areas of physics?

CB: Dedicating an entire day to a single area not only allows physicists within that area to learn about each other and form a community, but also to form a community with varied research interests – often a recipe for innovation and breakthrough. Now, if someone wants to learn about the work of Black atomic physicists, they can simply search the hashtag #BlackinAMO on Twitter. Splitting the week by area has the added benefit of helping employers identify and potentially hire for positions. It also allows us to clearly demonstrate the excellent and wide-ranging work that Black physicists do in their respective areas of physics.

How will you know that the week has been a success or had an impact?

CB: We will judge the success by how effective it has been to celebrate the work of Black physicists and whether it has helped to build an engaged community. It is crucial for our success that this work is done in a way that honours Black physicists’ rich set of identities. Black people are not a monolith, and so neither are Black physicists. We are men, women, non-binary, LGBTQ+, disabled, and in these identities we all enrich the physics discipline. We want to see younger Black physicists discovering their unknown peers and near-peers and we want them to interact with, and be inspired by, more senior Black physicists. We want the senior physicists to learn about the young and talented Black physicists for whom they cleared the path. We want Black physicists of all ranks to learn and hone important skills and tools to help them flourish. And, importantly, we want Black physicists spanning different generations to socialize and have fun with each other at the week’s social events. We also want non-scientists and non-physicists to interact with as many Black physicists as possible.

And what about the benefits to the public?

CB: #BlackinPhysics week offers a unique opportunity for the public to engage each other in a way that is atypical, yet beneficial and empowering for all parties involved. For example, the public will be engaged with physics content on social media and will have the ability to speak directly to, and learn from, physicists, thereby increasing scientific literacy. Physicists will also have myriad opportunities to practise presenting their research in concise yet engaging ways to physicists and non-physicists alike.

Do you plan to keep the initiative going?

EG: Yes, it will continue. Part of our mission is to strengthen intergenerational connections between physicists. We hope to encourage the connections made during #BlackInPhysics week and to build new links through our website and on Twitter with additional future events. A longer-term goal we have is to create a database of all Black physicists that can be used for job hiring, presentation invitations and so on.

Do you think the initiative could be mirrored in other countries?

XQ: #BlackInPhysics is a growing international collaboration. While the current organizing team is made of 12 individuals, our nationalities and histories span the globe. One of the amazing things about living in this age of social media is that we can extend the invitation to this event to a global audience. We aren’t using the hashtag, “US Black physicists” or “Black physicists in America” for a reason. #BlackInPhysics is meant for the people of the African diaspora, for immigrants, for Black and African physicists across the world to connect with each other, to build these connections and communities across oceans. #BlackInPhysics week is a part of a global movement of Black scientists coming together to say, “We are here!”

Elekta offers ongoing innovation in precision radiation medicine

From software launches to new radiation therapy equipment, Elekta continues to innovate in the field of cancer-treatment technologies. The company has released, acquired and gained regulatory approval for a range of new radiotherapy products over the last 12 months.

What’s new?

In December 2019, Elekta received 510(k) pre-market notification from the US Food and Drug Administration (FDA) for the use of diffusion-weighted imaging (DWI) datasets obtained with the Elekta Unity MR-linac. This approval expanded the clinical utility of Elekta Unity to include biologic assessment of tumour response during therapy, allowing treatment adaptation based, not just on gross anatomic changes, but also on early biologic changes at the cellular level.

DWI works by creating a map of the diffusion of water molecules at the cellular level, which can be processed to generate the apparent diffusion coefficient (ADC). A growing body of evidence shows that changes in ADC within a tumour can provide important insights into an individual’s tumour response. Such insights, which were previously unavailable during radiation treatment, can support further personalization of the radiotherapy regimen by allowing more tailored dose adaptation. Overall, DWI can improve a clinician’s ability to deliver the right dose to the right part of the tumour.

To learn more, check out the business line update from Lionel Hadjabjeda, president, MR-Linac solutions, on Wednesday 28 October at 12.00 p.m. ET.

MOSAIQ Plaza

Elekta’s MOSAIQ Plaza* is a powerful, comprehensive set of digital tools that connects users to their colleagues, data and patients, throughout the entire patient workflow. This fully integrated software suite helps to drive efficiencies, standardize daily practice and deliver value-based healthcare.

For example, Elekta recently acquired and continued development of the technology behind MOSAIQ Voice, which enables users to work at the speed of their voice. The tool allows clinical tasks in MOSAIQ to be completed via dictation, streamlining documentation and reducing workload.

Another game-changing workflow advancement is MOSAIQ SmartClinic. With a wide range of professionals involved in cancer patient care, MOSAIQ SmartClinic provides a transparent approach to care management, connecting users to the information that they need, whenever and wherever.

To learn more, check out the business line update from Andrew Wilson, president, oncology informatics solutions, on Tuesday 27 October at 12.00 p.m. ET.

Geneva brachytherapy applicator

Cervical cancer screening, via Pap and HPV tests, has successfully reduced the incidence of cervical cancer via early detection. But for women still confronted with this disease, treatments such as brachytherapy offer a proven way to improve overall survival. In April 2000. Elekta introduced the Geneva brachytherapy applicator. Designed for cervical cancer up to stage IIB, Geneva could help up to 75% of patients with locally advanced cervical cancer.

In mid-July, Geneva received FDA 510(k) clearance. With a comprehensive range of ovoid (13–40 mm) and intrauterine tube (30–80 mm) sizes, in addition to a new interstitial intrauterine tube to expand treatment options after hysterectomy, Geneva can accommodate most female anatomies. Clinicians can now take advantage of a single gynaecological applicator to treat most cervical cancer patients via intracavitary and/or interstitial brachytherapy.

To learn more, check out the business line update from John Lapré, president, brachytherapy solutions, on Tuesday 27 October at 2.00 p.m. ET.

Leksell Gamma Knife Lightning

On 12 June, Elekta introduced Leksell Gamma Knife Lightning**, a treatment optimizer designed to boost Gamma Knife radiosurgery workflows and improve plan quality. Lightning, which has been available commercially since the end of June, enables clinicians to create plans for multiple targets in less than a minute and to include beam-on time as a constraint, greatly reducing overall treatment time.

Gamma Knife Lightning addresses the challenge of increasing automation and speed in the stereotactic radiosurgery workflow, while also ensuring a personalized plan tailored to each patient’s needs. The ability to create multiple plans in less than 60 s will allow physicians to compare plans and select the best option. And by controlling beam-on time, they can develop plans with the most efficient delivery parameters – cutting beam-on time in half compared with manual forward planning.

To learn more, check out the business line update from Verena Schiller, president, neuroscience solutions, on Monday 26 October at 2.00 p.m. ET.

ProKnow

To expand its offering of cloud-based solutions for advanced radiotherapy, in late-August 2019 Elekta acquired ProKnow*, offering customers access to high-quality, cloud-based, treatment planning analytics. ProKnow can store, navigate and retrieve data in a scalable cloud-based framework that works across all imaging, planning and treatment modalities. A wealth of insight is contained within these data; but without simple, intuitive solutions, it remains locked away. By centralizing data in a secure web-based repository, ProKnow can unlock it and make it accessible, enabling providers to connect to clinical teams anywhere, any time.

To learn more, check out the breakout session presented by Ben Nelms, founder of ProKnow, on Friday 23 October at 2.00 p.m. ET.

The Kaiku Health app

In May 2020, Elekta announced its acquisition of Kaiku Health*, a Finnish company best known for its app that monitors patient-reported outcomes. The Kaiku Health app provides intelligent symptom tracking and management for healthcare providers in routine oncology care and clinical studies. The app screens for patients’ symptoms, notifies the care team on their development and provides personalized support for patients. It is easily implemented into existing hospital information systems and can be integrated with Elekta’s MOSAIQ oncology information system.

Including intelligent patient-monitoring software in the Elekta portfolio supports the company’s oncology informatics strategy. This is a concrete step toward expanding its digital portfolio to further digitally connect customers and their patients.

To learn more, check out the breakout session presented by Vesa Kataja, chief medical officer, Kaiku Health, on Friday 23 October at noon ET.

Virtual presence

With industry events such as the ASTRO Annual Meeting, the SROA Annual Meeting and the ASRT 2020 radiation therapy conference going completely virtual this year, so has Elekta.

For starters, Elekta is partnering with ASTRO for the very first virtual ASTRO experience. Alongside hosting a booth in the virtual exhibit hall, showcasing the abovementioned products, Elekta is running its virtual User Meeting 2020 on 23 October. Attendees at this event will hear from peers, industry experts and Elekta product specialists in a day of learning, networking and interactive product demonstrations.

The User Meeting will conclude with the session, Championing Women & Diversity in Radiation Oncology—A Panel Discussion. This virtual discussion will include five trailblazing professionals: Laura Cervino from Memorial Sloan Kettering Cancer Center; Sandra Hayden from the University of Texas Southwestern Medical Center; Lisa Kachnic from Columbia University Medical Center and the Herbert Irving Comprehensive Cancer Center; Toral Patel from the University of Texas Southwestern Medical Center; and Crystal Seldon from the University of Miami/Jackson Memorial Hospital.

In the afternoon breakout sessions, radiation oncologists and medical physicists will share their experiences of treating patients on the Elekta Unity and implementing an MR-guided radiotherapy programme. In the field of stereotactic radiotherapy, speakers will discuss linac-based stereotactic treatments and strategies for maximizing SRS efficacy and efficiency with the Leksell Gamma Knife Icon. Attendees can also hear presentations on skin brachytherapy, Elekta’s latest digital solutions, and its recent acquisitions: Kaiku Health and ProKnow. There’s also a discussion on the role of education and training in the medical technology industry.

Elekta is also attending the Virtual Annual Meeting of the Society of Radiation Oncology Administrators (SROA). During the meeting, Elekta will participate in a payer trend roundtable, on 27 October at 3.00 p.m. ET, followed by a “Brown bag” webinar at 4.00 p.m. ET, to share how it is innovating personalized care solutions on a global scale.

As a long-time partner of the American Society of Radiologic Technologists (ASRT), Elekta is participating in the educational sessions at the ASRT 2020 radiation therapy conference and will be hosting a virtual booth at the ASRT expo.

 Click here for a list of all of Elekta’s virtual events.

*MOSAIQ Plaza, ProKnow and Kaiku Health are not available in all markets.

**Leksell Gamma Knife Lightning is not yet commercially available.

Conductive hydrogel could repair damaged peripheral nerves

Conducting polymer hydrogel

Researchers in China have developed a stretchable and conductive hydrogel that they claim could one day be used to repair peripheral nerves – delicate tissues that transmit bioelectrical signals from the brain to the rest of the body in real time. The hydrogel, which has been tested in rats with sciatic nerve injuries, remains electrically conducting when elongated and its conductivity improves when it is illuminated with near-infrared light. These two properties mean that it could be used to treat serious peripheral nerve injury, especially when the missing nerve length exceeds 10 mm.

Flexible electronics has come along in leaps and bounds in the last few decades, allowing bioelectronic materials to be used as artificial tissue in vivo. Hydrogels – 3D polymer networks that can hold a large amount of water – are similar in structure to nerve tissue, and interfacing these materials to living tissue is one of the most important topics in bioelectronics today.

Repairing injured peripheral nerves

Peripheral nerve injury – for example, when a peripheral nerve has been completely severed in an accident – can result in chronic pain, neurological disorders, paralysis and even disability. Such injuries, however, are difficult to treat.

One of the main techniques used to repair injured peripheral nerves is autologous nerve transplantation. This involves removing a section of peripheral nerve from elsewhere in the body and “sewing” it onto the ends of the severed nerve. There are nevertheless some shortcomings associated with this approach, including the fact that the surgery doesn’t always restore nerve function and that multiple follow-up procedures are sometimes required. There is also the risk of painful neuroma (benign growth of nerve tissue) after the operation.

Another technique relies on tissue engineering to restore and repair motor and sensory function of neuronal cells. This method makes use of natural or synthetic materials that can be grafted onto nerve cells together with “supporting” cells, such as mesenchymal stem cells. The problem with this approach is that the grafted nerves recover slowly.

Hydrogel conducts bioelectric signals

The research team – led by Qun-Dong Shen of Nanjing University, Chang-Chun Wang of the Nanjing Institute of Technology and Ze-Zhang Zhu of the Affiliated Drum Tower Hospital of Nanjing University – has now developed an alternative technique. In their work, the researchers made use of a mechanically tough but stretchable conductive hydrogel containing biocompatible polymers: polyacrylamide and conductive polyaniline. These crosslinked polymers boast a 3D microporous network that, once implanted, allows nerve cells to enter and adhere, helping to restore lost tissue.

SEM image

The hydrogel conducts bioelectric signals – something that the team proved by replacing a damaged sciatic nerve from a toad with the material and measuring the signals through it. They also implanted the hydrogel into rats with sciatic nerve injuries. In these experiments, they observed that the rats’ nerves recovered their bioelectrical properties – as measured by electromyography one to eight weeks following the operation – and that their walking improved compared with rats that hadn’t been treated with the hydrogel.

Another advantage of the hydrogel is that the current flowing through it increases from 1.95 nA to 2.3 nA when it is irradiated with near-infrared light, which can penetrate deep into biological tissue. This proves that the hydrogel is relatively sensitive to light of this wavelength, say the researchers, and that the bioelectrical signal through the material can be increased in this way, so allowing for improved nerve conduction and recovery.

And that is not all: when elongated mechanically, the material remains conducting – just like biological nerve tissue. This means that it can accommodate the large strains produced by sutured nerve tissue in motion.

Full details of the research are published in ACS Nano.

‘Quintuple point’ material defies 150-year-old thermodynamics rule

Five different phases of a colloid-polymer mixture can co-exist at the same time, in defiance of the 150-year-old Gibbs phase rule, which states that only three simultaneous phases are possible. The result, which researchers in France and the Netherlands obtained using an algebraic model for the thermodynamics of binary rod-polymer mixtures, could help advance our understanding of phase transitions in complex systems, with possible industrial applications in areas such as food processing and paint manufacture.

The American physicist Josiah Willard Gibbs is an acknowledged founder of modern thermodynamics and physical chemistry. His phase rule, which he derived in the 1870s, sets out the maximum number of different phases that can simultaneously exist in a substance or mixture of substances. For pure substances, Gibbs’ phase rule predicts a maximum of three phases. One well-known example is water, which can co-exist as a liquid, solid and gas at its so-called triple point.

Clustering effect

In the new work, a team led by Remco Tuinier of the Eindhoven University of Technology simulated the behaviour of a colloidal mixture of two particle types – rods and polymers – dispersed in a background solvent. In their computations, they represented rods as hard spherocylinders and the polymers as spheres that freely overlap with each other.

“The system can increase the space available for the polymer chains by clustering the rods together,” Tuinier explains. “This results in a phase separation in the mixture into two (or more) phases containing a phase where the rods are enriched and another area that mainly contains polymers.”

Once this clustering occurs, the heavier rods sink to the bottom of the mixture, leading to segregation. Eventually, the lower part of the mixture becomes so crowded that the rods take up preferential positions so that they are “less in each other’s way”, Tuinier tells Physics World. The rods thus end up neatly arranged next to each other.

A quintuple point emerges

Building on previous models for dispersions of pure rods and disk-polymer mixtures, the researchers developed a quantitative theory to map out a complete phase diagram for their two-component rod-polymer mixtures. According to the calculations of team member Vincent Peters, up to five different phases appear in the system under a specific condition (see image). At this “quintuple point”, the possibilities are an isotropic gas phase with unaligned rods at the top; a nematic liquid crystal phase with rods pointing in roughly the same direction; a smectic liquid crystal phase with rods lying in different layers; and two solid phases with “ordinary” crystals at the bottom.

This five-phase system represents “the first time that the famous Gibbs rule has been broken,” team member Mark Vis says. The profusion of phases is possible, Tuinier adds, because of the shape of the particles (particularly their length and diameter), which Gibbs did not consider. “In addition to the known variables of temperature and pressure, you get two additional variables: the length of the particle in relation to its diameter, and the diameter of the particle in relation to the diameter of other particles in the solution,” he explains.

Serendipitous result

As sometimes happens in science, the result was in part a stroke of luck, since the researchers weren’t initially looking for more than three phases in their simulations. While studying plate-shaped particles and polymers, however, team member Álvaro González García and Vincent Peters observed a four-phase equilibrium. “Álvaro came to me one day and asked me what had gone wrong, because four phases just couldn’t be right,” Tuinier says.

While the team obtained its results using simulations, members say that a real version of their system could easily be produced in the laboratory, and the results tested in experiments. According to Vis, the team’s findings could help advance our understanding of phase transitions in such systems and predict more precisely when phase transitions occur – knowledge that could come in useful for applications such as manufacturing complex colloidal mixtures like mayonnaise or paint.

Liquid crystals in displays could benefit too, Vis adds. “Most industries choose to work with a single-phase system, where there is no segregation,” he says. “But if the exact transitions are clearly described, then the industry can actually use those different phases instead of avoiding them.”

The research is detailed in Physical Review Letters.

Optical microscopy – how small can it go?

“Adorn’d with a curiously polish’d suit of sable [black] Armour” and “multitudes of sharp pinns, shap’d almost like Porcupine’s Quills”. While not quite how people would usually think of a flea, this was Robert Hooke’s description of the creature in his 1665 bestseller Micrographia. The book contained images of flora and fauna minutiae drawn in mesmerizing detail, revealing familiar objects with unfamiliar features and structures that were not just inferred but actually seen with the help of a microscope. Hooke’s work represented a gear change in the practice of science.

Microscopy has come a long way since the compound microscopes of the 17th century that Hooke used, where two convex lenses produced a magnified image. As well as optical microscopy, which uses primarily visible light, we now have a host of other imaging techniques based on electrons, X-rays, atomic forces and other approaches besides. Many of these achieve far greater resolutions than optical microscopy, so can this traditional technique ever catch up? What will then limit its scope, and why even bother trying to improve something so old-fashioned?

Even as other techniques were resolving atoms, optical microscopy retained a fan base because, in some ways, you could still see more optically. When an object is illuminated with a pulse of light it can do a number of things with the energy. The object can scatter, transmit or absorb the light, making molecules vibrate in different ways, exciting electrons into different orbitals or causing them to resonate in unison. Spectral maps of what light does at different wavelengths therefore give researchers vital information about the chemical and structural composition of a sample, and its environment. Other techniques might give a level of energy-dependent response, but optical spectra are especially rich.

Optical microscopy also means you don’t need to freeze samples, keep them in a vacuum or zap them with electrons and a massive electric field. It’s therefore perfect for viewing living cells and other delicate samples.

Unfortunately, optical microscopy can only get you so far. A virus like HIV is only 140 nm in size, but for a long time anything smaller than a few hundred nanometres was considered beyond the scope of optical microscopy. That meant you couldn’t use it to image, say, the distribution of proteins around a neuron or virus, leaving you without any insight into how these cells function, or how to stop them. This assumption was not based on any practical limitations of the day’s microscope technology, but a fundamental physical restriction that limited the resolution of any optical microscope made from lenses.

Antonie van Leeuwenhoek used this instrument to see micro-organisms for the first time in 1673.

Lenses and limits

When the light from two separate points passes through a convex lens, it refracts – the ray paths bend toward each other. This means that when the light hits your retina it’s as if the points were further away from each other. To the mind’s eye the distance between them is magnified. Armed with just such a simple single-lens based instrument – as well as a keen eye, pedantry for lighting and extraordinary patience – the Dutch drapers’ son Antonie van Leeuwenhoek famously saw micro-organisms for the first time.

Writing in the Philosophical Transactions of the Royal Society in 1673, what he reported seeing were in fact bacteria and similar-sized micro-organisms, typically 0.5–5 μm (so still several orders of magnitude larger than a virus). When Hooke corroborated these observations, he used his more elaborate compound microscope – an instrument with an additional “eyepiece” lens that magnifies the already magnified image of the object produced by the first “objective” lens. Compound microscopes can be powerful instruments and are widely used today, but they are still a long way off from resolving a virus.

The snag is the diffraction that occurs whenever light passes around an object or through an aperture. Straight “planar” wave fronts are turned into curves that propagate like the rings round a pebble dropped in a pond. Where these waves overlap they interfere – doubling up in peaks of light intensity or cancelling each other out in troughs. A finite distance between resolvable objects emerges – any closer and the peaks overlap so they are indistinguishable. In 1873 Ernst Abbe famously defined this “diffraction-limited” resolvable distance, d, in a relation now carved in stone on his memorial in Jena, Germany: d > λ/2nsinϑ where λ is the wavelength of light and nsinϑ – known as the numerical aperture – is the product of the material’s relative refractive index and the sine of the half-angle of the maximum cone of light that can enter or exit the lens system.

In the decades that followed Abbe’s definition of the diffraction limit, the speed of light was found to be a constant, X-rays and radioactivity were discovered, energy and matter proved equivalent, and the quantum hypothesis muddied not just the distinction between waves and particles, but also the certainty of measurements of time and energy, and position and momentum, which became compromised. All the while the diffraction limit has held – or at least it has for “far-field” light.

Scanning near-field optical microscopy (SNOM) image of a fixed endothelial HLMVEC cell.

Beyond Abbe’s limit

Light travels to our retinas as propagating electromagnetic waves – electric and magnetic fields leapfrogging each other through space. This light is described as “far-field” by virtue of its having travelled far afield. But every object that scatters or emits far-field light also has “near-field” light clinging to its surface. These are the higher-frequency, shorter-wavelength electromagnetic components that diminish to nothing within around a wavelength.

In 1928 the Irish physicist Edward Hutchinson Synge suggested that a device with an aperture placed within roughly a wavelength of an illuminated surface could detect the near-field light and generate images unrestricted by the diffraction limit. The location of the near-field light interaction would be defined by the position of the aperture, and the resolution only limited by the aperture’s size. Another 44 years was to pass, however, before Eric Ash and George Nicholls at University College London in the UK were able to beat the diffraction limit in this way. They used microwaves with a wavelength of 3 cm, so a resolution of a centimetre was still breaking the diffraction limit. But it would be another decade before anyone achieved near-field optical microscopy at visible wavelengths.

It was the 1980s and Dieter Pohl, a physicist who had worked on one of the first lasers in Europe as a graduate student, was employed at IBM. At that time his colleagues Gerd Binnig and Heini Rohrer had invented the scanning tunnelling microscope, which brought atomic-size features into view and stole much of the limelight from optical microscopy. “It bugged me a little that optical techniques were now discarded – at least at IBM – because of their limited resolution,” Pohl recalls, despite having joined Binnig and Rohrer’s team.

While at IBM, Pohl conceived a scanning near-field optical microscope (SNOM), realizing it with his then student Winfried Denk. They did this by pressing a corner of a transparent but metal-coated quartz crystal – which they used as the aperture in Synge’s proposal – against a glass plate until a faint light transmitted from the near-field was detected. Painstakingly maneouvring it within nanometres of the sample (a test object with fine-line structures), Pohl and his colleagues were able to produce a SNOM image at visible wavelengths for the first time. The resolution was 20 nm, much smaller than a lot of viruses.

Since then, SNOM has evolved from apertures in sheets to metal-coated probes that have an aperture at the tip (figure 1a). Light can travel down a SNOM tip and out of a nanoscale aperture at the end to illuminate a sample with near-field light, which is then reflected off or transmitted through the sample. Once scattered, the light is detected as far-field. Although diffraction limited, you know where this far-field light was scattered from (the tip end) so can still achieve nanometre-scale resolution. In some systems the scattered light travels back up the tip to be detected there, and in this mode the original illuminating light can come from a separate source.

There are now also apertureless tips (figure 1b). In these SNOM systems, an atomically sharp metal tip – much like those used in atomic force microscopy – is illuminated by a far-field light source and scanned within nanometres of a sample surface, causing the near-field light to scatter and therefore be detected as far-field light. This method has the benefit of enhancing the field at the end of the tip. The light causes resonant oscillations of the electrons on the probe surface – “plasmons” – which concentrate and amplify the electromagnetic field in a highly localized area at the tip. When this enhanced near-field scatters from the sample, the interactions are intensified, producing a much greater signal that allows you to image more clearly.

Scanning near-field optical microscopy.

Anatoly Zayats, a nano-optical physicist at King’s College London in the UK, also thinks “tipless” SNOM is possible. He works on structuring light beams with phenomena like “superoscillations” and “photonic skyrmions”, and suggests these may provide a tip-free alternative that gets around one of the technique’s main bugbears – the indispensable yet pretty much irreproducible field enhancements from the tip that greatly depend on the individual tip’s shape and size. “Even minute differences in tip size and shape might have a significant impact on the resolution,” says Zayats.

SNOM has become the workhorse for nanoscale chemical characterization, where optical spectra really give the edge over other techniques. The field enhancements that help SNOM work have spawned progress in sensing, lithography and catalysis too. Given the success you might wonder why it took 50 years for anyone to do anything with Synge’s idea from 1928, especially as his paper contained a detailed description of how to realize the instrument. Nanoscale fabrication and manipulation posed obvious challenges in the 1920s, but Zayats suggests that perhaps the most significant gap in available technology was signal collection and processing. Not only do electronics and computers now collect, store and represent image data point by point with ease, but machine learning and artificial intelligence are pushing image-processing capabilities further still. In addition, as Pohl suggests, 1928 may just not have been the right time for SNOM to take off. “The frontlines of physics were on quantum theory and the theory of general relativity,” he says, “not on such practical goals as a super-resolution imaging technique.”

Going deep

SNOM works well on surfaces, but what about a virus deep in a tissue sample? Scientists have innovated with lens and focusing systems, and found ways to exploit the shorter wavelengths of X-rays, but even as late as the 1990s the diffraction limit still held for depth imaging.

Around this time, studies of fluorescing molecules were providing molecular biologists with new tools for imaging, albeit at diffraction-limited resolutions. This set the scene for the development of a game-changing technique exploiting them. “What I realized is that it’s very hard to do anything about the focusing process itself,” says Stefan Hell, one of the directors of the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany. “But resolution is about making molecules discernible. So the key to overcoming the barrier was held by the fluorescent molecule itself.”

In 1994 Hell proposed an approach for beating the diffraction limit in depth imaging based on fluorophores – molecules that can be excited to fluoresce by light at a specific wavelength but can also be suppressed from doing so with a different wavelength. Using a standard round beam to excite molecules, overlaid with a doughnut-shaped beam to de-excite them, only those molecules at the very centre, where the intensity of the doughnut beam drops low enough, will actually fluoresce. And because the doughnut beam’s intensity drops gradually towards the centre, that central region can have sub-diffraction-limit dimensions.

Hell’s experiments demonstrating “stimulated emission and depletion (STED) microscopy”, which he reported in 1999, soon inspired others. Keeping some of the molecules non-fluorescent, or working with fluorescence on–off switching using lasers with different wavelengths, turned out to be the key to “fluorescence nanoscopy”. By 2006 reports of photo-activated localization microscopy (PALM) and stochastic optical reconstruction microscopy (STORM) were also raising eyebrows. Developed by Eric Betzig at the Howard Hughes Medical Institute in the US and William Moerner at Stanford University, also in the US, PALM and STORM are both stochastic approaches (meaning they work with the probabilistic emission behaviour of fluorophores), but they differ in the types of fluorophores used. They both flood the field of view with illumination at just enough intensity to switch one fluorophore on but keep the others dark. You then need to find the emitting fluorophore with a camera and identify the centre of the molecule from the diffraction-limited intensity profile recorded. With only one molecule excited, there are no overlapping intensity profiles to confuse where the centre is.

In 2014 Hell, Betzig and Moerner were awarded the Nobel Prize for Chemistry for “the development of super-resolved fluorescence microscopy”. However, although the techniques can resolve down to a few nanometres in theory, in practice the best they manage is a few tens of nanometres. The problem is the need for more photons when you try to crank up the resolving power, whether that’s by getting as many photons as possible from the emitting fluorophore in PALM/STORM to increase the signal, or enlarging the region switching molecules off in STED. “That is limiting in terms of bleaching,” explains Hell, referring to the process whereby overexposed fluorophores can no longer fluoresce.

Recognizing the problem, Hell combined the strengths of both STED and PALM/STORM in an approach termed MINFLUX, which tracks fluorophores with a doughnut beam that excites – rather than de-excites – fluorescence. The technique homes in on an off-centre switched-on molecule by gauging its position from the measured intensity and the expected intensity profile. It uses far fewer photons and generates images with 1–5 nm resolution in just tens of milliseconds, making it possible to create movies following dynamic processes. “I think it will open up a new field for microscopy,” says Hell.

A 3D MINFLUX recording of a mitochondrian labelling Mic60 (orange) and ATPB (blue) proteins.

Pocket near-field optics

Imaging a structure like a virus with a resolution under 20 nm is now pretty routine in a lab setting, but optical microscopes remain bulky, complex devices. In January 2017, however, the EU launched a four-year project entitled ChipScope to design something less cumbersome.

The idea combines some of the advantages of SNOM with lensless optical microscopy, an existing technique that uses data analysis to generate images with a vastly expanded field of view based on multiple contributing images. Conventional lensless microscopes hold a sample directly under the detectors and illuminate it from a sufficient distance, which optimizes the field of view. In ChipScope, however, the sample is so close to an array of LEDs that non-diffracted near-field light illuminates it. The intensity of light transmitted through the sample is then captured by a camera as each LED is lit, to build up a shadow of the sample pixel by pixel, where each LED denotes a pixel. Although the collected light is diffraction-limited, its origin is the lit LED, and that position is known. As a result, the LED size, not the diffraction limit, determines the resolution. Before ChipScope, the state of the art for tiny LEDs was around 100 μm. Thanks to work at the University of Technology in Braunschweig, Germany, ChipScope has already demonstrated the approach with LEDs measuring 5 μm, and the plan is to push the resolution lower with 200 nm LEDs.

ChipScope’s new compact optical microscope relies on tiny LEDs that illuminate the sample and determine the resolution.

Even with 200 nm LEDs ChipScope won’t break resolution records, and so far the device is capturing intensity alone, which means it cannot be used for spectroscopy purposes. It is also limited to very thin samples – a few hundred nanometres or less – so that enough light from a 200 nm LED can pass through and is not diffracted over the thickness of the sample. But by stripping near-field optical imaging back to the bare bones, ChipScope makes huge gains in device size. Angel Dieguez, from the University of Barcelona in Spain, who is ChipScope’s project co-ordinator, describes the device based on 5 μm LEDs as proof the concept works, at least in the far-field imaging regime. “The whole microscope is half the size of a phone,” he highlights, “making it two orders of magnitude more compact than a conventional microscope.” By using 200 nm LEDs, the aim is to fit the device onto a chip that can slot into a mobile phone.

Etching 200 nm LEDs into a smooth array is no mean feat, however, and each LED needs its own wire for switching. There is also the problem of how best to operate the microscope – either by moving the sample across the LEDs with microfluidics or moving the LEDs under the sample with microelectromechanical systems (MEMS) technology. Then, somehow, you have to assemble the disparate components into a working device. Earlier this year it seemed as if the researchers had built an array of 200 nm LEDs, but the COVID-19 pandemic led to the device gathering dust in a lab as lockdown in Spain suspended further experiments. Many will be watching developments when the lab is up and running again. “It’s interesting,” says Hell, who is not involved in the ChipScope project himself. “Worth pursuing for sure.”

In the 450 years since Leeuwenhoek discovered bacteria with a single lens, countless ingenious scientific developments have shone a light on structures even smaller than viruses – in depth and even in motion. So could a fascination with phone-based nanoscopy finally subvert today’s selfie craze? Who knows, but optical microscopy’s attraction is sure to endure.

Rippling graphene harvests thermal energy

The rippling thermal motion of a tiny piece of graphene has been harnessed by a special circuit that delivers low-voltage electrical energy. The system was created by researchers in US and Spain, who say that if it could be duplicated enough times on a chip, it could deliver “clean, limitless, low-voltage power for small devices”.

Brownian motion is the random movement of a tiny particle that is buffeted by atoms or molecules in a liquid or gas – and the idea of harnessing this motion to do useful work has a long and chequered history.  In the early 1960s, the Nobel laureate Richard Feynman popularized a thought experiment known as the “Brownian ratchet”, which had been conceived in 1912 by the Polish physicist Marian Smoluchowski. This involves a paddle wheel that is connected by an axle to a ratcheted gear. Both the paddle wheel and the ratchet are immersed in fluids. The system is imagined as being small enough so that the impact of a single molecule is sufficient to turn the paddle. Because of the ratchet, the paddle can only turn in one direction and therefore it appears that the Brownian motion of the paddle can be harnessed to do the work of turning the axle.

However, Feynman showed that if the two fluids were at the same temperature, collisions throughout the system would prevent this from happening. The only way work could be done, argued Feynman, is if the fluids are a different temperature, making the Brownian ratchet a heat engine.

Freestanding graphene

In their new study, University of Arkansas physicist Paul Thibado and colleagues replaced the paddle with a freestanding sheet of graphene – a single layer of carbon atoms. In a 2014 study, the team used scanning tunnelling microscopy to discover that graphene ripples back and forth at room temperature like a wave on the surface of the ocean. Indeed, these ripples provide the sheets with the stability they need to exist.

The team’s energy harvesting circuit features a graphene sheet that ripples next to an electrode. As the sheet ripples from concave to convex – alternately getting closer to and further away from the electrode – the pair behave as a variable capacitor that produces an alternating current.

In their new circuit design, the team combined this variable capacitor with two opposing diodes wired in parallel. This created two separate paths for the current as it flows in each direction. In this way, one of the paths can be used to charge up a storage capacitor that can later be emptied to perform work, such as in lighting up a bulb or powering a similar component (see video).

The researchers report that their dual-diode system serves to boost the power: “We also found that the on-off, switch-like behaviour of the diodes actually amplifies the power delivered, rather than reducing it, as previously thought,” explained Thibado. “The rate of change in resistance provided by the diodes adds an extra factor to the power.”

“Symbiotic” relationship

But how does this setup work when the Brownian ratchet fails? The researchers explain that success lies in how the graphene and the circuit share a “symbiotic” relationship. Even though the circuit allows the thermal environment to do work on the load resistor, the circuit and the graphene operate at the same temperature, meaning that no heat flows between the two.

“This means that the second law of thermodynamics is not violated, nor is there any need to argue that ‘Maxwell’s Demon’ is separating hot and cold electrons,” Thibado explained.

He points out that the operation of the new device is not based on an old notion that a single diode could be used in such a circuit to allow high energy electrons to flow by while blocking low energy ones. This idea was dismissed in the 1950s by the French physicist Léon Brillouin because it would cause one side of the diode to heat up. This would lead to particles flowing from cold to hot, violating the second law of thermodynamics.

Rerouted current

“People may think that current flowing in a resistor causes it to heat up, but the Brownian current does not. In fact, if no current were flowing, the resistor would cool down,” Thibado added. “What we did was reroute the current in the circuit and transform it into something useful.”

“An energy-harvesting circuit based on graphene could be incorporated into a chip to provide clean, limitless, low-voltage power for small devices or sensors,” he adds.

With their initial study complete, the researchers are now working to store enough of the DC current produced by the energy-harvesting circuit within a capacitor for later use – a goal that will require the miniaturization of the circuit and its patterning on a silicon wafer or chip. Should it prove possible to duplicate the circuit millions of times over on a one square millimetre chip, Thibado says, “this can be a battery replacement”.

The research is described in the journal Physical Review E.

Flipping the switch: how a hybrid journal went open access

Piera Demma Cara

Tell us about Materials Research Express (MRX)?

MRX is an open-access journal that focuses on interdisciplinary and multidisciplinary research. Published by IOP Publishing, which publishes Physics World, it is devoted to publishing new experimental and theoretical research in the properties, characterization, design and fabrication of all classes of materials including biomaterials, nanomaterials, polymers, smart materials, electronics, thin films and more. The journal, which offers rapid peer review, has an international editorial board that is led by the journal’s editor-in-chief, Meyya Meyyappan from NASA’s Ames Research Centre in the US.

What was the original publishing model for MRX and what is it now?

MRX launched in 2013 as a hybrid open-access journal, which gave authors the choice to make their article immediately and openly accessible at the point of publication. From 1 October 2019, MRX changed to a fully open-access journal model, so that papers published from that date were made immediately and permanently free to access under a Creative Commons attribution licence (CC BY).

Can you explain these different publishing models?

Hybrid open access or subscription-only journals typically require the reader to pay to access some or all of the content. In a fully open-access journal, the final published version of record of every article is made immediately and permanently free to access and released under a licence that permits free reuse, such as a CC BY licence. This licence grants anyone – both authors of the article and other researchers – the rights to share the article or reuse parts or all of the original content from the article for any purpose, providing that there is clear attribution to the original author and their work. To support the costs of managing peer review and publication in a fully open-access journal, we apply an article publication charge (APC) to published articles, but other funding mechanisms are also used to fund open-access publication models.

What were the reasons for the switch?

IOP Publishing has long been supportive of open access. We wanted to show that support by flipping one of our larger journals to open access. The move also allowed us to understand the impacts on the journal in terms of submissions and published articles.

And what are the benefits of doing this?

Through the move, we have made important materials science research freely accessible. We’ve seen increased average downloads per article with some being downloaded and read over 2000 times – over 900 times in the first three months of publication alone – putting them in the top 50 of most downloaded articles since 2013.

What were some of the challenges switching MRX to an open-access model?

MRX is a journal of reference for the materials research community and has always been widely inclusive. Retaining the same author demographic after introducing an APC was probably the biggest challenge for us but to make this transition as fair and smooth as possible, we supported authors wherever possible. We set an APC price of £1100 and for the first month after we switched, IOP Publishing covered this APC for all submitted articles. Following that, and throughout 2020, we have offered authors a reduced APC of £825 – a 25% discount – and a further discount on this for authors from lower-income countries.

Did this have an effect?

The move has been well received by the community. Although we have received submissions from over 70 countries, we have, however, seen a substantial reduction in the total number of submissions. In 2020 we will publish about 1500 articles against almost 4500 in 2019. There has been an especially stark reduction in submissions from authors in India, which is a useful reminder of the considerable geographical differences in the level of financial support for open-access publishing.

Has the switch impacted the peer review and publication process of the journal?

Not at all. IOP Publishing is committed to providing authors with a high-level author service throughout the whole publication process. This means that both authors’ experience and journal-quality standards will not be affected by the switch to open access and that the journal will also continue to prioritise excellence and rigour in the publishing service it provides to the whole materials science community, including authors, reviewers and readers.

Is it possible that other IOP Publishing journals may follow MRX’s lead?

We are always looking at new opportunities to support open-science practices and expand access to research. We are committed to annually reviewing the opportunities to transition our hybrid open-access journals to being fully open access. However, converting the model of a journal that serves an active, global community of researchers requires careful consideration and our priority must always be centred around the needs of those research communities.

Data-driven oncology: machine learning and RayIntelligence

Want to learn more on this subject?

Fredrik Löfman, head of machine learning at RaySearch Laboratories in Stockholm, discusses machine learning and provides an introduction to RaySearch’s latest innovation, RayIntelligence, an oncology analytics system.

Fredrik Löfman is head of machine learning at RaySearch Laboratories AB. He has a MSc in engineering physics from Chalmers University of Technology, Gothenburg, Sweden, and Imperial College, London, UK, and a PhD in applied mathematics from the Royal Institute of Technology, Stockholm, Sweden.

Since 2017, Fredrik has established a machine-learning department at RaySearch focusing on data-driven oncology and machine-learning applications to automate and support the process of improving future cancer treatments. The department is responsible for prototype development, research projects and product development of machine-learning applications and analytics software in oncology.

Supercooled water is stable in two different forms

Supercooled water – that is, water that remains liquid far below its normal freezing point – does not have a uniform structure, but instead takes on two distinct forms. This discovery, which was made by researchers at Pacific Northwest National Laboratory (PNNL) in the US using infrared spectroscopy, provides long-sought-after experimental evidence that could help explain some of the anomalous properties of water at extremely cold temperatures.

Water is an unusual liquid, but its ubiquity means that we often forget just how unusual it is. Unlike most other liquids, it is denser at ambient pressure than the ice it forms when it freezes. It also expands rather than contracting when it cools (a phenomenon known as negative thermal expansion); becomes less viscous when compressed; and exists in no fewer than 17 different crystalline phases.

This atypical behaviour extends to water’s supercooled state, which occurs naturally in high-altitude clouds in the Earth’s atmosphere and in space as well as under carefully-controlled laboratory conditions. Many of the so-called “mixture” models that have been developed to explain the oddities of supercooled water predict that it undergoes a phase transition at low temperatures and high pressures, transforming from a high-density liquid phase to a low-density one. However, it is difficult to determine which of these models is correct because data on the behaviour of liquid water between 160 K and 235 K are so sparse.

Rapid crystallization

In this temperature range – the “no man’s land” of water’s complex phase diagram – the supercooled liquid rapidly crystallizes, making measurements difficult. According to Bruce Kay and Greg Kimmel, who led the PNNL team responsible for the latest study, it was previously an open question whether this rapid crystallization is just an experimental obstacle, or a fundamental problem stemming from some instability in water before it crystallizes.

By demonstrating that liquid water at extremely cold temperatures is relatively stable, and that it exists in two structural forms, the PNNL team’s results come down squarely in favour of the first option. “The findings explain a long-standing controversy over whether or not deeply supercooled water crystallizes before it can equilibrate,” Kimmel says. “The answer is: no.”

Infrared spectroscopy experiments

In their experiments, Kay, Kimmel and their colleagues Lori Kringle and Wyatt Thornley used infrared (IR) spectroscopy to study the structural transformations that occur when thin films of supercooled water are heated from 70 K (-203°C) to 273 K (0°C) at a rate of 1010 K/s, before being cooled at rates of 5 x 10K/s. These rates are achieved using nanosecond laser pulses and are 10times faster than in other techniques – a key factor in their success, Kimmel says. For each heat pulse, the films spend around 3 ns near the maximum temperature before rapidly cooling to the base temperature.

By analysing how the IR spectra of water’s O-H bonds evolved during these cycles, the researchers found that supercooled water can condense into a high-density, liquid-like structure. This higher-density form coexists with a low-density structure that has physical properties more in line with the typical bonding expected for water.

Backing up “mixture” models

The proportion of the high-density phase deceases rapidly as the temperature falls from 245 K to 190 K. This observation agrees with the predictions of “mixture” models for supercooled water, and Kringle, who did most of the experimental work, adds that the structural changes they observed were reversible and reproducible.

As well as furthering our understanding of supercooled water, the new finding, which is detailed in Science, might help explain how liquid water could exist on very cold planets (such as Jupiter, Saturn, Uranus, Neptune and beyond), and how supercooled water vapour creates the trails seen behind comets.

Copyright © 2025 by IOP Publishing Ltd and individual contributors