Skip to main content

Frequency combs shape the future of light

This year marks the 20th anniversary of the first time an optical-frequency comb was used to measure the atomic hydrogen 1S-2S optical transition frequency, which was achieved at the Max-Planck-Institut für Quantenoptik (MPQ) in Garching, Germany. Menlo Systems, which was founded soon afterwards as a spin-off from MPQ, has been commercializing and pioneering the technology ever since.

Today, optical frequency combs (OFCs) are routinely employed in applications as diverse as time and frequency metrology, spectroscopy, telecommunications, and fundamental physics. The German company’s fibre-based systems, and its proprietary “figure 9” laser mode-locking technology, have set the precedent for the most stable, reliable, robust, and compact optical frequency combs available on the market today.

An optical frequency comb exploits laser light that comprises up to 106 equidistant, phase-stable frequencies to measure other unknown frequencies with exquisite precision, and with absolute traceability when compared against a radiofrequency standard. The most common and versatile approach to create an OFC is to stabilize an ultrafast mode-locked laser, in which pulses of light bounce back and forth in an optical cavity. The frequency spectrum of the resulting pulse train is a series of very sharp peaks that are evenly spaced in frequency, like the teeth of a comb.

Through the so-called self-referencing technique, the first tooth of the comb – called the carrier-to-envelope offset frequency – is well fixed to a certain position. When the spacing of the comb teeth is referenced to a known frequency, such as the radiofrequency generated by a caesium atomic clock or a hydrogen maser, the absolute frequency of a light source can be accurately measured by interfering it with the nearest tooth on the comb with respect to the OFC reference. The device thus provides a way of making very accurate spectroscopic measurements of atomic and molecular transitions, and offers a versatile and unique way of comparing atomic clocks.

Nobel connections

The OFC was invented by Theodor Hänsch, who together with Jan Hall was awarded the 2005 Nobel prize in physics for its development. Hänsch, together with two of his students, Ronald Holzwarth and Michael Mei, and Alex Cable, founder and president of Thorlabs, set up Menlo Systems in 2001 to establish the technology as a turn-key device that could be used for a range of applications. The company soon received orders for OFC systems from two laboratories in Austria and Italy, and today all major metrology institutes around the globe own one or more Menlo Systems’ OFCs.

The company has continued to lead the way in terms of innovation, quality and performance, with its products used in ground-breaking research as well as in emerging industrial applications. Its flagship FC1500-ULNplus, the world’s most precise optical frequency comb, is now a key technology in the development of optical clocks – which are expected one day to replace the caesium atomic clock as the current standard for defining the second.

“Menlo has recently made relevant leaps that would have seemed impossible a few years ago for researchers in quantum optics, atomic and molecular physics,” says Michele Giunta, project leader at Menlo and a doctoral candidate in Hänsch’s group at the Max-Planck-Institut für Quantenoptik. “We have developed and conceived frequency combs synthesizing optical frequencies that mimic the sub-hertz linewidth of our ORS ultrastable laser, and it is even ready to support future sub-mHz linewidth lasers. In this way, every frequency generated by the OFC is phase-coherent with the optical reference and exhibits the same linewidth of the reference with a negligible additive noise.”

This technology is exploited in the FC1500-ULNplus, allowing the optical coherence of the reference to be transferred to all lasers locked to the comb, even for the narrowest linewidth laser demonstrated so far. “This product can be used to compare two or more different optical clocks, having the comb keep pace with the best one, and at same time avoid hindering the comparison against the others,” explains Giunta. “It is also used for precision spectroscopy of hydrogen – which is a long-standing research effort led by Theodor Hänsch – to test quantum electrodynamics, and making it possible to determine fundamental constants such as the Rydberg constant or the proton charge radius (Science 358 79).”

The role of the optical reference system, which comprises a continuous wave (CW)-laser locked to a stabilized high-finesse optical cavity, is to allow the comb to reduce the noise of each optical tooth down to the sub-Hertz level. “By doing this we control the two degrees of freedom of the comb with very high bandwidth and unrivalled coherence,” explains Giunta. “In fact, we have recently demonstrated and reported the lowest noise synthesis for both optical and microwave carriers using such a system.”

Record breaker

In 2016, in a collaboration with academic partners at SYRTE, the National Metrology Institute of France – Yann Le Coq and Giorgio Santarelli, who is now at LP2N in Bordeaux – Menlo demonstrated that this technology can be used to synthesize microwaves with the highest spectral purity demonstrated to date.

“Menlo’s technology allows to transfer the frequency stability of the reference laser to the timing of the comb laser pulse train, and hence to a microwave frequency that is detected as the pulse repetition rate,” says Giunta. “The sub-hertz optical linewidth of the reference laser is translated into a microhertz linewidth in the microwave carrier, which is typically a harmonic of the pulse repetition rate.”

This advance in ultralow-noise microwave generation could directly impact on applications such as high-speed data transmission, high-stability atomic clocks, synchronization of Very Long Baseline Interferometry (VLBI), radio astronomy and high-accuracy radar for surveillance and navigation systems.

“In Doppler radar and moving traffic indicator systems, for example, oscillator phase noise causes large stationary clutter to spread through the Doppler spectrum,” explains Giunta. “Consequently, the performance of radar systems is strongly affected by the phase noise characteristics of the microwave oscillator. Low close-in phase noise, at offset frequencies of 1 Hz to 100 kHz, is of critical importance for high-fidelity radar systems.”

Emerging applications

Other high-end applications of frequency comb technology include high-speed precision distance measurements, such as co-ordinating the positioning of satellite swarms, telecommunications, and calibrating astronomic spectrographs, adds Richard Zeltner, head of business development at Menlo.

“We work very closely with research and metrology institutes to find out where different fields are heading and how we can serve them,” he says. “As today’s research can turn into tomorrow’s products, we are striving to develop reliable technology that fulfils the needs of upcoming high-end applications.”

Zeltner also points out that Menlo continues to make its established products more reliable and user-friendly. “Following more than 15 years of development we recently succeeded in reducing the footprint and ease-of-use of an OFC,” he says. “The SmartComb is a simple and fully autonomous comb system that, together with our ORS ultra-stable laser, provides end-users with a rack-mountable solution for synthesizing sub-Hertz linewidth laser light and extremely pure microwaves.” The SmartComb can easily be used by non-experts, which makes it accessible to a broader clientele.

But the company is aiming even higher, adds Zeltner. Since precise timekeeping and synchronization are both crucial for global navigation satellite systems (GNSS), there are ongoing efforts to qualify optical-clock technology for space applications. “As OFCs are an integral part of any optical clock, Menlo is undertaking R&D efforts in this direction as well,” he says.

A SmartComb was recently operated successfully onboard a high-altitude research aircraft, while a prototype comb was tested under micro-gravity conditions on a sounding rocket that was part of the DLR Texus programme. “We are not quite at a readiness level for in-orbit verification satellite missions or the ISS, but we are making continuous progress towards that goal,” says Zeltner.

Looking to the future

Several other potential applications are also appearing on the horizon, such as comb-assisted coherent control of multiple lasers is one. “This could be important for future ion or neural atom-based quantum computers,” says Giunta. “These emerging technologies could come useful in a multitude of areas and, again, the OFC will be of central importance for such an information technology revolution.”

OFCs are Menlo’s star products, with more than 300 systems currently in operation around the world, but its product portfolio also includes femtosecond lasers, ultrastable CW lasers and THz technology. The company today employs more than 100 people and has offices in Germany, the US and China.

Further reading

Menlo Systems has published a number of scientific papers on photonic microwave generation’s publications, the FC1500-ULNplus, and applications in the space arena:

M Giunta et al. 2019 “Real-time phase tracking for wide-band optical frequency measurements at the 20th decimal place” Nat. Photon. (2019).

E Oelker et al. 2019 “Demonstration of 4.8 × 10−17 stability at 1 s for two independent optical clocks” Nat. Photon. 13 714

X Xie et al. 2016 “Photonic microwave signals with zeptosecond-level absolute timing noise” Nat. Photon. 11 44

J W Zobel et al. 2019 “Comparison of Optical Frequency Comb and Sapphire Loaded Cavity Microwave Oscillators” IEEE Photonics Technol. Lett. 31 1323

M Lezius et al. 2016 “Space-borne frequency comb metrology” Optica 3 1381

Electrons passing over nanophotonic materials could create synchrotron-like light

Vacuum fluctuations just a few nanometres from the surface of a material can cause a passing beam of relativistic electrons to emit X-rays and other high-frequency electromagnetic radiation — according to calculations done by scientists in the US, Israel and Singapore. If confirmed experimentally, the effect could be used to create compact and tunable sources of X-rays and even gamma rays.

Compact sources of high-quality electromagnetic radiation at X-ray and higher frequencies are difficult to make because electrons in materials cannot react quickly enough to electromagnetic field oscillations at these high frequencies. Instead, production of such radiation relies on strong external fields that accelerate beams of high-energy electrons in devices such as synchrotrons and free-electron lasers. While this approach has been extremely successful, it requires huge and expensive magnets or lasers, and can only be done at a handful of dedicated sites.

Now, Nicholas Rivera and colleagues at the Massachusetts Institute of Technology, Technion and Nanyang Technological University in Singapore, have proposed a new way of generating high-frequency light that does not require strong external fields. It involves a two-photon process whereby a relativistic free electron passing a few nanometers from the surface of a material spontaneously emits a high-energy photon and a polariton. The latter is a quasiparticle that forms as a result of interaction and mixing of light with dipoles in a medium. The cause of this spontaneous emission are vacuum fluctuations, a well-known phenomenon that arises because the vacuum is not empty space, but has a non-zero energy density of the electromagnetic field.

Famous effects

“The most famous effects arising from vacuum fluctuations are the Casimir and van der Waals forces between neutral objects, the latter of which explains the ability of geckos to walk on ceilings”, adds Rivera. These so-called vacuum forces are especially strong in the nanoscale vicinity of nanophotonic structures.

Rivera and colleagues did calculations of this effect in optical materials such as graphene; thin films of gold and silver; and silicon carbide. In these materials polaritons are emitted at infrared or visible frequencies. Using doped graphene increases the strength of the van der Waals force that extracts more emission from the electron. The calculated photon emission is more broadband than in synchrotrons, spanning from soft ultraviolet to hard X-ray frequencies, explains Ido Kaminer. Furthermore, the team calculated that the total emitted power is comparable to the power that equal-energy electrons would emit as synchrotron light induced by a 1 T magnetic field, a mind-blowing theoretical prediction.

The concept could be used to develop compact and tunable sources of high-frequency light, based on vacuum fluctuations near and inside nanophotonic materials. But what material will provide the desirable radiation characteristics?

Higher coherence and shorter pulses

Kaminer says: “We expect certain periodic patterns, as is now often done in metasurfaces, to result in higher coherence and shorter pulses”. Gigaelectronvolt electron beams could be delivered by a conventional linear accelerator, or perhaps in the future by a more compact laser wakefield accelerator. The framework could be extended to include more complex electron-beam configurations such as moving dipoles or bunched electrons – the latter is used in free-electron lasers.

Ultimately, by using the tools of nanophotonics, controlling high-frequency light emission may lead to creation of synthetic active nonlinearities at X-ray and even gamma-ray frequencies. Such nonlinearities could be used, for example, to create entangled pairs of X-ray photons via parametric down conversion.

Full details of the research are reported in Nature Physics, where the team also makes several suggestions for how the effect could be studied in the lab.

A 2030 UK energy plan

At this year’s Labour party conference held just outside London in Corydon in October, a motion was adopted that called on the party to work towards “a path to net zero carbon emissions by 2030”. Labour then asked a group of independent energy-industry experts to identify a pathway to decarbonize the UK energy system by 2030. The resulting report, which was published in late October, is very detailed and quite radical. It identifies four overarching goals to transform the UK’s energy supply and use: reducing energy waste in buildings and industry; decarbonizing heat; boosting renewable and low carbon electricity generation; and balancing the UK’s supply & demand.

Thirty recommendations were made to meet those goals. They include installing eight million heat pumps as well as upgrading every home in the UK with energy-saving measures such as insulation and double glazing but focusing first on damp homes and areas with fuel poverty. The report also calls for the installation of 7000 off-shore and 2000 on-shore wind turbines as well as solar panels that would cover an area of 22 000 football pitches, so tripling the UK’s current solar capacity.

“Delivering these thirty recommendations would make the UK a pioneer. No other industrialized country has plans of a similar scale,” the report notes. “The scope and pace of change would bring challenges, but also first-mover advantages and would avoid costly high-carbon lock-in for the country’s infrastructure.”

Keeping on track

Specific recommendations for early action include a vast expansion of offshore wind to 52 GW while onshore wind would increase to 30 GW and solar energy to 35 GW – all contributing to the 137 GW boost in renewable capacity. The report also calls for an urgent UK-wide programme to upgrade existing buildings to “significantly reduce energy wastage and a shift to low-carbon heat”. All new buildings would have to be net zero-carbon.

The plan is a maximalist programme of renewable expansion and energy efficiency upgrades in all sectors. On the demand side, it aims to reduce the need for energy across the UK by a minimum of 20% for heat and a minimum of 11% for electricity, relative to current levels. On the supply side, offshore wind would be supplying 172 TWh by 2030 while onshore wind would contribute 69 TWh and photovoltaic solar being 37 TWh. But there is also 63 TWh from nuclear — with 9 GW assumed to be in place by 2030 — as well as 32 TWh from gas with 40GW of power plants in use.

For the longer term, there would be significant investment in research and development for marine energy — up to 3GW of tidal — and renewable or low-carbon hydrogen for heating and energy storage along with carbon capture and storage for some heavy industries (2.5GW). The aim is that by the late 2020s “these emerging technologies can be deployed, alongside current technologies such as nuclear, to the appropriate scale”.

This is a good report that faces up to many of the issues. Yet the new set of proposals avoid detailed programme costing.

That part could be a hint of support for small modular nuclear to keep nuclear at its current level and also for fossil-gas steam methane reformation for hydrogen production. But the report also mentions the electrolysis “power to gas” route: using green power to make hydrogen. Interestingly, it sees solar providing about 6% of UK heating with biomass contributing 5%. Yet it recommends not expanding solid biomass use for large-scale electricity generation. So, no more DRAX-type imported wood pellet plants.

Based on the proposed programme, the report claims that the UK could be on track to deliver a 77% reduction in energy emissions by 2030 compared to 2010 levels. It adds that looking beyond that zero-carbon electricity “could potentially be anticipated as early as 2034-2040, and zero-carbon heating [from] 2036-2040”.

Social benefits

The report says that there would also be substantial social benefits from the plan. By 2030, its recommended investment in the energy sector would lead to a net benefit to the economy of £800bn and create 850 000 new skilled jobs in the green industry. “The UK would build a unique skill and knowledge base supporting the kind of transition that many other countries will need to go through, providing a huge opportunity for the UK to demonstrate industrial leadership,” the report says.

The report adds that upgrading the housing stock has the potential to end fuel poverty that is currently affecting 2.5 million UK households. By 2030, these measures could also mean 565 000 fewer cases of asthma by helping to alleviate damp. Replacing fossil fuels with renewable energy could also improve air quality resulting in 6200 fewer respiratory-related deaths each year by 2030. Overall, the report says, benefits to public health could potentially save the UK’s National Health Service £400m each year.

The proposals received a clean bill of health as technically credible from a range of academics and experts in the field and the sense of urgency was welcomed by Greenpeace. The detailed plan certainly does look interesting. However, there are some issues.  It doesn’t back district heating Networks (DHN), at least not yet. That is a change from traditional Labour stances that have seen urban DHN — along with Combined Heat and Power Plants — as important aspects.

The report says the deployment of DHNs would be constrained by the proposed building retrofit programme that will result in a “drop in heat demand by on average 25% which will further reduce the already marginal returns for DHN operators”. Yet DHNs are a good flexible infrastructure investment, able to use any heat source that comes along. And it has been claimed that for high rises, for example, plugging on to a DHN can offer lower cost carbon savings than often tricky-to-install and potentially risky retrofitted insulation.

 Nuclear retention

The report assumes that in its 90% low-carbon mix for 2030 nuclear capacity stays at the current level. But it also says it is “entirely possible to meet the 90% target without any new nuclear capacity”, though that would be “more challenging” due to the loss of low-carbon base-load and increased use of variable power. So, the report notes, more grid balancing would be needed via storage, interconnection, demand management or fossil fuel back-up. Though it adds, “the system will also benefit from cheaper generation technology such as wind & solar”.

The retention of nuclear is controversial. For example, far from helping to balance variable renewables, having nuclear on the grid may make the balancing problem harder to deal with since it is inflexible. By contrast, it can be argued that an increase in variable renewable capacity could reduce the balancing problem. There would be more green power available more often to meet peaks, thus reducing balancing needs and also an increased amount of surplus power at times, expanding the potential for power to gas electrolytic conversion to hydrogen. That could be stored and used to generate power again when there was a lull in renewable power availability.

This is a good report that faces up to many of the issues, nuclear apart. Yet the new set of proposals avoid detailed programme costing. That will be up to the Labour party to deal with if it adopts this plan. Rebecca Long Bailey, Labour’s shadow business and energy secretary, welcomed the report saying that it is “a major contribution to Labour’s plans to kickstart a Green Industrial Revolution”. Given that an election manifesto is now imminent, we will have to wait and see what is included in it.

Female chemists hit by ‘significant disadvantage’ when publishing their research

Gender biases exist at “each step” of the publication process in chemistry publishing. That is according to a 30-page reportIs Publishing in the Chemical Sciences Gender Biased? — released on 5 November by the Royal Society of Chemistry (RSC). The report, which examines the diversity of authors, referees and editors of RSC journals, finds that while these biases appear minor in isolation, their combined effect puts women at a significant disadvantage when publishing their research.

The RSC publishes more than 40 peer-reviewed journals, corresponding to around 35 000 articles each year that while focussed on chemistry also cover fields such as biology, materials and physics. The study looked at the gender of authors in over 717 000 manuscript submissions between 2014 and 2018 as well as over 141 000 citations between RSC journals from 2011 to 2018. The gender of authors was assigned to names using the same method by the UK Intellectual Property Office in their report on gender in UK patenting.

Call to action

The analysis found that women are less likely than men to submit their work to journals that have a high impact factor and are more likely to have an article rejected without review. The report notes that women are also less likely to hold positions towards the end of the author list, in particular that of corresponding author. Indeed, while around 36% of RSC authors are female, only 24% of submissions have a women as corresponding author.

While these issues don’t just apply to the chemical sciences, as an organisation there is absolutely no point telling others they need to change unless you’re willing to do so yourself

Emma Wilson

When it comes to peer review, the report finds that women are under-represented as reviewers but are more likely to be chosen to review articles that have female corresponding authors. The report also states that women cite fewer research papers than men overall, and men are less likely than women to cite papers authored by women. Indeed, only 18% of cited papers have a corresponding author who is a woman.

To tackle such biases, the report offers four areas for action. This includes publishing an annual analysis of authors, reviewers and editorial decision makers in each subdiscipline as well as recruiting reviewers, journal board members and associate editors to match the current gender balance in chemistry — with a target of 36% being women by 2022. The report also calls for more editorial training to eliminate biases and for the RSC to collaborate with other publishers to boost diversity and inclusion in the industry.

“We were surprised by some of the findings, which included a number of cases where women said they felt less confident submitting to a journal because they feel they might not meet the criteria for publication, while men may be more likely to take the risk,” says RSC publishing director Emma Wilson. “While these issues don’t just apply to the chemical sciences, as an organisation there is absolutely no point telling others they need to change unless you’re willing to do so yourself. In analyzing our journal peer review processes, committing to increase female representation within the publication process and annually reporting on our progress toward gender equality, we are aiming to raise the bar.”

Douglas Trumbull: a mutual appreciation between scientists and moviemakers

Douglas Trumbull

Douglas Trumbull has spent more than 50 years at the technological cutting edge of moviemaking – from his iconic special effects for 2001: A Space Odyssey (1968), The Andromeda Strain (1971), Close Encounters of the Third Kind (1977), Star Trek: the Motion Picture (1979), Blade Runner (1982) and The Tree of Life (2011), to his directorial work on the cult classics Silent Running (1972) and Brainstorm (1983). Now 77, the moviemaking legend admits he is “a complete outlier and weirdo relative to the entrenched motion-picture industry”. Indeed, he says he feels “much better understood by scientists and mathematicians than by studio executives”.

Trumbull’s career began on the short film To the Moon and Beyond (1964), which transports the viewer from Earth out to the entire universe before zooming back down to the atomic scale. Recorded at 18 frames per second using a fish-eye wide-angle lens on 70 mm film – a technique dubbed Cinerama 360° – the movie was projected onto a domed exhibit at the New York World’s Fair in 1964. Impressed by the special effects, director Stanley Kubrick hired Trumbull to work on 2001.

Since the 1980s he’s been based in rural Massachusetts, where Trumbull Studios is experimenting with new ways of making and showing films. These include a prototype 70-seat “pod” featuring advanced digital-projection technology as well as a slightly curved, torus-shaped cinema screen. “The work that I’m doing is predicated upon a belief that there’s an intrinsically, vitally important link between the medium itself and the movie experience you can deliver to audiences,” Trumbull says. “Kubrick was very conscious of this, he talked to me about it a lot. That has stayed with me forever.”

Trumbull’s entire career has been a learning curve. Or, as he puts it, “a hybrid of science, technology, and the drama and art form of movies”. It has also given him a unique perspective on the worlds of filmmaking and science, and the way that practitioners in one area can learn from those in the other. He cites, for example, his work on Terrence Malick’s epic The Tree of Life, which includes a 17-minute “creation” sequence showing the birth of stars and evolution of life on Earth.

“Terry is very scientific and well studied,” Trumbull says, pointing to Malick’s ability to understand, say, the fluid dynamics of two galaxies colliding. “He would go to a supercomputing lab at a major university and see if they had something like that. And sure enough, there were experimental movies made by weighting each star with a certain gravitational pull and having them interact in a scientifically valid way.”

Unfortunately, those simulations were, says Trumbull, “weirdly underwhelming, because they were only as good as the mathematical algorithms”, which prompted him and Malick to try something new. “By using real fluids in a real liquid, or real gases and explosive lights – and filming that with high-speed cameras at a thousand frames a second – we would find much more intuitively natural-looking effects than anything we could create with a computer.”

That ability to turn abstract thoughts into visual form is a skill that filmmakers share with scientists. For instance, Trumbull is about to start making a film about the engineer and inventor Nikola Tesla. “One of the aspects of Tesla’s nature was that he saw everything in his mind long before he manifested it in reality,” says Trumbull. “He could see things so vividly that, when he was developing some idea, he didn’t draw or build anything until his mind had completed the project – pre-visualized it. And this enabled him to understand more completely how one magnetic field would interact with another to create the Tesla coil, for example.”

At the same time, Tesla’s story provides a cautionary note about the limits of visualization, Trumbull warns. “In the later years of his life Tesla made some significant mistakes by believing that what was imagined in his mind was always accurate. He believed he could scale up a Tesla coil to the Wardenclyffe Tower [an experimental wireless station in New York] and transmit energy around the world, which I don’t think was ever going to work.”

One intriguing collision between the make-believe world of movies and real life is Blade Runner. Trumbull’s visual effects helped give the film its famously futuristic feel, but the story’s setting – Los Angeles, November 2019 – is no longer the distant future; it’s now. While not everything predicted in that film – flying cars, human-like robots and 3D holographic billboards – has been realized, “a lot of Blade Runner has come true but in a different form”, Trumbull says, pointing to the growing impact of artificial intelligence (AI). “AI is one of the most compelling topics of discussion in our society – and it’s evolved over time in such a way you don’t notice.”

Significantly, Blade Runner predicted a shift in our relationship with technology. “The resistance exerted by the AI beings – against the limit in their lifespan – is very much like 2001,” says Trumbull, pointing to the fictional crew’s decision in that movie to shut down the HAL computer. “The dying of HAL was really tragic. The fact that the audience could empathize with HAL more than they could empathize with the human characters in the movie is really telling. I think we’re going to see more and more of that. You’re going to feel bad when your vacuum cleaner dies.”

Can conventional X-ray tubes deliver FLASH dose rates?

The researchers

The curative potential of radiation therapy is limited by normal tissue toxicity, which restricts the dose that can be delivered to nearby non-target tissue. Recently, interest in ultrahigh dose rate radiotherapy has been rekindled, following preclinical studies showing increased normal tissue tolerance at high dose rates. This approach – known as FLASH radiotherapy – employs dose rates exceeding 40 Gy/s and can improve the therapeutic ratio by increasing the differential response between normal and tumour tissues.

The expectation is that FLASH could one day provide tumour ablation in a single sub-second treatment, while substantially reducing radiation-induced side-effects. The underlying mechanism, however, is not yet understood and requires further research. And to date, FLASH dose rates typically require specialized electron sources or substantial modifications to clinical linacs.

“Access to FLASH beamlines is limited and the progress on understanding of the FLASH mechanism is quite slow,” explains Magdalena Bazalova-Carter from the University of Victoria’s XCITE Lab. To remove this constraint, Bazalova-Carter and PhD student Nolan Esplen are investigating the feasibility of using a conventional high-powered X-ray tube for FLASH radiotherapy (Med. Phys. 10.1002/mp.13858).

“Our results will hopefully inspire researchers without access to high-power electron sources or proton beamlines to perform FLASH in vitro, and possibly some limited in vivo experiments, with a standard X-ray tube,” says Bazalova-Carter.”

The researchers used Monte Carlo (MC) modelling to evaluate the maximum dose rates achievable by two conventional X-ray tubes: the 3 kW MXR-160/22 (which was being validated in the XCITE lab) and the 6 kW MXR-165, which benefits from a short distance from the focal spot to the tube surface. For both tubes operating at maximum power, they simulated the output of an unfiltered 160 kV beam and calculated the dose deposited in a water phantom placed against the tube surface. They then converted the MC-calculated dose to dose rate.

The simulations revealed that both X-ray tubes were FLASH-capable, with maximum phantom surface dose-rates of 114.3 and 160.0 Gy/s, for the MXR-160/22 and MXR-165, respectively. Dose non-uniformity due to the heel effect – an inherent directional variation in the X-ray intensity emitted by the anode – was seen in both cases. For a 1 cm diameter region-of-interest within the high-dose region, dose rates were 110.6 Gy/s for the MXR-160/22 and 151.9 Gy/s for the MXR-165.

Plotting dose rate versus depth revealed a rapid fall-off for both 160 kV X-ray beams. At 2 mm depth, for example, dose rates decreased to 23% and 28% of that at the surface, for the MXR-160/22 and MXR-165, respectively. The dose rate remained FLASH-capable at depths of up to 1.4 and 2.1 mm, for the MXR-160/22 and MXR-165, respectively.

To validate their MC models, Bazalova-Carter and Esplen measured the dose in a plastic water phantom irradiated with a 120 kV beam from the MXR-160/22. They placed Gafchromic EBT3 films at 15 and 18 mm depth in the phantom and compared the measured 2D dose profiles with those from MC simulations of the 120 kV beam.

Dose distributions

In the region not affected by the heel effect, the simulations agreed well with the film measurements. The mean X-profile differences between experiments and simulations were 1.5% and 3.2%, at 15 and 18 mm, respectively; the mean y-profile differences were 1.5% and 3.5%, at 15 and 18 mm. Agreement in the heel-effect region was poorer, however, with a mean difference of up to 17.8% along the X-profile.

This validation experiment demonstrates the ability of conventional X-ray tubes to deliver FLASH therapy. The researchers suggest that these particular tubes could be suitable for FLASH skin irradiations, in vitro experiments or testing dose-rate dependence of small-field dosimeters.

They are now working to further tailor the X-ray tubes for FLASH applications. “We are currently building a shutter mechanism that will inset in the X-ray tube, which will further increase the dose rate,” Bazalova-Carter tells Physics World. “We are also designing experiments to test cell survival for FLASH irradiations with and without gold nanoparticles.”

Wearable MEG scanner used with children for the first time

The human brain undergoes significant functional and structural changes during the first decades of life, as the fundamental building blocks of human cognition are established. However, relatively little is known about maturation of brain function during these critical times. Non-invasive imaging techniques can provide information on brain structure and function, but brain scanners tend to be optimized for adult head-sizes. Traditional fixed scanners also require patients to stay completely still, which can be highly challenging for children.

A UK research collaboration aims to solve these problems by creating a wearable magnetoencephalography (MEG) system that allows natural movement during scanning. They have now used the wearable MEG for the first time in a study with young children (Nature Commun. 10.1038/s41467-019-12486-x).

The researchers, from the University of Nottingham, the University of Oxford and University College London, developed a lightweight ‘bike helmet’ style MEG scanner and used it to measure brain activity in children performing everyday activities. As well as enabling studies of neurodevelopment in childhood, this system should allow investigation of neurological and mental health conditions in children, such as epilepsy and autism, for example.

MEG measures the small magnetic fields generated at the scalp by neural current flow, allowing direct imaging of brain activity with high spatiotemporal precision. Traditional MEG systems use an array of cryogenically-cooled sensors in a one-size-fits-all helmet. Such systems are bulky and highly sensitive to any head movement.

To address these issues, the team is using optically pumped magnetometers (OPMs) to measure the magnetic fields generated by the brain. These small, lightweight sensors can be positioned on a 500 g helmet that can adapt to any head shape or size. The OPMs can also be placed far closer to the head than conventional sensors, increasing their sensitivity. The researchers also employed an array of electromagnetic coils to null the residual static magnetic field inside the magnetically shielded room, allowing individuals to be scanned whilst they move freely.

“The initial prototype scanner was a 3D printed helmet that was bespoke – in other words only one person could use it. It was very heavy and quite scary to look at,” explains PhD researcher Ryan Hill who led this latest study. “Here, we wanted to adapt it for use with children, which meant we had to design something much lighter and more comfortable but that still allowed good enough contact with the quantum sensors to pick up signals from the brain.”

The researchers designed and built the new bike helmet style scanner and used it to successfully analyse the brain activity of a two-year old (typically the hardest age to scan without sedation) and a five-year watching TV whilst their hands were being stroked by their mother. The children were able to move around and act naturally throughout.

To show that the MEG system is equally applicable to older children, the researchers also used it with a larger helmet to scan a teenager playing a computer game. Finally, they used the new scanner to examine brain activity in an adult learning to play a sequence of chords on a ukulele. Despite the substantial head and arm movement required to complete this task, clear electrophysiological responses were observed.

“This study is a hugely important step towards getting MEG closer to being used in a clinical setting, showing it has real potential for use in children,” says Matthew Brookes, who leads the MEG research at the University of Nottingham. “The challenge now is to expand this further, realising the theoretical benefits such as high sensitivity and spatial resolution, and refining the system design and fabrication, taking it away from the laboratory and towards a commercial product.”

The researchers conclude that their study demonstrates that the OPM-based MEG system can generate high quality data, even in a 2-year-old child, and can be used to measure brain activity during naturalistic motor paradigms. “OPM-MEG, with generic helmet design, paves the way for a new approach to neurodevelopmental research,” they write.

Air-quality regulations shown to lower traces of airborne transition metals

When taking a deep breath you draw a range of gases into your lungs from oxygen and nitrogen to carbon dioxide and argon along with traces of water vapour. But that same breath could also contain microscopic amounts of copper, iron, zinc and even chromium. Despite making up a tiny fraction of the pollutants in the air, transition metals have some of the most damaging impacts on our health.

A study has now assessed the abundance of various airborne transition metals in urban areas across the US. While the overall particulate pollution has decreased over the past two decades, particularly in urban areas, some cities have, however, seen a rise in the amount of air-borne transition metals. By studying the trends, researchers are beginning to pinpoint what the likely sources of various metals are and how their emissions can be better controlled.

Diluting concentrations

Clean air is a staggeringly good investment. Since 1990, the US has spent an estimated $65bn on implementing the 1990 Clean Air Act but gained $2 trillion in benefits. According to the Environmental Protection Agency, this year alone the act has prevented around 230 000 early deaths, avoided 120 000 emergency room visits, and stopped 5.4 million sick-days in schools and 17 million sick-days at work.

Particulate pollution is responsible for some of the worst health and economic impacts of air pollution with transition metals believed to be more damaging than other compounds. That is because the metals act as a catalyst and help to produce oxidants, which can lead to oxidative stress. “Oxidative stress has been linked with the genesis and progression of many different diseases – it’s why there is so much research and marketing for foods that contain antioxidants,” says Christopher Hennigan, an environmental engineer at the University of Maryland, Baltimore County.

There is a strong body of scientific research that shows that a transition to more sustainable energy sources will have co-benefits in air quality

Christopher Hennigan

Hennigan and colleagues analysed seven different transition metals over the period 2001 to 2016, across ten different urban areas — Atlanta, Baltimore, Chicago, Dallas, Denver, Los Angeles, New York City, Seattle, St Louis and Phoenix. They found that around a decade ago concentrations of nickel and vanadium in port cities were around five times higher than non-port cities but that the difference has now all but disappeared. “The reductions in port-cities were most likely from regulations on marine fuel sulphur content,” explains Hennigan.

The strong downward trend in vanadium across all urban areas clearly matched the introduction of diesel-fuel regulations in 2006. Yet copper, meanwhile, has stayed stubbornly constant in most areas. “Our results suggest that vehicle brake-lining dust is a major source of copper,” says Hennigan. The team also found higher concentrations of iron in western cities (by around a factor of two) than cities in the east, most likely because soil and dust are major sources of iron and prevailing winds cross more land and carry more dust to western cities.

One puzzle, however, was chromium, which increased in cities in the east and midwest, with a distinct spike in 2013. “We don’t have a good explanation for this which indicates a gap in our understanding of chromium sources and their magnitude,” says Hennigan.

The findings confirm how beneficial air-quality legislation has been for the US. It also makes a strong case for continuing to improve air quality with Hennigan believing there are still big gains to be make. “There is a strong body of scientific research that shows that a transition to more sustainable energy sources will have co-benefits in air quality,” he says.

The research is published in Environmental Research Letters.

Daniel Radcliffe: VFX tricks and wizardry

Daniel Radcliffe Horns

Jess Wade: You have been in a bunch of films that use VFX in the most progressive and creative ways. What was it like starting your acting career with the extraordinary VFX in the Harry Potter films [2001–2011]?

Daniel Radcliffe: For some of the experienced actors on Potter, it was their first time working with VFX on that kind of scale. It was different for us kids. Telling us that “the dragon is this tennis ball on the end of the stick” is a little different from giving an older actor that instruction – we’d never known anything different. And we were all kids, so using our imagination was something that we were doing a lot anyway.

JW: Has VFX changed how you act?

DR: I don’t think so – it’s always been a big part of my career. I enjoy the challenge of it. I think I’m weirdly good at following numbered cues now. I remember when they shot all the audience reactions during the Tri-Wizard Tournament [in the fourth film, Harry Potter and the Goblet of Fire (2005)], and there would basically be a bunch of the cast and background artists on a big stand – sometimes on a green screen, depending on what the backdrop was. Assistant directors would hang big numbers around the studio and just say, for example, “1” so everyone would turn to the same eye line at the same time.

JW: The Harry Potter films ended eight years ago, and you’ve done some really exciting things with VFX since then. Has it changed a lot?

DR: Potter came at a time when people were leaning heavily toward visual effects and away from “practical” make-up or special effects. Even though, of course, we had plenty of them too. In the last couple of years, we’ve reached a nice balance – where big franchises like Star Wars and Mad Max use a lot of practical stuff in creature effects and stunt work. People see the value of having practical, on-set effects, but VFX are so good. It can also make stunt work safer because you don’t have to put a human being through what you can get VFX to do.

But certainly, VFX is improving at an extraordinary rate. If you were to look at the difference between the first and last Potter films [in 2001 and 2011] – they get exponentially better over time.

JW: How does working with all that VFX compare to stage acting?

DR: I think that’s the joy of my job – I’ll do some films where there’s almost no VFX whatsoever, then I’ll do films like Swiss Army Man [2016] where it’s a crazy mix of VFX and old-school practical stuff such as camera tricks. There was one scene in that film where my character gets punched in the mouth, then swallows the hand that punches him… and punches himself in the stomach to make the hand that’s in his mouth get forced back out. I wondered “how are we going to do that?”. There was no VFX involved – it was entirely clever camera angles and a bit of make-up on the arm to make it look like it was covered in spit. It’s wonderful to be able to flit between those things – the very low-fi and the highly sophisticated ways of solving problems on film.

JW: Do you ever get involved with VFX? Do you go and see what they’re doing?

DR: The closest you get on set is when the film’s big enough to do previs [previsualization] sequences – like an animated storyboard that no-one else ever sees. For example, when there was a big quidditch sequence on Potter, they’d have that all mapped out on a visual storyboard first, and we’d try and stick to that when we filmed. But the majority of the time, the VFX is in post-production, when the actors aren’t around.

JW: But sometimes you go in to do that funny thing – what’s it called – ADR?

DR: Yeah, ADR – additional dialogue recording. At that point you might see some sequences with half-finished VFX – and that’s always cool; it’s always fun to see it in a primitive phase. For someone who is interested in how films get put together it’s kind of fascinating. In this rough cut of the film there will be shots like, if you did a driving sequence on a green screen, they’ll just show the shot on a green screen with a little caption saying “VFX needed”.

When films started using huge sets that were just entirely blue screen and VFX, I think actors were a bit whiney about it – there’s something about being on a bright blue or green screen that can drive you slightly insane. At first it was something to be remarked upon, but now it is so much part of the industry – I don’t think anyone sees it as a novel thing anymore.

JW: What’s your favourite example of VFX that you’ve worked with?

DR: That’s really hard. There are some amazing sequences in Potter – there is some really beautiful stuff. The Hall of Prophecy in [the fifth film, Harry Potter and the] Order of the Phoenix [2007] was almost entirely green screen if I remember rightly.

And then in Horns [2013], when my on-screen brother took some hallucinogenic drugs and had this really visual trip – that’s a really good mix of practical prosthetics, VFX and tricks the designers built into the sets.

There’s also the other side of VFX, which is less glamorous but even more useful. Like driving sequences – when you’re filming in a place where you can’t shut down roads, you have to do it on green screens. Then there’s patching up a prosthetic. Sometimes things look fantastic when they’ve been put on at 9 a.m., but when you’ve been wearing it for 10 or 11 hours, visual effects can be helpful for polishing up that stuff.

JW: What has been the most ridiculous thing that you had to work with?

DR: None of it feels too ridiculous at the time. The hippogriff [a magical creature that’s part eagle, part horse] in [the third film, Harry Potter and the] Prisoner of Azkaban [2004] – the reality of the hippogriff and the flight of it was quite funny. If you imagine a limbless, headless bucking bronco…

JW: [descends into laughter] Like…a mechanical thing?

DR: Yeah, a mechanical bucking bronco on hydraulics. Just a grey torso with no texture, filmed on a blue screen and a green screen with a motion control camera.

JW: [can’t stop laughing] But you were all kids! I imagine when one 14-year-old starts laughing, everyone starts laughing.

DR: Sure, there would be an element of that. Thankfully, for the hippogriff sequence I was on my own at the start – so I’d got used to it. Of course, it also feels slightly strange when you mark it through for the first time if you’re acting alongside something like a tennis ball, but you get used to it.

JW: Is it weird to watch yourself after you’ve been VFX-d?

DR: It’s not weird so much as it is cool! It’s satisfying and really fascinating to see the finished product all put together, after having seen it at its most basic stages.

JW: Have you had experience with any cool VFX technologies?

DR: On Potter there was something called cyber-scanning. You’d stand in the middle of around 30 cameras and a computer would make a 3D map of you. And you know, as a kid, I had to be very still for a long time. They also had to keep doing it for every film because us kids were growing up.

JW: What did they use that for?

DR: If there’s a scene where you’re being thrown around in a crazy way – or you’re falling from a broom or something – and they didn’t want to do it with a stunt man. They use the cyber scan to recreate a digital version of you.

JW: It’s kind of cool but also intimidating. I think I’d hate to have 30 cameras pointing at me from all different angles.

DR: Yeah, for sure, it’s weird. You don’t just sit there either – you sometimes have to make expressions. There will be six or seven “first do a neutral face, then do smiling, then smiling with teeth, then surprised, then scared…” – so you have to make slightly caricatured versions of facial expressions. It’s one of the weirder parts of my job – but I enjoy all of those parts of my job!

JW: Does it feel like there’s a movement in the film industry to go back to more old-school techniques, away from VFX?

DR: Maybe a little bit. If you go to one of J J Abrams’ sets for the new Star Wars films there are lots of practical prosthetics, make-up effects and creatures – it’s really cool. It’s one of the things people love about the films that he has made.

The directors of Swiss Army Man, Daniel Kwan and Daniel Scheinert, love doing stuff practically. There are sequences in the film where we’re attacked by a bear, and there is no safe or practical way of doing that really, and we didn’t have the money that The Revenant [2015] had to do a bear attack. But Dan Kwan has a VFX/animation background and knew how to film things to make the VFX easy – there are tricks.

People used to say they didn’t want movies to look like video games – but video games look incredible at this point in time, so it’s not really a valid criticism anyway anymore. I don’t think we’ll ever get to a point where we completely do away with human actors and have entirely VFX movies – though there is a place for those movies right now, and they’re awesome.

You see how people respond to films like Mad Max: Fury Road [2015], which had a lot of practical stunts, the crazy cars – that was all real. But it was coupled with a tonne of VFX – removing wires, stunt harnesses. I think the industry has got to a point where we realize the value of both and find a compromise between the two.

JW: When you think about your career – of course you think about acting, but increasingly producing and directing – do you see yourself getting more involved with VFX?

DR: Depending on what level of VFX is in the film, VFX teams work very closely with the director. I think it’s really important to work with people you get on with and who understand the vision of the film. I cannot overstate how important that relationship is – the VFX team can really bail you out of stuff. On Guns Akimbo [2019] there was a lot of VFX, and we had a very chill, cool VFX co-ordinator called Tony [Kock] – and whenever there was a problem on set we’d say, “Hey Tony, can you fix that?” and he’d be like, “Yeah, that’s fine.”

JW: When you find someone like that do you not just want to ask them a tonne of questions about the technical parts of it?

DR: I do, but it’s like when I ask you about physics – I can only understand so much.

JW: Talking of physics, it’s not often we have a film star in Physics World. If you played a physicist who would you be?

DR: I will reverse the question: who would you cast me as?

JW: Paul Dirac would be great. Remember we read that great book about him [Graham Farmelo’s The Strangest Man]. But I want to know more about whether you like physics?

DR: I was always excited by space but there was way too much maths in it for me to ever feel truly at home. I’m interested in it now though – absolutely. You know I always watch science shows and listen to podcasts. I guess I’d say I’m an enthusiast but I’m not informed. Maybe I got it from my teachers at school and my tutors on set. Even though I wasn’t great, they got me interested. But I think pretty much across the board, every subject I didn’t think I was good at when I was at school, I’m fascinated by now. I’m fascinated by mathematics. I don’t understand anything about mathematics, but I love hearing people talk about it. It blows my mind.

Clingfish inspires suction cups for underwater robots

By mimicking how a tiny fish clings to rocks and other objects, researchers in California have made suction cups that adhere to rough surfaces in air and underwater. The team also showed how a robotic arm fitted with such a suction device can manipulate delicate objects such as a strawberry and a raw egg. They hope that their design could be used by deep-sea remotely operated vehicles (ROVs) for the recovery of fragile archaeological specimens and brittle marine samples.

Suction cups work well on smooth surfaces such as car windscreens, where the pressure difference they rely on can be maintained for a very long time. Rough surfaces are much more challenging because creating an effective seal is difficult.

Evolution has solved this problem for sea creatures that use natural suction cups to cling onto rugged rocks both above and beneath the waves. Trying to mimic these natural  structures has long been an active area of research in soft robotics — however, scientists have only recently begun investigating the passive mechanism by which the northern clingfish avoids being tossed about by intertidal surges.

Michael Tolley’s soft robotics group at the University of California, San Diego, began to look at clingfish adhesion when PhD student and ROV pilot Jessica Sandoval shared her frustrations of gathering objects underwater.

Softness built in

“[ROVs] have rigid manipulators that don’t have much fine tune control,” said Tolley. “It started us thinking, can we do manipulation underwater with some softness built into the system to delicately handle things?”

Tolley struck up a collaboration with Dimitri Deheyn, a marine biologist at the Scripps Institution of Oceanography in nearby La Jolla. With Deheyn’s guidance, Sandoval dissected clingfish specimens from the Scripps’ extensive fish collection, and some fresh specimens collected along the San Diego coastline. They examined the structure of the clingfish’s suction cup using various optical techniques, identifying four core features likely to be involved in the adhesive strategies employed by the clingfish.

The team then studied how these features — slits, a soft sealing layer, microfibrils and the shape of the cup — impacted adhesion. They fabricated 25 mm diameter silicone suction cups, with different combinations of these features and tested them in different scenarios.

Secret to suction

The tests involved applying a relatively small force to attach a cup onto surfaces of varying roughness, both underwater and in the air. Then the force needed to remove the cup was recorded.

“The commercial suction cup always did better on flat surfaces, but with any sort of roughness our prototypes did much better. It’s an exciting start, but we’ve not yet reached the actual performance of the clingfish suction disc,” explains Tolley.

Different combinations of shape and slits performed better in water and air, but the soft sealing layer is essential for adhesive performance on rough surfaces below and above water.

Another feature thought to help clingfish maintain a seal on challenging rough surfaces, is the dense bed of microfibrils or “micropapillae”, which are tiny soft protuberances that line the cup perimeter. The team mimicked these micropapillae by adding silicone micropillars to their cups.

Somewhat surprised

“We were somewhat surprised the microstructures weren’t an improvement ……on the soft sealing layer,” said Tolley. “But we looked only at structure and material properties, while the clingfish has other features, like mucus secretion that could affect papillae adhesion.”

The team followed up these investigations by examining the impact of curved surfaces and analysing how slits in the suction cup enabled it to conform to concave surfaces.

The researchers then turned their attention to demonstrating what a suction cup could do by attaching it to the end of a robotic arm handling a variety of delicate fresh foods such as tomatoes, strawberries in the air. Tests were also done underwater, where the arm picked up several objects including a crab and a knobbly vase. Finally, with her pilot hat on, Sandoval used a ROV arm with a suction cup attachment to handle a raw egg without breaking it.

High pressure

Nicola Pugno at Italy’s University of Trento, who was not involved in the study, praised the team’s extensive investigations into suction cup performance in different scenarios. Pugno adds that he is intrigued to see how the suction cup, which relies on establishing a pressure differential for suction, would perform when subjected to the high pressures ROVs experience on the ocean floor.

The team is keen to perform further underwater tests and plans to study live clingfish to find out how suction cups are actively altered according to the environment.

“I see these types of adhesive components as being a very specific piece of the puzzle that fits into a larger soft robotic system,” said Tolley, who hopes to combine adhesion with other work in his team on pneumatics and smart muscles, to create robots with greater utility.

The research is described in Bioinspiration & Biomimetics.

Copyright © 2025 by IOP Publishing Ltd and individual contributors