Skip to main content

Sodium nanofluid helps extract heavy oil

Zhifeng Ren

With the world’s reserves of light oil diminishing, oil companies are increasingly turning to heavier varieties – which make up 70% of global oil reserves – to meet rising energy demands. Existing extraction technologies for heavy oil are, however, inefficient, expensive and damaging to the environment in ways that extend beyond the CO2 emissions produced when the oil is eventually burned. A team of researchers from the University of Houston in the US have now developed an alternative method that uses an inexpensive, non-toxic sodium-based nanofluid to extract heavy oil from reservoirs, with an efficiency of 80% in laboratory tests.

Reserves of heavy oil – so-called because it has a viscosity of more than 100 centipoise, or cP – and extra-heavy oil, with a viscosity of more than 10 000 cP, are currently being exploited in several areas, notably Canada and Venezuela. The high viscosities of these grades of oils make them difficult to recover, and two main technologies are currently used to do it. The first relies on extracting oil from surface sands using hot water and air bubbles, then diluting it with solvents such as n-pentane or n-heptane. This method has been used for decades, but it requires large quantities of water and is thus not suitable for all locations.

Since most heavy oil resources are located below the surface, the second, in situ, recovery method is more common. In recent years, researchers have developed both thermal and non-thermal variants of these in situ techniques. The former, which include cold production with sands, vapour extraction, chemical injection and “flooding” the oil with steam, can be used for thin layers of oil formations. The problem is that these techniques are limited to shallow layers and relatively light (< 200 cP) viscous oils.

Alternative techniques sought

While thermal methods like in-situ combustion and steam flooding can recover oil with higher viscosities (potentially more than 70%), they are only economically viable when oil formations are thick. Producing the steam they require also consumes fuel such as natural gas, leading to additional COemissions.

To overcome these challenges, researchers have begun to explore alternative techniques, including some that use nanomaterials. Such approaches are not yet mainstream, and so far the nanomaterials developed have only played a secondary role – for example, modifying the flow behaviour (rheology) of crude oil or being used as catalysts to upgrade crude oil during steaming processes.

80% recovery efficiency

A team led by Zhifeng Ren, who is director of the Texas Center for Superconductivity at the University of Houston (TcSUH), has now used a sodium (Na) nanofluid to recover over 80% of extra-heavy oil (with a viscosity of over 400 000 cP) from sand packs in a laboratory experiment. The researchers made this nanofluid by using a hand-held blender to mix commercially-available bulk sodium chunks with silicone oil. This non-reactive solvent is necessary because (as we all remember from chemistry classes at school) sodium reacts very strongly when it comes into contact with water.

When the researchers do add water to their sodium nanofluid, it reacts according to the formula 2Na + 2H20 –> 2NaOH + H2. The substantial amount of heat produced in this reaction generates steam, which then reduces the viscosity of the heavy oil in a way that is similar to older steam flooding techniques – with the advantage that the sodium reaction is relatively easy to control and can be kick-started in situ simply by adding water. The sodium nanomaterials also dissipate once the reaction is complete, which means that the oil is not damaged by adsorbing the sodium. Finally, the technique does not use any fossil fuels to generate steam or hot water, thereby reducing the direct COemissions from the oil-extraction process.

“Based on these advantages, we anticipate that the sodium nanofluid could become a game-changing technology for recovery of oil of any viscosity, as well as a milestone in using nanotechnology to solve oil-recovery problems in the petroleum industry,” Ren and colleagues say.

Side benefits

The researchers note that by mixing their sodium nanoparticles in silicone oil, the particles dispersed throughout the simulated reservoir before coming into contact with water. This dispersion has the effect of triggering smaller chemical reactions across a larger area. The team says it may also be possible to disperse the sodium nanoparticles in other solvents, such as pentane and kerosene, or even to mix them with polymers or surfactants for a higher oil recovery rate.

As a final benefit, the researchers note that one of their reaction’s products, NaOH (sodium hydroxide), is routinely employed in oil fields to spark a different reaction that also reduces oil viscosity – a technique known as alkaline flooding. The other reaction product, H2, could be used for gas flooding – another common oil recovery technique – as well as for upgrading the heavy oil via a hydrogenation reaction.

The new technique is detailed in Materials Today Physics.

Very high-energy electrons could treat tumours deep within the body

Very high-energy electrons (VHEEs), typically defined as those above 40 MeV, provide a potential new radiotherapy modality with dosimetric advantages. Beams of such electrons penetrate deep into the patient, enabling treatment of deep-seated tumours that photon-based irradiation may not reach.

Speaking at the Medical Physics & Engineering Conference (MPEC), Louie Hancock from the University of Manchester described the recent resurgence in interest in VHEE radiotherapy. “Over the last couple of decades, new linac designs mean that it’s now possible to produce roughly 200 MeV electrons in about two to three metres, whereas before it might have taken 20 metres or so,” he explained. “This has spurred interest in using these VHEEs for treating deep-seated tumours.”

Hancock noted that a linac-based VHEE treatment system should be compact enough to fit into a hospital bunker. “It’s much cheaper to put a machine in an existing bunker than to build a new building,” he pointed out. “I expect that VHEEs will probably be more expensive than photons to produce, but cheaper than protons.”

While there are currently no clinical systems available, there are electron accelerators for research use, such as the high-gradient X-band linac at CERN’s CLEAR facility, for example, and the CLARA electron accelerator at Daresbury Laboratory. In the meantime, Monte Carlo simulations can provide insight into VHEE treatments without having to actually build a machine.

Depth–dose curves show that, as well as delivering dose deep inside a patient, VHEEs should also be extremely resilient to changes in patient geometry. To confirm this, Hancock performed a back-of-the-envelope calculation for a simple block of water containing a cylindrical 5 cm tumour in its centre. He simulated treatment of the tumour by rotating a beam around the target, and then introduced a cavity above the tumour (a 5 cm sphere of bone or air) to calculate its impact on the tumour dose.

For treatments simulated using 1.3 MeV X-rays, the introduction of an air bubble resulted in hot spots of about 15%, while an unexpected bony region led to cold spots of about 10%. For 250 MeV VHEEs, while hot and cold spots were seen, they were only of about 2%. “In this simple situation, VHEE appears to be nearly an order of magnitude more resilient to unexpected homogeneities when compared with X-rays,” said Hancock.

But will this advantage translate to patients? To find out, Hancock and colleagues examined a clinical case, comparing treatment plans for VHEE and volumetric-modulated arc therapy (VMAT).

VHEE treatment planning is a highly complex process, with millions of variables creating a huge optimization problem. As such, the team at Manchester has developed an open-source treatment planning system for VHEEs that incorporates various tools to create a full planning workflow. This includes organ identification using Slicer 3D software, Monte Carlo dose calculations using Geant4, and then optimization of the generated dose profiles with Python software written by Hancock.

Using their VHEE code, the researchers created a treatment plan for a patient with cervical cancer, which included two large target sites to irradiate and many nearby organs to avoid. They compared this with a VMAT plan creating using Monaco. The two plans delivered identical dose to the target, while nearby organs (the sigmoid colon, bowel and bladder) received slightly less dose from VHEE than VMAT. For organs further from the tumour, VHEE conferred significant dosimetric advantages, particularly to the femoral heads where it reduced the delivered dose from 35 to 15 Gy.

“We achieved coverage of all of the tumour with VHEEs, so can say for sure that these electrons are capable of treating a large tumour deep inside the patient,” said Hancock. “We could also see that the low-dose background was reduced compared with the VMAT plan.”

To examine the impact of geometric changes, the team re-simulated the treatment plans with the rectal cavity filled with air instead of water. The error between the two VHEE plans was roughly 0.15 Gy, while for the X-ray plans it was about 0.7 Gy. “In the simple water phantom, we saw a dose error nearly an order of magnitude lower with VHEEs, now we’ve seen this effect in silico in an actual patient case,” explained Hancock.

The next step, he said, will be to repeat the analysis with lung and brain cases, and “move towards putting real things in real beams”. The real thing is the MARVIN human head-and-neck phantom, while the real beams will be delivered by CLARA at Daresbury and CLEAR at CERN. These two facilities provide a wide span of electron energies and will enable VHEE measurements in realistic environments.

Preliminary simulations for planned experiments using a 45 MeV electron beam at CLARA to irradiate MARVIN demonstrated a near-uniform dose over a large area inside the phantom. Hancock noted that the study is currently on hold due to the pandemic — although there is a potential to perform these experiments in Q1 2021.

Hancock concluded that the clinical test case, using the team’s VHEE treatment planning software, demonstrates that VHEE radiotherapy has the ability to treat tumours that are both deep and large. “VHEE might well also be more insensitive to inhomogeneities and changes in patient geometry than photons, which I think is likely to be clinically beneficial,” he added.

Excitement grows over mysterious signal in dark-matter detector

In June this year, physicists working on the XENON1T dark-matter detector announced the measurement of a curious signal in their experiment – which comprises 2 tonne of ultrapure xenon. The signal had a statistical significance of 3.5σ or less, which is well below the 5σ level that is usually required for a discovery in particle physics.

In a preprint published at the time, the team suggested that the signal – an excess in low-energy electron recoil events – could have three explanations. The most mundane is that it was caused by contamination of the ultrapure xenon by radioactive tritium.

A more intriguing explanation, they said, is that the signal is the detection of hypothetical particles called axions that could be emitted the Sun. The third possibility is that the excess is caused by neutrinos interacting with the xenon in an unexpected way – which would also be very interesting.

Now, the original preprint has been published in Physical Review D and over in Physical Review Letters, five theoretical papers put forth a range of tantalizing explanations for the excess.

Axionlike particle

Working in Japan, Fuminobu Takahashi, Masaki Yamada and Wen Yin say that the signal could be related to a hypothetical axion-like particle (ALP) with a mass of a few keV/c2  that interacts with electrons. As well as explaining the XENON1T signal, such an ALP could be a constituent of dark matter and its existence could explain an anomaly in the observed cooling of white dwarf and red giant stars.

Meanwhile, in Germany, Andreas Bally, Sudip Jana and Andreas Trautner reckon the mystery signal could be the work of a hypothetical gauge boson that mediates a new interaction between solar neutrinos and electrons.

A paper written by Nicole Bell et al. makes the case for a “relatively low-mass luminous dark-matter candidate” as the source of the excess. They suggest that this dark-matter particle could enter the detector in a “light state” and be scattered into a “heavy state” that would decay by emitting a photon. This photon would then interact with an electron in the detector to create the observed signal.

Galactic boost

Another dark-matter proposal comes from Bartosz Fornal and colleagues who suggest that otherwise sluggish cold-dark-matter particles could get a boost of energy from the galactic centre and collide with XENON1T electrons.

The fifth idea comes from Joseph Bramante and Ningqiang Song in Canada, who argue that the signal could come from the scattering of a type of dark matter that is a thermal relic from the early universe.

They can’t all be right, and it will be very interesting to see if a similar signal shows up in future dark-matter experiments.

Quantum spin liquid candidate becomes a superconductor under pressure

Researchers in China report that they have observed both superconductivity and an insulator-to-metal transition in sodium ytterbium (III) selenide (NaYbSe2) simply by applying pressure to it. This inorganic substance, which is also a quantum spin liquid (QSL) candidate, could therefore become a new platform for investigating superconductivity in compounds that have electrons in their orbitals, and for exploring the mechanisms of unconventional superconductivity in these materials.

Quantum spin liquids (QSLs) are solid magnetic materials that cannot arrange their magnetic moments (or spins) into a regular and stable pattern. This behaviour contrasts with that of ordinary ferromagnets, in which all the spins point in the same direction, or antiferromagnets, in which the spins point in alternating directions. Instead, the spins in a QSL constantly change direction in a fluid-like way – even at ultracold temperatures near absolute zero – and are thus said to be “frustrated”.

The late physicist and Nobel laureate Philip W Anderson proposed the existence of QSLs in the early 1970s, when he was studying the ground state of antiferromagnetically interacting spins on a triangular crystal lattice. Although Anderson did not follow up the idea at the time, he returned to it in 1986 after the discovery of high-temperature superconductivity in copper oxides (cuprates). A year later, his work bore fruit when he uncovered a potentially crucial link between QSL theory and “unconventional” high-temperature superconductivity. He described this link in the so-called resonant valence bond theory.

“Parent states”

Today, QSLs are thought to be the “parent state” for high-temperature unconventional superconductivity in cuprates. This class of superconductors could have applications in many areas, including energy grids, levitating transport and even quantum computing, but the physics underlying them is still not very well understood. Studying QSLs is thus important for condensed-matter physicists if these applications are to see the light of day.

A team of researchers jointly led by Run-Ze Yu and Chang-Qing Jin of the Institute of Physics at the Chinese Academy of Sciences, and He-chang Lei of Renmin University of China in Beijing, used a diamond anvil cell to measure the electronic conductivity of NaYbSeunder varying high pressures (up to 126 GPa) and a high magnetic field. This crystalline material features a triangular lattice of 4f-orbital Yb3+ions bonded to six equivalent Se2- atoms to form YbSeoctahedra. These YbSe6 octrahedra share corners with six equivalent NaSeoctahedra, while sharing edges with six equivalent YbSeoctahedra.

The researchers found that their sample acts as a paramagnetic insulator (a material with a permanent magnetic dipole moment) at applied pressures below 8 GPa. As the pressure increases from 8 GPa to 50 GPa, the material remains an insulator, but its resistance decreases by nearly eight orders of magnitude. A metallic phase is eventually observed at about 60 GPa and further increases in pressure lead to the emergence of a superconducting state at 103 GPa, with a superconducting transition temperature of 8 K.

Mott transition

This transition from insulator to conductor is known as a Mott transition. Because the material contains exotic spin excitations carrying fractional quantum numbers at low energies, its metallic state behaves like a non-Fermi liquid – meaning that its electrical conductivity depends linearly on temperature. “The origin of the superconductivity should therefore be exotic,” Yu explains. “This implies that the relationship to the resonant valence bond theory proposed by Anderson for copper oxide superconductors needs to further investigated in experiments.”

The fact that the researchers observed such a transition by simply applying pressure means that it is an intrinsic, physical property of the material, free from chemical transformations that are usually introduced by doping. And since Yb3+ ions contain f-electrons, Yu adds that the NaYbSesystem provides a new platform to investigate unconventional superconductivity mechanisms in such compounds. “The work will certainly help researchers gain a deeper understanding of the interplay between the QSL and unconventional superconductivity,” he tells Physics World.

Members of the team, who report their work in Chinese Physics Letters, say they will now study the nature of the insulator-to-metal transition in NaYbSein-depth, as well as the intrinsic physical properties of superconductivity in this QSL. “We will also be focusing on the possibility of ‘heavy fermion’ phenomena in this compound,” Yu adds. Such phenomena, he explains, arise when the conduction electrons, which are fermions, move as if they were hundreds of times more massive than electrons in conventional metals, like copper. This large “effective mass” comes about thanks to strong electron-electron interactions, which are also thought to play an important role in high-temperature superconductors.

Comprehensive machine targeting and image quality QA with the QUASAR Penta-Guide system

Want to learn more on this subject?

The QUASAR™ Penta-Guide Phantom is recognized globally as the preferred tool for commissioning and daily testing of Image-Guided Radiotherapy (IGRT) systems. Modus QA is excited to reveal Penta-Guide 2.0 software, a comprehensive Daily QA solution that is free for all existing and new Penta-Guide users.

This webinar, presented by Rocco Flores, will highlight the advanced features of the Penta-Guide Phantom, review the efficient daily QA user workflow and reveal the advanced utility of the included Penta-Guide 2.0 software.

Rocco Flores MRT(T) is a product manager with Modus QA. He has worked at several clinics over 20+ years as a medical radiation therapist with a focus on treatment delivery, planning and research. Since joining Modus QA in 2019, Rocco is primarily involved in new product development and clinical application support.

 

 

Pathways to scalable quantum technologies

This episode of the Physics World Weekly podcast features interviews with two leaders in the race to build practical quantum computers.

Michelle Simmons is director of Australia’s Centre of Excellence for Quantum Computation and Communication Technology. She talks about how her early work on fabricating solar cells kindled a passion for building electronic devices that she now pursues by leading a research group at the University of New South Wales that is building solid-state quantum computing devices at the atomic scale.

Jelena Vučković is Professor of Electrical Engineering and by courtesy, professor of Applied Physics at Stanford University. There, she leads the Nanoscale and Quantum Photonics Lab and focuses on using impurities in diamond to create quantum devices. Vučković talks about the challenges involved in creating scalable quantum computers and also reflects on the roles that engineers and physicists play in the development of quantum technologies.

Vučković and Simmons are plenary speakers at the Quantum 2020 Virtual Conference, which is being held on 19—22 October.

Air-breathing rocket engines: the future of space flight

The pursuit, exploration and utilization of the space environment can be misinterpreted as a luxury. History portrays space as an exclusive domain for global powers looking to demonstrate their prowess through technological marvels, or the stage for far-off exploration and scientific endeavour with little impact on daily life. However, the benefits of space are already woven into our everyday routines and provide utilities and resources on which the society has grown dependent. If these were suddenly to disappear and the world were to experience just “a day without space”, the consequences would be evident to all.

Our biggest space-based resource is the satellites we’ve put in orbit around the Earth. Communication satellites provide global connectivity and the means to transmit live television around the world. We can remotely watch international events and sporting spectacles in real time, thanks to this space-borne infrastructure. Earth observation is also of growing importance, letting us monitor and assess our natural habitat and climate, which in turn enables us to optimize agricultural land use, to predict and intervene in national disasters, to organize relief efforts, and more besides.

Position, navigation and timing (PNT) satellites provide valuable location data for drivers and outdoor enthusiasts, while their timing signals are also used to timestamp and co-ordinate global cash withdrawals and financial transactions. Satellites provide platforms for science and technology, and they can give universities access to orbital experimentation. Companies are learning to rapidly develop technology through prototypes placed in orbit, while research into space weather using in-orbit instruments provides improved understanding and forecasting to protect our Earth-bound power grids from solar storms.

The utilization of space is set to become more important still. A new vision for the future is starting to emerge that will feature even more innovative uses of space, ranging from space-based manufacturing and energy production to global Internet connectivity. Space-debris management is also receiving greater focus alongside lunar and Martian exploration, and even space tourism.

While some of these new innovations may sound like they are confined to the realm of science fiction, there are already companies furthering the technology to turn them into reality.

A concept image of Reaction Engine’s Synergetic Air Breathing Rocket Engine (SABRE).

Affordable space access

One of the fundamental challenges associated with exploring and working in space is that whatever is up there, starts down here. Escaping Earth’s gravity and achieving orbit is technically difficult, operationally complex and financially exclusive.

The tide is turning, however, and the world is in the midst of a transformational era in the history of spaceflight. An activity once limited to governments and national space agencies is now witnessing a flourish of innovation led by entrepreneurial private companies such as Rocket Lab, Virgin Orbit and SpaceX, the latter of which was in the global spotlight as watched the US Commercial Crew Programme launched American astronauts to the International Space Station and then return them safely earlier this year. By optimizing conventional rocket technology, these firms are rapidly reducing the cost of space access and generating launch capacity. This in turn is creating opportunities for new space operators and is beginning to initiate a virtuous circle, in which reduced launch costs and increased flight rates create additional commercial opportunities and more demand for launch services. Indeed, the investment bank Morgan Stanley estimates that the global space market will be worth $1 trillion per year by 2040.

The global space-access industry will be limited if it remains focused on optimizing conventional rocket technology, which was first used in the mid-20th century

While this cycle could significantly reduce launch costs even further, it will be limited if the global space-access industry remains focused on optimizing conventional rocket technology, which was first used in the mid-20th century. That’s why we here at Reaction Engines in the UK (see box below) are developing the Synergetic Air Breathing Rocket Engine (SABRE) – what we think will be the next generation of space-propulsion technology. Our aim is to enable horizontally launched reusable space vehicles that are affordable, reliable and responsive, and can be launched at a high and regular frequency.

SABRE versus conventional rockets

Conventional rocket vehicles are propelled by a fuel (liquid hydrogen, kerosene or methane) and an oxidizer (liquid oxygen) carried within the vehicle body. When the fuel and oxidizer combust, mass is projected out of the back of the rocket, creating thrust. However, this approach – and especially the use of heavy on-board liquid oxygen – is constrained by Tsiolkovsky’s rocket equation. It basically tells us that everything carried on board a vehicle has a penalty in the form of the additional propellant, and structural mass of the vehicle, needed to get it off the ground. In other words, this approach hampers mission performance, mission payload and mission time.

SABRE, on the other hand, is a hybrid air-breathing rocket engine. During the atmospheric segment of its ascent, it will use oxygen from the atmosphere instead of carrying it inside the vehicle, before switching to on-board oxygen upon leaving the atmosphere. A SABRE-powered launch vehicle will therefore have lower mass for a given payload than a conventional rocket vehicle. This mass benefit can be traded for systems that will enable reusability and aircraft-like traits, such as wings, undercarriage and thermal-protection systems – all the features needed to fly the same vehicle over and over again, achieving hundreds of launches.

Reusability will not only drive down the cost of the launch. SABRE-powered launch vehicles will also take off like aircraft instead of launching vertically like conventional rocket vehicles. As a result, they will introduce faster turnaround times, higher vehicle utilization rates and more responsive launches. They will be able to undertake safe abort and return-to-base scenarios. The design of SABRE-powered launch vehicles will also allow more rapid set-up and simpler launch facilities compared with current vehicle designs.

At Reaction Engines, we think these characteristics are far beyond what we see from even the most advanced expendable systems currently available – and that they therefore will unlock the full virtuous cycle for space access and the greater potential of the space economy.

The precooler

The key element within SABRE is its unique, high-performance thermal-management system, which relies on fundamental thermodynamics to extract, redirect and utilize the enthalpy of a hypersonic (Mach 5) airstream as it enters the engine. One of the most important parts of this system is the precooler, which was therefore one of the first pieces of SABRE technology to be developed.

The speed of existing air-breathing engines is limited by their ability to handle and exploit the energy contained within high-Mach air streams. To create thrust from an air-breathing engine you have to increase the speed of the air that passes through it. Counterintuitively, you also have to slow the air down when you reach high speeds so that the internal machinery can apply work into the airflow before accelerating it from the back of the engine.

However, when fast-moving air slows down, it rapidly heats up as kinetic energy is converted into thermal energy. For example, these temperatures can reach over 1000 °C when slowing a Mach 5 air stream. At such high temperatures, it is not possible to maintain the integrity of conventional engine components – quite simply, they melt.

SABRE heat exchanger

Together, SABRE’s thermal management system and the precooler provide a solution to this problem. When high-Mach air enters SABRE, it is first slowed down by the intake of the engine through a series of shockwaves created by the geometry of the engine’s components. As this happens the air rapidly heats up, but is then passed into the precooler where its temperature is reduced to manageable levels. The precooler has been designed to create heat transfer between the air stream and an internal fluid medium (cryogenic helium). Geometry, fluid properties, and thermal and mechanical effects have all been considered so as to maximize the extraction of heat from the air stream.

With the air stream suitably cooled, it can now enter the heart of the engine, where it goes through a cycle involving compression, combustion, regeneration and, ultimately, expansion through the engine’s nozzle thereby creating propulsive force. The air stream’s thermal energy, which has been transferred to the precooler fluid, is also used to drive the internal components of the engine.

The precooler can cool high mass-flow airstreams from temperatures above 1000 °C to ambient in less than 50 milliseconds, within a compact and low-weight design. It is formed from over 42 km of tubing, the walls of which are thinner than a human hair – thereby providing an enormous area for heat-transfer between the air and cooling medium.

Putting it to the test

In 2012 Reaction Engines manufactured a fully operational precooler and tested it more than 700 times, verifying the ability to take ambient air down to cryogenic temperatures. This prototype unit saw more test time than an already operational prospective SABRE precooler, and performed impeccably throughout.

Next came the Hot Heat Exchanger (HTX) test campaign, which was designed to subject the precooler to a range of high-temperature conditions representative of high-Mach flight. Carried out in 2019, the testing took place at a specially constructed facility at the Colorado Air and Space Port near Denver in the US. The test set-up featured a conventional fighter aircraft engine running on full afterburner in order to create the high-mass air flow and high-temperature conditions a SABRE precooler will experience in flight after stagnating (or slowing down) a Mach 5 airstream.

Three hot test campaigns were conducted, with each achieving a higher equivalent Mach number. The final round of tests saw both the precooler and test equipment pushed to their limits, and successfully deliver the ultimate Mach 5 test objective – it demonstrated that the precooler could quench airflow temperatures in excess of 1000 °C in less than 50 milliseconds.

The tests showed the precooler’s ability to successfully cool airflow at speeds much higher than the operational limit of any jet-engine-powered aircraft in history. It ran at over twice the operational conditions of Concorde and over one and a half times the conditions of the SR71 Blackbird. This remarkable precooler technology is, therefore, not only key to the SABRE engine but also offers propulsion solutions for high-Mach and hypersonic aircraft that remain within the atmosphere.

The precooler was put through a series of high-temperature tests using an engine from a conventional fighter aircraft.

What comes next?

Since the successful HTX test campaign, the next phase of SABRE testing is now under way in its “core engine campaign”, which aims to validate the performance of the air-breathing core. This part of SABRE is responsible for recycling the thermal energy extracted through the precooler into the engine’s internal components. It is also where the incoming air is compressed, mixed with hydrogen fuel and then burnt within the engine’s combustion systems prior to being expanded through its nozzle. This stage will prove the viability of SABRE’s entire thermodynamic cycle and will be a landmark moment when it is demonstrated for the first time.

While the precise timings of this latest phase of the test campaign have been hit by the COVID-19 restrictions imposed in the UK, an extensive design process has already been conducted in conjunction with the UK and European space agencies, which will enable the programme to swiftly progress once the restrictions are reduced. And while the COVID-19 pandemic has undoubtedly impacted both the global aerospace sector and the operation of manufacturing and R&D facilities in the UK, there remains robust demand in the commercial space-launch sector. Reaction Engines can already see a number of possibilities where applications of SABRE-class engines might be used in launch-vehicle architectures, and we are working with partners across the space industry to understand how these capabilities can be brought to the fore.

Technology spinout

Despite being originally conceived as an engine for enabling low-cost space access, it’s now clear that SABRE technology could provide benefits beyond the space industry. That’s why Reaction Engines is also developing spin-out applications of the technology with other industry partners.

In the aerospace sector, SABRE’s thermal-management capability and heat-exchanger technology could boost the efficiency of next-generation commercial aircraft engines and systems, as well as high-Mach and hypersonic aircraft. At a time when the aerospace industry must demonstrate resilience in the face of COVID-19, technological paradigm shifts and challenging decarbonization targets, SABRE technology could help by providing a range of efficiency improvements and cost savings for current and future aircraft concepts.

Reaction Engines is investigating how SABRE technology could benefit other sectors too. Motorsport, industrial processes and the energy industry are all areas where intelligent thermal management, such as that at the heart of SABRE, could bring about significant change. Reaction Engines is keen to adapt and deploy its technology into these industries to make them more efficient, more sustainable, and better for the environment.

It is clear that the space industry is going through a period of significant innovation and rapid development. Innovative new commercial entrants have lowered the cost of space access, which has in turn opened up further commercial opportunities in space.

As the industry develops and the demand for ultra-low-cost access to space increases, we believe that there will be a need for the kind of revolutionary leap forward in propulsion that SABRE and its thermal-management systems represent.

Reaction Engines

Reaction Engines is a UK company founded in 1989 by the propulsion engineers Alan Bond, Richard Varvill and John Scott Scott. They had previously worked together on the RB545 engine, which was destined for use on the Horizontal Takeoff and Landing (HOTOL) system – a space plane concept developed by British Aerospace and Rolls-Royce – in the late 1980s. After HOTOL was cancelled, the team formed Reaction Engines to evolve the HOTOL and RB545 concepts into the SABRE engine class, which the company continues to develop.

After years of fundamental technology development, a UK government grant of £60m in 2015 – coupled with investment from BAE Systems, Rolls-Royce and Boeing HorizonX – has enabled Reaction Engines to grow and transition from research into design and demonstration. Based at the Culham Science Centre in Oxfordshire, the company is working alongside the UK Space Agency, European Space Agency and other organizations in the UK and Europe, to develop SABRE.

Why physics-based firms benefit from green-strings-attached economic-recovery measures

Climate change is a “defining factor” in the long-term prospects of businesses. So said Larry Fink – boss of leading asset-management firm Blackrock – in an open letter he wrote to chief executives around the world in January, shortly before COVID-19 hit pandemic levels. What’s interesting is that as the world emerges from lockdown and governments look to restart their economies with stimulus packages, many are taking the opportunity to attach “green strings” to their support.

The world’s greenest pandemic bailout initiative so far is the European Commission’s 750bn recovery package. Unlike many national COVID-19 support schemes, EU member states that want funds must show they will use the money in line with Europe’s Green Deal to eliminate net greenhouse-gas emissions. The UK has its own Green Recovery Programme, which seeks to drive investment in low-carbon innovation, infrastructure and industries, supporting sectors that increase job creation and decarbonization.

As governments look to restart their economies with stimulus packages, many are taking the opportunity to attach “green strings” to their support

Such programmes essentially involve providing financial support with green caveats attached, which is great news for the “clean technology” sector. However, any such clean-tech improvements will not happen overnight and may take several years to get to market. What’s more, falling profits and reduced business confidence following COVID-19 will lead to companies cutting their spending on research and development, unless appropriate support and encouragement are in place.

That’s why I was delighted to see, here in the UK, chancellor Rishi Sunak increase the Research and Development Expenditure Credit (RDEC) from 12% to 13% to help businesses invest in R&D. More significantly, the UK government is currently holding a consultation on how to refine the scheme to make it more effective at stimulating growth and investment – something that I and others on the Business Innovation and Growth (BIG) group of the Institute of Physics (IOP) have been pushing since it was set up in 2018.

The BIG group and the IOP’s policy team will be responding to the consultation by reminding the UK government how physics-based firms can take a lot longer to develop especially as scaling up prototype products and processes is risky and expensive. But, once established, such firms have a sustained competitive advantage and often export globally. They also encourage basic science, develop patents, employ talented people and boost manufacturing.

Innovation in action

Given the emphasis on green technology in the post-COVID future, I was therefore pleased that several of this year’s IOP business award winners, which I was involved in selecting, are exactly the kind of clean-tech companies we need.

They included Hirst Magnetic Instruments, a research-led company founded in 1938 that develops and makes equipment to test, measure and produce magnets. Headed by John Dudding, it focuses on the growing market for magnetic-material characterization, which is vital for improving the efficiency of motors in electric vehicles. Hirst’s materials equipment and magnetizers are supporting materials testing in China and the production lines of companies in the electric-vehicle supply chain.

Meanwhile, one company to win an IOP business start-up award this year was QLM Technology, which is developing low-power tunable-diode LIDAR gas-imaging systems based on infrared single-photon detection. Its prototypes have produced some amazing camera images that show parts-per-million levels of methane in the atmosphere measured at distances of up to 200 m. Methane is the second most important greenhouse gas – being roughly 30 times more potent at trapping heat than carbon dioxide – and there are over a million gas and oil-well pads around the world, which are leaking much more than they should.

QLM’s low cost, accurate and robust devices are exactly what’s needed for widespread emissions monitoring and meaningful regulation. Indeed, the company is already developing products for industry leaders such as Ametek, BP and the National Grid, with plans to deliver industry-ready products early next year.

Perhaps the most intriguing of the IOP’s start-up winners was FeTu, a company set up in 2016 by chief executive Jonathan Fenton, whose Fenton Turbine seems the closest we have got so far to the ideal, closed-cycle reversible heat engine first imagined by thermodynamics pioneer Nicolas Carnot in 1824. The turbine, the firm claims, could replace compressors, air conditioners, fridges, vacuum pumps and heat pumps with efficiency savings across the board.

It sounds like a “too-good-to-be-true” technology, but Fenton has sensibly set out to prove that’s not the case, with some remarkable results. The turbine is complex to describe but the first version – a “bare-shaft” unit – cuts the energy cost of compressing gases like air by 25%, with the result proven in independent tests carried out by researchers at the University of Bath. That’s promising for FeTu given that 10% of all electricity used by companies in Europe goes on compressing air – roughly 80 terawatt hours consumption per year.

As many of the IOP’s award winners illustrate, plenty of these clean-tech innovations are firmly rooted in physics

What’s more, an “open-cycle” variant of the company’s turbine can, when configured to act as a heat pump, be as efficient as air-conditioning units and refrigerators – despite using just air as the working fluid. It therefore has great environmental potential given that conventional “phase-change” refrigerants are some 2500 times more potent than carbon dioxide as greenhouse gases. FeTu says it is making good progress with customers where the energy saving and environmental advantages are clear. It is also developing a “closed-cycle” variant of the same machine with even greater efficiency improvements in store.

As many of the IOP’s award winners illustrate, plenty of these clean-tech innovations are firmly rooted in physics. I believe that research and development will be critical to ensure both economic and social recovery from the impacts of COVID-19, enabling us to build a greener, healthier and more resilient world. The IOP’s business award winners are therefore well placed to contribute to that endeavour.

Combining elemental/chemical analysis: when micro-XRF meets Raman microscopy

Want to learn more on this subject?

Combining elemental and molecular characterization enables better and faster research in many real-world applications. Applications such as pharmaceuticals, environment, or geochemistry can benefit from this complementarity of techniques. Micro-XRF provides elemental distribution on a large area without any compromise on sample preparation, while Raman microscopy can depict molecular heterogeneity in the same conditions.

In this webinar, Thibault Brulé and Jocelyne Marciano will demonstrate how the combination of these two unique techniques can fully characterize the organic and inorganic layout over samples such as tablets, rocks or pollutant particles on filter.

Want to learn more on this subject?

Thibault Brulé is Raman application scientist at HORIBA France, working in the Demonstration Centre at the HORIBA Laboratory in Palaiseau. He is responsible for providing Raman spectroscopy applications support to key customers from various industries, as well as contributing to HORIBA’s application strategies. Prior to joining HORIBA in 2017, he conducted research on proteins in blood characterization based on dynamic surface enhanced Raman spectroscopy. He then applied this technique to cell-secretion monitoring. Thibault holds a MSc from the University of Technologies of Troyes, completed his PhD at the University of Burgundy and followed on with a postdoc fellowship at the University of Montreal.

Jocelyne Marciano is application scientist at HORIBA France, working in the Demonstration Centre at the HORIBA Laboratory in Palaiseau. She is responsible for testing and demonstrating different types of elemental analysers (EMIA-EMGA, SLFA, MESA and XGT) that are part of the HORIBA Japan product portfolio. Prior to joining HORIBA in 2008, Jocelyne spent 17 years working for Saint-Gobain as a key specialist for glass products analysis. She has in-depth experience in X-ray fluorescence and gas analysers as well as in XPS, SEM-EDX and µ-probe.

Proton-coupled electron transfer in electrochemistry

Want to learn more on this subject?

Proton-coupled electron transfer (PCET) reactions play a vital role in a wide range of electrochemical processes. This talk will focus on theoretical studies of molecular and heterogeneous electrocatalysis, highlighting the interplay between theory and experiment. A general theory of PCET that enables the calculation of rate constants, current densities, Tafel slopes, and kinetic isotope effects has been developed.

The application of this PCET theory to molecular electrocatalysts designed for water splitting illustrates the use of multi-proton relays to transport protons over hydrogen-bonded networks, as well as the ability to tune the redox potentials through substituents and other molecular design strategies.

The application of this PCET theory to proton discharge on a gold electrode provides insight into the fundamental principles underlying this process in acidic and alkaline aqueous solution as well as acetonitrile, and explains experimentally observed potential dependent kinetic isotope effects.

Recent periodic density functional theory calculations of electric fields at electrochemical interfaces and PCET at graphite-conjugated acids elucidate the impact of the applied potential on the electronic structure and interfacial fields.

The insights from these theoretical studies are useful for the design of both molecular and heterogeneous electrocatalysts to control the movement and coupling of electrons and protons for energy-conversion processes.

Want to learn more on this subject?

Sharon Hammes-Schiffer received her BA in chemistry from Princeton University in 1988 and her PhD in chemistry from Stanford University in 1993, followed by two years at AT&T Bell Laboratories. She was the Clare Boothe Luce Assistant Professor at the University of Notre Dame from 1995–2000. She then became the Eberly Professor of Biotechnology at The Pennsylvania State University until 2012, when she became the Swanlund Professor of Chemistry at the University of Illinois Urbana-Champaign. Since 2018, she has been the John Gamble Kirkwood Professor of Chemistry at Yale University.

Her research centres on the investigation of charge transfer reactions, proton-coupled electron transfer, nonadiabatic dynamics, and quantum mechanical effects in chemical, biological, and interfacial processes. Her work encompasses the development of analytical theories and computational methods and applications to experimentally relevant systems. She is a Fellow of the American Physical Society, American Chemical Society, American Association for the Advancement of Science, and the Biophysical Society. She is a member of the American Academy of Arts and Sciences, the US National Academy of Sciences, and the International Academy of Quantum Molecular Science. She has received the American Chemical Society Award in Theoretical Chemistry, the Royal Society of Chemistry Bourke Award, and the Joseph O Hirschfelder Prize in Theoretical Chemistry.

She was the deputy editor of The Journal of Physical Chemistry B and is currently the editor-in-chief of Chemical Reviews. She is on the Board of Reviewing Editors for Science and has served as Chair of the Physical Division and the Theoretical Subdivision of the American Chemical Society. She has more than 285 publications, is co-author of a textbook entitled Physical Chemistry for the Biological Sciences, and has given more than 415 invited lectures, including 24 named lectureships.



Copyright © 2025 by IOP Publishing Ltd and individual contributors