Skip to main content

Quantum jobs fair for kick-starting graduate careers

Earlier this month, I attended the Careers in Quantum fair at the University of Bristol, organized by the Quantum Engineering Centre for Doctoral Training (CDT). There was an enthusiastic buzz in the air — perhaps unsurprising considering that last year the UK government announced £2.5bn funding for developing quantum technologies as part of the government’s National Quantum Strategy, which includes a plan to double the number of quantum CDTs. The event was led by students in the CDT and featured stalls from nearly 30 quantum companies as well as a programme of talks and discussions.

The day was kicked off by Winfried Hensinger, a researcher at the University of Sussex and co-founder of Universal Quantum. The company’s goal is to build a million-qubit quantum computer, with Hensinger saying he wanted Universal Quantum to be “the AWS [Amazon Web Services] of quantum computing”. The challenge of scaling up quantum computers would come up again and again, and Hensinger’s ambitious talk – in which he revealed he’d wanted to build a quantum computer ever since his PhD – set the tone for the event.

The first panel discussion examined starting a quantum company. Immediately noticing the all-male line-up, the panellists made a point of encouraging female founders, and the gender imbalance was luckily not reflected in the rest of the day. Two panellists – Josh Silverstone of Qontrol and Dominic Sulway of Light Trace Photonics – had spun out their firms from their PhDs.

Although a doctorate is great for developing your technical knowledge and skills, the consensus was that a PhD is not always necessary for success in the sector. That notion would later be echoed in the last talk of the day by Harry Bromley from Aegiq, who spoke about his decision to pursue a career in quantum straight out of his master’s degree.

The importance of communication skills for pitching to investors and describing technical concepts in an accessible way was also highlighted – I was partly there to deliver a short presentation promoting the Physics World PhD contributor network, so I hope this is something that attendees took to heart.

The afternoon began with another panel discussion, this time on the near-term applications of quantum technology. Andrew Weld of QLM predicted that in 10 years, “no-one will be talking about quantum”, because the technology will be so widespread that “quantum technology” will become a meaningless phrase.

The event was dominated by start-ups, with lots of discussion of the challenges of recruitment, attracting investment and finding customers. However, one participant on this panel was Zoe Davidson, a research specialist in optical networks for British Telecom (BT) – which employs more than 100,000 people. She spoke about the company’s projects and what it’s like to deploy cutting-edge technology in a big organization.

The rest of the talks included several University of Bristol alumni (and previous conference organizers) who had returned as speakers, discussing their work at ORCA computing and Wave Photonics. Most of the conference was UK-centric, but there were also talks from Sofie Lindskov Hansen of Danish Sparrow Quantum and Jonas Philips of the Germany-based Quix Quantum – with Philips emphasizing the need for a European quantum supply chain.

Doing a PhD in quantum seems like a safe bet in terms of landing a job, but students still have a lot of choices to make about their futures. Based on the number of speakers who had spun out a company from their research, it’s likely that some of the students wandering around the hall, chatting to companies, and picking up free pens might be sitting on the next big quantum start-up.

When it comes to fish dynamics, three’s a school

How many fish make up a school? It sounds like one of those trick questions, but physicists at Heinrich Heine University Düsseldorf and the University of Bristol have now found an answer.

To do so they fitted a “bowl-shaped” aquarium at Bristol University with cameras to track the three-dimensional trajectories of zebrafish, studying group sizes of two, three, four and fifty fish (Nature Comms 15 2591).

The researchers then used methods from statistical physics to analyse swimming patterns and deduce the minimum group size where individual movements change and become coordinated group patterns.

They found that an isolated pair of fish prefer to move one after the other but when in threes the zebrafish swim next to each other – a characteristic of a large school of fish.

When the researchers then marked small sub-groups of three fish within a larger school, they found that the group of three moved within the school in a similar way to an isolated group of three. So, while three fish form a school, two are not enough.

The team now aim to apply their findings to the behaviour of other animals and how groups of people behave at parties or mass gatherings.

“We will see whether the simple limit of the number three also applies,” says Düsseldorf physicist Hartmut Löwen.

Molecular imaging technique could improve breast cancer screening

Mammography is a widely employed and effective tool for early detection of breast cancer, but dense breasts pose a significant challenge in cancer screening. Not only does dense breast tissue increase the risk of developing breast cancer, the high proportion of fibrous and glandular tissue can mask the presence of a tumour on a screening mammogram.

As a result, supplemental breast imaging modalities are often advised for women with dense breasts. Such tests, breast MRI in particular, add significantly to the cost of cancer screening. This is especially problematic considering that about 40% of the screening population have heterogeneously dense breasts and about 10% of women have extremely dense breasts.

Low-dose positron emission mammography (PEM) is a novel molecular breast imaging technique that could potentially replace or supplement mammography. With this in mind, researchers in Canada have compared the performance of PEM and breast MRI in identifying breast cancer and determining its local extent in 25 women recently diagnosed with breast cancer. They report the findings of their clinical study in Radiology: Imaging Cancer.

Radialis PET Imager

Historically, molecular breast imaging has not been used for clinical breast imaging due to the high radiation dose that it delivers to the breasts and surrounding organs. The use of an organ-targeted PET system – the Radialis PET Imager – to perform PEM could eliminate this concern. The Radialis uses coincidence detection of emitted gamma photons, eliminating the need for collimation (required for gamma camera-based molecular breast imaging) and enabling the use of radiation doses comparable to those of mammography.

PEM technology offers the high sensitivity of breast MRI but with the advantage of being lower in cost. Its effective radiation dose is comparable to that of traditional digital mammography and lower than digital tomosynthesis. In addition, PEM overcomes tumour masking issues associated with high breast density, delivers fewer false positives than mammography, and doesn’t require breast compression during the exam.

Principal investigator Vivianne Freitas, from the University of Toronto’s University Health Network, Sinai Health and Women’s College Hospital, and colleagues imaged the study participants 1 and 4 h after injection of 37, 74 or 185 MBq of the radiotracer 18F-fluorodeoxyglucose (18F-FDG). Similar to mammography, they acquired PEM images in standard craniocaudal and mediolateral oblique views.

Two breast radiologists blinded to the cancer location performed a per-lesion-visual assessment of the acquired images, recording the morphology of any observed lesions. Low-dose PEM identified 24 out of 25 known malignant lesions (determined by histopathology) compared with 100% for MRI, failing to detect a single 38-mm lobular cancer. MRI identified 13 additional lesions, eight of which were false-positives, while PEM detected six, one of which was a false-positive, demonstrating PEM’s lower false-positive rate of 16% versus 62% for MRI.

The researchers note that PEM’s low dose of 37–185 MBq produced diagnostic-quality images corresponding to a radiation exposure of 0.62–0.71 to 1.24–1.42 mSv. The low dose of the PEM device approached the mean total effective dose of two-view bilateral full-field digital mammography (about 0.44 mSv), was similar to contrast-enhanced mammography (0.58 mSv) and less than the combination of mammography and digital breast tomosynthesis (0.88 mSv).

“For screening, PEM’s ability to perform effectively regardless of breast density potentially addresses a significant shortcoming of mammography, particularly in detecting cancers in dense breasts where lesions may be obscured,” says Freitas. “It also presents a viable option for patients at high risk who are claustrophobic or who have contraindications for MRI.”

Freitas notes that while the full integration of PEM into clinical practice is yet to be confirmed, these preliminary findings are promising, particularly as they demonstrate PEM’s ability to detect invasive breast cancer with low 18F-FDG doses. “This marks a critical first step in its potential future implementation in clinical practice,” she says.

The researchers have now started a pilot study to evaluate whether liquid biopsy findings can be matched to images obtained by PEM in women at high risk of breast cancer. Participants have blood taken for a liquid biopsy test and a PEM exam following injection of 74 MBq of 18F-FDG, before undergoing an MRI-guided biopsy for a suspicious breast lesion.

The team will evaluate data from the two exams to determine whether any novel findings on tumour fragment size and patterns, mutational signatures, variants or epigenetic changes identified from liquid biopsy correlate with the characteristics of the PEM images. If correlations between the two are identified, the researchers plan to conduct additional studies to evaluate whether these techniques can help refine screening investigations and reduce unnecessary biopsies.

Climate change will affect how time is corrected using ‘negative leap seconds’

Today, official time is kept by atomic clocks – and technologies such as the Internet, positioning systems and mobile-phone networks depend on the clocks’ extraordinarily  accurate time signals.

These atomic clocks define the second in terms of the frequency of light that is involved in a specific transition in atomic caesium. The definition was chosen so that 86,400 atomic seconds corresponds very closely to the length of a day on Earth – which is the traditional definition of the second.

However, the correspondence is not exact. Between 1970 and 2020, the average length of a day on Earth (the period of Earth’s rotation) was about 1–2 ms longer than 86,400 s. This means that every few years, a second-long discrepancy builds up between time as measured by Earth’s rotation and time measured by an atomic clock.

Since 1972 this deviation has been corrected by the insertion of  27 leap seconds into co-ordinated universal time (UTC).

Complicated process

This correction process is complicated by the fact that various factors cause Earth’s period to vary on a number of different time scales. So leap seconds are inserted when needed – not according to a regular schedule like leap years. Nine leap seconds were inserted in 1972–1979, for example, but none have been inserted since 2016.

Indeed, since about 2020 Earth’s average period has dipped below 86,400 s. In other words, Earth’s rotation appears to be speeding up. This bucks the long-term trend of the rotation slowing, and is probably related to interactions deep within the Earth. As a result, metrologists face the unprecedented prospect of “negative leap seconds” – which could be even more disruptive to computer systems than leap seconds.

But now, Duncan Agnew of the Scripps Institution of Oceanography and the University of California, San Diego has identified a new process that may be countering this increase in rotational speed – something that could postpone the need for negative leap seconds.

Writing in Nature, he shows that the increased melting of ice in Greenland and Antarctica is decreasing the Earth’s angular velocity. This is because water from the poles is being redistributed throughout the oceans, thereby changing our planet’s moment of inertia. Because angular momentum is conserved, this change results in a decrease in angular velocity – think of a spinning ice skater who slows down by extending their arms.

Agnew reckons that this will postpone the need for a negative leap second by three years. A negative leap second could be needed in 2029, but it could be one of the last because metrologists have voted to get rid of the leap-second correction in 2035.

Researchers use machine learning to improve the taste of Belgian beers

Machine learning and artificial intelligence are finding applications in many different areas but scientists from Belgium have now raised the, er, bar somewhat.

They have used machine-learning algorithms to predict the taste and quality of beer and what compounds brewers could use to improve the flavour of certain tipples.

Kevin Verstrepen from KU Leuven and colleagues spent five years characterizing over 200 chemical properties from 250 Belgian commercial beers across 22 beer styles, such as Blond and Tripel beers. They also gathered tasting notes from a panel of 15 people and from the RateBeer online beer review database.

They then trained a machine-learning model on the data, finding it could predict the flavour and score of the beers using just the beers’ chemical profile.

By adding certain aromas predicted by the model, the researchers were even able to boost the quality – as determined by blind tasting – of existing commercial Belgian ale.

The team hops the findings could be used to improve alcohol-free beer. Yet KU Leuven researcher Michiel Scheurs admits that they did celebrate the work “with the alcohol-containing variants”.

Why you shouldn’t be worried about talk of a ‘quantum winter’

For politicians, funders and investors, science isn’t always on their radar. But from time to time, certain fields do capture wider attention. We saw that in the 1950s with nuclear power, which some thought would one day be “too cheap to meter”. Later, nanotechnology and graphene rose to the fore. More recently, artificial intelligence and quantum technology seem to be on everyone’s mind.

Quantum tech is no longer just about semiconductors, quantum dots, electron microscopes and lasers, which are “first-generation” technologies. Instead, the focus is on harnessing superposition, uncertainty and entanglement to develop “second-generation” quantum technologies.

Such “quantum 2.0” technology could revolutionize everything from computing and measurement to sensing, timing and imaging. Whether it’s engineering, transport, navigation, finance, defence or aerospace, quantum tech could disrupt the economy too. In fact, most leading nations these days have a quantum strategy, though China is out in front in terms of financial muscle, having ploughed $25bn into quantum tech up to 2021.

However, investment in quantum tech could be taking a potentially worrying turn, according to the recent report State of Quantum 2024. Published in January by the Finland-based firm IQM Quantum Computers, venture capitalists OpenOcean, European tech investors Lakestar and The Quantum Insider (TQI), the report warned that investment in the sector around the world has fallen by 50% since a high in 2022.

Graph showing cash invested by private firms in quantum technology

State of Quantum 2024 says that global investment in quantum technology, which peaked at $2.2bn in 2022, had dropped to $1.2bn the following year. The steepest fall was in the US, which plummeted by 80%, with Asia-Pacific investment reducing by 17%. The situation in Europe, the Middle East and Africa (EMEA) was slightly better, with investment growing marginally by 3%.

The drop in investment has led some commentators to suggest we are heading for a “quantum winter”, which seems dramatic but perhaps just reflects the notion that investors are getting more tuned into emerging markets and recognizing that the practical applications of quantum computing could still be many years away. It’s also worth remembering that, for all the hype, quantum technology remains a niche sector, accounting for less than 1% of total venture capital funding globally.

The hype cycle

As I pointed out last year, many technologies follow a graph devised in 1995 by Jackie Fenn, an analyst from US tech consultants Gartner Inc. Now known as the “Gartner hype cycle”, it shows how the expectation surrounding a particular technology develops over time. Having lived through a fair few technology cycles myself, I can safely say that the graph is pretty accurate.

A graph with a peak, a fall and then a plateau

We start with a “technology trigger”, when everyone notices something big going on. Interest rises, money flows in until we reach a “peak of inflated expectations”. Then, as people realize things are harder and trickier than imagined, we hit a “trough of disillusionment”. Later, activity picks up again via a “slope of enlightenment” until we hit a “plateau of productivity”, where firms – finally – realize what works and know what customers want.

If the hype cycle is true of quantum tech – and I have no reason to think it isn’t – then the current fall in investment is probably a combination of being around the top of the cycle and global economies slowing down. That’s not necessarily a bad thing. I am sure that more informed investors are selecting the stronger technologies and companies as market opportunities and the timings of applications are becoming clearer.

In any case, there are still big sums involved, with investment in EMEA continuing to rise. And as The Quantum Insider report points out, lots of great progress is going on at quantum research centres across the globe. One example is the UK’s National Quantum Computing Centre, which aims to allow early access to quantum computing for UK businesses as part of the country’s national strategy.

The most talked-about quantum 2.0 tech is, of course, quantum computing. It grabs all the headlines and excitement, which is not surprising given that the demand for such machines is potentially huge. According to a report published last year by Markets and Markets, the quantum-computing sector could be worth a staggering $4.4bn by 2028.

Map of the world showing countries with government initiatives in quantum technology

Much of that demand is being driven by the need for large machines with up to 10,000 quantum bits (qubits). Such devices would be used to decrypt data that have been stored with relatively low amounts of encryption (although, ironically, such data will probably be old and not that valuable). Still, if powerful quantum computers become a reality, their ability to crack historical encryption algorithms will compromise the security of the Internet and damage global security.

However, there are likely to be lots of other, more immediate applications of quantum computers with fewer qubits. Many companies are making headway in this area, with one stand-out being ORCA Computing, which in 2020 won a business start-up award from the Institute of Physics (IOP). Richard Murray, ORCA’s chief executive who is a trained physicist, recently told Forbes that the firm had sold five of its systems, with four already installed at three different sites around the world.

ORCA’s machine (an error-corrected photonic quantum computer) is the first of its kind that can work at room temperature. The latest sale involves the supply of a system to the UK’s publicly funded NQCC, which is based on the Harwell Innovation Campus in Oxfordshire. ORCA’s system will provide a testbed facility for machine learning, combining quantum and neural networks technologies.

But as Murray pointed out in another Forbes article, the challenge is working out what quantum computers can best be used for. That’s not easy, with many potential customers not understanding the benefits until they see working systems that can solve their problems. What is clear, though, is that quantum computers will be particularly good at tackling certain problems that are difficult or even impossible for classical computers to solve.

The challenge is working out what quantum computers can best be used for

The race is on in this high-stakes, high-reward sector, with IBM recently announcing a 1000 qubit computer. The company has been following a quantum-computing road map that roughly doubled the number of qubits every year. In 2021 IBM had a 127-qubit device and in 2022 it had one with 433 qubits. Its latest chip, called Condor, has 1121 superconducting qubits. Rather tellingly, however, the company says it will now shift gears and focus on making its machines more error-resistant, as this is clearly going to impact the ability of potential customers to use them in the near term.

But in terms of applications of quantum 2.0 tech, I believe there is lots of promise in quantum clocks, quantum sensors and quantum-imaging technology, especially once they reach certain usability criteria outside the lab. In this regard, the work of the UK Quantum Technology Hub on sensors and timing, led by the University of Birmingham, is world class, as are some of the companies associated with it.

A dummy of a child wearing a magnetoencephalography brain scanner

Take Cerca Magnetics, which won an IOP business-innovation award for developing the world’s first wearable magnetoencephalography scanner. Using optically pumped room-temperature magnetometers, each sensor element is no larger than a LEGO brick and can measure human brain function with a sensitivity rivalling that of cryogenic superconducting devices.

The company, which has also won the inaugural prize from the IOP’s Quantum Business and Innovation Group (qBIG), has already built a lightweight 3D-printed head-mounted scanner cap and has been installing systems around the world in magnetically shielded rooms. Its device can measure human brain function with what the company says is unprecedented accuracy. Most importantly, it can take data even as a patient moves, rather than them having to stay still while they stick their heads into a large scanner.

Another interesting quantum sensing firm is Bristol-based QLM, which recently demonstrated its latest methane gas quantum lidar camera in an event at the IOP attended by the Duke of Edinburgh. This camera is smaller, more robust and more integrated, allowing it to be used in the field for methane leak detection by customers who are trialling it globally. It was impressive to see how the company had progressed since it won its 2020 IOP business award.

One of my favourite applications of quantum 2.0 technology is quantum gravity sensing, where the aim is to get devices that can be deployed out in the field. Imagine how an understanding of what lies beneath the ground can impact our daily lives. Rather than road workers having to dig up huge stretches of tarmac to mend an underground pipe, they could do rapid, targeted “microsurgery”-style repairs.

Accurate mapping through quantum gravity sensors will really help here, and the University of Birmingham spin-out Delta.g has already raised £1.5m in 2023 to build its quantum gravity sensors for underground mapping. Pete Stirling, the company’s co-founder and chief executive, says its goal is to shrink the technology so that it can be deployed for real-life applications such as looking for hidden infrastructure and carrying out repairs.

“Given the broad applications across myriad industries, we’re hugely excited about our ability to use quantum gravity gradiometry to achieve vast cost-savings and pick up the pace of mapping work in a way that improves everyday life,” Stirling says. Imagine how useful it would be to have a zoomable and searchable “Google maps” database of all the myriad pipes, tunnels and cables that lie hidden under our feet.

So, despite the drop in venture-capital funding and fears of a “quantum winter”, the quantum-tech sector is still going strong and indeed maturing. The amount of funding is not therefore the only number we should focus on. In fact, more traditional metrics like field trials and product or service revenues are more appropriate.

While it may take many years for quantum computers to reach their full potential, there are nearer term applications, and in sensing and timing, where the end goals are clear. Product commercialization is in full swing for what, as far as I’m concerned, is the one of the most exciting branches of physics.

Brazil becomes first Latin American country to join CERN

Brazil has become the first country from the Americas to join the CERN particle-physics lab near Geneva. It is now an associate member of the lab after an earlier agreement in March 2022 was ratified by the country’s legislature with the country officially joining on 13 March. Brazil first started co-operating with CERN more than 30 years ago.

As an associate member, Brazilian nationals can now apply for staff positions and graduate programmes, while firms in Brazil can bid for CERN contracts. But unlike CERN’s 23 full member states, the country will not be represented on CERN Council or contribute to lab funding. Brazil is now the eighth associate member of CERN, with Chile and Ireland in the early stages of applying too.

Close collaboration

Formal co-operation between CERN and Brazil began in 1990 when scientists from the country starting taking part in the DELPHI experiment at CERN’s Large Electron–Positron Collider (LEP) – the predecessor to the Large Hadron Collider (LHC). Since then Brazil’s experimental particle-physics community has doubled in size to some 200 scientists.

Researchers, engineers and students from Brazil now collaborate in CERN experiments such as the four main LHC detectors – ALICE, ATLAS, CMS and LHCb – as well as in ALPHA antimatter experiment and the Isotope mass Separator On-Line facility, which produces and studies radioactive nuclei.

As well as particle-physics research, since December 2020 CERN and Brazil’s National Centre for Research in Energy and Materials have been formally cooperating on accelerator technology R&D.

Combining electrochemistry and neutron reflectometry in situ

Want to take part in this webinar?

The unique properties of neutrons make them unmatched probes for materials science. Although they are particles, they have a wave-like nature, with wavelengths in the same range as X-ray wavelengths, and readily pass through solid matter. Like X-rays, they refract, diffract, and interfere constructively and destructively with each other. Unlike X-rays, which interact mainly with the electron clouds around the atoms of a material, through electromagnetic forces, neutrons interact mainly with atomic nuclei, through the strong nuclear force; consequently, they can also be used to determine a material’s composition, by both element and isotope. Neutrons interact with light elements as strongly as with heavy elements, sometimes even more strongly, and many possibilities exist for isotope labelling experiments. For example, neutron scattering techniques readily detect hydrogen, which is a difficult or impossible target for most other analytical methods. They also distinguish hydrogen from deuterium like night from day.

This webinar describes the combination of electrochemical measurements with neutron reflectometry, a surface analytical technique, to provide sub-nanometre scale compositional and structural information on electrode surfaces under electrochemical control that is unobtainable by any other means. Illustrative examples using neutron reflectometry to monitor oxide film growth and hydrogen absorption on titanium, zirconium and copper are given.

An interactive Q&A session follows the presentation.

Want to take part in this webinar?

James (Jamie) Noël is an electrochemist and corrosion scientist whose research includes studies of the degradation of nuclear fuel and container materials for the permanent disposal of nuclear fuel waste. He leads a large and diverse research group that conducts experimental research on many aspects of the corrosion of copper, carbon steel, uranium dioxide, stainless steels, nickel alloys, and other materials. His industry research partners include the Nuclear Waste Management Organization (NWMO, Toronto), the Swedish Nuclear Fuel and Waste Management Company (SKB, Sweden), the National Cooperative for the Disposal of Radioactive Waste (Nagra, Switzerland), the Nuclear Waste Management Organization of Japan (NUMO, Tokyo), and Canadian Nuclear Laboratories (CNL, Chalk River). Jamie earned BS and MS degrees in chemistry from the University of Guelph and a PhD in chemistry from the University of Manitoba. He teaches ECS Short Courses on the Fundamentals of Electrochemistry at the Society’s biannual meetings, has been heavily involved with the ECS Corrosion Division and ECS Education Committee, and is an Associate Editor of CORROSION Journal. His awards include the ECS R.C. Jacobsen and Lash Miller Awards, Fellow of ECS, and the Western University Distinguished Research Professorship, Faculty Scholar Award, and Florence Bucke Science Prize. He has co-authored more than 130 refereed journal articles, 50 refereed conference proceedings papers, 20 commercial reports, and five book chapters – and scattered a few neutrons along the way.

Functional ultrasound imaging provides real-time feedback during spinal surgery

Damage to the spinal cord, whether by injury or disease, can have devastating impacts on health, including loss of motor or sensory functions, or chronic back pain, which affects an estimated 540 million people at any given time. A US-based research team has now used functional ultrasound imaging (fUSI) to visualize the spinal cord and map its response to electrical stimulation in real time, an approach that could improve treatments of chronic back pain.

Despite playing a central role in sensory, motor and autonomic functions, little is known about the functional architecture of the human spinal cord. Traditional neuroimaging techniques, such as functional MRI (fMRI), are impeded by strong motion artefacts generated by heart pulsation and breathing.

In contrast, fUSI is less impacted by motion artefacts and can image the spinal cord with high spatiotemporal resolution (roughly 100 µm and up to 100 ms) and high sensitivity to slow flowing blood during surgery. It works by emitting ultrasonic waves into an area-of-interest and detecting the echoed signal from blood cells flowing in that region (the power Doppler signal). Another advantage is that the fUSI scanner is mobile, eliminating the extensive infrastructure required for fMRI systems.

“The spinal cord houses the neural circuitry that controls and modulates some of the most important functions of life, such as breathing, swallowing and micturition. However, it has been frequently neglected in the study of neural function,” explains lead contact Vasileios Christopoulos from the University of California Riverside. “Functional ultrasound imaging overcomes the limitations of traditional neuroimaging technologies and can monitor the activity of the spinal cord with higher spatiotemporal resolution and sensitivity than fMRI.”

Previous research demonstrated that fUSI can measure brain activity in animals and human patients, including one study showing that low-frequency fluctuations in the power Doppler signal are strongly correlated with neuronal activity. More recently, researchers used fUSI to image spinal cord responses to electrical stimulation in animals.

In this latest work, Christopoulos and colleagues – also from the USC Neurorestoration Center at the Keck School of Medicine – used fUSI to characterize haemodynamic activity (changes in blood flow) in the spinal cord in response to epidural electrical spinal cord stimulation (ESCS) – a neuromodulation tool employed to treat pain conditions that don’t respond to traditional therapies.

In a first in-human study, the team monitored haemodynamic activity in six patients undergoing implantation of a therapeutic ESCS device to treat chronic back pain, reporting the findings in Neuron.

Utilizing a similar mechanism to fMRI, fUSI relies on the neurovascular coupling phenomenon, in which increased neural activity causes localized changes in blood flow to meet the metabolic demands of active neurons. The team used a miniaturized 15-MHz linear array transducer to perform fUSI, inserting it surgically onto the spinal cord at the tenth thoracic vertebra (T10), with the stimulation electrodes placed to span the T8–9 spinal segments. The recorded images had a 100 x 100 µm spatial resolution, a slice thickness of about 400 µm and a 12.8 x 10 mm field-of-view.

Four patients received 10 ON–OFF cycles of low-current (3.0 mA) stimulation, comprising 30 s with stimulation then 30 s without. Stimulation caused regional changes in spinal cord haemodynamics, with some regions exhibiting significant increases in blood flow and others showing significant decreases. Once the stimulation was switched off, blood flow returned to the initial condition.

To assess whether fUSI can detect haemodynamic changes associated with different stimulation protocols, the remaining two patients received five ON–OFF cycles of 3.0 mA stimulation followed by five cycles of 4.5 mA stimulation, with a 3-min pause between the two. The researchers found that increasing the current amplitude from 3.0 to 4.5 mA did not change the spatial distribution of the activated spinal cord regions. However, high-current stimulation induced stronger haemodynamic changes on the spinal cord.

This ability of fUSI to differentiate haemodynamic responses evoked by different ESCS currents is an important step towards developing an ultrasound-based clinical monitoring system to optimize stimulation parameters. Christopoulos explains that because patients are anaesthetized during spinal cord surgery, they cannot report whether the applied electrical stimulation protocol actually reduces pain. As such, the neurosurgeon cannot accurately assess the effects of neuromodulation in real-time.

“Our study provides a first proof-of-concept that fUSI technology can be used to develop closed-loop clinical neuromodulation systems, allowing neurosurgeons to adjust stimulation parameters (pulse width, pulse shape, frequency, current amplitude, location of stimulation, etc) during surgery,” he tells Physics World.

In future, the team hopes to establish fUSI as a platform for investigating spinal cord function and developing real-time closed-loop clinical neuromodulation systems. “We recently submitted for publication a clinical study demonstrating that fUSI is capable of detecting networks in the human spinal cord where activity is strongly correlated with bladder pressure,” says Christopoulos. “This finding opens new avenues for the development of spinal cord machine interface technologies to restore bladder control in patients with urinary incontinence, such as those with spinal cord injury.”

Why error correction is quantum computing’s defining challenge

“There are no persuasive arguments indicating that commercially viable applications will be found that do not use quantum error-correcting codes and fault-tolerant quantum computing.” So stated the Caltech physicist John Preskill during a talk at the end of 2023 at the Q2B23 meeting in California. Quite simply, anyone who wants to build a practical quantum computer will need to find a way to deal with errors.

Quantum computers are getting ever more powerful, but their fundamental building blocks – quantum bits, or qubits – are highly error prone, limiting their widespread use. It is not enough to simply build quantum computers with more and better qubits. Unlocking the full potential of quantum-computing applications will require new hardware and software tools that can control inherently unstable qubits and comprehensively correct system errors 10 billion times or more per second.

Preskill’s words essentially announced the dawn of the so-called Quantum Error Correction (QEC) era. QEC is not a new idea and firms have for many years been developing technologies to protect the information stored in qubits from errors and decoherence caused by noise. What is new, however, is giving up on the idea that today’s noisy intermediate scale devices (NISQ) could outperform classical supercomputers and run applications that are currently impossible.

Sure, NISQ – a term that was coined by Preskill – was an important stepping stone on the journey to fault tolerance. But the quantum industry, investors and governments must now realize that error correction is quantum computing’s defining challenge.

A matter of time

QEC has already seen unprecedented progress in the last year alone. In 2023 Google demonstrated that a 17-qubit system could recover from a single error and a 49-qubit system from two errors (Nature 614 676). Amazon released a chip that suppressed errors 100 times, while IBM scientists discovered a new error-correction scheme that works with 10 times fewer qubits (arXiv:2308.07915). Then at the end of the year, Harvard University’s quantum spin-out Quera produced the largest yet number of error-corrected qubits .

Decoding, which turns many unreliable physical qubits into one or more reliable “logical” qubits, is a core QEC technology. That’s because large-scale quantum computers will generate terabytes of data every second that have to be decoded as fast as they are acquired to stop errors propagating and rendering calculations useless. If we don’t decode fast enough, we will be faced with an exponentially growing backlog of data.

My own company – Riverlane – last year introduced the world’s most powerful quantum decoder. Our decoder is solving this backlog issue but there’s still a lot more work to do. The company is currently developing “streaming decoders” that can process continuous streams of measurement results as they arrive, not after an experiment is finished. Once we’ve hit that target, there’s more work to do. And decoders are just one aspect of QEC – we also need high-accuracy, high-speed “control systems” to read and write the qubits.

As quantum computers continue to scale, these decoder and control systems must work together to produce error-free logical qubits and, by 2026, Riverlane aims to have built an adaptive, or real-time, decoder. Today’s machines are only capable of a few hundred error-free operations but future developments will work with quantum computers capable of processing a million error-free quantum operations (known as a MegaQuOp).

Riverlane is not alone in such endeavours and other quantum companies are now prioritising QEC. IBM has not previously worked on QEC technology, focusing instead on more and better qubits. But the firm’s 2033 quantum roadmap states that IBM aims to build a 1000-qubit machine by the end of the decade that is capable of useful computations – such as simulating the workings of catalyst molecules.

Quera, meanwhile, recently unveiled its roadmap that also prioritizes QEC, while the UK’s National Quantum Strategy aims to build quantum computers capable of running a trillion error-free operations (TeraQuOps) by 2035. Other nations have published similar plans and a 2035 target feels achievable, partly because the quantum-computing community is starting to aim for smaller, incremental – but just as ambitious – goals.

What really excites me about the UK’s National Quantum Strategy is the goal to have a MegaQuOp machine by 2028. Again, this is a realistic target – in fact, I’d even argue that we’ll reach the MegaQuOp regime sooner, which is why Riverlane’s QEC solution, Deltaflow, will be ready to work with these MegaQuOp machines by 2026. We don’t need any radically new physics to build a MegaQuOp quantum computer – and such a machine will help us better understand and profile quantum errors.

Once we understand these errors, we can start to fix them and proceed toward TeraQuOp machines. The TeraQuOp is also a floating target – and one where improvements in both the QEC and elsewhere could result in the 2035 target being delivered a few years earlier.

It is only a matter of time before quantum computers are useful for society. And now that we have a co-ordinated focus on quantum error correction, we will reach that tipping point sooner rather than later.

Copyright © 2024 by IOP Publishing Ltd and individual contributors