This week, the Economist hosted the “Commercialising Quantum Global” conference in the UK and I was very pleased to attend in person on Wednesday. The meeting was held in the heart of the City of London, one of the world’s great financial centres. This was no coincidence, because this was not a conference primarily about science, or even technology – business was at the centre of most discussions.
The conference centre was in a part of the City called Houndsditch, which is just outside of what had been London’s medieval wall. I’m probably making too much of the symbolism of this location, but it seemed appropriate for the upstart quantum industry to be camped just outside of a citadel of commerce, plotting its entry.
After the first few talks at the conference, it became clear to me that most people there believed that quantum computing and other quantum technologies could bring great business opportunities as well as threats. As I scanned the speaker list for the day, I decided that one way of getting a broad understanding of how quantum could affect business was to attend two talks by people in the insurance industry.
Optimizing reinsurance
Those two speakers were Roland Scharrer, who is group chief data and emerging technology officer at AXA, and Andreas Nawroth who is leading expert for artificial intelligence at Munich Re.
Scharrer says that AXA started exploring quantum technologies in 2020. Indeed, many of the speakers at the conference said that their companies have been investigating quantum computing for about two to three years. And like many other companies, one of AXA’s main interests in quantum computing is using it for optimization.
For Scharrer a primary interest is using quantum algorithms to minimize the risk, and maximize the profit, associated with AXA’s use of reinsurance. Reinsurance is a product that one insurance company buys from another insurance company to cover losses in certain circumstances. This allows an insurance company to share risk with others and it is often used to cover so-called “black swan” events. These are very rare events that are extremely difficult to predict and can be very costly for insurers
Heuristic approach
Striking a balance between using reinsurance and insuring risk internally is a classic optimization problem that is very important for an insurance company to get right. Getting things wrong, even by a tiny bit, can be very costly. Scharrer explains that optimization is currently done using a heuristic approach that relies on human expertise.
While reinsurance optimization could be done better on a conventional computer, Scharrer says that it would take decades to do the calculations. And that is where a quantum computer could come in handy – because some quantum computers are predicted to be very good at solving certain optimization problems that could be relative to reinsurance. But like a lot of the technology being discussed at the conference, such a quantum computer does not yet exist.
In his talk, Munich Re’s Nawroth talked about how insurers could use quantum computers to do simulations that could help them better understand a wide range of phenomena that affect risk. These include climate change, green technologies, financial markets, pandemics, cyber security and so on.
Insuring for quantum effects
But for me, the most interesting thing that he spoke about was the need for insurers to understand the risks associated with the peculiar nature of quantum computing itself. This because their customers will want to ensure against these risks. One of these risks is associated with the no-cloning theorem of quantum mechanics, which states that it is impossible to create an exact copy of a quantum state. This, says Nawroth, would make it difficult for a quantum information system to recover after a cyber attack.
Another risk is that quantum algorithms are currently poorly understood, so it is difficult to insure against risks associated with their use. Finally, Nawroth pointed out that a move to quantum computing would mark a shift from deterministic to probabilistic algorithms – which again pose new challenges when it comes to insurance.
The simple fact that I was able to attend two talks on insurance and quantum computing, makes it clear that discussions around quantum technology have “moved beyond physics”. Indeed, I would say that this was an overriding theme of the conference. While I understand why progressing from basic science is a milestone in the commercialization of any technology, I’m not convinced that quantum computing is quite there yet.
Artificial comparison
For example, several speakers compared quantum computing to artificial intelligence (AI) in terms of its potential disruptive effects on business and on society. While it’s tempting to draw parallels between the two, I think it’s important to keep in mind that AI is a fully fledged technology that is already seeing widespread commercial use. And, in the case of ChatGPT, AI can be accessed from any smartphone. In contrast, quantum computing is a much more nascent technology that is only now seeing a few green shoots of commercial application.
Jay Gambetta, who leads IBM’s quantum computing initiative, is one who embraced this idea of moving on. He said that we are beyond the “quantum is cool” phase and have moved into the “utility” phase in the development of quantum computers. IBM’s 2023 generation of quantum processors will have 100–1000 quantum bits or qubits and the company intends to scale this up to 100,000 qubits in the next decade – creating machines that could address a range of practical computing problems. While much of this effort will be focused on engineering, I’m sure physicists will play important roles in making this happen – so perhaps it is a bit early to say that the industry has moved away from physics and into a truly commercial world