Skip to main content

What the storming of the US Capitol tells us about science

The Apotheosis of Washington

If you stand in the Great Rotunda in the neoclassical US Capitol Building and look up, you’ll see high above a concave fresco entitled The Apotheosis of Washington. Painted in 1865 by the Greek–Italian artist Constantino Brumidi, it shows the first US president surrounded by six allegorical scenes. The details are hard to make out from 50 m below, but with binoculars – or Google – you can spot George Washington gesturing towards a scene representing science.

The central figure in that particular scene is the Greek goddess Athena. Neither looks at the other; Washington has other things on his mind, while Athena is teaching people – including Benjamin Franklin, Samuel Morse and the steamboat pioneer Robert Fulton – about a spark gap. Brumidi knew that Washington, like America’s other founding figures, believed in an association between effective democratic institutions and science.

I thought of Brumidi’s fresco on 6 January this year as I watched live TV footage of domestic terrorists in the Great Rotunda assaulting police, smashing artefacts and splashing blood on a sculpture (Brumidi’s painting, high up in the oculus of the dome, was unharmed). The carnage was incited by leaders who (amplified by social media) were warring against both democratic institutions and science. I wondered about the connection between the two wars.

The three Cs

Each war is associated with a “grand story”. The grand story of the war on democracy is that the 2020 US presidential election was fraudulent; the grand story for science is that evidence for things like climate change, the pandemic and vaccines is false. Each story provides justifications for rejecting contrary evidence, with the key elements being that the evidence has been faked, that a group of people plotted that fakery, and that attacking it is moral and just. I think of these elements as the “three Cs”: conviction, conspiracy and community.

Let’s start with the first C, that adherents are firmly convinced of their beliefs. Such convictions insulate belief against the doubt that might inspire need for further inquiry; contrary evidence must be wrong or was manipulated.

“If my friends lose the election, ballots were stolen,” say believers of the first story; “Scientific evidence against my view was faked,” say believers of the other. Alternative “experts” are found to reassure believers. The Capitol invaders, for instance, swear by certain disaffected politicians, while science deniers turn to the likes of Bjørn Lomborg (for his views on climate change), Peter Duesberg (AIDS) and Andrew Wakefield (anti-vaccination).

If contrary evidence persists, the reason must be a conspiracy – an organized effort to produce falsehoods. In one grand story, the conspirators are socialists, political opponents, those of other races, and the “deep state”; in the other, foreigners and the medical and scientific establishment. Conspiracies explain contradictory evidence and strengthen buffers against it.

The trouble with conspiracies is that they’re non-falsifiable, because any evidence against them is dismissed as manufactured by the conspirators

The trouble with conspiracies is that they’re non-falsifiable, because any evidence against them is dismissed as manufactured by the conspirators. Conspiracies are also comforting, as they tell believers that the truth is not difficult and that they already know what’s really happening. Believe in a conspiracy theory and you don’t need to understand, say, climatology, epidemiology, demographics, physics or voting machine technology.

The third element bolstering grand stories is that they make believers feel spiritually and morally uplifted. Grand stories provide an apparent moral clarity, dividing the world into a blameless “us” and a wicked “them”, with the former representing the community as a whole and the latter a malevolent minority. To keep the group from splintering into sub-tribes with different views and aims, grand stories maintain unity through pageantry and entertainment.

I’ve seen anti-nuclear protests against research reactors that were picnics, with folk singers and dancers and people dressed as mushroom clouds and skeletons, while pro-science groups also have slogans and symbols. The Capitol’s invaders shared a mix of anger and celebration. Some painted their faces in patriotic red, white and blue and dressed as bald eagles or revolutionary war figures, while others carried iconography of racism and antisemitism such as Confederate flags.

Antidotes such as “better communication”, “science literacy” or “more dialogue” are ineffective; the messier and more difficult truth is harder to explain

The three Cs reinforce each other in a way that makes them propagate easily. Wouldn’t it be great if you didn’t need to investigate complex issues involving your health and welfare? Which would you rather watch: a parade of invaders smashing the halls of government, or broadcasts of a legislative session or scientific conference? Don’t you wish truth and moral clarity were easier?

Grand stories aim to spread enough distrust so that the most persuasively and vividly presented position seems the truest. This is why commonly suggested antidotes such as “better communication”, “science literacy” or “more dialogue” are ineffective; the messier and more difficult truth is harder to explain.

Democracies have ways of tolerating grand stories without suppressing them or letting their members dominate headlines, affect decisions or invade buildings. These ways involve a sifting process in which experts and institutions exercise judgment by weighing evidence, consulting experts and repeated inquiry.

This is not elitism, but democracy trying to make itself work. In the US at least, this process broke down well before 6 January. There’s a long-term danger if we allow grand storytelling to metastasize in social life and become normalized in politics, disconnecting beliefs from reality and blurring the distinction between fact and fiction.

The critical point

I have no idea what George Washington and Athena would have thought about the rampage taking place beneath them. The kind of battle occurring was not one each had to fight. To keep it from recurring will involve rebuilding trust and creating an even grander and still more uplifting story whose key elements are periodically rechecked facts, discerningly chosen experts, respect for irritations of doubt, and messier truth and moral vision. This is painstaking, frustrating and never-ending work, but the price of effective democracy.

Molecular qubits stick around for longer

Researchers in China have shown that the spin of a molecular quantum bit (qubit) can remain coherent for more than 1 millisecond – long enough to perform 145 000 basic logic operations. This number, known as the qubit “figure of merit”, is 40 times higher than previously reported for this molecule, raising the chances that such qubits could be used in quantum computing applications as well as in biomedical imaging and quantum sensing.

Quantum computers can, in principle, solve certain problems much faster than classical computers because they exploit a quantum particle’s ability to be in a superposition of two or more states at the same time (as opposed to classical bits that have only 0 and 1 states). Promising candidates for qubits include superconducting circuits, trapped ions, defects in solid materials and quantum dots.

Electron spins in magnetic molecules as qubits

Recently, electron spins in magnetic molecules have emerged as another qubit possibility. Compared with other physical systems, these molecular qubits have several advantages. For one, researchers can easily tailor their structure by changing their chemical makeup. It is also relatively straightforward to fabricate lots of identical molecular qubits and deposit them in regular arrays to create circuits.

Like all qubits, however, the superposed states in molecular qubits are fragile and easily disrupted by noise in the environment. This noise destroys the quantum information stored in the states, in a process known as decoherence. While various methods exist for overcoming decoherence in molecular qubits (including diluting the qubits in a diamagnetic matrix, enhancing the rigidity of the molecules and isotropic purification), the longest coherence time measured for a molecular qubit to date has been less than a millisecond.

Dynamic decoupling technique

A team at the University of Science and Technology in Hefei has now improved on that figure by applying microwave pulses to “flip” the quantum state of molecular qubits – a method known as dynamical decoupling. As team member Xing Rong explains, inverting the state of the molecule’s electron spin greatly averages out the coupling, or interaction, between the qubit and its environment, so extending the qubit’s coherence time.

Rong and colleagues made their molecular qubits from the transition-metal complex (PPh4)2[Cu(mnt)2]. Using a modified commercial X-band pulsed electron paramagnetic resonance spectrometer, they measured a coherence time of 1.4 milliseconds for the system. The previous best value for the material was just 6.8 μs.

Applications and future challenges

“The dynamical decoupling method described in our work does not require us to specially modify the molecule, in contrast to other approaches,” Rong tells Physics World. “The longer coherence time we measured could enable the molecular qubit to be widely used — not only in the field of quantum computation, but also in magnetic biomedical imaging and quantum sensing.”

The researchers, who report their work in Chinese Physics Letters, say they now plan to steer the spin coherence of their molecular qubit at the single molecular level. “This is a key step for a molecular qubit in quantum information processing and is very challenging because the signal from a single molecular qubit is very weak,” Rong says.

Why Hawaii has the best rainbows, ancient carpet still dazzles thanks to fermented wool

For several years, Physics World headquarters had large windows with a northern exposure, and that coupled with showery weather in Bristol meant that we often saw spectacular rainbows. It turns out, however, that Hawaii not Bristol is the best place in the world to view rainbows – at least according to Steven Businger of (you guessed it) the University of Hawaii.

Indeed, rainbows are so plentiful in the US state that the Hawaiian language has many different words for rainbow related phenomena. According to Businger, “There are words for Earth-clinging rainbows (uakoko), standing rainbow shafts (kāhili), barely visible rainbows (punakea), and moonbows (ānuenue kau pō), among others”.

Hawaii has lots of rain showers punctuated by sunshine and Businger points out three factors that make this happen. Hawaii’s mountains create showers, as do the combination of warm sea surfaces and cold air in the morning. Later in the day, the hot sun drives convection related precipitation. He also says that the relatively clean air of Hawaii makes its rainbows appear crisp and clear.

Businger sets out the case for Hawaii’s pre-eminence in “The secrets of the best rainbows on Earth”.

Not fade away

Staying on the subject of brilliant colours, The Pazyryk carpet is the oldest known knotted-pile carpet in the world, dating back to 400 BCE when it is believed to have been made in central Asia. Now in the State Hermitage Museum in Saint Petersburg, Russia, the carpet’s still-brilliant colours have long puzzled textile experts – particularly because the carpet lay buried in extreme conditions for 2500 years.

Now, researchers at FAU University in Germany have used high-resolution X-ray fluorescence microscopy to conclude that the wool used to make the carpet was fermented before it was dyed. This process allowed more of the dye to penetrate deeper into the wool fibres, which prevented the colour from fading. By fermenting and then dying some wool themselves, the team was able to confirm its hypothesis.

You can read more in “X-ray microscopy reveals the outstanding craftsmanship of Siberian Iron Age textile dyers“.

Mysterious moons Phobos and Deimos formed from the same body, say researchers

A new theory for the mysterious origins of Mars’ two tiny moons Phobos and Deimos has been developed by Amirhossein Bagheri and colleagues at ETH Zurich in Switzerland and the US Naval Observatory. The team used a combination of data and modelling to conclude that both moons may have come from the same body (a “protomoon”), which then broke apart. Their model could soon be improved with the help of Japan’s upcoming Martian Moons Exploration mission, due for launch in 2024.

Of all the moons in the solar system, Phobos and Deimos are among the most enigmatic in origin. With their irregular shapes, cratered surfaces, and diameters of just 22 km and 13 km respectively, the pair are commonly believed to be asteroids pulled in by Mars’ gravitational field. However, they also follow highly circular orbital paths, which lie almost perfectly in Mars’ equatorial plane. These paths would be highly unlikely for randomly captured asteroids, suggesting instead that Phobos and Deimios formed alongside Mars.

Yet one further aspect of their orbits, relating to their widely differing orbits, mean that this theory is also problematic. Notably, Phobos orbits well beneath Mars’ synchronous radius – the point at which a moon’s orbital period perfectly matches the rotation of its host planet. Meanwhile, Deimos orbits well beyond this point. These factors make it difficult for astronomers to explain how Phobos and Deimos could have formed at the same time.

Tidal interactions

Bagheri’s team explored the problem from a new perspective: this time, accounting for the energy dissipated during tidal interactions between the three bodies. To determine how these interactions would affect the orbits of Phobos and Deimos, the researchers used the latest geophysical data for Mars, gathered by NASA’s InSight mission. They also combined these data with both lab-based and theoretical models describing Mars’ tidal deformation.

Based on their results, Bagheri and colleagues conclude that Phobos and Deimos may have both originated from a protomoon – which formed alongside Mars at a roughly synchronous radius, before breaking into two parts. To explore this idea, the researchers used simulations to turn back the clock on the orbits of both moons.

This revealed that after breaking away from Deimos, Phobos would have initially had an elliptical orbit, which became increasingly circular due to tidal energy dissipation. As this happened, Phobos would have moved beneath Mars’ synchronous radius, reaching its current path after roughly 2.7 billion years. Meanwhile, Deimos would have had a more circular orbit from the start, meaning not enough energy was tidally dissipated to drive the moon under the synchronous radius.

This theory offers key clues about the futures of both bodies. While Deimos will likely continue to recede from Mars, Bagheri’s team predict that Phobos will continue on its inward spiral; eventually either crashing into Mars, or tidally disintegrating into a ring in roughly 39 million years. The proposal is still far from complete, but crucial new insights could soon be gathered by the Martian Moons Exploration – which will collect samples directly from Phobos, and make flyby observations of Deimos.

The research is described in Nature Astronomy.

Mass produced spheroids line up for tissue repair

Spheroids – three dimensional “balls” of cultured cells – are widely used within medical research and increasingly employed as building blocks for tissue engineering applications. Current methods for creating these spheroids, however, are time-consuming, require large amounts of reagents and have high running costs.

A research team in Austria has now developed a versatile and cost-effective technique for high-throughput spheroid production, using compartmented cell-culture dishes. Writing in Biofabrication, the team explains how the new system considerably reduces production costs and time, and can create spheroids with potential for use in tissue regeneration.

The researchers used laser engraving to quickly and easily create grids on standard 60 mm cell culture dishes. Each dish can produce approximately 200 spheroids, with one spheroid growing in each compartment.

“In previously work, we observed that growth surface restriction can cause focal cell aggregations. Realizing the potential for 3D culture generation, we developed a system based on compartmented culture plates,” explains Sylvia Nürnberger from Medical University of Vienna. “On these surfaces, mesenchymal stromal cells and chondrocytes not only aggregated, but formed complete spheroids that eventually detached from the surface. This method of spheroid formation is completely new.”

Sylvia Nürnberger and Marian Fürsatz

The novel production method requires less handling time than standard pellet culture approaches, in which spheroids are grown in a 96-well plate, and uses 10 times less media, significantly reducing reagent costs. “Our system also allows for additional early read-outs, since the spheroid formation phase by itself provides additional information on the cellular fitness,” adds first author Marian Fürsatz.

Spheroid growth kinetics

Nürnberger and colleagues studied spheroid formation using human adipose-derived stem cells (ASC/TERT1) and human articular chondrocytes (hAC) – cells with potential for use in cartilage repair.

When seeded in the grid plate, both cell types adhered only to the compartment surface (not to the laser incisions), where they initially formed a 2D cell monolayer. Spheroid formation started with contraction of this monolayer at the compartment edges, followed by rolling up of the cell layer, typically leaving a few points anchored to the plate surface. Conversion into spheroid form involved rapid loss of an anchor point, contraction and then full condensation into a sphere.

The stem cells formed free-floating spheroids after approximately three weeks, the chondrocytes after two months or more. By changing the grid size, the researchers could control the final spheroid size. Growth on 1 and 3 mm grids resulted in ASC/TERT1 spheroids with mean diameters of 134 and 340 µm, respectively.

As the stem cells created spheroids faster than the chondrocytes, the researchers tested whether adding ASC/TERT1 to hAC cultures could speed up their formation. Surprisingly, they found that the co-cultures formed spheroids even faster than ASC/TERT1 cells alone.

Assessing different ASC/TERT1:hAC ratios (80:20, 50:50 and 20:80) revealed that co-cultures with 50:50 and 20:80 ratios formed spheroids in roughly seven and six days, respectively, significantly faster than pure ASC/TERT1 cultures, which took around 21 days. The co-cultures created spheroids with slightly lower diameter, due to their faster formation, with higher hAC ratios causing greater size variation and more small spheroids.

Next, the researchers compared the differentiation capacity of spheroids generated by grid plates and via standard pellet culture. The relative gene expression and differentiation index were comparable between grid-plate and pellet culture spheroids. Both production methods created spheroids with similar circularity and roundness.

Comparing the internal structure of grid-plate spheroids and standard pellet cultures revealed clear differences in their internal structure. In particular, grid-plate ASC/TERT1 spheroids exhibited internal strands of dense matrix. These compact matrix strands might provide a denser and more cartilage-like environment for the cells and a higher stiffness, which could be an advantage for in vivo application.

Tissue regeneration potential

The researchers next assessed the suitability of grid-plate spheroids for cartilage repair, by embedding ASC/TERT1 and 50:50 hAC:ASC/TERT1 spheroids into a fibrin hydrogel. Co-cultures showed strong cellular outgrowth in all directions, reaching an approximate radius of 0.5 mm after 14 days. ASC/TERT1 spheroids showed higher outgrowth towards other spheroids, but slower outgrowth in other directions. Staining revealed that co-cultures induced greater matrix deposition around the spheroids.

Grid-plate spheroids

This strong outgrowth and matrix deposition suggests that the spheroids would likely also grow if delivered in vivo and thus could be used for cartilage repair. “This is one of the intended applications,” Nürnberger tells Physics World. “Spheroids could be implanted directly into the defect or be used as building blocks for bio-printing prior to implantation.”

The team is now investigating strategies to use the spheroids for cartilage defect regeneration, as well as in drug screening, since spheroid formation speed is altered by influences such as cytokines and drugs. “However, we are always thinking of new possibilities to apply our spheroid formation approach to new applications,” adds Nürnberger.

Meteorite hunters find fireball fragments in England, CERN collider has discovered 59 new hadrons

In this podcast episode we talk to Áine O’Brien of the University of Glasgow who is part of a team of meteorite experts who have gathered up remnants of a 100 kg carbonaceous chondrite meteoroid that exploded over southern England on the last day of February. She explains how a network of cameras and clever mathematics allowed scientists to work out where the fragments landed, and what it was like being out in the field looking for them. O’Brien also talks about how studying the meteorites could shed light on the conditions in which the solar system formed.

This week’s podcast also features particle physicist Tim Gershon of the University of Warwick, who gives us a flavour of the 59 new hadrons that have been discovered by CERN’s Large Hadron Collider (LHC) since it switched on in 2010. Gershon works on the LHCb experiment, which itself has just discovered four new exotic hadrons called tetraquarks. He explains what discovering new hadrons could tell us about the strong force and talks about how LHCb is being completely rebuilt so it can detect even more new particles when the LHC restarts – which could be as soon as next year.

Programmable photonic chip lights up quantum computing

A photonic chip balances on a person's finger

Computers are made of chips, and in the future, some of those chips might use light as their main ingredient. Scientists from the Ontario, Canada-based quantum computing firm Xanadu and the US National Institute of Standards and Technology have taken a big step towards that future by building a light-based chip that can be programmed through cloud access.

While conventional computers use electricity to create the ones and zeros that are their lifeblood, quantum computing experts have multiple options when developing their quantum bits (qubits). Some rely on superconductors, some start with extremely cold atoms, and some, like the researchers at Xanadu, use light.

But not just any light. The light that travels through the thumbnail-sized Xanadu chip, or circuit, has been “squeezed” – that is, its quantum uncertainty has been minimized. Squeezing light is possible because of the Heisenberg uncertainty relation that says that trying to make any microscopic object very narrow is like squashing a piece of clay: the narrower it gets in one direction, the more it bulges in another. Squeezing light produces precisely shaped photonic states that can be used for very accurate measurements in optical physics. Xanadu researchers, however, had other ideas: they used these squeezed states as qubits.

Optical computations

Xanadu’s chip works in three stages. First, laser light is fed into four microring resonators – tiny circular tracks in which light loops around and changes shape as it, in effect, catches its own tail. These resonators act as “squeezers” that smush many photons into a single squeezed state.

Next, a network of optical elements manipulates the photons’ properties in a way that is analogous to changing their direction by bouncing them off a mirror or changing their colour by passing them through a filter. Sequences of these light manipulations are the equivalent of computer code. Whenever the network bounces or rotates light, it executes operations similar to adding ones and zeroes in a classical computer.

In the final stage, the light enters a detector that counts how many photons are within each squeezed state. The result of the computer’s calculation lies in these photon numbers. “Some particular integer pattern of photon counts for a particular circuit that you dialled in will tell you something about the problem that you encoded in the device,” says Zachary Vernon, a physicist at Xanadu and a co-author on the study.

Vernon explains that this approach makes it possible to perform some computations that are new even to other quantum computers. “It lets you access a space of problems which are different than the ones that are accessible by matter-based qubit devices,” he says. In one particularly novel calculation, squeezed states encoded the shape of two graphs. The photon numbers detected at the end of the computation reflected how much structure those graphs had in common. This graph similarity analysis would not be easy to implement on any other quantum computer, Vernon says.

The small size of the Xanadu chip is another key advantage. According to Shuntaro Takeda, a physicist at the University of Tokyo, Japan, who was not involved with the study, previous squeezed-light experiments required large tables full of bulky optical elements like mirrors and lenses. In Takeda’s view, on-chip integration technology like Xanadu’s will be indispensable for building large-scale, general-purpose optical quantum computers in the future.

Being able to perform more than one calculation is already a leap forward for light-based quantum computing, says Zheshen Zhang, a quantum information researcher at the University of Arizona in the US who was also not part of the study. He notes that similar devices could, in the past, execute only one type of code, and could not be programmed to perform different tasks for different users. The Xanadu chip’s accessibility through a cloud service is a further benefit, he says.

Effects of photon loss

To make their devices useful for a broad base of future quantum programmers, Xanadu’s scientists still need to overcome some scientific and engineering challenges. In the current setup, for example, many photons are lost as they travel through the chip due to small flaws in the chip’s structure. Engineering more perfect chips and developing codes that take photon loss into account could be important for future generations of these device, Zhang says. Future chips will also have to handle more information – and thus more light – before they can outperform classical computers.

One example of a problem where a classical and an optical quantum computer could go head-to-head would involve simulating the behaviour of many molecules. “Can you show that the classical algorithm of simulating such a problem becomes intractable whereas the quantum algorithm would still allow you to actually get the answer?” Zhang asks.

The Xanadu team says that addressing this question is the next item on their agenda. The team has, however, already measured the quantum-ness of the device by demonstrating that approximating its mechanisms by some classical model would be extremely difficult. “If everything else stays the same, and you scale the [chip] system up, it will still be very quantum,” Vernon says. “Of course, a lot of things have to come together to make that work.”

The team reports the work in Nature.

Quantum conference offers business insight

The first quantum revolution, in which research physicists conceived novel experiments to probe and manipulate quantum states, has paved the way for a new era of engineering quantum systems for real-world applications. Quantum technologies are already being explored for improving the security of communications networks and developing more precise sensors, while quantum computing offers the potential to speed up drug discovery, reveal the secrets of protein folding, enable new approaches to machine learning and artificial intelligence – and much more besides.

Critical to the success of such real-world applications will be the development of a commercial ecosystem, in which technology suppliers work alongside research teams to develop and deliver key elements of a practical quantum system. Reflecting this need is a new conference and industry event, Quantum Business Europe, which aims to forge new collaborations and provide the business community with the knowledge, skills and connections they need to embark on the quantum revolution. The fully digital conference will run online on 16 –17 March 2021, with all sessions live-streamed and then available to watch on-demand for two months after the event.

The conference organizers hope to provide a forum that will bring together all the key players in Europe’s rapidly growing quantum sector. Delegates will be able to explore the latest advances and business applications of quantum technologies, exchange knowledge and ideas, and better understand the challenges and opportunities offered by quantum technologies.

A high-level conference programme will feature 50 expert speakers, who will offer a strategic view on the future development of the quantum sector and highlight some of the emerging use-cases for quantum technologies. The opening panel session, for example, will include Paula Forteza, a member of the French National Assembly, and Tommaso Calarco, chair of the European Quantum Community Network, and will discuss how Europe is preparing for a quantum future – with more than a billion Euros earmarked for the development of quantum technologies over the next decade.

Other keynote speakers include Accenture’s Matthias Ziegler, who will offer an analysis of the emerging quantum computing ecosystem, and Alexia Affuvès, Head of Quantum Engineering Grenoble, who will discuss the potential of quantum computation to cut energy use and reduce our digital footprint. Parallel sessions in the afternoon will focus on business applications of quantum technologies, ranging from finance and insurance to quantum communications, quantum sensing and quantum computing in the automotive and pharmaceutical industries.

Alongside the conference will run a series of more than 30 technical demonstrations by leading research teams and technology vendors. Intel will be showcasing recent advances in qubit design and control, while Atos will reveal how quantum computing can be used for combinatorial optimization. Cryogenics specialist Bluefors will offer a demo of its Cryogenic Wafer Prober, described in more detail below, while a virtual trade show will feature 20 companies eager to discuss the latest innovations that will provide the building blocks of next-generation quantum systems.

If you would like to take part in the event, visit the Quantum Business Europe website to register for a full conference pass or secure free access to the virtual exhibition and demo sessions.

Cryogenic technology enables quantum progress

Bluefors Afore The Cryogenic Wafer Prober

Finnish company Bluefors has perfected a series of commercial cryogenic systems that make it easier to assemble and test a quantum system in ultracold conditions. One recent addition to the portfolio is the Cryogenic Wafer Prober, which enables automated wafer-level testing at temperatures well below 4 K. Developed in partnership with Afore, which specializes in developing application-specific test solutions for semiconductor chips, the automatic testing solution offers fast sample characterization – with a throughput up to 100 times faster than conventional cryogenic chambers – as well as the ability to probe an entire 300 mm wafer.

The Cryogenic Wafer Probe has recently been acquired by CEA-Leti, the technology research institute of the French Alternative Energies and Atomic Energy Commission, to characterize silicon-based qubits at low temperatures. “This unique testing solution will become an essential part of the R&D and ramp-up to future commercial production of quantum and superconducting devices,” commented Bluefors’ Vitaly Emets.

The wafer prober features an active alignment system that can automatically locate and contact devices anywhere on the wafer, while an intuitive user interface provides direct control and full overview of the testing process. In addition, the load-lock system has been designed to allow fast wafer change at cryogenic temperatures.

The Cryogenic Wafer Prober is just one of many innovations that Bluefors has introduced for making quantum experiments quicker and easier to set up. Last year the company introduced the option of high-density wiring, which has become increasingly important as scientists seek to increase the number of qubits in their quantum computing systems. This high-density interface allows more than 1000 high-frequency control lines to be installed in a single system, and has been designed to allow the wires to be installed in blocks of 12.

The high-density interface exploits standard connectors and coaxial cables for the wiring, and the attenuators have been embedded in a single block that fits into the Bluefors’ modular cryogenics system. This modular form factor also allows the use of custom components with multiple high-density channels, such as amplifiers, filters and attenuators.

  • For more information about Bluefors’ cryogenics technology, read the Physics World article Cool technology enables quantum computing. On 17 March, you can also tune into the company’s technical demonstration of the Cryogenic Wafer Prober at Quantum Business Europe.

Physicists measure smallest gravitational field yet

Physicists in Austria have measured the gravitational field from the smallest ever object: a gold sphere with a diameter of just 2 mm. Carried out using a miniature torsion balance, the measurement paves the way to even more sensitive gravitational probes that could reveal gravity’s quantum nature.

For years, Einstein’s general theory of relativity and Newton’s universal law of gravitation have been subjected to ever more stringent tests. These tests have involved both astronomical observations and laboratory experiments. Usually, the masses that provide the gravitational field in the latter are large objects of several kilograms or more, such is the need to compensate for gravity’s inherent weakness.

The latest work, in contrast, uses a gold sphere with a mass of just 92 mg as its source. Markus Aspelmeyer and Tobias Westphal of the Institute for Quantum Optics and Quantum Information in Vienna and colleagues positioned this mass a few millimetres away from another tiny gold sphere with about the same mass located at one end of a 4 cm-long glass rod. The rod was suspended at its centre via a silica fibre, while a third sphere at the far end of the rod acted as a counterbalance.

Such “torsion balances” have been used for more than 200 years to make precise measurements of gravity. The idea is that the source mass pulls the near end of the bar towards itself, causing the suspending fibre or wire to rotate. By measuring this rotation and balancing it against the stiffness of the wire, the strength of the gravitational interaction can be calculated. The fact that the bar moves horizontally means it is less exposed to the far larger gravitational field of the Earth.

Noise-reduction strategies

A major challenge with such experiments is screening out noise. Aspelmeyer and colleagues did this by placing the balance in a vacuum to limit acoustic and thermal interference, while also grounding the source mass and placing a Faraday shield between it and the test mass to reduce electromagnetic interactions. In addition, they mainly collected data at night to minimize ambient sources of gravity. This is important because the gravitational attraction of the source mass is equivalent to the pull of a person standing 2.5 m from the experiment or a Vienna tram 50 m away.

A gold sphere resting on a 1 euro cent coin that has been digitally altered to appear warped by the sphere's gravity

To generate signals above the remaining noise, the researchers used a bending piezoelectric device to cyclically move the source towards and away from the test mass. Doing this at a fixed frequency (12.7 mHz) allowed them to look for a corresponding variation in the rotation of the balance – which they measured by bouncing a laser beam off a mirror below the silica fibre.

After repeating this process hundreds of times over a 13.5-hour period and then converting the time-series data into a frequency spectrum, Aspelmeyer and colleagues identified two clear signals above the background. These were the principle oscillation at 12.7 mHz and, at 25.4 mHz, the second harmonic generated by the gravitational field’s nonlinear variation in space. As the researchers point out, both harmonics were well above the resonant frequency of the oscillating balance and below the frequencies of readout noise.

A Newtonian result – for now

By using a camera to record the changing distance between source and test mass, the physicists also plotted how the gravitational force varied in space. They say that their data – a smooth curve dropping off as the square of the distance – provide unambiguous evidence of Newtonian gravity. What’s more, they also calculated their own value for the gravitational constant, G. This quantity remains a headache for metrologists, given the very precise but mutually inconsistent measurements of it made by different groups. The group’s result – a weighted mean based on 29 measurements during the seismically-quiet Christmas period in 2019 – is unlikely to resolve those disputes, being around 9.5% smaller than the official CODATA value of 6.674×10−11m3kg−1s−2. However, the researchers note that that this margin is within the roughly 10% uncertainty they obtain by totting up all the known sources of systematic error in their experiment.

Looking ahead, Aspelmeyer and colleagues argue that their experimental approach could in principle be extended to still smaller source masses. In particular, they say it should be possible to significantly reduce thermal noise by increasing the fibre’s quality factor. Raising the current value of about 5 to more than 20,000 could allow for source masses below the Planck mass of 22 μg – thereby raising the prospect of probing quantum gravity.

Getting to that point will, they caution, require mitigating other sources of noise. However, they reckon that these problems are solvable. Low-frequency noise from human sources, for example, could be reduced by transferring the experiment to a suitably remote location. Casimir forces, meanwhile, could be limited through electromagnetic shielding and signal modulation.

Andrew Geraci of Northwestern University in the US agrees that the work could lead to quantum-based investigations. He explains that placing very small objects into a quantum superposition would allow scientists to determine whether gravity plays a role in the entanglement of quantum systems. “While there is still a long way to go before this can be achieved,” he says, “I consider the work to be exciting progress in this direction.”

The research is published in Nature.

Integrated system offers easy and scalable quantum control

A quantum chip may measure just a few millimetres across, but the equipment needed to cool, control and measure such delicate quantum systems can fill an entire physics lab. Quite apart from the large cryostat that maintains the qubits at ultracold temperatures, a typical experimental set-up incorporates dozens of discrete electronic instruments that enable the quantum processor to perform logic operations.

Since a quantum processor is essentially an analogue system, electronics equipment is needed to convert digital commands from a conventional computer into high-frequency electric pulses that alter the state of the qubit. Data acquisition systems are then used to measure the result of the quantum operation and relay it back to the PC.

Qblox Cluster with CEO Niels Bultink and CTO Jules van Oven

“Most quantum labs patch together a solution using separate pieces of general-purpose lab equipment,” says Niels Bultink, an experimental quantum physicist and co-founder of start-up company Qblox. “These instruments have not been optimized for the specific needs of quantum systems, and it takes a lot of time and money to set up the experiments.”

Such complex experimental set-ups not only occupy a lot of lab space, but they are also prone to errors and connectivity issues that can be difficult to locate and resolve. The hardware challenge is difficult enough when controlling quantum processors containing just a few qubits, but such piecemeal installations for quantum control and measurement will become unsustainable as researchers scale up their quantum processors – from tens of qubits today to hundreds and even thousands of quantum bits in the future.

That hardware bottleneck has been apparent for some time, and in 2015 Bultink was a PhD researcher working on an IARPA-funded project with professor Leonardo DiCarlo at QuTech – a quantum research centre in Delft, the Netherlands – to develop more scalable control electronics. Bultink saw commercial potential in the instrumentation he was working on, and in 2018 he joined forces with fellow physicist Jules van Oven to establish Qblox and bring fully-integrated control electronics to the growing quantum market.

Using the technology developed QuTech as a springboard, the Qblox team has fundamentally reimagined the architecture of quantum control to create a single integrated system, called the Cluster, that provides all the functionality needed to manipulate and measure quantum computers.

“With general-purpose lab equipment it can take weeks or even months to fine tune all the parameters for a multi-qubit device,” comments Butlink. “The Qblox architecture can speed up these routines by orders of magnitude, saving research teams significant amounts of time and money.”

The Cluster is a modular system that has been designed with scalability in mind: a single unit fits inside a standard 19″ rack mount and provides control of systems containing a maximum of 20 qubits, while additional modules can be connected together to operate quantum processors with hundreds of quantum bits.

We have created an entirely new architecture tailored to the peculiar requirements of qubits, reducing size by factor of 100.

Neil Bultink, Qblox co-founder

Each module in the Cluster system contains all the instrumentation needed to control and read-out a quantum computer, including waveform generators, frequency up- and down-conversion and data-acquisition tools. As well as miniaturizing the electronic components and integrating them together, each of the components has been optimized for use with quantum systems. “We have created an entirely new architecture tailored to the peculiar requirements of qubits, reducing size by factor of 100,” says Butlink.

One key focus for the Qblox team was to minimize the noise in the instrumentation, since quantum systems are extremely sensitive to noise. “The noise from the control system directly induces errors in quantum computation,” explains Bultink. “We developed a new class of waveform generator that operates at noise levels four times lower than the best alternative on the market.”

Careful attention was also taken to reduce drift in the instrumentation, which is essential to ensure that measurements are stable and reproducible. Gain and offset drifts have been reduced by a factor 10 towards just a few ppm/K, while automated calibration reduces time needed to set up the equipment from weeks to just a few hours.

Along with developing the hardware, Qblox has created an open-source control software called Quantify (with co-development partner, Orange Quantum Systems). “Quantify contains all the basic functionality to control experiments,” explains Bultink. “It also has a novel scheduler featuring a unique hybrid control model that allows quantum gate- and pulse-level descriptions to be combined in a clearly defined and hardware-agnostic way.”

Qblox Pulsar assembly

As well as optimizing the performance of the electronics in each individual module, one of the big challenges for the Qblox team was to ensure that multiple modules connected together would work effectively as a single, larger unit. This needs precise timing control to ensure that the control and read-out tasks performed by each module are synchronized together, and to achieve this Qblox has exploited their proprietary SYNQ protocol to ensure that all outgoing signals have a fixed and stable timing relation with respect to each other down to the nanosecond.

The other important parameter is latency, a measure of the time taken for the control system to send signals to the quantum system and record the result. Low latency is essential for experimental protocols that require feedback mechanisms, such as quantum error correction (QEC), where the control of one qubit depends on the measurement of another qubit just a few hundreds of nanoseconds earlier. “QEC is an emerging research area that seeks to remove errors from a quantum system,” explains Butlink. “Qubits are faulty by nature, and the ability to correct for these errors becomes increasingly important as more qubits are added to the system.”

Butlink explains that QEC requires the time between a measurement and a subsequent operation to be short compared to the timescale at which qubits can contain their information, which becomes more challenging as the control system is scaled up.

“For this, we have created a massively scalable infrastructure to share qubit measurement outcomes between the modules,” continues Bultink. The LINQ protocol developed by Qblox distributes measurement outcomes to all modules in less than 200 ns. “Doing this for a handful of qubits may sound difficult, but solving this for hundreds of qubits is one of the coolest challenges we have ever worked on,” he says.

The Cluster system can be used to control just about any experimental implementation of a quantum processor. Existing customers are using the system to control quantum computers based on superconducting qubits and quantum dots, for which the Cluster provides a single plug-and-play solution that can directly generate signals ranging from ultrastable DC to frequencies up to 18.5 GHz. For some quantum systems, such as those based on trapped ions, cold atoms, or nitrogen-vacancy (NV) centres in diamond, additional laser systems are needed for conversion into the optical regime, although all the necessary interfaces are provided.

We want researchers to challenge us with their experimental requirements. Our goal is to solve their control stack problems.

Neil Bultink, Qblox co-founder

Indeed, the Qblox team is currently installing a Cluster system for use with a quantum processor based on diamond NV centres. “We want researchers to challenge us with their experimental requirements,” says Bultink. “Our goal is to solve their control stack problems.”

While the Cluster systems can handle set-ups with hundreds of qubits, Bultink is well aware that quantum computers are likely to need thousands of qubits to offer a realistic alternative to classic computation. It may be early days to reveal the next milestone on the Qblox roadmap, but the company is part of an EU-funded project (part of Horizon 2020) that aims to produce the hardware needed to control systems with more than a thousand qubits.

More generally, Bultink sees Qblox as a vital part of a growing commercial ecosystem for the development of practical quantum computers with real-world applications. “It is really exciting to be part of the birth of the quantum industry,” he says. “More companies are providing solutions for different elements of a quantum computing build – not necessarily competing, but producing systems that can be integrated together to enable the first applications of quantum computing.”

Copyright © 2026 by IOP Publishing Ltd and individual contributors