Skip to main content

Meteorite hunters find fireball fragments in England, CERN collider has discovered 59 new hadrons

In this podcast episode we talk to Áine O’Brien of the University of Glasgow who is part of a team of meteorite experts who have gathered up remnants of a 100 kg carbonaceous chondrite meteoroid that exploded over southern England on the last day of February. She explains how a network of cameras and clever mathematics allowed scientists to work out where the fragments landed, and what it was like being out in the field looking for them. O’Brien also talks about how studying the meteorites could shed light on the conditions in which the solar system formed.

This week’s podcast also features particle physicist Tim Gershon of the University of Warwick, who gives us a flavour of the 59 new hadrons that have been discovered by CERN’s Large Hadron Collider (LHC) since it switched on in 2010. Gershon works on the LHCb experiment, which itself has just discovered four new exotic hadrons called tetraquarks. He explains what discovering new hadrons could tell us about the strong force and talks about how LHCb is being completely rebuilt so it can detect even more new particles when the LHC restarts – which could be as soon as next year.

Programmable photonic chip lights up quantum computing

A photonic chip balances on a person's finger

Computers are made of chips, and in the future, some of those chips might use light as their main ingredient. Scientists from the Ontario, Canada-based quantum computing firm Xanadu and the US National Institute of Standards and Technology have taken a big step towards that future by building a light-based chip that can be programmed through cloud access.

While conventional computers use electricity to create the ones and zeros that are their lifeblood, quantum computing experts have multiple options when developing their quantum bits (qubits). Some rely on superconductors, some start with extremely cold atoms, and some, like the researchers at Xanadu, use light.

But not just any light. The light that travels through the thumbnail-sized Xanadu chip, or circuit, has been “squeezed” – that is, its quantum uncertainty has been minimized. Squeezing light is possible because of the Heisenberg uncertainty relation that says that trying to make any microscopic object very narrow is like squashing a piece of clay: the narrower it gets in one direction, the more it bulges in another. Squeezing light produces precisely shaped photonic states that can be used for very accurate measurements in optical physics. Xanadu researchers, however, had other ideas: they used these squeezed states as qubits.

Optical computations

Xanadu’s chip works in three stages. First, laser light is fed into four microring resonators – tiny circular tracks in which light loops around and changes shape as it, in effect, catches its own tail. These resonators act as “squeezers” that smush many photons into a single squeezed state.

Next, a network of optical elements manipulates the photons’ properties in a way that is analogous to changing their direction by bouncing them off a mirror or changing their colour by passing them through a filter. Sequences of these light manipulations are the equivalent of computer code. Whenever the network bounces or rotates light, it executes operations similar to adding ones and zeroes in a classical computer.

In the final stage, the light enters a detector that counts how many photons are within each squeezed state. The result of the computer’s calculation lies in these photon numbers. “Some particular integer pattern of photon counts for a particular circuit that you dialled in will tell you something about the problem that you encoded in the device,” says Zachary Vernon, a physicist at Xanadu and a co-author on the study.

Vernon explains that this approach makes it possible to perform some computations that are new even to other quantum computers. “It lets you access a space of problems which are different than the ones that are accessible by matter-based qubit devices,” he says. In one particularly novel calculation, squeezed states encoded the shape of two graphs. The photon numbers detected at the end of the computation reflected how much structure those graphs had in common. This graph similarity analysis would not be easy to implement on any other quantum computer, Vernon says.

The small size of the Xanadu chip is another key advantage. According to Shuntaro Takeda, a physicist at the University of Tokyo, Japan, who was not involved with the study, previous squeezed-light experiments required large tables full of bulky optical elements like mirrors and lenses. In Takeda’s view, on-chip integration technology like Xanadu’s will be indispensable for building large-scale, general-purpose optical quantum computers in the future.

Being able to perform more than one calculation is already a leap forward for light-based quantum computing, says Zheshen Zhang, a quantum information researcher at the University of Arizona in the US who was also not part of the study. He notes that similar devices could, in the past, execute only one type of code, and could not be programmed to perform different tasks for different users. The Xanadu chip’s accessibility through a cloud service is a further benefit, he says.

Effects of photon loss

To make their devices useful for a broad base of future quantum programmers, Xanadu’s scientists still need to overcome some scientific and engineering challenges. In the current setup, for example, many photons are lost as they travel through the chip due to small flaws in the chip’s structure. Engineering more perfect chips and developing codes that take photon loss into account could be important for future generations of these device, Zhang says. Future chips will also have to handle more information – and thus more light – before they can outperform classical computers.

One example of a problem where a classical and an optical quantum computer could go head-to-head would involve simulating the behaviour of many molecules. “Can you show that the classical algorithm of simulating such a problem becomes intractable whereas the quantum algorithm would still allow you to actually get the answer?” Zhang asks.

The Xanadu team says that addressing this question is the next item on their agenda. The team has, however, already measured the quantum-ness of the device by demonstrating that approximating its mechanisms by some classical model would be extremely difficult. “If everything else stays the same, and you scale the [chip] system up, it will still be very quantum,” Vernon says. “Of course, a lot of things have to come together to make that work.”

The team reports the work in Nature.

Quantum conference offers business insight

The first quantum revolution, in which research physicists conceived novel experiments to probe and manipulate quantum states, has paved the way for a new era of engineering quantum systems for real-world applications. Quantum technologies are already being explored for improving the security of communications networks and developing more precise sensors, while quantum computing offers the potential to speed up drug discovery, reveal the secrets of protein folding, enable new approaches to machine learning and artificial intelligence – and much more besides.

Critical to the success of such real-world applications will be the development of a commercial ecosystem, in which technology suppliers work alongside research teams to develop and deliver key elements of a practical quantum system. Reflecting this need is a new conference and industry event, Quantum Business Europe, which aims to forge new collaborations and provide the business community with the knowledge, skills and connections they need to embark on the quantum revolution. The fully digital conference will run online on 16 –17 March 2021, with all sessions live-streamed and then available to watch on-demand for two months after the event.

The conference organizers hope to provide a forum that will bring together all the key players in Europe’s rapidly growing quantum sector. Delegates will be able to explore the latest advances and business applications of quantum technologies, exchange knowledge and ideas, and better understand the challenges and opportunities offered by quantum technologies.

A high-level conference programme will feature 50 expert speakers, who will offer a strategic view on the future development of the quantum sector and highlight some of the emerging use-cases for quantum technologies. The opening panel session, for example, will include Paula Forteza, a member of the French National Assembly, and Tommaso Calarco, chair of the European Quantum Community Network, and will discuss how Europe is preparing for a quantum future – with more than a billion Euros earmarked for the development of quantum technologies over the next decade.

Other keynote speakers include Accenture’s Matthias Ziegler, who will offer an analysis of the emerging quantum computing ecosystem, and Alexia Affuvès, Head of Quantum Engineering Grenoble, who will discuss the potential of quantum computation to cut energy use and reduce our digital footprint. Parallel sessions in the afternoon will focus on business applications of quantum technologies, ranging from finance and insurance to quantum communications, quantum sensing and quantum computing in the automotive and pharmaceutical industries.

Alongside the conference will run a series of more than 30 technical demonstrations by leading research teams and technology vendors. Intel will be showcasing recent advances in qubit design and control, while Atos will reveal how quantum computing can be used for combinatorial optimization. Cryogenics specialist Bluefors will offer a demo of its Cryogenic Wafer Prober, described in more detail below, while a virtual trade show will feature 20 companies eager to discuss the latest innovations that will provide the building blocks of next-generation quantum systems.

If you would like to take part in the event, visit the Quantum Business Europe website to register for a full conference pass or secure free access to the virtual exhibition and demo sessions.

Cryogenic technology enables quantum progress

Bluefors Afore The Cryogenic Wafer Prober

Finnish company Bluefors has perfected a series of commercial cryogenic systems that make it easier to assemble and test a quantum system in ultracold conditions. One recent addition to the portfolio is the Cryogenic Wafer Prober, which enables automated wafer-level testing at temperatures well below 4 K. Developed in partnership with Afore, which specializes in developing application-specific test solutions for semiconductor chips, the automatic testing solution offers fast sample characterization – with a throughput up to 100 times faster than conventional cryogenic chambers – as well as the ability to probe an entire 300 mm wafer.

The Cryogenic Wafer Probe has recently been acquired by CEA-Leti, the technology research institute of the French Alternative Energies and Atomic Energy Commission, to characterize silicon-based qubits at low temperatures. “This unique testing solution will become an essential part of the R&D and ramp-up to future commercial production of quantum and superconducting devices,” commented Bluefors’ Vitaly Emets.

The wafer prober features an active alignment system that can automatically locate and contact devices anywhere on the wafer, while an intuitive user interface provides direct control and full overview of the testing process. In addition, the load-lock system has been designed to allow fast wafer change at cryogenic temperatures.

The Cryogenic Wafer Prober is just one of many innovations that Bluefors has introduced for making quantum experiments quicker and easier to set up. Last year the company introduced the option of high-density wiring, which has become increasingly important as scientists seek to increase the number of qubits in their quantum computing systems. This high-density interface allows more than 1000 high-frequency control lines to be installed in a single system, and has been designed to allow the wires to be installed in blocks of 12.

The high-density interface exploits standard connectors and coaxial cables for the wiring, and the attenuators have been embedded in a single block that fits into the Bluefors’ modular cryogenics system. This modular form factor also allows the use of custom components with multiple high-density channels, such as amplifiers, filters and attenuators.

  • For more information about Bluefors’ cryogenics technology, read the Physics World article Cool technology enables quantum computing. On 17 March, you can also tune into the company’s technical demonstration of the Cryogenic Wafer Prober at Quantum Business Europe.

Physicists measure smallest gravitational field yet

Physicists in Austria have measured the gravitational field from the smallest ever object: a gold sphere with a diameter of just 2 mm. Carried out using a miniature torsion balance, the measurement paves the way to even more sensitive gravitational probes that could reveal gravity’s quantum nature.

For years, Einstein’s general theory of relativity and Newton’s universal law of gravitation have been subjected to ever more stringent tests. These tests have involved both astronomical observations and laboratory experiments. Usually, the masses that provide the gravitational field in the latter are large objects of several kilograms or more, such is the need to compensate for gravity’s inherent weakness.

The latest work, in contrast, uses a gold sphere with a mass of just 92 mg as its source. Markus Aspelmeyer and Tobias Westphal of the Institute for Quantum Optics and Quantum Information in Vienna and colleagues positioned this mass a few millimetres away from another tiny gold sphere with about the same mass located at one end of a 4 cm-long glass rod. The rod was suspended at its centre via a silica fibre, while a third sphere at the far end of the rod acted as a counterbalance.

Such “torsion balances” have been used for more than 200 years to make precise measurements of gravity. The idea is that the source mass pulls the near end of the bar towards itself, causing the suspending fibre or wire to rotate. By measuring this rotation and balancing it against the stiffness of the wire, the strength of the gravitational interaction can be calculated. The fact that the bar moves horizontally means it is less exposed to the far larger gravitational field of the Earth.

Noise-reduction strategies

A major challenge with such experiments is screening out noise. Aspelmeyer and colleagues did this by placing the balance in a vacuum to limit acoustic and thermal interference, while also grounding the source mass and placing a Faraday shield between it and the test mass to reduce electromagnetic interactions. In addition, they mainly collected data at night to minimize ambient sources of gravity. This is important because the gravitational attraction of the source mass is equivalent to the pull of a person standing 2.5 m from the experiment or a Vienna tram 50 m away.

A gold sphere resting on a 1 euro cent coin that has been digitally altered to appear warped by the sphere's gravity

To generate signals above the remaining noise, the researchers used a bending piezoelectric device to cyclically move the source towards and away from the test mass. Doing this at a fixed frequency (12.7 mHz) allowed them to look for a corresponding variation in the rotation of the balance – which they measured by bouncing a laser beam off a mirror below the silica fibre.

After repeating this process hundreds of times over a 13.5-hour period and then converting the time-series data into a frequency spectrum, Aspelmeyer and colleagues identified two clear signals above the background. These were the principle oscillation at 12.7 mHz and, at 25.4 mHz, the second harmonic generated by the gravitational field’s nonlinear variation in space. As the researchers point out, both harmonics were well above the resonant frequency of the oscillating balance and below the frequencies of readout noise.

A Newtonian result – for now

By using a camera to record the changing distance between source and test mass, the physicists also plotted how the gravitational force varied in space. They say that their data – a smooth curve dropping off as the square of the distance – provide unambiguous evidence of Newtonian gravity. What’s more, they also calculated their own value for the gravitational constant, G. This quantity remains a headache for metrologists, given the very precise but mutually inconsistent measurements of it made by different groups. The group’s result – a weighted mean based on 29 measurements during the seismically-quiet Christmas period in 2019 – is unlikely to resolve those disputes, being around 9.5% smaller than the official CODATA value of 6.674×10−11m3kg−1s−2. However, the researchers note that that this margin is within the roughly 10% uncertainty they obtain by totting up all the known sources of systematic error in their experiment.

Looking ahead, Aspelmeyer and colleagues argue that their experimental approach could in principle be extended to still smaller source masses. In particular, they say it should be possible to significantly reduce thermal noise by increasing the fibre’s quality factor. Raising the current value of about 5 to more than 20,000 could allow for source masses below the Planck mass of 22 μg – thereby raising the prospect of probing quantum gravity.

Getting to that point will, they caution, require mitigating other sources of noise. However, they reckon that these problems are solvable. Low-frequency noise from human sources, for example, could be reduced by transferring the experiment to a suitably remote location. Casimir forces, meanwhile, could be limited through electromagnetic shielding and signal modulation.

Andrew Geraci of Northwestern University in the US agrees that the work could lead to quantum-based investigations. He explains that placing very small objects into a quantum superposition would allow scientists to determine whether gravity plays a role in the entanglement of quantum systems. “While there is still a long way to go before this can be achieved,” he says, “I consider the work to be exciting progress in this direction.”

The research is published in Nature.

Integrated system offers easy and scalable quantum control

A quantum chip may measure just a few millimetres across, but the equipment needed to cool, control and measure such delicate quantum systems can fill an entire physics lab. Quite apart from the large cryostat that maintains the qubits at ultracold temperatures, a typical experimental set-up incorporates dozens of discrete electronic instruments that enable the quantum processor to perform logic operations.

Since a quantum processor is essentially an analogue system, electronics equipment is needed to convert digital commands from a conventional computer into high-frequency electric pulses that alter the state of the qubit. Data acquisition systems are then used to measure the result of the quantum operation and relay it back to the PC.

Qblox Cluster with CEO Niels Bultink and CTO Jules van Oven

“Most quantum labs patch together a solution using separate pieces of general-purpose lab equipment,” says Niels Bultink, an experimental quantum physicist and co-founder of start-up company Qblox. “These instruments have not been optimized for the specific needs of quantum systems, and it takes a lot of time and money to set up the experiments.”

Such complex experimental set-ups not only occupy a lot of lab space, but they are also prone to errors and connectivity issues that can be difficult to locate and resolve. The hardware challenge is difficult enough when controlling quantum processors containing just a few qubits, but such piecemeal installations for quantum control and measurement will become unsustainable as researchers scale up their quantum processors – from tens of qubits today to hundreds and even thousands of quantum bits in the future.

That hardware bottleneck has been apparent for some time, and in 2015 Bultink was a PhD researcher working on an IARPA-funded project with professor Leonardo DiCarlo at QuTech – a quantum research centre in Delft, the Netherlands – to develop more scalable control electronics. Bultink saw commercial potential in the instrumentation he was working on, and in 2018 he joined forces with fellow physicist Jules van Oven to establish Qblox and bring fully-integrated control electronics to the growing quantum market.

Using the technology developed QuTech as a springboard, the Qblox team has fundamentally reimagined the architecture of quantum control to create a single integrated system, called the Cluster, that provides all the functionality needed to manipulate and measure quantum computers.

“With general-purpose lab equipment it can take weeks or even months to fine tune all the parameters for a multi-qubit device,” comments Butlink. “The Qblox architecture can speed up these routines by orders of magnitude, saving research teams significant amounts of time and money.”

The Cluster is a modular system that has been designed with scalability in mind: a single unit fits inside a standard 19″ rack mount and provides control of systems containing a maximum of 20 qubits, while additional modules can be connected together to operate quantum processors with hundreds of quantum bits.

We have created an entirely new architecture tailored to the peculiar requirements of qubits, reducing size by factor of 100.

Neil Bultink, Qblox co-founder

Each module in the Cluster system contains all the instrumentation needed to control and read-out a quantum computer, including waveform generators, frequency up- and down-conversion and data-acquisition tools. As well as miniaturizing the electronic components and integrating them together, each of the components has been optimized for use with quantum systems. “We have created an entirely new architecture tailored to the peculiar requirements of qubits, reducing size by factor of 100,” says Butlink.

One key focus for the Qblox team was to minimize the noise in the instrumentation, since quantum systems are extremely sensitive to noise. “The noise from the control system directly induces errors in quantum computation,” explains Bultink. “We developed a new class of waveform generator that operates at noise levels four times lower than the best alternative on the market.”

Careful attention was also taken to reduce drift in the instrumentation, which is essential to ensure that measurements are stable and reproducible. Gain and offset drifts have been reduced by a factor 10 towards just a few ppm/K, while automated calibration reduces time needed to set up the equipment from weeks to just a few hours.

Along with developing the hardware, Qblox has created an open-source control software called Quantify (with co-development partner, Orange Quantum Systems). “Quantify contains all the basic functionality to control experiments,” explains Bultink. “It also has a novel scheduler featuring a unique hybrid control model that allows quantum gate- and pulse-level descriptions to be combined in a clearly defined and hardware-agnostic way.”

Qblox Pulsar assembly

As well as optimizing the performance of the electronics in each individual module, one of the big challenges for the Qblox team was to ensure that multiple modules connected together would work effectively as a single, larger unit. This needs precise timing control to ensure that the control and read-out tasks performed by each module are synchronized together, and to achieve this Qblox has exploited their proprietary SYNQ protocol to ensure that all outgoing signals have a fixed and stable timing relation with respect to each other down to the nanosecond.

The other important parameter is latency, a measure of the time taken for the control system to send signals to the quantum system and record the result. Low latency is essential for experimental protocols that require feedback mechanisms, such as quantum error correction (QEC), where the control of one qubit depends on the measurement of another qubit just a few hundreds of nanoseconds earlier. “QEC is an emerging research area that seeks to remove errors from a quantum system,” explains Butlink. “Qubits are faulty by nature, and the ability to correct for these errors becomes increasingly important as more qubits are added to the system.”

Butlink explains that QEC requires the time between a measurement and a subsequent operation to be short compared to the timescale at which qubits can contain their information, which becomes more challenging as the control system is scaled up.

“For this, we have created a massively scalable infrastructure to share qubit measurement outcomes between the modules,” continues Bultink. The LINQ protocol developed by Qblox distributes measurement outcomes to all modules in less than 200 ns. “Doing this for a handful of qubits may sound difficult, but solving this for hundreds of qubits is one of the coolest challenges we have ever worked on,” he says.

The Cluster system can be used to control just about any experimental implementation of a quantum processor. Existing customers are using the system to control quantum computers based on superconducting qubits and quantum dots, for which the Cluster provides a single plug-and-play solution that can directly generate signals ranging from ultrastable DC to frequencies up to 18.5 GHz. For some quantum systems, such as those based on trapped ions, cold atoms, or nitrogen-vacancy (NV) centres in diamond, additional laser systems are needed for conversion into the optical regime, although all the necessary interfaces are provided.

We want researchers to challenge us with their experimental requirements. Our goal is to solve their control stack problems.

Neil Bultink, Qblox co-founder

Indeed, the Qblox team is currently installing a Cluster system for use with a quantum processor based on diamond NV centres. “We want researchers to challenge us with their experimental requirements,” says Bultink. “Our goal is to solve their control stack problems.”

While the Cluster systems can handle set-ups with hundreds of qubits, Bultink is well aware that quantum computers are likely to need thousands of qubits to offer a realistic alternative to classic computation. It may be early days to reveal the next milestone on the Qblox roadmap, but the company is part of an EU-funded project (part of Horizon 2020) that aims to produce the hardware needed to control systems with more than a thousand qubits.

More generally, Bultink sees Qblox as a vital part of a growing commercial ecosystem for the development of practical quantum computers with real-world applications. “It is really exciting to be part of the birth of the quantum industry,” he says. “More companies are providing solutions for different elements of a quantum computing build – not necessarily competing, but producing systems that can be integrated together to enable the first applications of quantum computing.”

Global audience tunes into online APS March Meeting

What a difference a year makes. In 2020 the APS March Meeting was one of the first casualties of the worsening COVID-19 pandemic, with the event cancelled just hours before it was due to get under way. This year, like all other major scientific conferences, the March Meeting will be convened online – enabling physicists in all parts of the world to explore the latest research breakthroughs and technical innovations.

The APS is expecting more than 11,000 scientists and students to log on for the online event. The main draw will be the scientific programme, with parallel sessions running from Monday 15 March to Friday 19 March, which will be complemented by pre-meeting tutorials and short courses, a series of Industry Days, and events designed specifically for undergraduate students to present their research, learn about career options, and connect with the scientific community.

A virtual exhibit will run from Monday through to Thursday, with company representatives available to discuss their products from 12.00 p.m. to 3.00 p.m. (Central Time). A few of the new product innovations that will be presented at the exhibit are highlighted below.

Superconducting magnet system offers more efficient cooling

The DryMag superconducting magnet system from Lake Shore Cryotonics

The Janis DryMag 1.5 K superconducting magnet system from Lake Shore Cryotronics offers a more cost-effective solution for low-temperature material research. Providing a continuous temperature range of 1.5 K to 420 K – even with the magnet operating at full field – the cryogen-free system now enables an easy shift between in-plane and out-of-plane measurements by providing both a 2D vector field magnet configuration as well as 0–90° precision sample rotation.

With an initial cooldown time of less than 24 hours (with the 9 T magnet), the system offers accurate and simultaneous temperature control of the sample mount and the surrounding helium exchange gas. The sample is located in static helium exchange gas for efficient cooling, regardless of sample material or shape.

In addition, the system is available in optical and non-optical geometries, and with a sample in vacuum configuration. An optional electrical transport measurement package is available, which includes Lake Shore’s M91 FastHall measurement controller for much faster, more precise Hall measurements. This controller offers measurement times up to 100 times faster than typical Hall systems, particularly when measuring low-mobility materials.

For more information about the DryMag system, visit lakeshore.com/DryMag.

Integrated unit simplifies the generation of complex waveforms

Tabor P90612B

The new Proteus RF arbitrary waveform generators/transceivers from Tabor Electronics allow complex pulse shapes and phases to be easily created up to 8 GHz, eliminating the need for cumbersome IQ modulator/oscillator set-ups. With an optional high-speed digitizer, the Proteus can be converted into an arbitrary waveform transceiver to provide closed-loop measurement capability within a single unit and programming environment.

A variety of interpolators, IQ modulators and numerically controlled oscillators are integrated into each channel, allowing complex RF signals to be generated directly from the Proteus instrument. This integrated approach eliminates the limitations of external IQ modulators and mixers, such as IQ mismatch and in-band carrier feed-through. Each channel is coherent and fully independent, making it possible to produce multichannel time/phase aligned signals, or signals of different frequencies and characteristics, from a single unit.

The Proteus RF AWG exploits a high-speed PCI express interface that provides up to 64 Gb/s of data transfer speed. It offers 16GS of memory to allow even long and complex waveform sequences to be downloaded quickly, while reducing the waveform size through interpolation up to a factor of x8 makes it possible to save even more experimental set-up time. For applications requiring more memory or real-time changes to the waveform, the Proteus arbitrary waveform transceiver allows waveforms to be streamed from disk to instrument at speeds of up to 6GS/s.

Three form-factor versions are available to suit the needs of the experiment: the modular PXIe format offers unlimited time-aligned channels with the fastest data transfer rates, while the desk and bench versions offer up to 12 fully independent channels.

Integrated platform delivers quantum control

Quantum Machines hardware

New from Quantum Machines (QM) is the Quantum Orchestration Platform (QOP), a complete hardware and software solution that allows even the most complex quantum algorithms to be run on any quantum processor. Unlike classical computation, where the computer logic is embedded within the processor itself, quantum processors have no built-in logic. Logic operations are instead performed by sending high-frequency pulses to the quantum processor from tailor-made classical hardware.

To address this challenge, QM has developed an integrated solution for controlling quantum systems. At the core of the solution is the OPX, the hardware element of the QOP, which incorporates multiple waveform generators, digitizers and processing units that are all integrated on a single FPGA with a unique and scalable design. Inside the OPX is a dedicated pulse processor that allows for advanced multi-qubit manipulation, quantum error correction, and full system scaling.

The OPX is designed to be easily programmed using QUA, a powerful yet intuitive programming language, which enables the most complex experiments and algorithms to be run quickly and easily. With seamless compatibility and powerful capabilities such as ultra-low feedback latency and general control flow, the platform delivers real-time processing to speed up experiments by as much as an order of magnitude.

Founded by leading quantum researchers, Quantum Machines partners with development teams at the forefront of quantum computing, including multinational corporations, start-ups, government laboratories, and academic institutions, to help advance the future of quantum computing.

Novel AFM mode enables electrochemical mapping for battery research

Park Systems

The NX range of atomic force microscopes (AFMs) from Park Systems now supports scanning electrochemical cell microscopy (SECCM), a new pipette-based nanoelectrochemical scanning probe technique for investigating the local electrochemical properties of electrode surfaces. For the first time this technique is allowing scientists studying electrocatalysis and energy storage to correlate electrochemical activity with the nanostructure of electrochemical interfaces.

SECCM works by inserting a quasi-reference counter electrode (QRCE) into a nanopipette filled with an electroactive species. The Z-scanner of the AFM is used to lower the nanopipette onto the contact surface, creating a meniscus that allows a tiny droplet, or nanoelectrochemical cell, to form. The electroactive species in the droplet reacts when a bias is applied between the QRCE and an electrode placed on the XY scanner, making it possible to create a electrochemical current mapping at various positions across the sample surface.

SECCM allows researchers to perform thousands of confined nanoelectrochemical measurements on a single surface, with droplet sizes ranging from a few hundred nanometers to a few microns. Researchers can easily alter the chemical systems by swapping a new pipette with another electroactive species, and there is little need for special preparation of samples – making the technique both simple and cost-effective.

In one recent study, the SECCM mode of the Park NX12 system was used to study an electrochemically reversible redox process at a highly ordered surface of pyrolytic graphite (HOPG). Localized nanoscopic cyclic voltammetry measurements were taken each time the meniscus touched the surface, providing a spatially-resolved surface electroactivity mapping of HOPG at the micro- and nanoscale.

The results indicate that the results are robust and reproducible, with a current limit as low as a few picoAmpere. According to application scientists at Park Systems, this capability could also facilitate the rational design of functional electromaterials for use in energy storage studies and for corrosion research.

Cryostat supports quantum computing scale-up

The ProteoxLX from Oxford Instruments

Oxford Instruments NanoScience will be showcasing its latest innovation in cryogen-free dilution refrigerator technology for quantum computing scale-up, the ProteoxLX, at the APS March Meeting. The ProteoxLX is part of Oxford Instruments’ family of next-generation dilution refrigerators, which all share the same modular layout to provide cross-compatibility and added flexibility for cryogenic installations.

Optimized for scaling up quantum computing systems, the LX system supports maximum qubit counts, with a large sample space and ample coaxial wiring capacity. Low vibration features reduce noise and support long qubit coherence times, while the system provides full integration of signal conditioning components.

The LX also offers two fully customizable secondary inserts for an optimized layout of cold electronics, as well high-capacity input and output lines that are fully compatible and interchangeable across the Proteox family.

Photonic crystal lasers deliver optimal performance for lidar sensing and laser processing

PCSEL

High-performance lasers are central to both lidar sensing and laser processing – technologies that enable a range of smart mobility and smart manufacturing applications. Lidar, for example, plays an essential role within autonomous vehicles, construction machines and automated factory robots, while laser processing is used for fabrication of electronics, automobiles and solar cells.

Currently, however, the lasers used for these applications – semiconductor lasers, CO2 lasers and fibre lasers – all come with limitations. Speaking at the recent Photonics West LASE conference, Susumu Noda from Kyoto University described the problem.

To generate high-power from a conventional semiconductor laser, such as a broad-area Fabry–Pérot laser, the cavity width must be increased, which results in a low-quality, divergent output beam. “For lidar sensing, a complicated lens systems and fine adjustments are required to reform the beam shape,” Noda explained. “And for laser processing, the output beam cannot be focused sufficiently and thus cannot be used directly for manufacturing.”

Laser processing is also performed using CO2 and fibre lasers. But CO2 lasers are extremely large and have low efficiency. Fibre lasers, meanwhile, contain hundreds of laser diodes that are combined into an amplification fibre. As such, fibre lasers suffer from a complex configuration and a substantial size, as well as limited efficiency.

What’s needed is a totally different laser technology that addresses these issues. And according to Noda, this is the photonic crystal surface-emitting laser, or PCSEL.

PCSELs are new type of semiconductor laser that contain a photonic crystal integrated on top of the active layer. Photonic crystals are nanostructured materials in which a periodic variation of the dielectric constant (formed, for example, by a lattice of holes) creates a photonic band-gap. The resulting PCSELs emit a high-quality, symmetric beam with narrow divergence.

Susumu Noda

“For lidar applications, lens-free, adjustment-free operation is possible. And for laser processing, due to their high brightness, we can use PCSELS directly for materials processing,” said Noda. “Therefore, photonic crystal lasers are expected to contribute to the development of smart mobility and smart manufacturing.”

Noda and his research team have been working on PCSELs since 1999, when they first established 2D surface-emitting coherent oscillation. They went on to demonstrate control of polarization and beam shape by tailoring the phonic crystal structure, expansion into blue–violet wavelengths and 1D beam steering. In 2013, 0.2 W PCSELs became commercially available, and the team has since demonstrated watt-class operation and above.

Lidar applications

To generate the high-brightness operation required for lidar applications, Noda’s team created a double-lattice PCSEL, containing two sets of square lattice structures with different air hole depths and sizes. The 500 µm-diameter device exhibited a slope efficiency of 0.8 W/A and extremely narrow beam divergence of 0.1°.

The PCSEL demonstrated superior temperature performance to conventional semiconductor lasers, operating between -40 and 100 °C, with temperature dependences of the output power and lasing wavelength of -0.36%/K and 0.08 nm/K. Examining lens-free light propagation from these PCSELs revealed that they could create small beam spots (5 cm) at distances of up to 30 m. Conventional Fabry–Pérot lasers could not even deliver a beam at 10 m or further.

PCSEL-based lidar

The team used this double-lattice PCSEL to construct a lidar system. Noda pointed out that while conventional lasers used in lidar require complicated lens systems and mechanical mirrors to steer the beam, PCSELs offer lens-free, adjustment-free beam control.

Noda shared a video showing the ability of PCSEL-based lidar to measure the distance of objects – in this case, two researchers. As they walked towards the lidar system, it accurately detected their movement in real time. It could also track smaller changes in distance, such as swinging of arms or hand movements, demonstrating successful high-resolution lidar operation.

The team has also recently demonstrated 2D electronic beam scanning, based on a chip design with dually modulated PCSELs integrated in a 10 × 10 array.

Laser processing

For laser processing applications, the PCSELs need to offer miniaturization and high efficiency, to address the limitations of CO2 and fibre lasers. Using ultracompact PCSELs, “the realization of handheld laser processing systems can be possible,” said Noda.

Three steps are required to achieve this, he explained: creating 1 mm, 10 W PCSELs with high beam quality; enlarging the devices to 3 mm to reach 100 W output power; and developing associated packaging and cooling technologies.

To create 10 W PCSELs, the team optimized the double-lattice structures by adjusting the size and shape of the air holes. The fabricated device successfully achieved 10 W continuous-wave operation, with very narrow beam divergence. Irradiating a metal surface with the focused PCSEL beam formed extremely fine holes. “We believe this is the first successful processing of a metal surface by a single-chip semiconductor laser,” said Noda.

Next, the team fabricated and packaged a 3 mm-diameter PCSEL. Initial tests on this device under pulsed conditions showed that 150 W output was obtained using only six times the threshold current. Noda predicts that by further increasing the device size to 1 cm, kilowatt-class operation could be achieved.

“For the future, key devices for lidar sensing and laser processing can be made using PCSELs,” Noda concluded.

Quantum gravity could soon be tested using ultracold atoms

Quantum gravity might soon be tested in the lab, thanks to a new analysis from physicists in the UK, France and Hong Kong. Drawing on advances in quantum information science, the researchers have found that if gravity is fundamentally quantum rather than classical it must generate a signature known as non-Gaussianity. To look for that signature, they propose probing an ultra-cold gas of several billion caesium atoms existing in a state known as a Bose-Einstein condensate (BEC).

Attempts to unify general relativity and quantum mechanics usually involve quantizing gravity to create a theory of quantum gravity such as string theory or loop quantum gravity. However, with little or no empirical data to support such theories, some physicists have developed alternative unifying theories in which matter is quantized but gravity itself remains a fundamentally classical variable.

Previously, most scientists thought that distinguishing between these two types of theory in the laboratory would be impossible given the scale at which space-time should become quantized. That “Planck length” – a mere 1.6×1035 m – could only be probed directly by colliding particles using an accelerator about the size of the Milky Way.

Manageable Planck mass

However, insights from quantum information science suggest that tests of quantum gravity could be done at the far more manageable scale of the “Planck mass” – which is about 22 µg. The challenge is creating a real system that can remain in a coherent quantum state on this macroscopic scale and the solution could rely on techniques developed for the construction of quantum computers and other quantum technologies.

One such proposal, put forward by Vlatko Vedral at the University of Oxford and other researchers in 2017, involves the observation of quantum entanglement between two microspheres, each of which is placed into a superposition of two spatial locations. By blocking all other possible interactions between the spheres, any entanglement must occur via a gravitational interaction. But as Richard Howl, until recently at the University of Nottingham in the UK, and Vedral point out in their new research, classical non-local effects could conceivably lead to such entanglement.

To carry out a more unambiguous test of quantum gravity, the new work instead relies on non-Gaussianity. Needed to carry out universal quantum computation, this is a property of a quantum system whose time-evolution operator is not just a linear or quadratic function of quantum variables. A matter particle emitting a graviton, for example, would be non-Gaussian because the Feynman diagram representing the interaction would involve three quantum operators (as opposed to two in the classical case).

Billions of atoms

Howl and colleagues have shown theoretically that if a system displays non-Gaussianity then its gravitational interaction must be quantum mechanical. What is more, they have identified a quantum system that could be scrutinized for this characteristic, and which could be set up using existing technology. That system is a BEC, a state of matter in which all atoms are cooled to such a low temperature that they end up sharing the same quantum state. More specifically, the researchers suggest a 0.2 mm-diameter condensate of 4 billion atoms of caesium-133 held in a spherical optical trap for around 2 s.

There are several ways that this system could be scrutinized. One option involves releasing the condensate from its trap and then sending it through a matter-wave beam-splitter. By measuring the number of atoms in the two outgoing beams and repeating the process many times over, the difference between those numbers should follow a non-Gaussian distribution if gravity is indeed quantum mechanical.

The physicists maintain that this set-up has several advantages over microspheres. It involves just a single quantum system in a single location, for example, and the team argues that a BEC lends itself naturally to eliminating the electromagnetic interactions that would also display non-Gaussianity and therefore potentially generate a false positive signal.

Feshbach resonances

As they point out, the microsphere experiment would minimize electromagnetic forces simply by placing the spheres far enough part – at a distance where the objects’ mutual gravity is stronger than the van der Waals force. Doing so, however, also reduces the gravitational interaction. With the condensate however, Feshbach resonances allow the overall strength of electromagnetic interactions between the constituent atoms to go to zero when the system is exposed to a suitable magnetic field or laser beam.

Carrying out the experiment in the lab will involve overcoming a number of technical hurdles, including how to put the atoms in their initial quantum state. BECs have previously been placed into massive non-classical states but not into the kind of state needed here. That is a macroscopic squeezed state, which exploits Heisenberg’s uncertainty principle to reduce noise in the measured variable – either the BEC’s position or momentum. This comes at the expense of increasing noise in the other variable, which does not affect the measurement.

Generating this state will be tricky since it will require transforming squeezed states of spin and atom number – which has never been done for such a large BEC. Alternatively, says Howl, it might be possible to use a much smaller squeezed state. But that in turn would mean increasing both the number of atoms in the condensate and the number of experimental trials by a couple of orders of magnitude.

Gerard Milburn at the University of Queensland in Australia, is enthusiastic about the principle behind the new work – describing the switch to non-Gaussian signatures as “a very good idea”. But he cautions that putting that idea into practice will not be easy, given quantum noise arising from non-Gaussian dynamics in the condensate itself. “My guess is that these effects will be at least as large as the non-Gaussian effects coming from quantum gravity,” he says.

The research is reported in PRX Quantum.

DNA’s remarkable physical properties

DNA is not just genetic material. It is also an advanced polymer that is inspiring a new field of research that treats DNA as a soft material. As well as developing our fundamental understanding of life processes, this research could also lead to applications such as smart drug carriers or new methods for regenerating tissues.

Find out more in the article ‘Make or break: building soft materials with DNA’, by physicist Davide Michieletto.

Accuracy, feasibility and reliability of linac-based VMAT technique for total body irradiation (TBI)

Want to learn more on this subject?

Radiation therapy in the form of total body irradiation (TBI) technique continues to be an important part of conditioning regimens in patients undergoing bone-marrow transplantation in haematological malignancies.

During this webinar, Assoc. Prof. Bora Tas will speak about the feasibility, accuracy and reliability of volumetric modulated arc therapy (VMAT)-based TBI treatment in patients.

Want to learn more on this subject?

Assoc. Prof. Bora Tas is director and head of MR-Linac (Unity) & Linac Business Lines at Elekta in Istanbul, Turkey. He has more than 10 years of clinical experience managing the physics and dosimetry for a facility with a high-volume, hospital-based radiation oncology department. He is skilled in X-rays, electrons, MRI, particle therapy, medical devices, dosimetry, and oncology management, as well as having lots of experience of working with VMAT, IMRT, SRS and SBRT techniques. He is an editorial board member and reviewer of scientific journals.

Copyright © 2026 by IOP Publishing Ltd and individual contributors