Skip to main content

Application of electrochemical impedance spectroscopy in lithium-ion batteries

Want to learn more on this subject?

Be it improving energy density or cycle life or reducing cost, understanding the failure modes of batteries in a non-destructive mode is critical during the design, product development and manufacturing of lithium-ion batteries.

Electrochemical impedance spectroscopy (EIS) provides the ability to access and decouple the failure modes based on the processes’ timescale. Analysis of recorded EIS can be done either through phenomenological modelling or equivalent circuit modelling, with each having its own pros and cons.

This webinar reviews the basics of applying EIS for understanding the phenomena in lithium-ion batteries, the experimental details and protocols, and the types of models with a few case studies.

Want to learn more on this subject?

Chockkalingam (Chock) Karuppaiah has more than 23 years of experience in electrochemical and energy product development. Products he developed include polymer electrolyte fuel cells, flow batteries and solid oxide fuel cells. He was involved in the design, development and manufacturing scale-up of seven different products. His professional experience includes being vice-president of engineering, Bloom Energy; founder, EC Labs; research professor, Case Western Reserve University, US; manager of the Fundamentals Team, Plug Power; and research staff, Los Alamos National Lab.

Dr Karuppaiah received his BS in chemical and electrochemical engineering from the Central Electrochemical Research Institute, India (1993), and PhD in electrochemistry and fuel cells from Rensselaer Polytechnic Institute, US (1997). He has authored 21 published patents.


Novel stereotactic QA with film-class resolution: First clinical experience with myQA SRS

Want to learn more on this subject?

This presentation has been submitted for approval by CAMPEP for 1 MPCEC hour and is accredited by EBAMP as a CPD event for Medical Physicists at EQF Level 7 and awarded 9 CPD credit points.

Learn about the physics characteristics and clinical performance of the new, recently released myQA SRS solution. This detector is optimized for your SRS/SBRT Patient QA and features a novel 0.4 mm film-class resolution CMOS technology array with 105,000 measurement points. The solution includes a dedicated phantom to support your efficient QA for different stereotactic delivery methods. First clinical results will be presented.

We are pleased to feature guest speaker Yun Yang, PhD, DABR, Medical Physicist at Rhode Island Hospital, USA. Dr Yang will present his findings from extensive testing of the myQA SRS at Rhode Island Hospital. The presentation will conclude with answers to your live questions.

Want to learn more on this subject?

Yun Yang, PhD, DABR, is a Medical Physicist at Rhode Island Hospital, USA. He has 10 years’ experience working as a Medical Physicist and has obtained a PhD in medical physics, MSc in radiological sciences and protection and a BE in electronic science and technology. Yun Yang specializes in the area of medical physics, has had numerous research articles published and is a member of AAPM, ASTRO and ACR. In addition to his passion for his field, he also shares his expertise through teaching as an Assistant Professor.

Muons and streetlights: the six-decade quest to pinpoint the value of g–2

I’m sure you know the tale of the police officer who spots a drunk person looking for their wallet beneath a streetlight. The officer asks if the drunkard is sure that’s where it’s lost. “No, I’m not,” the drunkard replies. “It’s just the only place I can see.” Psychologists refer to this observational bias, in which you study something where you can conveniently look, the “streetlight effect”. For experimental high-energy physicists, however, it’s all they can ever do.

Consider the muon – a “fatter” version of an electron – onto which huge resources have been devoted to measuring the way it wobbles. Physicists have been pursuing this quest almost continuously since the 1950s when they began building the theoretical edifice known as quantum electrodynamics (QED). In QED, which describes how light interacts with matter, particles are conceived as spinning magnets, with the ratio of their magnetic moment to spin being a value called g.

The difference between g and 2 is an indication of whether QED is comprehensive enough – or whether there is physics “beyond” it

In QED’s basic versions, g is exactly equal to 2. But when muons whirl around in a magnetic field, they encounter traces of all particles and forces “out there” in nature, and how they wobble depends on the total value of such traces. The experimental value of the difference between g and 2, called the anomalous magnetic moment or g–2, is therefore an indication of whether QED is comprehensive enough – whether there is physics “beyond” it.

To determine a value for g–2, physicists soon realized they’d have to align the spin axes of muons, send them through a magnetic field, and then see how they scatter. Unfortunately, that proved too difficult and it was not until the discovery of “parity violation” in the weak interaction in 1957 that things radically changed. Nature, it turned out, had bestowed a wonderful gift upon particle physicists. Among other things, parity violation meant that muons emit their decay products – electrons – only in certain directions with respect to their spin axis. Complex and difficult polarization and scattering procedures were no longer needed to determine the frequency of muon wobble. Instead, you could just study patterns of electrons.

Changing times

The first “g–2 experiment” began at CERN in 1959. Apart from potentially revealing that QED might be defective, experimentalists hoped that the project might reveal something about the difference between muons and electrons. The experiment also provided an important use for CERN’s first accelerator, the synchrocyclotron, whose value was already in doubt given the new generation of machines called synchrotrons.

Running in spurts for several years, with a final report issued in 1965, that first g–2 experiment was ultimately disappointing. It revealed no breakdown of QED, and heralded nothing new about the muon. As far as one could tell, the QED edifice was solid.

But then CERN’s new accelerator, the Proton Synchrotron, came into operation. Various experimental developments, coupled with a more precise theoretical value, suddenly made another experiment worthwhile. The second CERN g–2 experiment, which began in 1966, arrived at a measurement 25 times more precise than before. This result disagreed with theory by 1.7σ, a sign of a defect in the QED edifice jarring enough to inspire more work from both theorists and experimentalists.

By the time the second CERN g–2 experiment ended, yet more ways to beat back systematic errors had been uncovered, leading to a third version at CERN starting in 1969. One interesting feature was that this experiment was a highly sensitive test of general relativity. Debate was still ongoing about the physical reality of time dilation (i.e. the slowing down of a clock with respect to an observer) and now the measure of time dilation of the muon lifetime in the storage ring ended the debate. The results, which confirmed the QED prediction to a precision of 0.0007%, were published in 1979.

By that year, what was becoming known as the Standard Model of particle physics had come together, linking the known particles and almost all the known forces in a single theoretical package. Predicted particles were discovered, and measurements of various processes turned out to be in accord with theory. The QED edifice looked sounder than ever. Strangely, this attracted renewed attention to g–2, for now physicists sought some way – any way – to look for defects.

Another experiment was duly embarked upon, this time at the Brookhaven National Laboratory in the US. Using a 15 m-diameter storage ring fitted with superconducting magnets providing a vertical 1.45 T magnetic field, it sought to push the measurement down from seven parts per million to just one part per million, testing the limits of the Standard Model. With data collection complete in 2001, the result was published in 2004 and disagreed with theory by around 2.5σ, with an accuracy of 0.5 parts per million in the anomaly (Phys. Rev. Lett. 92 1618102).

Science is not a simple sequence of theoretical test and experimental confirmation or refutation

This was a suggestive, but not definitive, indication of physics beyond the Standard Model. A fifth g–2 experiment duly began at Fermilab in 2013, after Brookhaven bequeathed its magnet to the Illinois lab. Now consisting of about 200 people, the experiment announced its latest findings in April showing a discrepancy of 4.2σ (Phys. Rev. Lett. 126 141801). Despite being a little shy of the 5σ now considered necessary to achieve consensus for a claim, the new result was derived from only the first of several runs.

The critical point

The sequence of five g–2 experiments is an intriguing lesson in the history of physics. Each was undertaken for a different motive, each involved a different set of technologies, and each gave rise to results that had different implications. The story shows that science is not a simple sequence of theoretical test and experimental confirmation or refutation. Rather, many different theoretical and experimental factors come into play in making such a time-consuming and expensive experiment worthwhile.

Ultimately, experimentalists may very well find what they are looking for under the streetlight after all.

Diamond microparticles enable simultaneous MRI and optical imaging

A team of US-based researchers has created an innovative technique that uses diamond microparticles to enable simultaneous optical and MR imaging – a major breakthrough that could pave the way for faster and deeper medical imaging. So, how does the new technique work in practice? What are its advantages over existing imaging approaches? And what are the key research and clinical applications?

Deeper high-resolution images

When researchers or clinicians want to look closely at living tissue, they face a trade-off between the depth and clarity of the images that they can capture. This is because light-based, or optical, microscopes provide detailed, high-resolution images – but only up to depths of around a millimetre. Conversely, MRI uses radio frequencies capable of reaching all parts of the body, but can only capture low-resolution images.

In an effort to overcome these limitations, a research team headed up at the University of California, Berkeley, has demonstrated how microscopic diamond particles can be used to capture information from MRI and optical fluorescence imaging at the same time, potentially enabling observers to obtain high-resolution images up to a centimetre beneath the surface of tissue – 10 times deeper than existing approaches using just light. The researchers describe their new technique in the Proceedings of the National Academy of Sciences.

As author Ashok Ajoy explains, the new method makes use of both optical imaging and hyperpolarized MRI – the first ever implementation of such a combination. The technique uses diamond microparticles with nitrogen vacancy defects that optically fluoresce and allow the nuclei in carbon-13 (13C) atoms in the surrounding lattice to be spin hyperpolarized, which allows the diamond particles to light up in MRI.

“In combination, these two imaging modes have a lot of complementary features that make them attractive. Especially powerful is the fact that the imaging in the two modes occurs in Fourier reciprocal spaces,” says Ajoy.

Key applications

The research team envision using the technique primarily for cellular studies and examining tissue samples outside the body. Other likely applications range from helping with the identification of chemical markers of disease in blood to physiological studies in animals.

“While there is wide literature showing that diamond is non-toxic, we don’t envision using these particles inside the body natively,” Ajoy says.

That said, methods by which the diamonds are used to spin hyperpolarize other analytes – for example, water – which can then be injected into the body, might open exciting new background-free avenues of MRI imaging in angiography, he adds.

According to Ajoy, a key advantage of the combined approach is that, by using two modes of observation, it could allow faster imaging. Diamond tracers are also low cost and comparatively simple to work with in research and clinical settings – potentially broadening access to inexpensive nuclear magnetic resonance (NMR) techniques in the future. Other advantages of the new method stem from the fact it provides better imaging in scattering or optically dense media.

“Moreover, these two modes – optics and MRI – sample the image in two Fourier reciprocal spaces, known as x- and k- space. It’s like seeing the same object simultaneously in two conjugate modes; this can yield interesting new ways to significantly speed up image acquisition,” Ajoy says.

Moving forward, Ajoy confirms that the research team has already embarked on the next phase of research, which will focus on enhancing the functionality of the diamond particles. “We are attempting to endow the particles with chemical sensing capabilities so that they can provide information on their local chemical environment via their optical or MRI signatures,” he adds.

Unveiling the minute nanoscale magnetic realm with the Qnami ProteusQ

Want to learn more on this subject?

In recent years, the need for novel materials to boost storage and computation capabilities to keep up with the ongoing quantum revolution has increased tremendously. Nanoscale magnetism plays a crucial role in the identification of ideal candidate materials for such a challenge. Standard magnetic imaging techniques, however, can not reveal magnetic properties at the nanoscale without invasively perturbing the materials’ magnetic configuration.

Here enters Qnami ProteusQ: the first patented and complete quantum microscope system. It is the first scanning NV (nitrogen-vacancy) magnetometer for the analysis of magnetic materials at the atomic scale. Its proprietary quantum technology provides high-precision images that allow scientists and engineers to see directly the most subtle properties of their samples and the effect of microscopic changes in their design or fabrication processes.

In this talk, we will show how quantum sensing enables us to measure magnetic fields that have never been measurable before. We will provide an overview of the different magnetic imaging modalities of ProteusQ and put this in context of cutting-edge materials science research.

Want to learn more on this subject?

Alexander Stark is a co-founder and the CIO of Qnami. He received his PhD from the Technical University of Denmark where he developed measurement protocols for quantum sensing applications. Alexander has extensive know-how at the interface between hardware, software and quantum technologies, and is responsible for the development of Qnami’s flagship, the Qnami ProteusQ. He is also co-founder of Qudi, an open-source framework for quantum engineers that now constitutes the backbone of LabQ, the software operating Qnami’s Quantum Microscope.

Peter Rickhaus is the application scientist of Qnami. He has a broad knowledge in nanoscience ranging from biology, chemistry to physics. Within his PhD at Basel University and his Postdoc at the ETH Zurich he acquired a deep understanding of quantum physics and also extended his capabilities towards numerous fabrication and processing techniques. He contributed to more than 30 research publications and hosts a rich hands-on knowledge for investigative and analytical techniques.

 

Tomography technique could help in the fight against nuclear terrorism

Physicists at the Royal Institute of Technology in Stockholm, Sweden, have developed a new technique to rapidly detect and characterize so-called special nuclear materials like plutonium and enriched uranium. The technique, dubbed neutron-gamma emission tomography, works by measuring the “coincidences” of particles emitted in nuclear fission.

Special nuclear materials are a double-edged sword. As fuel for power stations and reactors, they have enabled great technological advances, but they can damage cities and even threaten human civilization if employed as weapons of mass destruction. They also pose a long-term contamination hazard, from accidents and from potential acts of nuclear terrorism using radiotoxic dispersion devices. Being able to identify, localize and characterize such materials quickly is therefore critical for national security, as well as for detecting radiation leaks and mapping radioactive contamination.

The problem is that the radiation portal monitors commonly used in settings such as airports and seaports are unable to do these things. Instead, they are simply designed to measure the radiation flux as people, vehicles, parcels and other objects pass through them, and set off an alarm if the flux exceeds predefined thresholds. The radiation flux they measure consists primarily of neutrons and gamma photons, both of which are produced during nuclear fission – the decay process by which the nucleus of an atom splits into two or more smaller, lighter “daughter” nuclei.

“Coincidences” of neutron and gamma-ray emissions

In contrast, the new neutron-gamma emission tomography (NGET) technique developed by Bo Cederwall and colleagues can determine the location of special nuclear materials with high precision. It works by measuring the time of arrival of neutrons and gamma photons at specially-designed detector assemblies. The system then looks for “coincidences” – that is, events in which neutrons and gamma rays are detected one after the other – and uses the time-of-arrival information to pinpoint the particles’ source in real time.

“In physics, fast coincidences mean that particles have arrived within a very short time interval, in this case within a couple of 100 nanoseconds or so,” Cederwall explains. “These particles are, in the majority of cases, correlated from the same fission event, or from other types of reactions like alpha-particle induced reactions in the material.”

Test source

The team members demonstrated their new technique using a prototype radiation portal monitor they developed in their laboratory. This system consists of an array of eight 127-mm-diameter by 127-mm-length cylindrical liquid organic scintillator cells arranged in two detector assemblies 1 metre apart. The researchers carried out their tests using a radioactive source of californium-252 (Cf-252) with a mass of 3.2 × 10−9 g, encapsulated in a 4.6-mm × 6-mm cylindrical ceramic casing.

Cf-252 undergoes spontaneous fission, producing an average of 3.76 neutrons per fission event. The source’s total fission rate of roughly 1900 events per second is thus equivalent to that produced by around 100 g of weapons-grade plutonium (7% plutonium-240 and 93% plutonium-241), which would correspond to an object about 1 cm in size.

Not yet optimized

Although Cederwall and colleagues stress that they have not yet optimized their detector for efficiency, nor designed it for imaging, they were nevertheless able to identify the position of their relatively weak test source within an uncertainty of just 4.2 cm. Using a set of more uniformly distributed detectors or smaller detector cells would, they say, substantially improve the detector’s spatial resolution. What is more, while the current study focused on measuring coincidences from a stationary source, the researchers say the method could readily be adapted to moving objects with the aid of an optical tracking system.

The researchers, who report their work in Science Advances, say they now plan to try out the NGET technique on different configurations and geometries of portal monitors, including some that might be used for vehicles and freight containers rather than pedestrians.  They have also begun a project to analyse the contents of radioactive waste containers. “There is a large global stockpile of temporarily stored radioactive waste – for example, from civil and military nuclear research – which is quite often of unknown detailed composition and origin,” Cederwall tells Physics World. “Such materials need careful characterization before they are disposed of to ensure public safety”.

Frequency and distance of human travel follows universal pattern, mobile-phone data reveals

Patterns describing how far and how frequently people travel to different locations within cities are surprisingly universal across the world, according to a new study by an international team of researchers. While it may seem obvious that people will travel to closer locations more often, this aspect of human mobility had not been analysed in detail before and the results could help city planners achieve a wide range of goals from improving public transport to controlling disease.

How people move from one place to another is of fundamental importance. It dictates social, economic and cultural exchanges; how cities develop and grow; traffic congestion and pollution; and the spread of contagious diseases. But despite this, our understanding of human movements is incomplete. Existing models tend to focus on the number of people that travel between different locations, with little consideration of the frequencies of recurring visits by individuals.

Gravity and radiation

The gravity law of human migration says that the movement between two urban areas decrease with distance and increases with population size or importance of the areas. The radiation model adds the component of other places people could stop on their journeys: the less places there are to stop – to go shopping or to work, for example – the further people will travel.

In this latest research, researchers analysed anonymized mobility data from millions of mobile phone users in seven cities around the world: Abidjan, Ivory Coast; Dakar, Senegal; Boston, US; Singapore; and Braga, Lisbon and Porto in Portugal. The data were collected during different periods between 2006 and 2013. Using these data, the researchers were able to estimate were each phone user lived and the places they travelled.

“Analysing not only visitation distance, but also frequency, allows us to gain a deeper understanding of urban mobility patterns, and to develop more accurate models of how people interact with the physical space surrounding them,” team member Paolo Santi of the Senseable City Laboratory at the Massachusetts Institute of Technology says. “These models could be used, for instance, to estimate energy demand in the transition towards more pervasive electric mobility.”

Predictable and universal

Santi and colleagues found that flows to all locations in a city follow a predictable and universal pattern, revealing a simple and robust law that they call the universal visitation law of human mobility. According to this law, the number of visitors to any location decreases as the inverse square of the product of their visiting frequency and travel distance. Or put more simply, people are unlikely to travel far too often.

This scaling law is remarkably consistent across urban areas around the world, according to the researchers, who found that the number of individuals who visited different locations was highly consistent across the different cities. They also found that the number of visitors decreases in a predictable pattern for all locations in a given city, with respect to the frequency of visits and distance travelled. High-density areas were filled with people who had travelled shorter distances.

“We believe that the striking similarity observed across different cities might be caused by some common, fundamental mechanism that drives people’s mobility choices,” Santi told Physics World. He explains that one possibility is that people alternate between returning to already visited locations and exploring new locations. But exploration choices are based on popularity: people are more likely to explore popular locations. “We have shown that this basic mechanism can generate a pattern of visitation distance and frequencies that replicates the visitation law observed in the data,” Santi adds.

Missing component

The research is described in a paper in Nature, and in an accompanying commentary, Laura Alessandretti and Sune Lehmann, at the Technical University of Denmark, write that Santi and colleagues “have identified a key component that was missing from existing theoretical frameworks of human mobility: visitor frequency”. They add that the discovery of the law paves “the way for studies that could deepen our theoretical understanding of how individual and collective mobility patterns are connected”.

Santi says, “We would like to explore more the applications of the visitation law, for example in the field of urban infrastructure design and planning”. “Also, we are applying the visitation law to study epidemics, trying to understand the effect of restrictions on distance and frequency of visitation to the size of the infected population.”

Welcome to the first Physics World Quantum Week

It’s an exciting time for anyone involved in quantum science and technology, with fields such as quantum computing, quantum communication and quantum cryptography all moving from a physicist’s dream to commercial reality.

So in the wake of last year’s hugely successful Quantum 2020 online conference, which was hosted by the Institute of Physics (IOP) and IOP Publishing, which publishes Physics World, we couldn’t resist following this up with what we’re dubbing Quantum Week.

Running on 14–18 June 2021, it showcases and celebrates some of the exciting work going on around the world in the burgeoning field of quantum science and technology. Here’s a rundown of what you can enjoy this week.

Quantum computational advantage and beyond” presented by Chao-Yang Lu (University of Science and Technology, China, 14 June)

“Tales of a not-quite-probability distribution” presented by Nicole Yunger Halpern (Harvard University, US, 15 June)

“Building quantum processors and networks atom by atom” presented by Hannes Bernien (University of Chicago, US, 15 June)

“Nitride quantum light sources” presented by Rachel Oliver (University of Cambridge, UK, 16 June)

“A roadmap for the quantum internet” presented by Tracy Northup and Harold Ollivier (University of Innsbruck, Austria, and INRIA, France, 17 June)

“A quantum future of computing” presented by Matthias Troyer (ETH Zürich, 18 June)

  • A curated selection of our in-depth quantum coverage including interviews with key figures from the quantum world, and features about everything from quantum theory to practical application of quantum technology
  • Two episodes of the Physics World Weekly podcast on 11 June and 18 June, as well as this month’s Physics World Stories podcast, will all have a quantum theme
  • More quantum research stories than ever, including a number from PhD students who are part of the newly formed Physics World student-contributor network in quantum physics.
  • A fun quiz created by staff at the Harwell Science and Innovation Campus in the UK.

And to keep the quantum theme going, we’re about to start up a new bi-monthly newsletter in quantum physics so you won’t miss any future Physics World quantum coverage. You can sign up to this newsletter now in your Physics World account. In the meantime, do enjoy Quantum Week!

Quasiprobabilities shed light on quantum advantage

Quantum advantage is a hot topic, with multiple experiments approaching (and some even surpassing) the point at which a quantum technology performs better than its classical predecessor. But how often does quantum advantage arise, and under what circumstances is it possible? These are some of the questions that David Arvidsson-Shukur, Jacob Chevalier Drori and Nicole Yunger Halpern explored in a recent paper in Journal of Physics A, which (like Physics World) is published by IOP Publishing.

Here, Yunger Halpern and Arvidsson-Shukur describe their research, their goals and their plans to test their predictions experimentally.

What was the motivation for your research?

Probabilities govern many aspects of our world, from university admissions to earthquakes to family poker games. These everyday probabilities have exotic cousins known as quasiprobabilities that are used to describe quantum observables such as an electron’s position and momentum (which cannot be described using a joint probability distribution because the observables are incompatible, being represented by operators that do not commute). Quasiprobabilities resemble probabilities in that they sum to one. However, they can assume negative and nonreal values as well as positive ones. Such values are called “nonclassical,” as they are inaccessible to the probabilities that govern the classical world.

Photo of Nicole Yunger Halpern

Nonclassical values of a particular quasiprobability have recently been shown to underlie some types of “quantum advantage” – that is, the ability of some quantum technologies to outperform their classical counterparts in computation, measurement, and thermodynamics. This special quasiprobability has an awkward name – it’s called the Kirkwood–Dirac quasiprobability, in honour of two 20th-century physicists – but considering the surnames of all three coauthors on this paper, we can’t criticize.

Given the importance of the Kirkwood–Dirac quasiprobability’s nonclassical values, two natural questions arise: Under what conditions does this quasiprobability behave anomalously? And how anomalous can its behaviour get? That’s what we wanted to explore.

What did you do in the paper?

We pinned down conditions under which the Kirkwood–Dirac quasiprobability assumes nonclassical values. Using these conditions, one can calculate which experiments can exhibit certain types of quantum advantages. We also put a “ceiling” on how much nonclassicality one Kirkwood–Dirac quasiprobability distribution can contain.

What was the most interesting or important finding?

Nonclassical Kirkwood–Dirac quasiprobabilities (and thus the quantum advantages achievable with them) turn out to be rarer than was previously expected. The Kirkwood–Dirac quasiprobability is defined in terms of observables such as position and momentum, or components of spin. Researchers previously believed that, whenever these observables failed to commute, the quasiprobability would assume nonclassical values. But our research shows that nonclassical Kirkwood–Dirac quasiprobabilities are more outlandish than quantum uncertainty.

Why is this research significant?

Photo of David Arvidsson-Shukur

For intertwined practical and fundamental reasons. We’re in the midst of the second quantum revolution, in which quantum physics is being applied to outperform everyday technologies in information processing, security, measurement, communication and more. Nonclassical Kirkwood–Dirac quasiprobabilities have been proven to underlie some of these quantum advantages.

Our work reveals conditions under which this quasiprobability becomes nonclassical, and thus also conditions under which quantum physics can bring advantages to technologies and protocols. Our results can be used to design experiments that leverage quantum resources. Furthermore, pinpointing what empowers quantum resources helps reveal how the quantum world differs from the classical in a fundamental sense.

What will you do next?

We’re collaborating with Aephraim Steinberg’s lab at the University of Toronto, Canada. Our experimental collaborators there are using photons to measure a property of a crystal, and they’re also measuring a Kirkwood–Dirac quasiprobability that describes their experiment. The crystal property they’re studying can be inferred most efficiently when this quasiprobability is negative or nonreal. Hence, the experiment signals that a nonclassical Kirkwood–Dirac quasiprobability underlies quantum resources’ ability to enhance our measurement abilities. We hope that this proof-of-principle experiment will lead to more uses of our work in sensing and other quantum technologies.

  • Find out more about quasiprobabilities in a webinar delivered by Nicole Yunger-Halpern as part of Physics World’s Quantum Week.

Optical cryostat proves a game-changer in quantum communication studies

Twin-track innovations in cryogenic cooling and optical table design are “creating the space” for fundamental scientific breakthroughs in quantum communications, allowing researchers to optimize the performance of secure, long-distance quantum key distribution (QKD) using engineered single-photon-emitting light sources.

In a proof-of-concept study last year, Tobias Heindel and colleagues in the Institute of Solid State Physics at the Technische Universität (TU) Berlin, Germany, implemented a basic QKD testbed in their laboratory. The experimental set-up uses a semiconductor quantum-dot emitter to send single-photon pulses along an optical fibre to a four-port receiver that analyses the polarization state of the transmitted qubits.

The attoDRY 800 represents a game-changer for our quantum-emitter physics experiments.

Tobias Heindel, TUBerlin

Significant progress is evident along several co-ordinates, including the use of temporal filtering to minimize quantum bit-error rates and maximize the achievable secure key rate and tolerable channel losses. What’s more, evaluation of the emitter’s photon statistics during key generation demonstrates real-time security monitoring to counter any eavesdropping.

Cool optics

At the heart of the TU Berlin testbed is the attoDRY800 optical cryostat from attocube, a German manufacturer of specialist nanotechnology solutions for research and industry. Put simply, this closed-cycle cryostat (which requires no liquid cryogens) consists of an ultralow-vibration (ULV) cold breadboard platform that’s fully integrated into an optical table. What’s more, the cryocooler assembly is located in the otherwise unused space underneath the table – a unique design that ensures unobstructed access to the cold sample from all directions on the optical table.

Photo of the attoDRY800

“The attoDRY 800 represents a game-changer for our quantum-emitter physics experiments,” explains Heindel. “The integrated set-up means everything that usually gets in the way is now located out of sight beneath the experimental table.”

As such, Heindel and his team can freely design their experiments around the so-called “sample shroud”, an integrated vacuum enclosure surrounding the cold plate and sample (in this case, a cryogenically cooled semiconductor quantum dot measuring just a few tens of nm across).

If required, the cryostat’s automated temperature control (between 3.8 and 320 K) also supports lengthy, unattended measurement cycles in which the sample is kept cold for weeks or months at a time. “Ultimately,” Heindel adds, “the ease-of-use and flexibility of this cryostat allow us, in large part, to focus on our science and not the enabling experimental technologies.”

Efficiency gains

Being a closed-cycle cryostat, the attoDRY800 provides a convenient replacement for traditional optical helium-flow cryostat set-ups. The absence of liquid cryogens represents a win-win, ensuring reduced running costs and significant workflow efficiencies – i.e. no need for researchers to spend time replacing empty helium gas cylinders every few days. “Like a good referee in football, you very quickly stop thinking about the cryostat – it’s just an integral part of the experiment,” notes Heindel.

Those efficiency gains extend to the optical testbed, where the ULV performance of the attoDRY800 ensures long-term stable coupling of single photons from the quantum-dot emitter into a singlemode fibre. (Worth noting as well that the cryostat’s ULV capability has been evidenced separately by attocube engineers as part of an in-house study of wide-field imaging in cryomicroscopy – their results showing no detectable influence of vibrations on the image resolution between cryostat on/off down to the diffraction limit.)

Like a good referee in football, you very quickly stop thinking about the cryostat – it’s just an integral part of the experiment.

Tobias Heindel, TUBerlin

To further optimize photon collection efficiency, users can specify high-numerical-aperture apochromatic objective lenses (NA = 0.81–0.95) integrated into the cryostat, into the vacuum shroud, or located on the outside of the shroud but in close proximity to the optical windows. For the TU Berlin team, however, a high-NA objective inside the cryostat is a must-have to guarantee extremely low relative spatial and vibrational drift. “This is a big plus,” notes Heindel, “because the optics for collecting photons are at the same cryogenic temperature as the quantum-dot emitter.”

According to attocube, the long-term stability of the attoDRY800 at base temperature – measured over 40 hours as part of the same in-house study highlighted above – shows no measurable drift, while drifts as a function of temperature are up to a factor of 20 better than alternative solutions thanks to the integrated low-temperature objectives.

As for precise positioning and manipulation of the quantum-dot source, the cold breadboard space is designed to host several of attocube’s patented nanopositioners. “We have a stack of three slip-stick piezo actuators that move the sample with respect to the optical access,” explains Heindel. “That could be an optical fibre integrated within the cryostat or an objective lens located inside or outside the cryostat.”

The quantum roadmap

On the commercial front, TU Berlin is emerging as a “lighthouse customer” for the attoDRY8000, with attocube now having installed three of the cryostats (plus optical tables) in the university’s Institute of Solid State Physics. Heindel, for his part, has two more attoDRY800 systems on order which will be integrated in one large “double-table” configuration, part of an experimental testbed that will see two quantum-light sources used for QKD via a common relay station.

“The testbed will be an early-stage demonstrator of a star-like topology that, we hope, will be scalable for a future metro-area quantum network,” he explains. “We are also deploying fibres and free-space optical links to other buildings here at TU Berlin to evaluate options for a quantum local-area network architecture.”

The direction of travel is clear, with the TU Berlin team moving beyond point-to-point quantum connections to address more complex issues than the sharing of a secure key between two parties. “We are working together with Freie Universität Berlin, Humboldt Universität zu Berlin and a number of non-university research institutions in Berlin and Brandenburg to develop a Berlin-wide quantum network,” concludes Heindel.

Just last month, the same consortium was confirmed as the first so-called Einstein Research Unit of the Berlin University Alliance. This cross-disciplinary initiative brings together expertise in theoretical and experimental physics, applied mathematics, computer science and machine learning to explore aspects of “quantum digital transformation”, including near-term quantum computational devices and quantum processors. The unit will be funded with €2m annually for an initial three years.

Sponsored by attocube

Quantum innovation: part of the attocube DNA

Photo of Florian Otto

The product portfolio at attocube covers a lot of bases – from precision motion components (such as nanopositioners and displacement-measuring interferometers) to measurement platforms (such as scanning probe microscopes), from closed-cycle cryostats for scientific applications to turnkey solutions for materials science and nanospectroscopy. Those products are also put to work across a range of operating conditions – from ambient to ultralow temperatures, from high magnetic fields to ultrahigh vacuum.

“Think one-stop shop, think vertical integration,” explains Florian Otto, head of product management at attocube. “We are facilitators of cutting-edge quantum research and that starts and ends with a granular understanding of the customer’s evolving requirements.”

As such, it’s essential for Otto and colleagues to see “first-hand what users need to maximize their scientific impact” – a key driver of attocube’s engagement with several pan-European R&D collaborations and a diverse network of research and industry partners.

A case in point is ASTERIQS. This three-year, €10 million initiative is funded by the European Union’s Quantum Flagship programme to develop next-generation diamond sensors for the measurement of a range of parameters (temperature, pressure, magnetic and electric fields) and the investigation of molecular-scale spintronic devices.

On a parallel track, attocube is also a participant in another Quantum Flagship project called SQUARE. This three-year effort, funded to the tune of €3 million, is seeking to “establish individually addressable rare-earth ions as a fundamental building block of a quantum computer, and to overcome the main roadblocks on the way towards scalable quantum hardware”.

Closer to market, meanwhile, quantum technology collaborations are under way with various industry partners, including millikelvin solutions for scanning probe microscopy (with Bluefors, Leiden Cryogenics and Oxford Instruments) and a cryogenic scanning microwave impedance microscope (with Primenano).

“Our mission at attocube is the same today as it’s always been,” says Otto. “To provide enabling technologies that will unlock the creativity, ingenuity and imagination of our users.”

 

Copyright © 2026 by IOP Publishing Ltd and individual contributors