A passive vacuum pump that uses 3D-printed surfaces to better absorb gas molecules has been unveiled by researchers in the UK. It removes gas nearly four-times faster than a similar system with a flat surface. The pump could make it easier to design quantum sensors that require high-vacuum conditions.
Cold atoms are at the heart of many quantum-sensing technologies. For example, atom interferometry is used to measure tiny deviations in local gravity – which can be used to map underground infrastructure.
Cold-atom systems must operate at high vacuum and most vacuum pumps are mechanical or electrical in nature. The size of these active pumps and the energy that they consume makes it difficult to operate sensors in remote or mobile scenarios – particularly on satellites. As a result, researchers who are designing quantum sensors are keen on reducing or even eliminating their reliance on active pumps.
One solution is the use of passive pumps, which have surfaces made from materials that absorb large numbers of gas molecules. Now, Lucia Hackermueller and colleagues at the University of Nottingham, Torr Scientific and Metamorphic Additive Manufacturing have created two new textured surfaces that accelerate passive pumping.
Bounce optimization
One of their surfaces is a hexagonal array of tapered pockets that resembles a honeycomb. The other surface is a hexagonal array of conical protrusions. They chose their designs after doing Monte Carlo computer simulations of how gas molecules behave near textured surfaces. When a molecule collides with a flat surface it will either be absorbed or bounce off the surface and escape. However, if the surface has 3D structures on it, a molecule may ricochet back and forth several times between structures before it escapes. Each collision increases the chance that the molecule will be absorbed by the surface. So, the researchers sought to optimize the number of bounces in their simulations.
They then used the 3D printing of a titanium alloy to create the two promising designs on hockey-puck sized flanges that could be installed in a conventional high-vacuum system (see figure). The final step in the fabrication process was to coat the surfaces with a nonevaporable getter, which is a material designed specifically to absorb gas molecules in a vacuum system.
The team found that their hexagonal-pocket design pumped gas 3.8 times faster than a flat surface – and the hexagonal-protrusion design achieved a performance that is nearly as good.
Team member Ben Hopton at the University of Nottingham says, “What’s exciting about this work is that relatively simple surface engineering can have a surprisingly large effect. By shifting some of the burden from active pumping to passive surface-based pumping, this approach has the potential to significantly reduce, or even remove, the need for bulky pumps in some vacuum systems, allowing quantum technologies to be far more portable.”
A new way of creating arrays of ultracold neutral atoms could make it possible to build quantum computers with more than 100,000 quantum bits (qubits) – two orders of magnitude higher than today’s best machines. The approach, which was demonstrated by physicists at Columbia University in the US, uses optical metasurfaces to generate the forces required to trap and manipulate the atoms. According to its developers, this method is much more scalable than traditional techniques for generating arrays of atomic qubits.
“Neutral atom arrays have become a leading quantum technology, notably for quantum computing, where single atoms serve as qubits,” explains atomic physicist Sebastian Will, who co-led the study with his Columbia colleague Nanfang Yu. “However, the technology available so far to make these arrays limits array sizes to about 10,000 traps, which corresponds to a maximum of 10,000 atomic qubits.”
Building on a well-established technique
In common with standard ways of constructing atomic qubit arrays, the new method relies on a well-established technique known as optical tweezing. The principle of optical tweezing is that highly focused laser beams generate forces at their focal points that are strong enough to trap individual objects – in this case, atoms.
To create many such trapping sites while maintaining tight control of the laser’s light field, scientists typically use devices called spatial light modulators (SLMs) and acousto-optic deflectors (AODs) to split a single moderately intense laser beam into many lower-intensity ones. Such arrays have previously been used to trap thousands of atoms at once. In 2025, for example, researchers at the California Institute of Technology in the US created arrays containing up to 6100 trapped atoms – a feat that Will describes as “an amazing achievement”.
A superposition of tens of thousands of flat lenses
In the new work, which is detailed in Nature, Will, Yu and colleagues replaced these SLMs and AODs with flat optical surfaces made up of two-dimensional arrays of nanometre-sized “pixels”. These so-called metasurfaces can be thought of as a superposition of tens of thousands of flat lenses. When a laser beam hits them, it produces tens of thousands of focal points in a unique pattern. And because the pixels in the Columbia team’s metasurfaces are smaller than the wavelength of light they are manipulating (300 nm compared to 520 nm), Yu explains that they can use these metasurfaces to generate tweezer arrays directly, without the need for additional bulky and expensive equipment.
The Columbia researchers demonstrated this by trapping atoms in several highly uniform two-dimensional (2D) patterns, including a square lattice with 1024 trapping sites; patterns shaped like quasicrystals and the Statue of Liberty with hundreds of sites; and a circle made up of atoms spaced less than 1.5 microns apart. They also created a 3.5 mm diameter metasurface that contains more than 100 million pixels and used it to generate a 600 × 600 array of trapping sites. “This is two orders of magnitude beyond the capabilities of current technologies,” Yu says.
Another advantage of using metasurfaces, Will adds, is that they are “extremely resilient” to high laser intensities. “This is what is needed to trap hundreds of thousands of neutral atom qubits,” he explains. “Metasurfaces’ laser power handling capabilities go several orders of magnitude beyond the state of the art with SLMs and AODs.”
Laying the groundwork
For arrays of up to 1000 focal points, the researchers showed that their metasurface-generated arrays can trap single atoms with a high level of control and precision and with high single-atom detection fidelity. This is essential, they say, because it demonstrates that the arrays’ quality is high enough to be useful for quantum computing.
While they are not there yet, Will says that the metasurface atomic tweezer arrays they developed “lay the critical groundwork for realizing neutral-atom quantum computers that operate with more than 100,000 qubits”. These high numbers, he adds, will be essential for realizing quantum computers that can achieve “quantum advantage” by outperforming classical computers. “The large number of qubits also allows for more ‘redundancy’ in the system to realize highly-efficient quantum error correction codes, which can make quantum computing – which is usually fragile – more resilient,” he says.
The Columbia team is now working on further improving the quality of their metasurfaces. “On the atomic arrays side, we will now try to actually fill such arrays with more than 100 000 atoms,” Will tells Physics World. “Doing this will require a much more powerful laser than we currently have, but it’s in a realistic range.”
This episode of the Physics World Weekly podcast features Amanda Randles, who is a computer scientist and biomedical engineer at Duke University in the US. In a conversation with Physics World’s Margaret Harris, Randles explains how she uses physics-based, computationally intensive simulations to develop new ways to diagnose and treat human disease. She has also investigated how data from wearable devices such as smartwatches can be used identify signs of heart disease.
In 2024, the Association for Computing Machinery awarded Randles its ACM Prize in Computing for her groundbreaking work. Harris caught up with Randles at the 2025 Heidelberg Laureate Forum, which brings prizewinning researchers and early-career researchers in computer science and mathematics to Heidelberg, Germany for a week of talks and networking.
Randles began her career as a physicist and she explains why she was drawn to the multidisciplinary research that she does today. Randles talks about her enduring love of computer coding and also reflects on what she might have done differently when starting out in her career.
I hear it all the time: physics students have only the haziest idea of what they can do with a physics degree. Staying in academia is the obvious option but they’re often not sure what else is out there. With hefty student debts to pay off, getting a well-paid job in finance seems to top many physicists’ wish lists these days. But there are lots of other options, from healthcare, green energy and computing to education, aviation and construction.
Some of the many things you can do with a physics degree are covered in the latest edition of Physics World Careers, which is out now. This bumper, 96-page digital guide contains profiles of physicists working across a variety of fields, along with career-development advice and a directory of employers looking to hire physicists. Now in its 10th year, the guide has become an indispensable source of careers information for physicists setting out in the world of work.
The 2026 edition of Physics World Careers includes, for example, an article featuring two leaders from the UK’s intelligence agency GCHQ, a spotlight on the many jobs in nuclear energy, as well as careers tips from a recent Physics World Live panel. Remember that if you’re ready to start your job search, you can find all the latest opportunities on the Physics World Jobs portal, which has vacancies in physics and engineering for people at all career stages.
A great example of where a physics degree can take you is Rob Farr, a theoretical physicist who’s spent more than 25 years in the food industry. He’s a wonderful illustration of a physicist doing something you might not expect, in his case going from the chilly depths of ice cream science to the dark arts of coffee production and brewing. But that’s the beauty of a physics degree – it provides skills, knowledge and insight that can be applied to very different areas.
High-level backing “Quantum Metrology: From Foundations to the Future” was held at NPL as part of the global celebrations for the UNESCO International Year of Quantum Science and Technology. Above: Lord Vallance, UK Minister for Science, Innovation, Research and Nuclear, opens the workshop with the official launch of the NMI-Q collaboration, an international metrology initiative that aims to accelerate the adoption of quantum technologies and applications. (Courtesy: NPL)
The UNESCO International Year of Quantum Science and Technology (IYQ) ends on an exotic flourish this month, with the official closing ceremony – which will be live-streamed from Accra, Ghana – looking back on what’s been a global celebration “observed through activities at all levels aimed at increasing public awareness of the importance of quantum science and applications”.
The timing of IYQ has proved apposite, mirroring as it does a notable inflection point within the quantum technology sector. Advances in fundamental quantum science and applied R&D are accelerating on a global scale, harnessing the exotic properties of quantum mechanics – entanglement, tunnelling, superposition and the like – to underpin practical applications in quantum computing and quantum communications.
Quantum metrology, meanwhile, has progressed from its roots in fundamental physics to become a cornerstone of technology innovation, yielding breakthroughs in fields such as precision timing, navigation, cryptography and advanced imaging – and that’s just for starters.
Collaborate to accelerate
Notwithstanding all this forward motion, IYQ has also highlighted significant challenges when it comes to scaling quantum systems, achieving fault tolerance and ensuring reproducible performance. Enter NMI-Q, an international initiative that leverages the combined expertise of the world’s leading National Metrology Institutes (NMIs) – from the G7 countries and Australia – to accelerate the adoption of foundational hardware and software technologies for quantum computing systems and the quantum internet.
Cyrus Larijani “We want NMI-Q to blossom into something much bigger than the individual NMIs.” (Courtesy: NPL)
The NMI-Q partnership was officially launched in November last year at the IYQ conference “Quantum Metrology: From Foundations to the Future”, an event hosted by NPL. Together, the respective NMIs will conduct collaborative pre-standardization research; develop a set of “best measurement practices” needed by industry to fast-track quantum innovation; and, ultimately, shape the global standardization effort in quantum technologies.
“NMI-Q has an ambitious and broad-scope brief, but it’s very much a joined-up effort when it comes to the division of labour,” says Cyrus Larijani, NPL’s head of quantum programme. The rationale being that no one country can do it all when it comes to the performance metrics, benchmarks and standards needed to take quantum breakthroughs out of the laboratory and into the commercial mainstream.
Post-launch, NMI-Q has received a collective “uptick” from the quantum community, with the establishment of internationally recognized standards and trusted benchmarks seen as core building blocks for the at-scale uptake and interoperability of quantum technologies. “What’s more,” adds Larijani, “there’s a clear consensus for collaboration over competition [between the NMIs], supported by shared development roadmaps and open-access platforms to avoid fragmentation and geopolitical barriers.”
Follow the money
In terms of technology push, the scale of investment – both public and private sector – in all things quantum means that the nascent supply chain is evolving at pace, linking component manufacturers, subsystem developers and full-stack quantum computing companies. That’s reinforced by plenty of downstream pull: all sorts of industries – from finance to healthcare, telecoms to energy generation – are seeking to understand the commercial upsides of quantum technologies, but don’t yet have the necessary domain knowledge and skill sets to take full advantage of the opportunities.
Given that context, the onus is on NMI-Q to pool its world-leading expertise in quantum metrology to inform evidence-based decision-making among key stakeholders in the “quantum ecosystem”: investors, policy-makers, manufacturers and, ultimately, the end-users of quantum applications. “Our task is to make sure that quantum technologies are built on reliable, scalable and interoperable foundations,” notes Larijani. “That’s the crux of where we’re going with NMI-Q.”
Made to measure NMI-Q leverages the combined expertise of NMIs from the G7 countries and Australia to shape the global standardization effort in quantum science and technology. Above: NMI-Q representatives gathered at NPL in November for the collaboration’s official launch, announced by UK science minister Lord Vallance (front row, third from right). (Courtesy: NPL)
Right now, NPL and its partner NMIs are busy shaping NMI-Q’s work programme and deliverables for 2026 and beyond, with the benchmarking of quantum computers very much front-and-centre. Their challenge lies in the diversity of quantum hardware platforms in the mix; also the emergence of two different approaches to quantum computing – one being a gate-based framework for universal quantum computation, the other an analogue approach tailored to outperforming classical computers on specific tasks.
“In this start-up phase, it’s all about bringing everyone together to define and assign the granular NMI-Q work packages and associated timelines,” says Larijani. Operational and strategic alignment is also mandatory across the member NMIs, so that each laboratory (and its parent government) is fully on board with the collaboration’s desired outcomes. “It’s going very well so far in terms of aligning members’ national interests versus NMI-Q’s direction of travel,” adds Larijani. “This emphasis on ‘science diplomacy’, if you like, will remain crucial to our success.”
Long term, NMI-Q’s development of widely applicable performance metrics, benchmarks and standards will, it is hoped, enable the quantum technology industry to achieve critical mass on the supply side, with those economies of scale driving down prices and increasing demand.
“Ultimately, though, we want NMI-Q to blossom into something much bigger than the individual NMIs, spanning out to engage the supply chains of member countries,” says Larijani. “It’s really important for NPL and the NMI-Q partners to help quantum companies scale their offerings, advance their technology readiness level and, sooner than later, get innovative products and services into the market.”
That systematic support for innovation and technology translation is evident on the domestic front as well. The UK Quantum Standards Network Pilot – which is being led by NPL – brings together representatives from industry (developers and end-users), academia and government to work on all aspects of standards development and ensure that UK quantum technology companies have access to global supply chains and markets.
Quantum impact
So what does success look like for Larijani in 2026? “We’re really motivated to work with as many quantum companies as we can – to help these organizations launch new quantum products and applications,” he explains. Another aspiration is to encourage industry partners to co-locate their R&D and innovation activities within NPL’s Institute for Quantum Standards and Technology.
“There are moves to establish a quantum technology cluster at NPL to enable UK and overseas companies to access our specialist know-how and unique measurement capability,” Larijani concludes. “Equally, as a centre-of-excellence in quantum science, we can help to scale the UK quantum workforce as well as encourage our own spin-out ventures in quantum metrology.”
Quantum futures: inclusive, ethical, sustainable
“Quantum Metrology: From Foundations to the Future” was held at NPL as part of UNESCO’s IYQ global celebrations. Organized by a steering committee of NMI-Q members, the conference explored quantum metrology and standards as enablers of technology innovation; also their role as “a cornerstone for trust, interoperability, and societal benefit in quantum innovation and adoption”.
The commitments below – articulated as formal recommendations for UNESCO – reflect the collective vision of conference delegates for an inclusive, ethical and sustainable quantum future…
Governance and ethics: attendees emphasized the need for robust governance and ethical oversight in quantum technologies. They called for the establishment of neutral international bodies, ideally under UN leadership, to ensure fair and transparent governance. Inclusivity was highlighted as essential, with a strong focus on extending benefits to developing nations and maintaining open dialogue. Concerns were raised about risks linked to scalability, security and potential misuse by non-state actors, underscoring the importance of proactive monitoring.
Standards and infrastructure: participants advocated for sustained funding to develop international standards and benchmarking frameworks. They also stressed the value of shared fabrication facilities and testbeds to democratize access and accelerate innovation globally.
Education and talent: education and talent development emerged as a priority, with recommendations to launch fully funded MSc programmes, practical placements and mentoring networks. Strengthening links between industry and academia, alongside outreach to schools, are seen as vital for early engagement and long-term skills development.
Societal impact: delegates urged that societal impact remain central to quantum initiatives. Applications in healthcare, climate modelling and sustainability should be a priority; also arts and cultural integration efforts to foster public understanding and ethical reflection.
Massively quantum: The University of Vienna’s Multi-Scale Cluster Interference Experiment (MUSCLE), where researchers detected quantum interference in massive nanoparticles. (Courtesy: S Pedalino / Uni Wien)
Classical mechanics describes our everyday world of macroscopic objects very well. Quantum mechanics is similarly good at describing physics on the atomic scale. The boundary between these two regimes, however, is still poorly understood. Where, exactly, does the quantum world stop and the classical world begin?
Researchers in Austria and Germany have now pushed the line further towards the macroscopic regime by showing that metal nanoparticles made up of thousands of atoms clustered together continue to obey the rules of quantum mechanics in a double-slit-type experiment. At over 170 000 atomic mass units, these nanoparticles are heavier than some viroids and proteins – a fact that study leader Sebastian Pedalino, a PhD student at the University of Vienna, says demonstrates that quantum mechanics remains valid at this scale and alternative models are not required.
Multiscale cluster interference
According to the rules of quantum mechanics, even large objects behave as delocalized waves. However, we do not observe this behaviour in our daily lives because the characteristic length over which this behaviour extends – the de Broglie wavelength λdB = h/mv, where h is Planck’s constant, m is the object’s mass and v is its velocity – is generally much smaller than the object itself.
In the new work, a team led by Vienna’s Markus Arndt and Stefan Gerlich, in collaboration with Klaus Hornberger at the University of Duisburg-Essen, created clusters of sodium atoms in a helium-argon mixture at 77 K in an ultrahigh vacuum. The clusters each contained between 5000 and 1000 atoms and travelled at velocities of around 160 m s−1, giving them de Broglie wavelengths between 10‒22 femtometres (1 fm = 10-15 m).
To observe matter-wave interference in objects with such ultra-short de Broglie wavelengths, the team used an interferometer containing three diffraction gratings constructed with deep ultraviolet laser beams in a so-called Talbot–Lau configuration. The first grating channels the clusters through narrow gaps, from which their wave function expands. This wave is then modulated by the second grating, resulting in interference that produces a measurable striped pattern at the third grating.
This result implies that the clusters’ location is not fixed as it propagates through the apparatus. Instead, its wave function is spread over a span dozens of times larger than an individual cluster, meaning that it is in a superposition of locations rather than occupying a fixed position in space. This is known as a Schrödinger cat state, in reference to the famous thought experiment by physicist Erwin Schrödinger in which he imagined a cat sitting in a sealed box to be both dead and alive at once.
Pushing the boundaries for quantum experiments
The Vienna-Duisburg-Essen researchers characterized their experiment by calculating a quantity known as macroscopicity that combines the duration of the quantum state (its coherence time), the mass of the object in that state and the degree of separation between states. In this work, which they detail in Nature, the macroscopicity reached a value of 15.5 – an order of magnitude higher than the best known previous reported measurement of this kind.
Arndt explains that this milestone was reached thanks to a long-term research programme that aims to push quantum experiments to ever higher masses and complexity. “The motivation is simply that we do not yet know if quantum mechanics is the ultimate theory or if it requires any modification at some mass limit,” he tells Physics World. While several speculative theories predict some degree of modification, he says, “as experimentalists our task is to be agnostic and see what happens”.
Arndt notes that the team’s machine is very sensitive to small forces, which can generate notable deflections of the interference fringes. In the future, he thinks this effect could be exploited to characterize the properties of materials. In the longer term, this force-sensing capability could even be used to search for new particles.
Interpretations and adventures
While Arndt says he is “impressed” that these mesoscopic objects – which are in principle easy to see and even to localize under a scattering microscope – can be delocalized on a scale more than 10 times their size if they are isolated and non-interacting, he is not entirely surprised. The challenge, he says, lies in understanding what it means. “The interpretation of this phenomenon, the duality between this delocalization and the apparently local nature in the act of measurement, is still an open conundrum,” he says.
Looking ahead, the researchers say they would now like to extend their research to higher mass objects, longer coherence times, higher force sensitivity and different materials, including nanobiological materials as well as other metals and dielectrics. “We still have a lot of work to do on sources, beam splitters, detectors, vibration isolation and cooling,” says Arndt. “This is a big experimental adventure for us.”
Using artificial intelligence (AI) increases scientists’ productivity and impact but collectively leads to a shrinking of research focus. That is according to an analysis of more than 41 million research papers by scientist in China and the US, which finds that scientists who produce AI-augmented research also progress faster in their careers than their colleagues who do not (Nature649 1237).
The study was carried out by James Evans, a sociologist at the University of Chicago, and his colleagues who analysed 41.3 million papers listed in the OpenAlex dataset published between 1980 and 2025. They looked at papers in physics and five other disciplines – biology, chemistry, geology, materials science and medicine.
Using an AI language model to identify AI-assisted work, the team picked out almost 310,000 AI-augmented papers from the dataset. They found that AI-supported publications receive more citations than no-AI-assisted papers, while also being more impactful across multiple indicators and having a higher prevalence in high-impact journals.
Individual researchers who adopt AI publish, on average, three times as many papers and get almost five times as many citations as those not using AI. In physics, researchers who use AI tools garner 183 citations every year, on average, while those who do not use AI get only 51 annually.
AI also boosts career trajectories. Based on an analysis of more than two million scientists in the dataset, the study finds that junior researchers who adopt AI are more likely to become established scientists. They also gain project leadership roles almost one-and-a-half years earlier, on average, than those who do not use AI.
Fundamental questions
But when the researchers examined the knowledge spread of a random sample of 10,000 papers, half of which used AI, they found that AI-produced work shrinks the range of topics covered by almost 5%. The finding is consistent across all six disciplines. Furthermore, AI papers are more clustered than non-AI papers, suggesting a tendency to concentrate on specific problems.
AI tools, in other words, appear to funnel research towards areas rich in data and help to automate established fields rather than exploring new topics. Evans and colleagues think this AI-induced convergence could drive science away from foundational questions and towards data-rich operational topics.
AI could, however, help combat this trend. “We need to reimagine AI systems that expand not only cognitive capacity but also sensory and experimental capacity,” they say. “[This could] enable and incentivize scientists to search, select and gather new types of data from previously inaccessible domains rather than merely optimizing analysis of standing data.”
Meanwhile, a new report by the AI company OpenAI has found that messages on advanced topics in science and mathematics on ChatGPT over the last year have grown by nearly 50%, to almost 8.4 million per week. The firm says its generative AI chatbot is being used to advance research across scientific fields from experiment planning and literature synthesis to mathematical reasoning and data analysis.
Hints of non-gravitational interactions between dark matter and “relic” neutrinos in the early universe have emerged in a study of astronomical data from different periods of cosmic history. The study was carried out by cosmologists in Poland, the UK and China, and team leader Sebastian Trojanowski of Poland’s NCBJ and NCAC PAS notes that future telescope observations could verify or disprove these hints of a deep connection between dark matter and neutrinos.
Dark matter and neutrinos play major roles in the evolution of cosmic structures, but they are among the universe’s least-understood components. Dark matter is thought to make up over 25% of the universe’s mass, but it has never been detected directly; instead, its existence is inferred from its gravitational interactions. Neutrinos, for their part, are fundamental subatomic particles that have a very low mass and interact only rarely with normal matter.
Analysing data from different epochs
According to the standard (ΛCDM) model of cosmology, dark matter and neutrinos do not interact with each other. The work of Trojanowski and colleagues challenges this model by proposing that dark matter and neutrinos may have interacted in the past, when the universe was younger and contained many more neutrinos than it does today.
This proposal, they say, was partly inspired by a longstanding cosmic conundrum. Measurements of the early universe suggest that structures such as galaxies should have grown more rapidly than ΛCDM predicts. At the same time, observations of today’s universe indicate that matter is slightly less densely packed than expected. This suggests a slight mismatch between early and late measurements.
To explore the impact that dark matter-neutrino interactions (νDM) would have on this mismatch, a team led by Trojanowski’s colleague Lei Zu analysed data from different epochs of the universe’s evolution. Data from the young (high redshift) universe came from two instruments – the ground-based Atacama Cosmology Telescope and the space-based Planck Telescope, which the European Space Agency operated from 2009 to 2013 – that were designed to study the afterglow of the Big Bang, which is known as the cosmic microwave background (CMB). Data from the older (low-redshift, or z< 3.5) universe, meanwhile, came from a variety of sources, including galaxy maps from the Sloan Digital Sky Survey and weak gravitational lensing data from the Dark Energy Survey (DES) conducted with the Dark Energy Camera on the Victor M Blanco Telescope in Chile.
“New insight into how structure formed in the universe”
Drawing on these data, the team calculated that an interaction strength u ≈10−4 between dark matter and neutrinos would be enough to resolve the discrepancy. The statistical significance of this result is nearly 3σ, which team member Sming Tsai Yue-Lin of the Purple Mountain Observatory in Nanjing, China says was “largely achieved by incorporating the high-precision weak lensing data from the DES with the weak lensing component”.
While this is not high enough to definitively disprove the ΛCDM model, the researchers say it does show that the model is incomplete and requires further investigation. “Our study shows that interactions between dark matter and neutrinos could help explain this difference, offering new insight into how structure formed in the universe,” explains team member Eleonora Di Valentino, a senior research fellow at Sheffield University, UK.
Trojanowski adds that the ΛCDM has been under growing pressure in recent years, while the Standard Model of particle physics cannot explain the nature of dark matter. “These two theories need to be extended to resolve these problems and studying dark matter-neutrino interactions are a promising way to achieve this goal,” he says.
The team’s result, he continues, adds to the “massive amount of data” suggesting that we are reaching the limits of the standard cosmological model and may be at the dawn of understanding physics beyond it. “We illustrate that we likely need to bridge cosmological data and fundamental particle physics to describe the universe across different scales and so resolve current anomalies,” he says.
Two worlds
One of the challenges of doing this, Trojanowski adds, is that the two fields involved – cosmological data analysis and theoretical astroparticle physics – are very different. “Each field has its own approach to problem-solving and even its own jargon,” he says. “Fortunately, we had a great team and working together was really fun.”
The researchers say that data from future telescope observations, such as those from the Simonyi Survey Telescope at the Vera C Rubin Observatory (formerly known as the Large Synoptic Survey Telescope, LSST) and the China Space Station Telescope (CSST), could place more stringent tests on their hypothesis. Data from CMB experiments and weak lensing surveys, which map the distribution of mass in the universe by analysing how distant galaxies distort light, could also come in useful.
Quantum entanglement is a uniquely quantum link between particles that makes their properties inseparable. It underlies the power of many quantum technologies from secure communication to quantum computing, by enabling correlations impossible in classical physics.
Entanglement nevertheless remains poorly understood and is therefore the subject of a lot of research, both in the fields of quantum technologies as well as fundamental physics.
In this context, the idea of separability refers to a composite system that can be written as a simple product (or mixture of products) of the states of its individual parts. This implies there is no entanglement between them and to create entanglement, a global transformation is needed.
A system that remains completely free of entanglement, even after any possible global invertible transformation is applied, is called absolutely separable. In other words, it can never become entangled under the action of quantum gates.
Separable, Absolutely Separable and Entangled sets: It is impossible to make absolutely separable states entangled with a global transformation (Courtesy J. Abellanet Vidal and A. Sanpera Trigueros)
Necessary and sufficient conditions to ensure separability exist only in the simplest cases or for highly restricted families of states. In fact, entanglement verification and quantification is known to be generically an NP-hard problem.
Recent research published by a team of researchers from Spain and Poland has tackled this problem head-on. By introducing new analytical tools such as linear maps and their inverses, they were able to identify when a quantum state is guaranteed to be absolutely separable.
These tools work in any number of dimensions and allow the authors to pinpoint specific states that are on the border of being absolutely separable or not (mathematically speaking, ones that lie on the boundary of the set). They also show how different criteria for absolute separability, which may not always agree with each other, can be combined and refined using convex geometry optimisation.
Being able to more easily and accurately determine whether a quantum state is absolutely separable will be invaluable in quantum computation and communication.
The team’s results for multipartite systems (systems with more than two parts) also reveal how little we currently understand about the entanglement properties of mixed, noisy states. This knowledge gap suggests that much more research is needed in this area.
When we interact with everyday objects, we take for granted that physical systems naturally settle into stable, predictable states. A cup of coffee cools down. A playground swing slows down after being pushed. Quantum systems, however, behave very differently.
These systems can exist in multiple states at once, and their evolution is governed by probabilities rather than certainties. Nevertheless, even these strange systems do eventually relax and settle down, losing information about their earlier state. The speed at which this happens is called the relaxation rate.
Relaxation rates tell us how fast a quantum system forgets its past, how quickly it thermalises, reaches equilibrium, decoheres, or dissipates energy. These rates are important not just for theorists but also for experimentalists, who can measure them directly in the lab.
Recently, researchers discovered that these rates obey a surprisingly universal rule. For a broad class of quantum processes (those described by what physicists call Markovian semigroups) the fastest possible relaxation rate cannot exceed a certain limit. Specifically, it must be no larger than the sum of all relaxation rates divided by the system’s dimension. This constraint, originally a conjecture, was first proven using tools from classical mathematics known as Lyapunov theory.
In a new paper published recently, an international team of researchers provided a new, more direct algebraic proof of this universal bound. There are a number of advantages of the new proof compared to the older one, and it can be generalised more easily, but that’s not all.
The very surprising outcome of their work is that the rule doesn’t require complete positivity. Instead, a weaker condition – two‑positivity is enough. The distinction between these two requirements is crucial.
Essentially, both are measures of how well-behaved a quantum system is, how it is protected from providing nonsensical results. The difference is that two-positivity is slightly less stringent but far more general, and hence very useful for many real-world applications.
The fact that the new proof only requires two-positivity means that it this new universal relaxation rate can actually be applied to a lot more scenarios.
What’s more, even when weakened even further, a slightly softer version of the universal constraint still holds. This shows that the structure behind these bounds is richer and more subtle than previously understood.