Skip to main content

Single photon can encode quantum information in 10 dimensions

A simple way of encoding a large amount of information in a single photon has been unveiled by an international team of physicists led by Michael Kues and Christian Reimer of INRS-EMT in Montreal, Canada.

The technique involves firing a laser pulse into a tiny ring that traps light at a set of equally spaced frequencies called a frequency comb. This microring resonator is made from a material with nonlinear optical properties, which allows two photons of the laser light to combine to create two photons with frequencies that are distributed across the frequency comb.

Loss tolerant

In this experiment, each photon is produced with 10 possible frequencies. Just as the polarization state of a photon allows it to carry a bit of quantum information that can be zero or one, these photons can be encoded in terms of their frequencies to carry much larger amounts of quantum information. This could be very beneficial because quantum information is lost when photons are absorbed randomly in components such as optical fibres. If fewer photons are required to transmit larger amounts of information, then losses would be less significant.

The microring resonator can be integrated within an optical chip and the team was able to manipulate the encoded photons using off-the-shelf components used in optical-communications networks. Another benefit of the scheme is that the pairs of photons are produced in an entangled state, which makes them useful for a wide range of quantum-information applications. The research is described in Nature.

Qatar blockade highlights fragility of helium supply

The world is not running out of helium. It never will. For one thing, there’s 3.8 billion tonnes of the stuff bearing down on us. The Earth’s atmosphere contains helium at a concentration of 5.2204±0.0041 ppmv, and as far as we can tell, it’s at equilibrium: there is no sign of anthropogenic helium emissions, and the rate of helium generated from radioactive decay in the Earth’s crust seems to balance out the flow of helium ions from the upper atmosphere into space.

So, with all that helium up there, should we worry about helium supplies? Unfortunately, the answer is “probably”. The helium used in laboratories, hospitals and semiconductor-fabrication facilities around the world arrives there via a precarious, finely tuned cryogenic supply chain. The various parts of this chain are vulnerable to disruption in a number of ways, and if you start taking any of them away, the result is a game of helium Jenga: although there is a certain amount of resilience, each loss reduces the system’s stability, and there is always a chance that removing one more piece will bring the tower crashing down, regionally if not globally.

Rationed supply

The latest candidate for that crucial piece is Qatar. The tiny Gulf state produces more than 25% of the world’s helium, but earlier this month, a dispute with its neighbours meant that it could not export it. If this political dispute is not resolved, and the signs are not encouraging, low-temperature physics labs and other critical helium users in Europe and East Asia could catch a cold. No one wants to be “on allocation” – legal jargon for a rationed supply – but scientific or medical users could be hit particularly hard, especially since there is no special pleading or prioritization system: reductions are applied as contracts allow, regardless of the user’s relative importance. During the last major helium crunch, in 2013, some facilities had their supply cut by half or more.

To understand how this situation came about, it helps to know how helium is produced, and how the market for it developed. As already noted, helium is plentiful in the Earth’s atmosphere, but if we had to extract it from there, low-temperature physics as we know it would have been greatly hindered because of the associated monetary and energy costs. Fortunately for physicists – as well as patients in MRI scanners, aerospace engineers (helium is used to propel hydrogen rocket fuel), data centres (their hard drives are filled with helium to boost storage capacity) and many others – certain strata in the Earth’s crust contain giant accumulations of natural gas, nitrogen and helium, all just waiting to be found by geologists.

Scientific or medical users could be hit particularly hard, especially since there is no special pleading or prioritization system

The problem is that helium is the Cinderella of industrial gases. Although it critically underpins 21st century technology – the materials scientist Mark Miodownik dubbed it a “super element” in a 2017 BBC television programme – it is nevertheless habitually neglected, vented and wasted. In short, the decision to recover and purify helium from a particular gas field depends entirely on the price of natural gas, the size of the field and the concentration of helium within it.

Airships and rockets

This has not always been the case. A century ago, helium was a cutting-edge military technology, a potential replacement for flammable hydrogen in airships. Realizing this, the US government started recovering the small-but-useful percentage of helium found in certain gas fields across the Great Plains, notably at Cliffside in the Texas panhandle. By the 1960s, the emphasis had shifted from airships to rockets, and a huge stockpile of helium was being accumulated for use in the Apollo programme: the Saturn V rocket that launched astronauts toward the Moon in July 1969 used around 65 tonnes of helium to purge the cryogenic tanks and lines, and to propel its liquid-hydrogen fuel. At that time, the only other significant helium source was in Russia, and so helium became a Cold War gas, prized by both sides.

Indirectly, however, the space race also led to the development and refinement of low-temperature cryogenics technology, and this opened up new possibilities for helium. While Cold War adversaries were sending up astronauts and cosmonauts, natural gas was beginning to be liquefied on an industrial scale for export in special tankers – chiefly to Japan, an emerging but energy-poor nation. One of the first liquid natural gas (LNG) plants was built at Arzew, Algeria, and it yielded an unexpected bonus. Although Algeria’s giant Hassi-R’mel gas field contains just 0.17% helium, officials at the Arzew LNG plant discovered that the non-condensable residue or “purge gas” from the main condenser was around 50% helium. Initially, this gas was simply vented, but through patient negotiation, a team from the firm Air Products persuaded the Algerians to form a joint-venture company to build a helium refinery and liquefier.

The development of this plant made it possible to export pure liquid helium in special, super-insulated tanks to Japan, Europe and beyond (previously, nearly all of Europe’s helium had come from the US). The first helium shipment left Arzew in 1995, and since then, the idea of recovering helium during the liquefaction of natural gas has caught on as the LNG market has expanded. Helium is now routinely recovered from LNG plants in Algeria, Australia and, of course, Qatar, where a project called Helium 3 was due to boost the country’s share of the global helium market from 25% to more than a third upon the completion of a third helium liquifier in early 2018.

The latest weak link

That, at least, was the plan before Qatar’s neighbours imposed sanctions on it. Ironically, up to that point, the helium market had been returning to some kind of equilibrium after a period of tremendous upset triggered by the US Helium Privatization Act of 1996. The sole purpose of this act was to sell off America’s helium reserve, and thus to recover the cost of creating and maintaining the country’s helium stockpile. At a stroke, it effectively made the US government the world’s biggest player and price setter in the helium market. Many smaller refiners dropped out, unable to compete against the flood of helium emerging from the Cliffside field. By 2013 the Act’s work was done, and plans were duly made to shut down production at Cliffside. Only intense lobbying persuaded the US Congress not to cut off the world’s helium supply at a stroke, and at the last minute, the Helium Stewardship Act of 2013 averted a total shutdown.

The knock-on costs to a scientific or medical facility of being without helium far exceed the costs of the helium itself

In some ways, this was a wake-up call for the industry. Today, there are around 20 significant helium refiners in the world. But even though only a few per cent of the world’s supply is auctioned in Texas (and the Cliffside gas field will be depleted by 2021), the US still effectively sets the price. This is not very satisfactory, but no one has yet come up with a better idea of how to price helium.

Hopefully, Qatar’s absence from this market will be temporary. But both sides in the dispute seem to be digging in for the long haul, and the US – which has long acted as an intermediary in the region – is being ambivalent. Even if the dispute is resolved within a few weeks, the knock-on effects will be considerable. Qatar is still shipping LNG to customers via carriers that berth at special facilities near Ras Laffan Industrial City at the northern end of the country. However, Qatar’s containerized imports and exports, including cryogenic helium tanks, are normally sent by road through Saudi Arabia and on to the major port of Jebel Ali in the United Arab Emirates (UAE). The decision of the UAE and Saudi Arabia (with Egypt and Bahrain) to impose sanctions meant that this route is no longer open.

Warm containers

With exports thus cut off, the two existing helium liquifiers at Ras Laffan have been shut down. Restarting them will take time, and the hiatus also means that many helium containers will be in the wrong place. Worse, the containers will have warmed up, and re-conditioning them will take time and care – as the low-temperature science community knows from long experience.

In response to this crisis, helium suppliers in Europe and East Asia have begun running down their existing stocks of purified helium. If and when these are exhausted, the dreaded allocations will start, and at that point managers of scientific facilities will be in a tight spot. Most large-scale physics experiments need a certain minimum, long-term helium supply rate. They can get by for a week or two but they will soon be asking how long this blockade in Qatar is going to last. The knock-on costs to a scientific or medical facility of being without helium far exceed the costs of the helium itself.

Creative solutions

Regardless of what happens in Qatar, the overall instability of the helium market means that anxiety about supplies has understandably become the norm, even though terrestrial reserves show no sign of reaching “peak helium” production. Some users have responded with creative solutions. Helium recycling systems such as the one recently installed at the UK’s Rutherford Appleton Laboratory (RAL) reduce both supply anxiety and cost; the RAL system should pay for itself within two years. Another useful strategy is to follow MRI manufacturers in recovering helium refrigerant gas during the decommissioning of old machinery. Minimizing leaks is also crucial, especially for fusion-energy applications. However, if fusion is ever to become a technically (and perhaps economically) viable energy source, engineers need to ensure that the trend of reducing leaks by 3% per cent each year (the so-called “leak learning rate”), achieved at the Joint European Torus and elsewhere, is replicated at ITER and in DEMO, the first fusion power plant.

The real challenge, however, is for the physics community to wean itself off the abundant but intermittent terrestrial helium supply. A confluence of needs – for ultra-high cryogenic system reliability, compactness, low helium inventory and, in space systems, for minimal weight and power requirements – is pushing the community in the direction of alternative technologies such as cryocoolers and pulsed-tube refrigerators. These high-precision heat engines can produce very low temperatures without the need for liquid-helium refills, and their efficiency and reliability is improving continuously. The amount of helium used per watt of cooling power is thus falling dramatically; even though helium is still the refrigerant gas, the inventory and leakage rates of new systems could be met economically through the recovery of helium from the atmosphere.

The helium supply and helium user communities have been innovating for a century now, and this shows no sign of petering out. Both the importance of helium and the anxiety about its supply drive ingenuity. Necessity is, once again, indeed the mother of invention.

Krypton nuclei coexist in two distorted shapes

The first spectroscopic studies of krypton-98 and krypton-100 suggest that these nuclei exist in a quantum superposition of two distorted shapes. The study also provides evidence that neutron-rich krypton nuclei undergo a much slower transition from spherical to distorted shapes than do neighbouring isotopes of rubidium, strontium and zirconium.

Nuclei with atomic masses of about 100 and neutron number of about 60 tend to undergo rapid changes in shape as the number of neutrons in the nucleus increases by just one or two. Understanding how and why these changes occur could provide important information about how nucleons (protons and neutrons) in the nucleus interact with each other – something that is not well understood. Previous measurements at the REX-ISOLDE facility at CERN on krypton-94 and krypton-96 (which has 60 neutrons) had puzzled physicists because they hinted that the transition from a spherical to a deformed shape occurs much more gradually than in other atomic species.

Gamma-ray spectra

This latest work was done by researchers at Institut de physique nucléaire Orsay in France and RIKEN in Japan, who used RIKEN’s Radioactive Isotope Beam Factory. They fired uranium nuclei at a succession of targets to produce a number of different species of nuclei. The species are then separated and studied by measuring the gamma rays that they give off as they make transitions between internal quantum energy states.

These spectra suggest that the deformed state of krypton above 60 neutrons is a coexistence of two different shapes – a sphere flattened at the poles (oblate) and a sphere stretched at the poles (prolate). The data, which are presented in Physical Review Letters, also confirm that above 60 neutrons the distortion increases gradually.

Delivering on a quantum promise

Unlike modern-day politicians, scientists have great authority regarding the pursuit of truth. As a consequence, many members of the public – as well as political bodies and institutions – tend to uncritically take science as a matter of fact. Yet there are some occasions where the benefits and risks of science need to be communicated carefully. A striking example is nuclear energy and the bold claims made by some that nuclear-fission technology is “safe beyond doubt”. One only has to look at the nuclear power plants at Three Mile Island, Chernobyl and Fukushima to know that such claims can prove spurious.

In a similar fashion, I believe that the alleged applications of quantum physics are, in some cases, being oversold to the public. The “quantum mechanics is magic” tour – expressed through European physicists’ “quantum manifesto” – has resulted in the European Commission launching a €1bn quantum-technologies flagship initiative in quantum technology. The campaign promises to deliver nothing less than a “second quantum revolution”.

Feasibility of goals

To me, the initiative, along with other framework and flagship programmes, resembles Soviet-style five-year plans – a bureaucrat’s delight. I have no doubt that €1bn spent on quantum physics is being wisely invested and that something worthy will come out of it. What worries me is the deceptive and potentially harmful way that this and similar quantum-related initiatives are promoted. While many of the quantum manifesto’s short- and medium-term goals appear feasible, some of the long-term goals might not be achievable even in principle. And when it comes to quantum random-number generators and quantum cryptography, certain goals are impossible, as I outlined recently in Ethics in Science and Environmental Politics (16 25).

Take, for example, the manifesto’s call to “build a universal quantum computer able to demonstrate the resolution of a problem that, with current techniques on a supercomputer, would take longer than the age of the universe”. I am at a loss to imagine what that could be, given the rather sober situation regarding the capacity of quantum computers. Although the Quantum Algorithmic Zoo – a catalogue of quantum algorithms compiled by the US National Institute of Standards and Technology – showcases a growing number of potential speed-ups by utilizing quantum computers, no substantial “killer apps” have been suggested in the last few years. Indeed, there is not even a consensus about what exactly could make quantum computation better than classical computation.

Most physicists seem to agree that one advantage might be “quantum parallelism”. This is based on coherently superposing classically distinct and mutually exclusive computational states and pushing all of them through a quantum computer simultaneously. It seems, however, that this strategy is applicable only in particular instances. It is also unclear if quantum computation is scalable so that an increase in quantum bits would need no excessive – possibly exponential – overhead in resources to create and maintain the additional bits. Another genuine quantum application is the use of quantum entanglement for communication. While exponential speed-ups have been proposed, there is again no common understanding of the issues involved.

Regarding quantum random-number generators, the situation is confused, to say the least. Indeed, it is not even clear where exactly quantum randomness resides. It cannot originate from elements such as lossless beam splitters because these are merely “permuting” the quantum state. If measurements were the source of randomness, then any such randomness would be tied to the notorious quantum-measurement problem – how or whether wave-function collapse occurs. Moreover, because of “incompleteness theorems”, any statements regarding the indeterminism, let alone randomness, of empirical bit sequences are unprovable. Thus regardless of what we may be inclined to believe, and whatever authoritative certificates are issued, such claims remain metaphysical and conjectural.

Security considerations

Finally, contrary to publicized claims, quantum cryptography is insecure and can be successfully cryptanalysed through man-in-the-middle attacks. As a consequence, to be safe, such quantum-cryptographic protocols require both an uncompromised classical as well as quantum communication channel. With these provisos, one may ask: what exactly is the advantage and what is the “added security”? Is quantum cryptography presenting itself as the solution to a problem while at the same time requiring the absence of the threat it purports to resolve? If you push the experts with such questions, they respond that, rather than generating a key out of the blue, within certain error bounds, they could “enlarge” an existing key. This is the type of confidence that is implied by “unconditional security” in many of these papers.

As the quantum fairy is about to deliver €1bn, the question is whether we should allow such “fairy tales” to be marketed to the public and politicians. Should those conveying the most sentimental and overstated promises prevail? Maybe this is unavoidable, but it is not without consequences. One might also ask why we are not funding other initiatives to provide solutions to the upcoming energy crisis, as well as alleviate our dependencies on crude oil. One option is thermonuclear fusion – a yet-utopian “solar” energy source – that might be sustainable at moderate operating costs and perils. This will require much higher “whatever it takes” investment, but while much has been done already, much more is needed.

UK will fund Joint European Torus beyond Brexit

The UK government will continue to fund the Joint European Torus (JET) nuclear-fusion experiment until at least 2020, despite the country’s intention to leave the European Union (EU) in March 2019. JET is located in Oxfordshire and is run by the European Consortium for the Development of Fusion Energy, which gets about half of its funding from the EU’s Euratom Horizon 2020 programme.

JET is operated by the Culham Centre for Fusion Energy (CCFE) – the UK’s national fusion research laboratory – under a four-year €283m contract that expires in 2018. About 88% of the running costs of JET are paid for by the EU, causing some to worry about the fate of the lab after Brexit.

Fair share

In a bid to renew the contract, the UK Department for Business, Energy & Industrial Strategy says that it will continue to pay its “fair share” of JET running costs until 2020. Industry Secretary Greg Clark says: “JET is a prized facility at the centre of the UK’s global leadership in nuclear fusion research.”

JET is a magnetic-confinement plasma-physics experiment and is used to study how nuclei could be made to fuse together to unleash large amounts of clean energy. It is the precursor to the ITER fusion-energy demonstrator, which is currently being built in France. JET supports 1300 jobs in the UK, 600 of which are highly skilled scientists and engineers.

‘Nightmare’ geometry of Uranus creates tumbling magnetosphere

Uranus has a “switch-like” magnetosphere that opens and closes due to the planet’s complicated magnetic and rotational geometry. This is the conclusion of Xin Cao and Carol Paty from the Georgia Institute of Technology in the US, who have used data from Voyager 2 to model the icy planet’s unusual magnetic properties.

Most planets are surrounded by a magnetosphere – a plasma structure created by the planet’s magnetic fields controlling the flow of cosmic charged particles. For Earth, the magnetic axis is almost aligned with the planet’s spin axis, and the constant bombardment of electrically charged particles released by the Sun (known as solar winds) means the resulting magnetosphere has a bullet-like shape.

Realignment switch

Usually, this structure is in a “closed” state – charged particles in the solar wind are directed away from Earth. However, it can “open” because of magnetic reconnection – a phenomenon that happens within plasmas, like the solar wind, because they have their own set of magnetic fields. When the magnetic-field lines inside a plasma get close to other field lines, the entire pattern changes and everything realigns into a new configuration. The process converts the stored energy of the magnetic field into heat and kinetic energy, which sends particles out along the field lines. This is how solar flares are created.

Magnetic reconnection can happen in Earth’s magnetosphere when the solar wind changes direction due to solar storms. The realignment can open up gaps in the protective magnetosphere, allowing highly energetic cosmic particles to enter the atmosphere. This is one mechanism that creates spectacular auroras and disrupts satellites.

Complex seventh planet

In contrast to Earth’s relatively simple arrangement, Uranus has a particularly complex magnetic and rotational geometry that generates a dynamic magnetosphere. “Uranus is a geometric nightmare,” says Paty. The ice giant has the most extreme axial tilt of all planets in the solar system, spinning on a near-horizontal axis. Furthermore, its magnetic axis is at a 59° angle to the spin axis and – adding insult to injury – does not go through the geometric centre of the planet.

Uranus is a geometric nightmare
Carol Paty, Georgia Institute of Technology

To study the effects of this unusual arrangement on Uranus’s magnetosphere, Cao and Paty created a simulation using data from the only spacecraft to fly by Uranus, NASA’s Voyager 2. The mission sped past the planet over 30 years ago, collecting only five days worth of data. The researchers found that the lopsided magnetic field tumbles asymmetrically during the planet’s 17.24 h full rotation – causing rapid rotational change in field strength and orientation. “When the magnetized solar wind meets this tumbling field in the right way, it can reconnect, and Uranus’s magnetosphere goes from open to closed to open on a daily basis,” explains Paty. As well as this “switch-like” mechanism, Cao and Paty also suggest that solar-wind reconnection (as on Earth) occurs upstream of Uranus’s magnetosphere, in its twisted tail.

Again unlike Earth, magnetic reconnection on Uranus can cause auroras at a range of latitudes. However, these are difficult to observe because the planet is nearly two billion miles away – even the Hubble Space Telescope only gets a faint view on occasion. Therefore, Cao and Paty used their numerical models to predict favourable locations for reconnection.

Icy exoplanet clues

The pair say that learning more about Uranus will help with understanding the ever increasing number of exoplanets that astronomers are discovering. “The majority of exoplanets that have been discovered appear to also be ice giants in size,” says Cao. “Perhaps what we see on Uranus and Neptune is the norm for planets: very unique magnetospheres and less-aligned magnetic fields. Understanding how these complex magnetospheres shield exoplanets from stellar radiation is of key importance for studying the habitability of these newly discovered worlds.”

The study is presented in the Journal of Geophysical Research: Space Physics.

Four new beamlines to be built at The European Synchrotron

Four new beamlines will be built at The European Synchrotron (ESRF) X-ray source in Grenoble, France. One of the new beamlines will be used for serial macromolecular crystallography, which will allow scientists to work-out the structures of proteins from measurements made on tiny samples measuring just 1 μm across. Hard X-ray diffraction microscopy will be done on another new beamline and will allow researchers to study the structures of materials on length scales from millimetres to nanometres. This could prove very useful for understanding how some materials undergo structural failure.

The third new beamline will be dedicated to the use of coherent X-rays to study dynamical processes in samples as varied as heart muscle, teeth and smartphone displays. Last but not least, high-throughput X-ray tomography will be the focus of the fourth new beamline. This new facility will provide images of the interiors of large objects at sub-micron resolution and could be used in a wide range of applications including characterizing aeroplane components to studying the insides of archaeological mummies.

Extremely brilliant

Work on the new beamlines will begin next year and should be completed by 2022. This will allow users to take advantage of the current €150m upgrade to ESRF – called the Extremely Brilliant Source – which is also scheduled for completion in 2022.

CERN welcomes Lithuania as associate member

Lithuania is on course to become an associate member of CERN, pending final approval by the Lithuanian parliament. Associate membership will allow representatives of the Baltic country to take part in meetings of the CERN Council, which oversees the Geneva-based physics lab. Lithuanian citizens will be eligible to work in certain roles at CERN and companies based in Lithuania will be able to bid on CERN contracts.

The relationship between Lithuania and CERN began in 2004, when the country signed an agreement of co-operation with the lab. Since then, Lithuanian physicists have become involved with the CMS experiment on the Large Hadron Collider (LHC), with a focus on data mining and data-quality analysis. Researchers from Lithuania are also involved in developing new detector technologies for the LHC.

Impetus for growth

“Co-operation with CERN gives a new impetus for economic growth, provides an opportunity for us to take part in global research and opens a wide horizon for our youth,” said Lithuanian president Dalia Grybauskaitė at a signing ceremony in Vilnius that was attended by CERN’s director-general Fabiola Gianotti. Twenty-two countries are full members of CERN, which also has six associate members – two of which are on track to full membership.

Dark-matter constraints tightened after LUX no-shows

The Large Underground Xenon (LUX) collaboration has set new constraints on hypothetical dark-matter particles called WIMPs – weakly interacting massive particles. The LUX experiment is a dark-matter detector at the Sandford Underground Research Facility in the US. Buried 1500 m under radiation-shielding rock, it consists of a 2 m-tall titanium tank filled with 370 kg of liquid xenon cooled to –108 °C.

The detector relies upon the assumption that WIMPs should occasionally collide with the xenon atoms. If this occurs, the recoiling atom will create light and some free electrons. The electrons are accelerated by an electric field such that they create more light when they reach a layer of xenon gas at the top of the tank. Light signals from the collision point and the top of the tank are collected by extremely sensitive detectors and the energy of the collision can be deduced from the brightness. Requiring two signals from each event makes it easier to discriminate against light created by background radiation.

Two runs

The experiment has undergone two data runs searching for WIMPs – one for three months in 2013 and another between October 2014 and May 2016. Neither run saw evidence of WIMPs.

However, the two null results have allowed researchers to set a new upper limit on the spin-dependent WIMP–nucleon elastic cross-sections. The value for proton collisions is 5 × 10–40 cm2, while for WIMP–neutron interactions the cross-section is 1.6 × 10–41 cm2 – the most sensitive constraint to date.

The result, presented in Physical Review Letters, agrees with recent findings from the PICO-60 detector in Canada, which have also now been published in Physical Review Letters.

How to make topological plasmons in graphene

Patterned graphene should be an ideal material for creating infrared topological plasmons, according to calculations by physicists in the US and China. Dafei Jin, Thomas Christensen and colleagues at the University of California, Berkeley, Massachusetts Institute of Technology and the Institute of Physics of the Chinese Academy of Sciences came to this conclusion after calculating the electronic properties of graphene – a sheet of carbon just one atom thick – patterned with a triangular lattice of circular holes (see figure).

Plasmons are particle-like collective oscillations of conduction electrons that can be created when light is shone on a material. The team’s calculations suggest that when a magnetic field is applied perpendicular to the patterned graphene sheet, plasmons created by infrared light are able to propagate in one direction along the edge of the sheet. However, the plasmons do not propagate into the interior of the sheet – and this behaviour is a hallmark of a topological material.

Writing in Physical Review Letters, the team suggests that infrared topological plasmons could be useful for creating practical devices that combine optics with ultrafast electronics.

Copyright © 2026 by IOP Publishing Ltd and individual contributors