Skip to main content

Could radon and one of its radioactive isotopes reliably predict an earthquake?

A combined analysis of the concentrations of radon and one of its radioactive isotopes called “thoron” may potentially allow for the prediction of impending earthquakes, without interference from other environmental processes, according to new work done by researchers from Korea. The team monitored the concentrations of both isotopes for about a year and observed unusually large peaks in the thoron concentration only in February 2011, preceding the Tohoku earthquake in Japan, while large radon peaks were observed in both February and the summer. Based on their analyses, the researchers suggest that the anomalous peaks observed in that month were precursory signals related to that earthquake that followed the following month.

Earthquake prediction remains the holy grail of geophysics, and an oft-proposed but highly contested method for quake forecasting revolves around the detection of abnormal quantities of certain gaseous tracers in soil and groundwater. These are believed to be released through pre-seismic stress and the micro-fracturing of rock in the period immediately before an earthquake.

Cloudy with a chance of tremors?

While a number of such precursors have been proposed – including radon, chloride and sulphate – their application to earthquake forecasting has not been realized. The problem here lies in how abnormal concentrations of these tracers can also occur through other environmental processes. For example, signals from radon (222Rn) – an easy-to-detect radioactive gas whose short half-life of 3.82 days makes it highly sensitive to short-term fluctuations – can be disrupted by meteorological phenomena and tidal forces. Radon has no stable isotopes, but has a host of radioactive isotopes including a very short-lived isotope called thoron (220Rn, half-life = 55.6 s).

In a new study, Guebuem Kim and Yong Hwa Oh of Seoul National University propose that an underground, dual-tracer analysis – using both radon and thoron – might be able to overcome these limitations. With its half-life of only 56 seconds, measured thoron activity in the stagnant air of a cave should typically be very low if the recording detector is placed sufficiently far (0.2 m) from the cave floor. “Thoron – through diffusive flows – decays away before it reaches the detector,” explains Kim. “Thus, at an optimum position, only advective flows of thoron – earthquake precursors – reach the detector.”

To test this concept, the researchers took hourly measurements of the radon and thoron concentration in the Seongryu Cave, in eastern Korea’s Seonyu Mountain, over a period of 13 months. The cave – which formed around 250 million years ago – is around 330 m long and varies from 1 to 13 metres in height. Recordings were taken in a part of the cave that is isolated from the air flow from the outside, preventing any thoron anomalies that may arise from a wind-induced surface flow along the cave floor.

Unexpected peaks

An unusually large peak in thoron concentration – above those caused by seasonal variations or daily temperature fluctuations, and unexplainable by a precipitation event – was recorded in the February of 2011, preceding the magnitude 9.0 Tohoku earthquake in Japan, 1200 km away, a month later. In contrast, radon peaks were observed not only during February but also in the preceding summer period, when atmospheric stratification is believed to better trap radon within the cave system. While the thoron measurements alone are capable of recording earthquake signals, Kim says, the anomalous peaks detected were clearer when plotted in tandem with radon activity.

The single station used in the study would not be able to localise or assess the magnitude of an impending earthquake, but the team suggest this may be done using a large network of such detectors. Though the researchers undertook their measurements in a natural limestone cave system, the principle could also be applied to man-made caverns, the researchers report, with the method not being dependant on a particular lithology of rock.

Heiko Woith, a hydrogeologist at the Helmholtz-Zentrum Potsdam in Germany who was not involved in the Korean team’s work, is sceptical about the new method. “The length of the time series is too short to judge the reliability of a precursor,” he says, cautioning that a non-tectonic origin for the thoron anomaly still cannot be ruled out. “Certainly, the radon–thoron approach is interesting to follow in future studies, but it is premature and misleading to call it a new ‘reliable earthquake precursor’ at this stage,” he concludes.

With this initial study complete, the researchers are now looking to further explore the potential of their radon–thoron technique by setting up a remote monitoring system within an artificial cave, powered by a solar panel on the surface. Ultimately, Kim suggests, these system might be deployed on a larger scale.

The research is described in Scientific Reports.

Keep it brief

By James Dacey

“Brevity is a great charm of eloquence,” said the great Roman orator Cicero. A new study published today suggests that researchers would be wise to follow Cicero’s advice when it comes to choosing a title for their next academic paper. Data scientists at the University of Warwick in the UK analysed 140,000 papers and found that those with shorter titles tend to receive more citations.

Similar studies have been carried out in the past leading to contradictory results. But Adrian Letchford and his colleagues have used two orders of magnitude more data than previous investigations, looking at the 20,000 most cited papers published each year between 2007 and 2013 in the Scopus online database. Publishing their findings in Royal Society Open Science, Letchford’s group reports that papers with shorter titles garnered more citations every year. Titles ranged from 6 to 680 characters including spaces and punctuation.

(more…)

Single photons see the light

A new trick that enables photons to interact with one another has been developed by physicists in Canada. Using an ultracold gas of rubidium atoms, the researchers have shown that a single photon can have a measurable effect on the state of a separate photon beam. They say that the result marks an important step on the road to developing quantum computers that encode information using light, rather than matter.

In many ways, photons are the ideal data carrier for quantum information systems, but, given their lack of charge, they do not interact with each other. This is a particular problem in the development of quantum computers – the logic gates used in classical or quantum computers require the entities that encode bits to interact with one another.

Photonic interplay

To get round this problem, matter could be used as an intermediary – a material’s atomic state could be altered by one photon, and that in turn would change the state of a second photon. But this effect is very weak – only since the development of the laser in 1960 has it been possible to produce high-intensity light beams that have a measurable effect on each other. Unfortunately, quantum technology requires that data be encoded in individual photons, rather than in beams.

In the latest work, Aephraim Steinberg and colleagues at the University of Toronto have shown how a single photon – a “signal” beam – fired into a gas of rubidium-85 atoms cooled down to a few micro-kelvins can alter the state of a “probe” beam that travels through the gas in the opposite direction. To do so, they tune the frequency of the probe to equal that of one of rubidium’s principal transitions. They then use a third laser to etch out a very narrow “transparency window” within the absorption line so that the probe can travel unimpeded through the gas. The job of the signal photons is then to very slightly modify the rubidium’s resonant frequency, which allows the probe photons to be momentarily absorbed and re-emitted, so changing the probe’s phase and delaying it slightly.

Clicks and shifts

While the interaction of a single photon with a beam has been studied in the past, an actual single “signal” photon was never used experimentally – it was later corrected for in calculations. While Steinberg and colleagues also do not generate true single-photon beams, they reduce the intensity of the signal to the point where it likely contains either one or zero photons, and use a detector that “clicks” only if the signal contains a photon. By continuously measuring the phase of the probe after it has crossed the gas, they can establish whether or not the clicks and phase shifts go hand in hand.

Repeating this process many times, the researchers did observe such a correspondence – they found that the single photons, on average, rotate the phase of the probe beam by about one thousandth of a degree. “This dependence arises because before we measure the signal, we have entanglement between that beam, with an uncertain photon number, and the probe beam, which thereby picked up an uncertain phase,” says Steinberg.

Going from this result to a working all-optical quantum logic gate will require much additional work, however. For one thing, the effect needs to be much bigger. According to Steinberg, a phase shift of a few degrees per single photon might be enough for a working computer. While the team’s existing technique probably won’t allow that, increasing the phase shift by upping the density of the rubidium atoms may help.

Practical considerations

Another challenge will be generating interactions between two sets of single photons, rather than an individual photon and a beam of photons. The solution there, says Steinberg, might be to use the probe beam as a “quantum bus” (used to store or transfer information between independent qubits), which can interact with multiple signal beams and thereby set up connections.

Steinberg acknowledges that physicists have long been trying to generate phase shifts of 180°. A rival technique may succeed – it involves firing pairs of photons into a cloud of rubidium atoms such that the energy of one of the photons is shared with several of the atoms, and this “Rydberg state” then changes the gas’s refractive index for the other. But he points out that, in 2006, Jeffrey Shapiro at the Massachusetts Institute of Technology calculated that such a large phase change would introduce enough noise into the photon–atom system to destroy the delicate quantum state. “It remains an open question as to whether one can exploit loopholes in his theorem,” says Steinberg.

The research is published in Nature Physics.

Inside the particle pyramid

Archaeologists believe Teotihuacan was established in 100 BC before growing to become one of the largest settlements of ancient times and home to an estimated 125,000 people. The Teotihuacanos were contemporaries of the Mayans. But while it is clear that the Teotihuacanos were aware of the Mayans, very little else is known about this mysterious civilization. One of the big mysteries surrounds the city’s leaders – who were they and where were they buried? As the ancient Egyptians enshrined their Pharaohs in pyramids, perhaps the Teotihuacano leaders are hidden within the Pyramid of the Sun.

To answer this question, archaeologists turned to a group of physicists led by Arturo Menchaca Rocha of the National Autonomous University of Mexico (UNAM). Menchaca’s team has tried to peer inside the pyramid using muons – the charged elementary particles that rain down on the Earth’s surface as they are produced by cosmic rays interacting with the atmosphere. Being roughly 200 times heavier than electrons, muons are able to penetrate dense materials such as rock, but in doing so they lose their energy and their paths can be deflected. By placing a detector in a tunnel beneath the pyramid, Menchaca’s team has been searching for cavities that could be secluded chambers.

During their recent visit to Mexico, our reporters were invited to scramble along the dark tunnel beneath the pyramid to see the detector for themselves on the final day before the equipment was dismantled. Listen to the podcast to find out what they experienced and whether or not Menchaca has discovered any hidden chambers. You can also see Menchaca talking about the particle-pyramid project in this video interview recorded on the day of the visit.

This podcast was produced in conjunction with a Physics World special report on Mexico to be published in September. A free-to-read digital version of that issue will be available from the beginning of September via the Physics World app, available from the App Store and Google Play. That issue contains a feature about how Menchaca has now teamed up with geoscientists to apply a variation of this muon technique to look inside Mexico’s most famous volcano, Popocatépetl. This fiery mountain has woken up in recent years and poses a big threat to the vast urban areas of Mexico City and Puebla. The Mexican authorities are keen to have a system in place at the volcano that is capable of predicting when it is likely to erupt and how dangerous those events might be.

Joint quantum-computing venture is a first for China

The global effort to develop practical quantum computers got a boost this month with the inauguration of a dedicated laboratory in Shanghai, China. The new lab – a joint venture between the Chinese Academy of Sciences (CAS) and the Chinese online retail giant Alibaba – aims to develop a general-purpose prototype quantum computer by 2030.

The new CAS–Alibaba Quantum Computing Laboratory’s interim goals include the coherent manipulation of 30 quantum bits (qubits) by 2020, and quantum simulation with calculation speeds equivalent to those achieved by today’s fastest supercomputers by 2025. This ambitious series of five-year plans will be supported by an annual injection of $5m from Alibaba’s cloud-computing subsidiary, Aliyun, over the next 15 years.

Novel partnership

Chaoyang Lu, a member of the new lab, which is located in the centre of Shanghai, acknowledged that this goal is “extremely challenging”, citing the massive technical difficulties involved in making quantum computing a reality. Lu, who is also a quantum physicist at the University of Science and Technology of China (USTC), noted that this type of partnership is new in China, calling it the first “large-scale investment in fundamental science” by a privately run business.

The lab’s director and chief scientist, Jianwei Pan, told physicsworld.com that the money from Aliyun will be earmarked for recruitment, while annual operating costs (which he estimates at a few tens of millions of US dollars) will come from government agencies, including CAS. Eventually, he said, the lab could become home to “some 100 scientists” from around the globe.

“The CAS–Alibaba Quantum Computing Laboratory will undertake frontier research on systems that appear the most promising in realizing the practical applications of quantum computing. The laboratory will combine the technical advantages of Aliyun in classical calculation algorithms, structures and cloud computing with those of CAS in quantum computing, quantum analogue computing and quantum artificial intelligence, so as to break the bottlenecks of Moore’s Law and classical computing,” says Pan.

Big data

A co-operation agreement was signed by the CAS president, Chunli Bai, and Alibaba’s chief technical officer, Jian Wang, at the lab’s official inauguration on 30 July. According to Wang, the company’s investment in advancing quantum computing and its related technologies “reflects the scale and clarity of [Alibaba’s] long-term vision to collaborate with partners in an ecosystem modelled towards the sustained development of the economy and society. New discoveries in information security and computing capacity based on quantum computing could be as significant in the future as big data technologies are today.” Managers at the new facility will report directly to Aliyun, and will also have the option to buy any intellectual property that stems from the lab’s research.

“As an Internet company, we have been paying close attention to upcoming computing technologies,” says Shuanlin Liu, the chief architect of Alibaba Infrastructure Service at Aliyun. “With CAS’s prowess in quantum physics and our strength in cloud computing, we are pretty confident that the lab’s five-year plans will work out in time.”

Mimicking the Martian surface to test space devices

Mars has become an important target for planetary exploration, in part because there are several theories that claim Martian conditions are ideal for prebiotic life. The question of whether life currently exists on Mars, or has existed in the past, is therefore of direct relevance to the origin of life on Earth – and a question that is still very much open.

Since the first successful Mars “fly-by” by NASA’s Mariner 4 mission 50 years ago, there have been more than 40 spaceflights to our planetary neighbour. Although many have ended in failure, these missions have changed our view of Mars. There are several spacecraft currently orbiting the red planet and collecting imaging and spectroscopic data in order to survey the Martian geology and radiation environment. The past decade has also seen several landers and rovers delivered safely to the surface of Mars, which has opened up the potential for further exploration. NASA’s Spirit and Opportunity rovers in particular have sent back stunning pictures of the dusty Martian landscape and collected valuable information about the planet’s potential for supporting life.

Instruments under pressure

The high price of the space missions means that it is vital that instruments perform reliably on the Martian surface. Electronic and mechanical devices that operate under the pressures and temperatures found on Earth will not necessarily work on Mars, where the atmospheric pressure is about 100 times lower and comprises 95% carbon dioxide. The low atmospheric pressure means there are very high levels of UV radiation at the Martian surface, where the temperature varies from 20 °C to –150°C depending on the latitude, season and time of day. Instruments destined for Mars must therefore be calibrated using the Martian atmospheric parameters as much as possible.

Planetary simulation chambers have become vital for optimizing the functions of onboard space instruments. At the Spanish Astrobiology Centre (CAB, CSIC-INTA) in Madrid, we have recently developed an environmental simulation chamber for testing new electromechanical devices and instruments that could be used on missions to Mars and elsewhere in space. Called MARTE, it is an advanced vacuum vessel designed to regulate surface and environment temperatures, solar radiation, total pressure and atmospheric composition.

Having these capabilities in the same experimental environment gives MARTE several advantages when compared with other chambers – the most important being versatility. MARTE has a modular design that allows its total volume and shape to be modified in order to test instrumentation and samples of different types and sizes. Its pressure ranges from 1000 mbar to 10–6 mbar, while the temperature can range between 108 K and 423 K. The device allows users to simulate solar illumination at different azimuths and UV-radiation levels, while a quadrupole mass spectrometer enables precise control of the gas composition at different pressures.

Testing conditions

A vacuum chamber

MARTE was first conceived in 2009 and took two years to build. Until now, the chamber has been used primarily to test environmental sensors onboard NASA’s Mars Curiosity rover under real working conditions. This involves calibrating the pressure sensor to provide accurate readings even when sudden pressure variations occur during Martian storms.

MARTE can also simulate dust deposition using a custom-built vibration system. Dust suspended in the Martian atmosphere is one of the most critical meteorological phenomena affecting surface instrumentation. Spacecraft and rovers that have been sent to Mars so far, including Curiosity, have been severely affected by dust accumulating on solar panels and optical instrumentation. To address this problem, we have installed a mechanical system in MARTE that simulates such conditions using dust that has the same colour, chemical composition and density as dust found on Mars. This allowed us to measure the resulting attenuation in the output of UV sensors, for instance, and to evaluate the performance of the sensors as they were operating.

With Mars missions focusing increasingly on the search for extraterrestrial life, we have teamed up with colleagues at CAB to developed the Signs Of Life Detector (SOLID). This instrument analyses soil samples to look for the presence of life based on antibody microarray technology that can detect traces of microorganisms or other biological supra-molecular structures. The detector was placed in MARTE to investigate how it would perform at typical Martian pressures and in a carbon-dioxide-rich atmosphere.

Payload-ready

Some of the parameters that have been optimized to ensure that SOLID will work in future space missions are its electronics, heat-dissipation structures and materials for vacuum that must be able to sustain extreme temperature variations. These tests have been of great relevance for understanding the behaviour of SOLID’s ultrasonic and fluidic systems, for example, and helped us identify the type of pumps and valves required. Thanks to these tests, SOLID is now at an advanced stage that makes it a competitive instrument as a payload for future life-detection missions to Mars.

MARTE will also perform tests for the Mars Environmental Dynamics Analyzer (MEDA) meteorological station, which integrates pressure, wind and humidity sensors and is one of the instruments planned for NASA’s Mars 2020 mission. The Mars 2020 rover will continue with the objective of its predecessor Curiosity to explore the Martian environment for signs of life, and MARTE is an essential platform for validating MEDA instrumentation.

Indeed, one year after its first trials, MARTE’s unique capability to simultaneously control very different atmospheric parameters and to be adapted to different set-ups is proving a crucial tool for all researchers interested in sending instrumentation to the red planet.

Vacuum environments aid nano exploration

It is 100 years since father-and-son team William and Lawrence Bragg won the Nobel Prize for Physics for their discovery of X-ray crystallography. This powerful technique exploits the fact that X-rays produce a characteristic diffraction pattern when they pass through a crystalline sample. From these patterns researchers can infer the atomic structure of matter – ranging from new materials to biological molecules.

Pioneering crystallographers relied on low-intensity X-ray sources similar to those found in hospitals to determine the structure of simple crystals such as salt and, later, the famous double helix of DNA. Since the 1980s, however, researchers have had much more powerful synchrotron X-ray light sources at their disposal. These enable ­studies not only of the basic crystalline structure of materials, but also their assembly, interfaces, defects and transformations. More than 30 major synchrotron light sources are currently in operation worldwide and are used by tens of thousands of researchers each year, spanning numerous scientific disciplines.

The extremely high brightness and excellent stability of synchrotron radiation, combined with optical elements that can focus X-ray beams to spot sizes just tens of nanometres across, allow researchers to probe isolated nanostructures or interrogate small volumes within complicated heterogeneous systems such as semi­conductor devices. The creation of coherent X-ray beams with well-defined wavefronts is a crucial aspect of the technique, and recent advances in coherent X-ray scattering with smaller spot sizes are providing new insight into nanoscale materials.

Serious challenges

Applying these emerging X-ray techniques in controlled sample environments – particularly at elevated temperatures under vacuum or controlled gas environments – provides even deeper insight into processes relevant to materials synthesis. It allows researchers to study critical processes, such as the atom-by-atom growth of thin films for electronic and optics devices, under conditions relevant to manufacturing processes. The technique also allows users to study a material’s intermediate structures, which could be unstable at room temperature or under atmospheric pressure. Without isolating samples under controlled vacuum environments it is impossible to probe these subtle physical and chemical processes, but the use of X-ray nanobeams also presents serious challenges for vacuum technology.

Applying X-ray nanobeam techniques in non-ambient conditions is hindered mainly by mechanical constraints. Nanobeam diffraction requires creating instruments that are very robust against displacements caused by ambient acoustic or mechanical noise and also robust against angular instabilities. The relative displacement between the sample and focusing optics must be much less than the beam size (typically 100 nm today), while the sample’s orientation must be stable to within 1 mdeg. The latter is equivalent to the angle subtended by a penny observed at a distance of 1 km.

Nanobeam diffraction requires creating instruments that are very robust against displacements caused by ambient acoustic or mechanical noise and also against angular instabilities

X-ray diffraction experiments also involve repeated measurements in the same small region of the sample, which requires the position of optical elements to be stable to within a few microns per hour. Finally, the short-wave nature of X-rays compared with visible light means that the working ­distances of X-ray optics are comparatively large, such that the focusing optics can be placed up to a few centimetres away from ­the sample. Even so, that means squeezing the vacuum environment into a space far smaller than the typically 0.5–1 m diameter of ­traditional vacuum chambers used for materials research.

Integrated sample environments

Researchers from the University of Wisconsin in the US and the European Synchrotron Radiation Facility (ESRF) in Grenoble, France, have recently developed a new approach with which to integrate precise vacuum sample environments with X-ray nanobeam experiments. The result is a series of high- and ultrahigh-vacuum (UHV) sample environments that are compact and light enough to be integrated with nanopositioners and X-ray nanobeam optics at modern synchrotrons.

A high-vacuum sample chamber

The initial device – a UHV environment – was integrated with a hexapod nano­positioner that enabled the collection of maps and diffraction patterns without degrading resolution or precision. The set‑up contains small sputter-ion vacuum pumps that have no moving parts and thus exhibit reduced vibrations and better positional and orientation stability compared with turbomolecular pumps. The UHV environment is maintained in a stainless-steel chamber with metal seals and welded thin beryllium windows, and the device weighs just 1.4 kg. A series of demon­stration experiments probed the motion of gold atoms on a silicon surface, resulting in the formation of large, 100 nm-scale gold ­crystals (Rev. Sci. Instrum. 84 113903). Similar processes of atomic transport on surfaces are relevant for developing next-generation semiconductors such as 2D chalco­genides and graphene.

A true UHV environment, however, is not required for all experiments and in many cases a high-vacuum environment is sufficient. Examples include aging processes of devices for which the critical parts are often not exposed at the surface, and thin-film phase transitions for which sub-monolayer-scale control of the surface composition is not required. The team at the ESRF’s ID01 beamline has thus subsequently developed a compact aluminum high-vacuum chamber evacuated with a turbomolecular pump and equipped with dome-shaped X-ray windows made either from beryllium metal or from polymers. Requiring only high vacuum reduces the mass of the system further, allowing users to combine the cell with piezoelectric positioners that allow samples to be moved with high precision.

Scanning X-ray diffraction micro­scopy at elevated temperatures in vacuum under diffraction conditions has become an important method of investigation for materials research and condensed-matter physics at ESRF’s dedicated X-ray nanodiffraction beamline. For example, the ESRF team’s high-vacuum environment has recently been used to study interfacial processes that can distort silicon’s crystal lattice (Appl. Phys. Lett. 106 141905). Integrating silicon with an ever-increasing spectrum of materials has been instrumental in advancing integrated-circuit technology. The combination of vacuum environments and X-ray nanobeam techniques allows the chemical reactions and atomic motion that determine the properties of these interfaces to be studied at the small length scales relevant to devices.

Fast-paced expansion

The next several years will see an increasingly fast pace of development of X-ray light sources, with orders-of-magnitude improvements in brightness provided by new synchrotrons under construction. Experimental facilities based in the target areas of these sources will vastly extend the application of X-ray nanobeams and expand the scientific community employing these techniques. The creation of vacuum environments with even lower mass and greater stability has an important role in enabling the materials-research community to apply these advances to challenging nanoscale structural phenomena.

Hydrogen sulphide is warmest ever superconductor at 203 K

Hydrogen sulphide becomes a superconductor at the surprisingly high temperature of 203 K (–70 °C), when under a pressure of 1.5 million bar, according to recent work done by physicists in Germany. This smashes the previous record for conventional superconductivity and takes it above the lowest temperature directly recorded at ground level on Earth (–89.2 °C or 184 K) for the first time. The researchers say the discovery could be a major step towards room-temperature superconductivity.

Superconductors conduct electricity with zero resistance below a critical temperature. A second key characteristic is that below the critical temperature they expel magnetic fields – this is dubbed the Meissner effect. The ultimate goal is a superconductor that works at room temperature. This would dramatically improve the efficiency of electricity generation and transmission, and make current uses of superconductivity, such as superconducting magnets in particle accelerators, much simpler.

Conventional or not?

“There is theoretically no limit for the transition temperature of conventional superconductors, and our experiments give reason to hope that superconductivity can even occur at room temperature,” says Mikhail Eremets, at the Max Planck Institute for Chemistry, in Mainz, Germany, who led the research, together with physicists at the Johannes Gutenberg University Mainz.

In conventional superconductivity, vibrations in a material’s crystal lattice bind electrons together in pairs that can flow without resistance. Lighter elements are thought to be better because their atoms can vibrate at higher frequencies, facilitating superconductivity at higher temperatures. Although superconductors have been found by looking at such materials, the highest critical temperature achieved so far is 39 K in magnesium diboride. Superconductivity has been achieved at 164 K at high pressure in copper-oxide systems, but these are not conventional superconductors. Also, as the mechanism of superconductivity is not fully understood, achieving higher critical temperatures is difficult. Calculations have shown that hydrogen, the lightest element, should be a conventional superconductor at room temperature. But superconductivity in pure hydrogen has proved elusive, leading people to look at hydrogen-rich materials instead.

Under pressure

Eremets and his colleagues focused on hydrogen sulphide (H2S) because it is relatively easy to handle and is predicted to become a superconductor at around 80 K under high pressure. They found that when samples of hydrogen sulphide were placed under extreme pressure – around 1.5 million atmospheres (150 gigapascals) – in a diamond-anvil cell and cooled to 203.5 K, they had zero electrical resistance and their magnetization decreased sharply, confirming the superconducting state.

The researchers believe that, under pressure, hydrogen sulphide decomposes and changes from H2S to H3S. They propose that this high-pressure hydrogen sulphide is a conventional superconductor, with the superconductivity originating in the crystal lattice. “Our research into hydrogen sulphide has shown that many hydrogen-rich materials can have a high transition temperature,” says Eremets.

Writing for Nature News & Views, Igor Mazin, of the Naval Research Laboratory in Washington DC, described the discovery as “the holy grail of superconductors”. Damian Hampshire, head of the superconductivity group at Durham University, who was not involved in the work, agrees that it is an exciting experimental result. “It points out the types of materials that might provide room-temperature superconductivity. But mostly, it reminds us all how little science we understand and how much more there is to discover,” he says.

Not everyone is convinced, however, by the conventional superconductivity interpretation. “I don’t believe that this is conventional superconductivity arising from high-frequency vibrations of hydrogen,” says Jorge Hirsch, a theoretical physicist at the University of California, San Diego, who was also not involved in Eremets’ work. “If it is superconductivity, I believe it is unconventional superconductivity arising from holes conducting through sulphur anions.” He adds that, to find other such high-temperature superconductors, researchers should look at sulphur-containing compounds under very high pressures.

The research is published in Nature.

Queer in STEM, an astronomy rumpus and the heat from a fan

(iStock/Rawpixel Ltd)

By Matin Durrani

Our eyes were drawn this week to the results of the first national US survey of the experiences of lesbian, gay, bisexual, transgender, queer or asexual (LGBTQA) people working in science, technology, engineering and medicine (STEM) subjects. Entitled Queer in STEM, the study was carried out by Jeremy Yoder, a plant-biology postdoc at the University of Minnesota, and Alison Mattheis who’s on the faculty at the College of Education at California State University Los Angeles.

(more…)

Physicists create a magnetic wormhole in the lab

Wormholes are normally the stuff of science fiction, but new research carried out by a group of physicists in Spain has shown that it is possible to build the magnetic equivalent of a wormhole that is capable of transporting a magnetic field from one point in space to another. The team has built a spherical wormhole, made from ferromagnetic and superconducting components, that is magnetically cloaked. The researchers say that the work could prove particularly useful in magnetic resonance imaging.

Wormholes are hypothetical “topological features” or tunnels that, in theory, would connect two distant regions of space–time, via higher dimensions. They are predicted by certain solutions of general relativity, as a result of massive objects severely distorting space–time, but have never been observed in nature and would be exceedingly difficult to recreate in the lab.

Manipulated magnets

In the latest work, Alvaro Sanchez and colleagues at the Autonomous University of Barcelona have instead designed and constructed a wormhole for a magnetic field. The 9 cm-diameter sphere guides and cloaks a magnetic field from a dipole source placed one side of it, such that field lines appear to emanate from a monopole on the other side of the sphere. Indeed, it seems as if the field takes an invisible shortcut through the intervening space. “You see the apparatus with your eyes,” says Sanchez, “but magnetically it is undetectable. It is like the field lines have gone through another spatial dimension.”

The sphere has three parts. Running through it is a tube made from a thin sheet of the nickel–iron ferromagnetic alloy, which is wound into a spiral – it transports the applied field. This “magnetic hose” is needed because magnetic fields decay rapidly with distance – an alternative design for such a hose being realized by Sanchez and co-workers two years ago.

The other two parts then make up the spherical shell: a layer of superconducting tape surrounded by an array of (high-permeability) plates of the alloy. The superconductor repels external magnetic fields, thereby magnetically isolating the hose within; while the array compensates for the fact that the superconductor distorts any external field as it repels it. Designing the array so that it precisely cancels the magnetic signature of the superconductor required extensive computer modelling, notes research co-author Jordi Prat-Camps.

To put its wormhole to the test, the team placed the device in an external magnetic field created by a pair of Helmholtz coils and then inserted magnetic probes at two points – at the exit to the wormhole and alongside it. As intended, the researchers found that the first probe revealed a monopole-like field. The latter, when moved back and forth, showed no distortion of the external field (which was not the case when either of the two shells was removed).

The work builds on a theoretical proposal put forward by Allan Greenleaf of the University of Rochester in the US and colleagues in 2007. Greenleaf’s group actually outlined the design of a wormhole that would apply to electromagnetic waves in general, including light, and not just static magnetic fields. But Sanchez explains that actually building such a device would be extraordinarily difficult – it would require extreme values of magnetic permeability, as well as harder-to-manipulate electrical permittivity. Sanchez points out that the new device is not the world’s first magnetic cloak, but he believes it is the first such cloak to work in three dimensions, rather than two. He also reckons that it is the first ever artificial wormhole of any variety.

Unfortunately, Sanchez does not think that it can help improve our understanding of putative cosmological wormholes. Those objects would rely on the distortion of space–time itself, rather than on a particular field within space–time, and as such would require enormous sources of gravity. “Scientists are very good at manipulating magnetic fields,” he says, “but we don’t have the same mastery over gravitation.”

Manifold magnetic applications

However, the ability to isolate magnetic fields from one another could prove useful in applications such as MRI – potentially allowing simultaneous imaging of different parts of the body. “Our main motivation was scientific,” says Sanchez, “but since magnetic fields are used in so many different things, our device could have many potential applications.”

You see the apparatus with your eyes, but magnetically it is undetectable. It is like the field lines have gone through another spatial dimension
Alvaro Sanchez

Tie Jun Cui, an electrical engineer at Southeast University in China who was not involved in the work, agrees with Sanchez, and believes that it might now be possible to make analogous devices for sound or heat, for example. “Considering the various acoustic-cloak designs proposed in recent years, it should not be difficult to design an invisible acoustic tunnel,” he says.

John Pendry of Imperial College in London, whose technique of transformation optics has been used to develop cloaking devices, notes that there is nothing, in principle, to prevent the magnetic wormhole from working as advertised, but would like further details from the Spanish researchers before endorsing their work. “They don’t give too many details of their design, nor are there any computer simulations to refer to,” he says.

The wormhole is described in Scientific Reports.

Copyright © 2025 by IOP Publishing Ltd and individual contributors