Skip to main content

The blue fog

The secret of the blue fog
Resolution to long-standing mystery could lead to new display devices
Video Player is loading.
Current Time 0:00
Duration 1:38
Loaded: 0%
Stream Type LIVE
Remaining Time 1:38
 
1x
    • Chapters
    • descriptions off, selected
    • subtitles off, selected
    • en (Main), selected

    New research has shed light on a long-standing mystery that has perplexed physicists and chemists for over a century. Solving the secret of the “blue fog” proved to be an intellectual tour de force – and one that could lead to new types of display devices. Find out more by reading this feature article from the April issue of Physics World, written by Oliver Henrich and Davide Marenduzzo, who were involved in the latest work.

    Flash Physics: Antiprotons from dark matter, lab visits do not inspire, CERN’s new linac, Australia at ESO

    Antiproton excess linked to dark matter

    An unexplained excess in the number of antiprotons detected by the Alpha Magnetic Spectrometer (AMS) is related to the annihilation of dark-matter particles, according to two independent studies. Dark matter is a mysterious substance that appears to account for most of the matter in the universe. While its existence can be inferred indirectly from a number of different astronomical phenomena, dark-matter particles have never been detected directly. Writing in Physical Review Letters, Alessandro Cuoco and colleagues at RWTH Aachen University in Germany describe how they analysed antiproton, proton and helium cosmic-ray detection rates by AMS – which is located on the International Space Station – and other experiments. They found that the creation of antiprotons by the annihilation of dark-matter particles with masses of about 80 GeV/C2 provided the best explanation for why AMS has detected more antiprotons than expected to be created by conventional astrophysical process. In the same issue of the journal, Ming-Yang Cui of the Chinese Academy of Sciences and colleagues describe an independent analysis of the antiproton excess, which suggests that it is the result of annihilating dark-matter particles with masses in the 40–60 GeV/C2.

    Students not choosing science despite extra activities

    Science-related extracurricular activities do not encourage students to study science, technology, engineering and mathematical (STEM) subjects at high school, according to a study by Pallavi Amitava Banerjee from the University of Exeter in the UK. Banerjee tracked the educational progress of 600,000 teenagers from the start of secondary school (age 11–12) to A-level examinations (age 18). By using data from the National Pupil Database and activity providers, she examined whether students were more likely to choose STEM subjects for their A-levels if they had taken part in engagement activities such as trips to labs, special practical lessons or visits to STEM centres. Presented in Review of Education, Banerjee highlights that there is little evidence linking the two. For example, the number of students taking physics A-level was 5% for students that had taken part in enrichment activities, compared with 4.3% if they had not. On the other hand, extra activities were slightly more beneficial for children ages 11–14 rather than ages 14–16. “Of course there are many factors which can affect the decisions young people make about the subjects they choose to continue studying at age 16,” says Banerjee. “It is essential for policymakers to consider if whether, if these schemes are not working, perhaps the money could be spent elsewhere. Given the range of schemes being run it is also crucial to understand if any work better than others. Knowing the answer to this could help ensure money is spent on only the highest quality activities.”

    CERN completes new linear accelerator

    Photograph of Linac 4 at CERN

    The CERN particle-physics lab near Geneva has built its first accelerator since the completion of the Large Hadron Collider (LHC) in 2008. Linear Accelerator 4 (Linac 4), which is around 90 m long and took a decade to construct, will be used to accelerate beams of negative hydrogen ions to 160 MeV. When Linac 4 is connected to CERN’s accelerator complex at the end of 2019, the 160 MeV beam will then be sent to the Proton Synchrotron Booster, which will accelerate the ions and strip the electrons away, before the resulting protons enter the Proton Synchrotron, the Super Proton Synchrotron and finally the LHC. Linac 4 will now undergo “extensive” commissioning and is expected to replace Linac 2, which has been in operation since 1978. The new accelerator will be part of CERN’s High Luminosity Upgrade, which will see the LHC’s luminosity increase five-fold by 2025.

    Australia gains access to ESO telescopes in Chile

    Astronomers in Australia will gain access to European Southern Observatory (ESO) telescopes in Chile in 2018 under a new agreement involving an A$26m payment to the ESO. Australia has also committed to the ongoing funding of the telescopes until 2028 at an average annual rate of A$12m and Australian astronomers and companies will be involved in developing new technologies for the telescopes. Chris Tinney at the University of New South Wales Sydney says: “Australian astronomers have been seeking access to ESO for the past two decades.” Lisa Kewley, who chairs the Australian Academy of Science National Committee for Astronomy, adds: “This is great news for the future of Australian astronomy.” Nobel laureate and Australian National University vice-chancellor Brian Schmidt says access to ESO’s facilities and other infrastructure such as the next-generation Giant Magellan Telescope (GMT) and Square Kilometre Array (SKA) radio telescope is critical to the future of Australian astronomy. Tim de Zeeuw, the ESO’s director general, says: “The ESO community is well aware of Australia’s outstanding instrumentation capability, including advanced adaptive optics and fibre-optic technology.” He adds: “Australia’s expertise is ideally matched to ESO’s instrumentation programme, and ESO Member State institutions would be excited to collaborate with Australian institutions and their industrial partners in consortia developing the next generation of instruments.”

     

    • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on how time can be reversed in emulsions.

    Flash Physics: Diamonds emit randomly polarized photons, LHCb reveals flawed model, SETI names first fellows

    Diamond defects create randomly polarized photons

    The first practical source of a randomly polarized stream of single photons has been created by physicists in Japan. The source is based on a negatively charged defect in diamond in which two adjacent carbon atoms are replaced by a nitrogen atom and a vacant lattice site. These “NV centres” have several properties that could make them useful for creating quantum-information systems – including the ability to emit single photons on demand. To date, work on single-photon sources has focused on supplying photons that are in specific polarization states. This is because quantum information can be encoded and transmitted in such polarization states. There are certain applications, however, that would benefit from a stream of photons in which the polarizations of successive photons are truly random and uncorrelated. Now, Keiichi Edamatsu, Naofumi Abe and colleagues at Tohoko University have shown that NV centres with a certain orientation with respect to the diamond lattice will emit randomly polarized photons. Writing in Scientific Reports, the team says that its source could find use as a random-number generator and also for performing tests on fundamental aspects of quantum mechanics.

    J/ψ measurement reveals flaw in collision simulations

    Photograph of the LHCb collaboration at CERN

    The production of J/ψ mesons in proton collisions in the Large Hadron Collider (LHC) at CERN does not agree with predictions made by a widely used computer simulation. That is the conclusion of physicists working on CERN’s LHCb experiment who have studied the jets of hadrons that are created when protons collide at 13 TeV. These jets contain large numbers of J/ψ mesons, which comprise a charm quark and a charm anti-quark. The LHCb team was able to measure the ratio of the momentum carried by the J/ψ mesons to the momentum carried by the entire jet. It was also able to discriminate between J/ψ mesons that were created promptly by the collision and J/ψ mesons that were created after the collision by the decay of other particles. Analysis of the data reveals that PYTHIA – a Monte Carlo simulation used to model high-energy particle collisions – does a poor job at predicting the momentum carried by prompt J/ψ mesons. The possibility of such a discrepancy had already been identified in theoretical work and has now been confirmed experimentally. The apparent shortcomings of PYTHIA could have a significant effect on how particle physics is done because the simulation is used both in the design of collider detectors and also to determine which measurements are most likely to reveal information about physics beyond the Standard Model of particle physics. The measurement is described in Physical Review Letters.

    SETI Institute honours its first fellows

    The Search for Extraterrestial Intelligence (SETI) Institute has named its first fellows. Seth Shostak, Mark Showalter and Edna DeVore were honoured at SETI’s first annual gala fundraiser for their contributions to scientific research and outreach. Shostak has been with SETI for 26 years as its senior astronomer, overseeing the radio-observing programmes. He also hosts SETI’s radio show and podcast, and is the editor for the institute’s magazine Explorer. Senior scientist Showalter specializes in planetary rings and moons. Over the course of his 12 years at SETI, he has discovered three planetary rings and six moons, including Saturn’s Pan and Pluto’s Kereros and Styx. DeVore is the institute’s director of education and during her 25 years at SETI she also served as acting chief executive for two years. She has led outreach and education projects for NASA missions, including SOFIA and Kepler, and oversees the Research Experience for Undergraduates programme, funded by the National Science Foundation. “Mark, Edna and Seth have distinguished themselves throughout their careers through groundbreaking work and an uncompromising commitment to excellence and innovation,” says William Diamond, who heads the SETI Institute.

     

    • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

    How will Brexit affect science in the rest of the EU?

    Brexit panel: left to right are Rolf Tarrach, Ole Petersen, Mark Ferguson and Gail Cardew

    By Hamish Johnston

    Here in the UK it’s easy to forget that our exit from the EU could have significant unintended consequences for scientists in the remaining 27 member nations.

    Yesterday, I was at a public forum called “Brexit: the scientific impact”, which was held at the Royal Institution in London. While there was much discussion about domestic challenges, the second session – “Brexit: the scientific impact on the EU-27” – provided a fascinating insight into the challenges facing the UK’s neighbours.

    (more…)

    Flash Physics: A fridge for quantum computers, mimicking nature’s colours, NASA launches quick-fire RAISE

    A nano-fridge for quantum computers

    A nanoscale “refrigerator” that could cool quantum computers has been developed by scientists in Finland. The team from Aalto University has cooled down a qubit-like superconducting resonator by tunnelling single electrons through a 2 nm-thick insulator. By providing the electrons with too little energy to tunnel directly, the charged particles capture the remaining energy needed from the nearby quantum device, with the loss of energy consequently cooling the device. To switch off the quantum-circuit refrigerator, the external voltage is simply turned off as the device it is cooling cannot provide enough energy to push an electron through the insulator. “I have worked on this gadget for five years and it finally works!” says team member Kuan Yen Tan. Next, the team hopes to apply its refrigerator to qubits, which switch states too much when they become too hot. The researchers also want to lower the minimum temperature and increase the rate at which cooling can be switched on and off. The work is presented in Nature Communications.

    Mimicking nature’s vivid colours with transparent particles

    A simulation of the silica-coated black substrate

    Scientists have long known that certain birds and butterflies get their vivid plumage from structures in the wings and feathers that control how light is scattered and reflected, with the “structural colour” often changing depending on the angle with which the animal is viewed. However, the Stellar Jay – a bright blue bird – has underneath the light-scattering structures a layer of black particles that absorb any wavelengths that are scattered towards it, which makes the bird appears blue at all angles. Now, a team led by Yukikazu Takeoka of Nagoya University in Japan has recreated this layering effect. They covered a black plate with layers of transparent, 190 nm silica particles that scatter and reflect the light. By controlling the thickness of the silica, the researchers were able to control the colour intensity – if too thin, the coating was transparent but if too thick, it became white. They found a 1–2 μm-thick layer created bright blue when on a black background, while on glass it was a much less vivid colour. Furthermore, Takeoka and team tested different sized silica particles, which can scatter light to different degrees. The researchers were able to create green using 260 nm particles and purple using 300 nm. The artificial structural colours, presented in Advanced Materials, could be useful for applications where light control is important, such as solar cells or adaptive camouflage.

    NASA launches quick-fire solar imager

     

    NASA has successfully launched a mission to study the split-second changes that occur at the Sun’s most active regions. The Rapid Acquisition Imaging Spectrograph Experiment (RAISE) was launched by a sounding rocket on 5 May from New Mexico. The rocket travelled around 300 km into the Earth’s atmosphere, during which time the RAISE instrument took images every 0.2 s for five minutes. While there are several missions that continuously study the Sun – such as NASA’s Solar Dynamics Observatory – some areas that rapidly change require dedicated observation. After taking some 1500 images, the RAISE payload parachuted back to Earth where it is now being recovered. This is RAISE’s third flight, following launches in 2014 and 2010.

     

    • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

    Simulating the universe: solving Einstein’s equations of general relativity in a cosmological setting

    A visualization of a curved space–time “sea”

    From the Genesis story in the Old Testament to the Greek tale of Gaia (Mother Earth) emerging from chaos and giving birth to Uranus (the god of the sky), people have always wondered about the universe and woven creation myths to explain why it looks the way it does. One hundred years ago, however, Albert Einstein gave us a different way to ask that question. Newton’s law of universal gravitation, which was until then our best theory of gravity, describes how objects in the universe interact. But in Einstein’s general theory of relativity, space–time (the marriage of space and time) itself evolves together with its contents. And so cosmology, which studies the universe and its evolution, became at least in principle a modern science – amenable to precise description by mathematical equations, able to make firm predictions, and open to observational tests that could falsify those predictions.

    Our understanding of the mathematics of the universe has advanced alongside observations of ever-increasing precision, leading us to an astonishing contemporary picture. We live in an expanding universe in which the ordinary material of our everyday lives – protons, neutrons and electrons – makes up only about 5% of the contents of the universe. Roughly 25% is in the form of “dark matter” – material that behaves like ordinary matter as far as gravity is concerned, but is so far invisible except through its gravitational pull. The other 70% of the universe is something completely different, whose gravity pushes things apart rather than pulling them together, causing the expansion of the universe to accelerate over the last few billion years. Naming this unknown substance “dark energy” teaches us nothing about its true nature.

    Simulating the universe
    How to model the cosmos with Einstein’s equations
    Video Player is loading.
    Current Time 0:00
    Duration 1:25
    Loaded: 0%
    Stream Type LIVE
    Remaining Time 1:25
     
    1x
      • Chapters
      • descriptions off, selected
      • subtitles off, selected
      • en (Main), selected

      Now, a century into its work, cosmology is brimming with existential questions. If there is dark matter, what is it and how can we find it? Is dark energy the energy of empty space, also known as vacuum energy, or is it the cosmological constant, Λ, as first suggested by Einstein in 1917? He introduced the constant after mistakenly thinking it would stop the universe from expanding or contracting, and so – in what he later called his “greatest blunder” – failed to predict the expansion of the universe, which was discovered a dozen years later. Or is one or both of these invisible substances a figment of the cosmologist’s imagination and it is general relativity that must be changed?

      At the same time as being faced with these fundamental questions, cosmologists are testing their currently accepted model of the universe – dubbed ΛCDM – to greater and greater precision observationally. (CDM indicates the dark-matter particles are cold because they must move slowly, like the mole­cules in a cold drink, so as not to evaporate from the galaxies they help bind together.) And yet, while we can use general relativity to describe how the universe expanded throughout its history, we are only just starting to use the full theory to model specific details and observations of how galaxies, clusters of galaxies and superclusters are formed and created. How this happens is simple – the equations of general relativity aren’t.

      Horribly complex

      While they fit neatly onto a T-shirt or a coffee mug (see below), Einstein’s field equations are horrible to solve even using a computer. The equations involve 10 separate functions of the four dimensions of space and time, which characterize the curvature of space–time in each location, along with 40 functions describing how those 10 functions change, as well as 100 further functions describing how those 40 changes change, all multiplied and added together in complicated ways. Exact solutions exist only in highly simplified approximations to the real universe. So for decades cosmologists have used those idealized solutions and taken the departures from them to be small perturbations – reckoning, in particular, that any departures from homogeneity can be treated independently from the homogeneous part and from one another.

      This “first-order perturbation theory” has taught us a lot about the early development of cosmic structures – galaxies, clusters of galaxies and superclusters – from barely perceptible concentrations of matter and dark matter in the early universe. The theory also has the advantage that we can do much of the analysis by hand, and follow the rest on computer. But to track the development of galaxies and other structures from after they were formed to the present day, we’ve mostly reverted to Newton’s theory of gravity, which is probably a good approximation.

      Einstein’s equations of general relativity on a coffee mug

      To make progress, we will need to improve on first-order perturbation theory, which treats cosmic structures as independent entities that are affected by the average expansion of the universe, but neither alter the average expansion themselves, nor influence one another. Unfortunately, higher-order perturbation theory is much more complicated – everything affects everything else. Indeed, it’s not clear there is anything to gain from using these higher-order approximations rather than “just solving” the full equations of general relativity instead.

      Improving the precision of our calculations – how well we think we know the answer – is one thing, as discussed above. But the complexity of Einstein’s equations has made us wonder just how accurate the perturbative description really is. In other words, it might give us answers, but are they the right ones? Nonlinear equations, after all, can have surprising features that appear unexpectedly when you solve them in their full glory, and it is hard to predict surprises. Some leading cosmologists, for example, claim that the accelerating expansion of the universe, which dark energy was invented to explain, is caused instead by the collective effects of cosmic structures in the universe acting through the magic of general relativity. Other cosmologists argue this is nonsense.

      Computers are finally becoming fast enough that modelling the universe using the full power of general relativity – without the traditional approximations – is not such a crazy prospect

      The only way to be sure is to use the full equations of general relativity. And the good news is that computers are finally becoming fast enough that modelling the universe using the full power of general relativity – without the traditional approximations – is not such a crazy prospect. With some hard work, it may finally be feasible over the next decade.

      Computers to the rescue

      Numerical general relativity itself is not new. As far back as the late 1950s, Richard Arnowitt, Stanley Deser and Charles Misner – together known as ADM – laid out a basic framework in which space–time could be carefully separated into space and time – a vital first step in solving general relativity with a computer. Other researchers also got in on the act, including Thomas Baumgarte, Stuart Shapiro, Masaru Shibata and Takashi Nakamura, who made important improvements to the numerical properties of the ADM system in the 1980s and 1990s so that the dynamics of systems could be followed accurately over long enough times to be interesting.

      Other techniques for obtaining such long-time stability were also developed, including one imported from fluid mechanics. Known as adaptive mesh refinement, it allowed scarce computer memory resources to be focused only on those parts of problems where they were needed most. Such advances have allowed numerical relativists to simulate with great precision what happens when two black holes merge and create gravitational waves – ripples in space–time. The resulting images are more than eye candy; they were essential in allowing members of the US-based Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration to announce last year that they had directly detected gravitational waves for the first time.

      By modelling many different possible configurations of pairs of black holes – different masses, different spins and different orbits – LIGO’s numerical relativists produced a template of the gravitational-wave signal that would result in each case. Other researchers then compared those simulations over and over again to what the experiment had been measuring, until the moment came when a signal was found that matched one of the templates. The signal in question was coming to us from a pair of black holes a billion light-years away spiralling into one another and merging to form a single larger black hole.

      General relativity offers at least one big advantage over Newtonian gravity – it is local

      Using numerical relativity to model cosmology has its own challenges compared to simulating black-hole mergers, which are just single astrophysical events. Some qualitative cosmological questions can be answered by reasonably small-scale simulations, and there are state-of-the-art “N-body” simulations that use Newtonian gravity to follow trillions of independent masses over billions of years to see where gravity takes them. But general relativity offers at least one big advantage over Newtonian gravity – it is local.

      The difficulty with calculating the gravity experienced by any particular mass in a Newtonian simulation is that you need to add up the effects of all the other masses. Even Isaac Newton himself regarded this “action at a distance” as a failing of his model, since it means that information travels from one side of the simulated universe to the other instantly, violating the speed-of-light limit. In general relativity, however, all the equations are “local”, which means that to determine the gravity at any time or location you only need to know what the gravity and matter distribution were nearby just moments before. This should, in other words, simplify the numerical calculations.

      Recently, the three of us at Kenyon College and Case Western Reserve University showed that the cosmological problem is finally becoming tractable (Phys. Rev. Lett. 116 251301 and Phys. Rev. D 93 124059). Just days after our paper appeared, Eloisa Bentivegna at the University of Catania in Italy and Marco Bruni at the University of Portsmouth, UK, had similar success (Phys. Rev. Lett. 116 251302). The two groups each presented the results of low-resolution simulations, where grid points are separated by 40 million light-years, with only long-wavelength perturbations. The simulations followed the universe for only a short time by cosmic standards – long enough only for the universe to somewhat more than double in size – but both tracked the evolution of these perturbations in full general relativity with no simplifications or approximations whatsoever. As the eminent Italian cosmologist Sabino Matarese wrote in Nature Physics, “the era of general relativistic numerical simulations in cosmology ha[s] begun”.

      Illustration of a simulation of “co-ordinate invariance”

      These preliminary studies are still a long way from competing with modern N-body simulations for resolution, duration or dynamic range. To do so will require advances in the software so that the code can run on much larger computer clusters. We will also need to make the code more stable numerically so that it can model much longer periods of cosmic expansion. The long-term goal is for our numerical simulations to match as far as possible the actual evolution of the universe and its contents, which means using the full theory of general relativity. But given that our existing simulations using full general relativity have revealed no fluctuations driving the accelerated expansion of the universe, it appears instead that accelerated expansion will need new physics – whether dark energy or a modified gravitational theory.

      Both groups also observe what appear to be small corrections to the dynamics of space–time when compared with simple perturbation theory. Bentivegna and Bruni studied the collapse of structures in the early universe and suggested that they appear to coalesce somewhat more quickly than in the standard simplified theory.

      Future perfect

      Drawing specific conclusions about simulations is a subtle matter in general relativity. At the mathematical heart of the theory is the principle of “co-ordinate invariance”, which essentially says that the laws of physics should be the same no matter what set of labels you use for the locations and times of events. We are all familiar with milder versions of this symmetry: we wouldn’t expect the equations governing basic scientific laws to depend on whether we measure our positions in, say, New York or London, and we don’t need new versions of science textbooks whenever we switch from standard time to daylight savings time and back. Co-ordinate invariance in the context of general relativity is just a more extreme version of that, but it means we must ensure that any information we extract from our simulations does not depend on how we label the points in our simulations.

      Our Ohio group has taken particular care with this subtlety by sending simulated beams of light from distant points in the distant past at the speed of light through space–time to arrive at the here and now. We then use those beams to simulate observations of the expansion history of our universe. The universe that emerges exhibits an average behaviour that agrees with a corresponding smooth, homogeneous model, but with inhomogeneous structures on top. These additional structures contribute to deviations in observable quantities across the simulated observer’s sky that should soon be accessible to real observers.

      Creating codes that are accurate and sensitive enough to make realistic predictions will require us to study larger volumes of space

      This work is therefore just the start of a journey. Creating codes that are accurate and sensitive enough to make realistic predictions for future observational programmes – such as the all-sky surveys to be carried out by the Large Scale Synoptic Telescope or the Euclid satellite – will require us to study larger volumes of space. These studies will also have to incorporate ultra-large-scale structures some hundreds of millions of light-years across as well as much smaller-scale structures, such as galaxies and clusters of galaxies. They will also have to follow these volumes for longer stretches of time than is currently possible.

      All this will require us to introduce some of the same refinements that made it possible to predict the gravitational-wave ripples produced by a merging black hole, such as adaptive mesh refinement to resolve the smaller structures like galaxies, and N-body simulations to allow matter to flow naturally across these structures. These refinements will let us characterize more precisely and more accurately the statistical properties of galaxies and clusters of galaxies – as well as the observations we make of them – taking general relativity fully into account. Doing so will, however, require clusters of computers with millions of cores, rather than the hundreds we use now.

      These improvements to code will take time, effort and collaboration. Groups around the world – in addition to the two mentioned – are likely to make important contributions. Numerical general-relativistic cosmology is still in its infancy, but the next decade will see huge strides to make the best use of the new generation of cosmological surveys that are being designed and built today. This work will either give us increased confidence in our own scientific genesis story – ΛCDM – or teach us that we still have a lot more thinking to do about how the universe got itself to where it is today.

      Cat-chy quantum song, science TV resurrected, $800,000 textbook, desk traffic lights

      By Sarah Tesh 

      I never realized it until now, but my life was missing a song about Schrödinger’s cat. Well, theoretical physicist, science writer and now singer/song writer Sabine Hossenfelder  has come to the rescue with a song about quantum states. This is her second music video done in collaboration artists Apostolos Vasilidis and Timo Alho. The rather cat-chy tune not only includes lyrics about quantum entanglement, Boltzmann brains and the multiverse, but also fits in references to Star Trek and The Matrix. In her BackReaction blog, Hossenfelder says, “If you think this one’s heavy on the nerdism, wait for the next.” We’re looking forward to it!

      (more…)

      Triatomic molecules cooled with lasers

      Molecules containing three atoms have been laser cooled to ultracold temperatures for the first time. The feat was achieved by John Doyle and colleagues at Harvard University in the US, who used a technique called Sisyphus cooling to chill an ensemble of about a million strontium-monohydroxide molecules to 750 μK. The team says the work opens the door to a range of applications, including quantum simulation and precision measurements.

      First developed in the late 1970s, the laser cooling of atomic gases to ultracold temperatures has revolutionized the study of the quantum states of matter. Important milestones include the creation of the first-ever Bose–Einstein condensate in the lab in 1995 and the first Fermi–Dirac condensate in 2003. The technique relies on the fact that photons carry small amounts of momentum and – under certain conditions – the repeated absorption and re-emission of photons by an atom can reduce its random motion and hence its temperature.

      Degrees of freedom

      Laser cooling of molecules – rather than atoms – is complicated by their rotational and vibrational degrees of freedom, which affect how they absorb and emit photons. As a result, the absorption and emission of photons can put the molecules into “dark states” that no longer take part in the cooling process. Despite this and other challenges, however, David DeMille and colleagues at Yale University managed to laser-cool a collection of strontium-fluoride diatomic molecules in 2014.

      In this latest work, John Doyle and colleagues at Harvard University have now cooled triatomic-strontium monohydroxide molecules using a method that is named after the doomed Greek hero Sisyphus, who was forced to push a boulder up a hill, only for it to roll down to the bottom and then repeat the task for eternity. Sisyphus cooling involves molecules losing kinetic energy by having to “climb” a hill of potential energy created by a standing wave of laser light.

      The atoms reach the “peak” when they spontaneously transition to a state that no longer interacts with the light. At this point, an applied magnetic field puts the atoms back into the original state – ready to climb again. This process is repeated many times, with each cycle reducing the atoms’ kinetic energy – and thus their random motion and temperature too.

      Rapid cooling

      Key to the success of Doyle’s team is that the cooling was achieved very rapidly – in 100 μs – and only involved about 200 photons interacting with each molecule. This speed is critical as the molecules are therefore less likely to be put into dark states before the cooling finishes.

      Writing in Physical Review Letters, Doyle and colleagues say that their technique could also be used to cool larger and more complicated strontium-based polyatomic molecules – for example by replacing the hydroxide with a methyl group. If the technique could be further extended to chiral molecules, it could also be used to investigate why some biological processes favour right- or left-handed molecules.

      Flash Physics: Matter-wave tractor beams, WiFi routers make holograms, nuclear-industry’s Brexit plans

      Tractor beams could be made from matter waves

      It should be possible to create a matter-wave tractor beam that grabs hold of an object by firing particles at it – according to calculations by an international team of physicists. Tractor beams work by firing cone-like “Bessel beams” of light or sound at an object. Under the right conditions, the light or sound waves will bounce off the object in such a way that the object experiences a force in the opposite direction to that of the beam. If this force is greater than the outward pressure of the beam, the object will be pulled inwards. Now, Andrey Novitsky and colleagues at Belarusian State University, ITMO University in St Petersburg and the Technical University of Denmark have done calculations that show that beams of particles can also function as tractor beams. Quantum mechanics dictates that these particles also behave as waves and the team found that cone-like beams of matter waves should also be able to grab hold of objects. There is, however, an important difference regarding the nature of the interaction between the particles and the object. Novitsky and colleagues found that if the scattering is defined by the Coulomb interaction between charged particles, then it is not possible to create a matter-wave tractor beam. However, tractor beams are possible if the scattering is defined by a Yukawa potential, which is used to describe interactions between some subatomic particles. The calculations are described in Physical Review Letters.

      3D holograms produced from WiFi routers

      Photograph of cross made of aluminum foil between the detection antenna and the WiFi router and an insert of the resulting hologram image

      Household WiFi routers can be used to produce 3D holograms of rooms. The futuristic imaging process has been developed by Philipp Holl and Friedemann Reinhard of the Technical University of Munich in Germany. Using one fixed and one movable antenna, they measure the distortions in the router’s microwave signal caused by it reflecting off and travelling through objects. The data are then fed through reconstruction algorithms enabling the researchers to produce 3D images of the environment surrounding the router at centimetre precision. The technique is simpler than optical holography, which relies upon elaborate laser equipment, and will have improved resolution when future WiFi technology has increased speed and bandwidth. The research has, however, raised concerns about privacy. “It is rather unlikely that this process will be used for the view into foreign bedrooms in the near future.” Reinhard says to address these worries: “For that, you would need to go around the building with a large antenna, which would hardly go unnoticed.” The method is also limited because microwaves come from so many devices and from multiple directions. Instead, Holl and Reinhard hope the technology, presented in Physical Review Letters, will be applied to recover victims buried under collapsed buildings or avalanches. Unlike conventional methods, it could provide spatial representation of the structures surrounding victims, allowing swifter and safer rescue.

      UK nuclear industry outlines “Brexit” priorities

      The UK Nuclear Industry Association (NIA) has called on the UK government to work closely with the nuclear industry to avoid a “cliff-edge” scenario after the country leaves the European Atomic Energy Community (Euratom). In its report – Exiting Euratom – the trade association for the UK’s civil nuclear industry, which represents more than 260 companies, outlines six priority areas for negotiations with the European Commission as part of the “Brexit” negotiations. These include agreeing a new funding arrangement for the UK’s involvement in Fusion 4 Energy, which is responsible for providing Europe’s contribution to ITER fusion reactor in France, as well as setting out the process for the movement of nuclear material, goods, people and services post Brexit. The NIA also says that if a new Euratom deal is not agreed by the time the UK leaves the European Union in 2019 then the existing arrangement should continue until a new one is implemented.

       

      • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on the laser cooling of triatomic molecules.

      New cosmic messengers, and what they can tell us

      Bartos-multimessenger-astronomyBy Margaret Harris

      Immediately after last year’s announcement that the Laser Interferometer Gravitational-Wave Observatory (LIGO) had seen its first gravitational waves, a lot of the discussion centred on what the discovery meant for general relativity.  This was understandable: getting further confirmation of Einstein’s century-old theory was (and is) a big deal.  But in the longer term, and as the LIGO detectors notch up a few more observations (they’re currently crunching data on six new candidates), the emphasis will shift away from the waves themselves, and towards what they can tell us about the universe.

      The key thing to realize here is that gravitational waves are fundamentally different from other, better-studied cosmic “messengers” that travel to Earth from distant reaches of the universe.  Unlike photons, gravitational waves are not impeded by clouds of gas or dust; unlike cosmic rays, they are not deflected by electromagnetic fields. In addition, some of the most dramatic astrophysical events, such as the merger of two black holes in empty space, are “dark” or “silent” to other messengers: these events produce gravitational waves in copious quantities, but not, as far as we know, anything else.

      (more…)

      Copyright © 2025 by IOP Publishing Ltd and individual contributors