Skip to main content

The cutting edge of quantum physics

An artist's drawing of a diamond overlain with a semi-transparent pattern of 1’s and 0’s. A green laser-like beam enters one side of the diamond and a red beam of 1’s and 0’s comes out the other side. An inset drawing shows a ball-and-stick model of atoms that make up the structure of diamond with NV defects. Black carbon atoms in a lattice surround a single blue nitrogen atom and a semi-transparent ball

In the 20th century many aspects of quantum physics were harnessed into world-changing technologies, including semiconductors, lasers and other now- ubiquitous devices. Throughout this first quantum revolution, however, one key aspect of quantum physics – superposition – has largely remained in the laboratory, a fundamental curiosity rather than a prom­ising feature to be exploited.

However, this is about to change, thanks to several significant initiatives that aim to bring about a second quantum revolution. The key to this revolution’s success will be the ability to “easily” engineer and control quantum bits. We use the word “easily” with caution, because initializing a quantum state and keeping it in a superposition for significant lengths of time is a difficult undertaking. Scientists are exploring many different approaches, using materials as varied as superconductors, synthetic diamonds, cold atoms and quantum dots, and the race is currently wide open. But diamond does have some intriguing advantages, both for quantum computation and for other applications such as magnetic-field sensing. The challenge for our organization, the industrial diamond firm Element Six, has been to support research in this area while also staying true to our core business interests in materials applications.

A useful flaw

The type of diamond that attracts would-be quantum revolutionaries has a defect in its otherwise uniform lattice of carbon atoms. This defect consists of a single nitrogen atom adjacent to a missing carbon atom, or vacancy. The nitrogen-vacancy (NV) centre has unique optical absorption and emission properties – among other effects, it gives diamond a red-to-pink colouration – and these properties have long been the focus of fundamental research on crystal structures.

In addition to its unusual optical properties, the negative charge state of the NV centre also has an electronic spin, S = 1, in its ground state. Remarkably, the state of this electronic spin can be controlled and read out at room temperature. The reason is that unlike most materials, the crystal lattice in diamond forms a low-noise environment, so fragile quantum properties are not lost and information can be stored and probed for longer time periods. The spin state can be read out by measuring the intensity of light given off by an NV centre as the system is excited by microwave radiation. At the NV centre’s resonance frequency of 2.88 GHz, the spin state will flip from 0 to a +1 or –1, causing a dip in the intensity of red light emitted.

The robustness of this spin state, and the ease of reading it out, make NV diamond a very promising platform for a wide range of quantum technologies, with potential applications in secure communications, computing, imaging and sensing. A recent focus area for the diamond community is the use of NV defects to measure magnetic fields. Thanks to the Zeeman interaction, the gap between the frequencies of the 0 → 1 and –1 → 0 microwave transitions in NV diamond increases as a function of magnetic field. Hence, in the simplest case, one can estimate the magnitude of the magnetic field by exposing the NV centre to a range of microwave frequencies and measuring the separation between the two dips in intensity. Remarkably, a basic measurement of this type can be performed using a single NV centre at room temperature. With multiple NV centres, the geometry of the diamond lattice means that one can make extremely sensitive measurements of the field’s direction as well as its magnitude.

Raw materials

Of course, numerous technologies for estimating magnetic field already exist. These include superconducting quantum interference devices (SQUIDs), vapour cells, flux-gate sensors and the Hall-effect sensors that constitute the compass in modern smartphones. However, SQUID-based magnetometers must be cryogenically cooled, making them relatively bulky and costly to run, while other sensor technologies require frequent recalibration and offer limited frequency bandwidth for measuring changing magnetic fields. In contrast, NV diamond-based sensors do not need to be recalibrated, have a broad bandwidth and could be incorporated into a lightweight, low-powered device. Critically, NV centres can also be used to construct maps of magnetic field across a surface, thanks to the high spatial resolution provided by a microscopic probe. For these reasons, diamond-based magnetometers have strong potential both as replacements for existing technologies and as the enablers of applications where competing technologies do not yet exist.

For these applications to become a reality, though, we need a ready supply of high-quality NV diamonds. NV centres are rare in natural diamonds, and it is difficult to do much research if you are limited to working with a single sample. At Element Six we have developed methods for growing NV diamond synthetically using chemical vapour deposition (CVD). This process involves filling a microwave chamber with a mixture of hydrogen, methane and nitrogen gas, and heating it to 2500–3000 K to create a plasma. Diamond “seeds” placed in the chamber become the nuclei for new diamonds as carbon atoms from the plasma deposit onto their surfaces layer by layer. The hydrogen stabilizes the surface against forming graphite instead of diamond, while the nitrogen acts as a dopant, making it possible for NV centres to form.

This process is the result of more than 15 years of intensive R&D and it enables us to grow diamond in a controlled and scalable fashion, with a purity far exceeding that of natural diamonds. It also makes it possible to control the number of NV centres. Under high-purity conditions, small numbers of NV centres are produced via the chemistry of the growth process. These isolated vacancies can be probed individually in an experiment, so this type of NV diamond is well-suited for quantum- computation applications. Magnetic- sensing applications require higher numbers of NV centres, and we achieve this by increasing the nitrogen concentration during synthesis and then bombarding the crystal with high-energy electrons to create additional vacancies. Heating the diamond to 800 °C causes these vacancies to migrate through the crystal lattice until they encounter nitrogen atoms; at that point, the structure stabilizes, since the NV centre has a lower potential energy than a separate nitrogen and vacancy.

The value chain

Over the past decade advances in our diamond-making capabilities, coupled with a deepening understanding of the physics of quantum spins in NV diamond, have opened up a wide range of potential applications. Element Six has supported this nascent field by supplying state-of-the-art diamond samples and diamond engineering expertise to external partners, while focusing internally on making further improvements to the material. In recent years, however, we have also become more active in supporting commercial start-ups to allow them to incubate the technology and in helping larger companies assess the applicability of our diamonds to various market opportunities.

The breadth and depth of knowledge needed to appreciate these opportunities is significant. It requires one to consider an entire value chain: a material; a device made from that material; the package surrounding that device; the subsystems and systems the device fits into; and finally the user. As is often the case, the commercial value of this chain is concentrated at the subsystem and system level. But Element Six is a materials company, and we have grown by developing novel materials that address problems across multiple markets and industries. Making devices, let alone complete systems for end users, is not really our speciality. So how can we access the value at the other end of the chain?

Rather than changing our strategic focus, we have instead sought to exploit diamond quantum devices by communicating their “value proposition” to end users. A basic demonstration of the NV centre’s ability to measure magnetic field is not difficult, and a prototype device can be made using remarkably simple components such as off-the shelf diode lasers and photodiodes, and coils of wire to deliver the microwaves to the sample. Packaging all of this together into a robust unit is less trivial, of course; ultimately, the performance of a diamond sensor will depend not only on the material itself and Element Six’s expertise, but also on the stability of surrounding components and the data-processing algorithms used to transform raw measurements of light intensity into an accurate and highly sensitive map of the vector magnetic field. Nevertheless, it is always much easier to convince people of a device’s potential with a demo than with PowerPoint slides.

Another component of our strategy has been to partner with university researchers who are developing diamond-quantum-device technology. This has enabled us to secure some intellectual property (IP) on the physics needed to make working devices – although, crucially, we actively avoided filing patents for the actual applications because we wanted to leave third parties free to develop their own. Our university partners have also been an important bridge between us and potential end users. Making a diamond-based quantum device (or indeed any quantum device) requires knowledge of quantum physics, and since this is an emerging industry most organizations do not yet have that expertise. Combining our IP and materials know-how with their quantum-physics expertise enabled us to start talking to organizations that were actually in a position to develop this technology. In addition, many of the academic groups we work with have produced spin-out companies. We have supported these companies with materials sales and know­ledge-sharing, and we anticipate that the applications they develop will be a growth area for Element Six over the coming years.

Potential gems

Diamond quantum technologies are extremely promising, with many applications already at the proof-of-concept stage. These include applications in materials characterization such as nanoscale imaging of the write heads for next-generation magnetic hard drives, and biological imaging. New sensing methods for pressure and temperature, plus the alluring possibility of diamond-based quantum computing, makes this an exciting and productive area.

We foresee that diamond will continue to be used as a tool to aid our understanding of the quantum world. However, the real excitement concerns the possible technologies that this understanding will enable. In late 2016 a group of researchers led by Ron Walsworth at Harvard University, US used NV centres in diamond to study neuron activity in marine worms, measuring the tiny magnetic pulses from single neurons with high spatial resolution. No other existing technology can perform measurements at such high sensitivity and resolution; the maximum spatial resolution of standard MRI scans is about 1 mm3, whereas diamond-based magnetic field sensing could, in theory, give us cellular-level images of chemical processes. We expect that this proof-of-principle experiment will be followed by breakthroughs in our understanding of how the brain works, as well as new diagnostic methods and treatments.

Entry denied

Galileo Galilei, the controversial Italian astronomer, was recently questioned by a US Immigration and Customs Enforcement (ICE) officer. I managed to obtain a transcript of the encounter.

ICE Passport, please.
Galileo Eccolò.
ICE [Flicks through document.] Ah, so you’re a scientist? I’ve heard you scientists are doing work that threatens American interests.
G Which work? I’ve upset a few astronomical apple carts, but what makes politicians go ballistic these days is meteorology.
ICE Look, your particular academic cranny doesn’t matter. What matters is if your science threatens American jobs, economy and values.
G It can’t. Science doesn’t hurt a nation’s interests. It can only stimulate a country’s activity – make it grow.
ICE That’s not what the politicians say.
G Well, they’re idiots.
ICE Do you know who you’re accusing?
G Powerful idiots.
ICE No, they’re duly elected senators, representatives and members of the executive branch, sworn to defend the country.
G Not all elected politicians understand how best to defend the country. If they did, they wouldn’t argue so much.
ICE Yes, but they are the law of the land, the ultimate authorities.
G You reckon? So why not ask politicians to repair your car, fix your computer or cut out your appendix? It’s because that’s not their skill. Also, it’s way beneath their dignity. Leave that stuff to geeks like me!
ICE Well, politicians make the laws. The American constitution says nothing about restricting their authority, or sharing it with science.
G That constitution was written almost 250 years ago, at a particular historical moment and for a different audience. Its authors knew that times would change and wrote it flexibly, so it could be adapted to changing realities. Modern politicians should pay attention to the words of founding fathers like Benjamin Franklin, John Adams, Thomas Jefferson and James Madison, who weren’t just amateur scientists but realized that using science to understand nature is essential to effective democracy. In determining the real, they thought, science allows politics to craft the possible. As one of my political friends likes to say, “The founding fathers showed how to create legislation, not to legislate creation!”
ICE The safest course is to adhere to the constitution’s literal language.
G Oh, really? What about article 1, section 2, paragraph 3, which treats a slave as three-fifths of a person? If today you believed the literal truth of that, you’d be un-American! The constitution only works today if you adapt it to current reality.
ICE I’m recording you as denying that the constitution is the supreme law of the land.
G What I’m saying is that there are two constitutions, the American constitution and nature’s constitution. Politicians are the authorities for the former, while scientists are the authorities for the latter. These two constitutions – political and natural – cannot conflict. If they appear to conflict, somebody is overstepping their authority. Those who overstep their authority are being un-American. They are violating their oaths of office and endangering the country. If you want to purge America of its enemies, eject the science-deniers.
ICE Sorry, Signor Galileo, this country isn’t yet ready for you and your views. Entry denied!

The critical point

Four centuries ago, when science and society were not yet coupled, investigators into nature like Galileo had to develop clever arguments and rhetorical strategies to justify the value of their work and to defend science as being in the national interest in the face of powerful opposition. Today, equally powerful forces are at work seeking to decouple science and society. It is valuable to revisit the original arguments and strategies that Galileo and his colleagues used to see if they can be recast in modern terms.

Galileo relied on a variety of tactics, including drawing attention to key distinctions, exposing concepts that his enemies had used in empty and abstract ways, turning the arguments of his accusers right back at them, and appealing to authority. Galileo also did not refrain from sarcasm, insults and ridicule, nor from declaring his own piety and patriotism while accusing his enemies of lacking the same.

To create the above conversation I started with Galileo’s famous Letter to Christina of 1615. Nominally addressed to the mother of his patron Cosimo II, who had hosted a gathering at which Galileo’s piety had been questioned, it was meant as an open letter to political and theological authorities to lay to rest issues raised by his astronomical work. I kept the basic structure of Galileo’s response, but replaced words like “Bible” or “theologians” with words like “constitution” and “politicians”. Where he wrote some version of “pious”, I wrote some version of “American”. The result forges an argument that remains clear and powerful.

Truth be told, Galileo’s defence didn’t keep him out of trouble, but it was effective in the long run. No single counter-measure will blunt the forces promoting science denial, and all of Galileo’s rich toolkit of tactics will have to be used. A good starting point to find them is to read his original Letter to Christina.

The easy way to make emulsions

A new technique for making emulsions that does not require the ingredients to be intensively mixed has been unveiled by researchers in Bulgaria and the UK. The gentle nature of the emulsion-making process could make it useful in a number of practical applications involving fragile ingredients such as pharmaceuticals.

Emulsions such as mayonnaise, paint and cosmetic creams are dispersed mixtures of tiny droplets of different fluids that will not mix if simply added together. Oil and water, for example, will only become an emulsion if they are mixed vigorously together and quickly separate when the mixing stops. That is a challenge for manufacturers, who also have to control the size of the droplets, which affects the visual appearance, consistency, texture and even taste of an emulsion.

Too hot to handle

Intensive mixing, which is the conventional industrial process for making emulsions, relies on mechanical shear to break up droplets until they reach the desired size. The problem, according to Stoyan Smoukov of the University of Cambridge, is that this process is extremely inefficient.

As little as 0.1% of the mixing energy goes on creating smaller droplets, with most of the rest of the energy simply heats up the mixture. While heating is fine for some emulsions, it can destroy temperature-sensitive materials such as proteins and other biological materials that are increasingly being used in pharmaceutical emulsions.

Although several techniques for “self-emulsification” without mixing have been developed, none are particularly suited for temperature-sensitive materials. Now, working with researchers at Sofia University, Smoukov has developed a new self-emulsification process that takes advantage of a phase transition that occurs in droplets as the temperature of the mixture changes by only a few degrees around room temperature.

Simple mixture

The team studied a simple mixture comprising water, oil and soap – the latter acting as a “surfactant” that lowers the surface tension between oil and water. The researchers found that when the temperature of the mixture is raised by several degrees, energy from thermal fluctuations causes oil droplets to spontaneously break apart to form smaller droplets. By putting the material through several cycles of heating and cooling, they found that the size of the droplets could be reduced progressively.

Because the process is irreversible, it could provide a new way of creating emulsions from temperature-sensitive ingredients. More fundamentally, Smoukov believes that the system could provide a simple model for understanding how much more complex non-equilibrium systems – including some living organisms – can harness energy from temperature fluctuations.

The research is described in Nature Communications.

The blue fog

New research has shed light on a long-standing mystery that has perplexed physicists and chemists for over a century. Solving the secret of the “blue fog” proved to be an intellectual tour de force – and one that could lead to new types of display devices. Find out more by reading this feature article from the April issue of Physics World, written by Oliver Henrich and Davide Marenduzzo, who were involved in the latest work.

Flash Physics: Antiprotons from dark matter, lab visits do not inspire, CERN’s new linac, Australia at ESO

Antiproton excess linked to dark matter

An unexplained excess in the number of antiprotons detected by the Alpha Magnetic Spectrometer (AMS) is related to the annihilation of dark-matter particles, according to two independent studies. Dark matter is a mysterious substance that appears to account for most of the matter in the universe. While its existence can be inferred indirectly from a number of different astronomical phenomena, dark-matter particles have never been detected directly. Writing in Physical Review Letters, Alessandro Cuoco and colleagues at RWTH Aachen University in Germany describe how they analysed antiproton, proton and helium cosmic-ray detection rates by AMS – which is located on the International Space Station – and other experiments. They found that the creation of antiprotons by the annihilation of dark-matter particles with masses of about 80 GeV/C2 provided the best explanation for why AMS has detected more antiprotons than expected to be created by conventional astrophysical process. In the same issue of the journal, Ming-Yang Cui of the Chinese Academy of Sciences and colleagues describe an independent analysis of the antiproton excess, which suggests that it is the result of annihilating dark-matter particles with masses in the 40–60 GeV/C2.

Students not choosing science despite extra activities

Science-related extracurricular activities do not encourage students to study science, technology, engineering and mathematical (STEM) subjects at high school, according to a study by Pallavi Amitava Banerjee from the University of Exeter in the UK. Banerjee tracked the educational progress of 600,000 teenagers from the start of secondary school (age 11–12) to A-level examinations (age 18). By using data from the National Pupil Database and activity providers, she examined whether students were more likely to choose STEM subjects for their A-levels if they had taken part in engagement activities such as trips to labs, special practical lessons or visits to STEM centres. Presented in Review of Education, Banerjee highlights that there is little evidence linking the two. For example, the number of students taking physics A-level was 5% for students that had taken part in enrichment activities, compared with 4.3% if they had not. On the other hand, extra activities were slightly more beneficial for children ages 11–14 rather than ages 14–16. “Of course there are many factors which can affect the decisions young people make about the subjects they choose to continue studying at age 16,” says Banerjee. “It is essential for policymakers to consider if whether, if these schemes are not working, perhaps the money could be spent elsewhere. Given the range of schemes being run it is also crucial to understand if any work better than others. Knowing the answer to this could help ensure money is spent on only the highest quality activities.”

CERN completes new linear accelerator

Photograph of Linac 4 at CERN

The CERN particle-physics lab near Geneva has built its first accelerator since the completion of the Large Hadron Collider (LHC) in 2008. Linear Accelerator 4 (Linac 4), which is around 90 m long and took a decade to construct, will be used to accelerate beams of negative hydrogen ions to 160 MeV. When Linac 4 is connected to CERN’s accelerator complex at the end of 2019, the 160 MeV beam will then be sent to the Proton Synchrotron Booster, which will accelerate the ions and strip the electrons away, before the resulting protons enter the Proton Synchrotron, the Super Proton Synchrotron and finally the LHC. Linac 4 will now undergo “extensive” commissioning and is expected to replace Linac 2, which has been in operation since 1978. The new accelerator will be part of CERN’s High Luminosity Upgrade, which will see the LHC’s luminosity increase five-fold by 2025.

Australia gains access to ESO telescopes in Chile

Astronomers in Australia will gain access to European Southern Observatory (ESO) telescopes in Chile in 2018 under a new agreement involving an A$26m payment to the ESO. Australia has also committed to the ongoing funding of the telescopes until 2028 at an average annual rate of A$12m and Australian astronomers and companies will be involved in developing new technologies for the telescopes. Chris Tinney at the University of New South Wales Sydney says: “Australian astronomers have been seeking access to ESO for the past two decades.” Lisa Kewley, who chairs the Australian Academy of Science National Committee for Astronomy, adds: “This is great news for the future of Australian astronomy.” Nobel laureate and Australian National University vice-chancellor Brian Schmidt says access to ESO’s facilities and other infrastructure such as the next-generation Giant Magellan Telescope (GMT) and Square Kilometre Array (SKA) radio telescope is critical to the future of Australian astronomy. Tim de Zeeuw, the ESO’s director general, says: “The ESO community is well aware of Australia’s outstanding instrumentation capability, including advanced adaptive optics and fibre-optic technology.” He adds: “Australia’s expertise is ideally matched to ESO’s instrumentation programme, and ESO Member State institutions would be excited to collaborate with Australian institutions and their industrial partners in consortia developing the next generation of instruments.”

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on how time can be reversed in emulsions.

Flash Physics: Diamonds emit randomly polarized photons, LHCb reveals flawed model, SETI names first fellows

Diamond defects create randomly polarized photons

The first practical source of a randomly polarized stream of single photons has been created by physicists in Japan. The source is based on a negatively charged defect in diamond in which two adjacent carbon atoms are replaced by a nitrogen atom and a vacant lattice site. These “NV centres” have several properties that could make them useful for creating quantum-information systems – including the ability to emit single photons on demand. To date, work on single-photon sources has focused on supplying photons that are in specific polarization states. This is because quantum information can be encoded and transmitted in such polarization states. There are certain applications, however, that would benefit from a stream of photons in which the polarizations of successive photons are truly random and uncorrelated. Now, Keiichi Edamatsu, Naofumi Abe and colleagues at Tohoko University have shown that NV centres with a certain orientation with respect to the diamond lattice will emit randomly polarized photons. Writing in Scientific Reports, the team says that its source could find use as a random-number generator and also for performing tests on fundamental aspects of quantum mechanics.

J/ψ measurement reveals flaw in collision simulations

Photograph of the LHCb collaboration at CERN

The production of J/ψ mesons in proton collisions in the Large Hadron Collider (LHC) at CERN does not agree with predictions made by a widely used computer simulation. That is the conclusion of physicists working on CERN’s LHCb experiment who have studied the jets of hadrons that are created when protons collide at 13 TeV. These jets contain large numbers of J/ψ mesons, which comprise a charm quark and a charm anti-quark. The LHCb team was able to measure the ratio of the momentum carried by the J/ψ mesons to the momentum carried by the entire jet. It was also able to discriminate between J/ψ mesons that were created promptly by the collision and J/ψ mesons that were created after the collision by the decay of other particles. Analysis of the data reveals that PYTHIA – a Monte Carlo simulation used to model high-energy particle collisions – does a poor job at predicting the momentum carried by prompt J/ψ mesons. The possibility of such a discrepancy had already been identified in theoretical work and has now been confirmed experimentally. The apparent shortcomings of PYTHIA could have a significant effect on how particle physics is done because the simulation is used both in the design of collider detectors and also to determine which measurements are most likely to reveal information about physics beyond the Standard Model of particle physics. The measurement is described in Physical Review Letters.

SETI Institute honours its first fellows

The Search for Extraterrestial Intelligence (SETI) Institute has named its first fellows. Seth Shostak, Mark Showalter and Edna DeVore were honoured at SETI’s first annual gala fundraiser for their contributions to scientific research and outreach. Shostak has been with SETI for 26 years as its senior astronomer, overseeing the radio-observing programmes. He also hosts SETI’s radio show and podcast, and is the editor for the institute’s magazine Explorer. Senior scientist Showalter specializes in planetary rings and moons. Over the course of his 12 years at SETI, he has discovered three planetary rings and six moons, including Saturn’s Pan and Pluto’s Kereros and Styx. DeVore is the institute’s director of education and during her 25 years at SETI she also served as acting chief executive for two years. She has led outreach and education projects for NASA missions, including SOFIA and Kepler, and oversees the Research Experience for Undergraduates programme, funded by the National Science Foundation. “Mark, Edna and Seth have distinguished themselves throughout their careers through groundbreaking work and an uncompromising commitment to excellence and innovation,” says William Diamond, who heads the SETI Institute.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

How will Brexit affect science in the rest of the EU?

Brexit panel: left to right are Rolf Tarrach, Ole Petersen, Mark Ferguson and Gail Cardew

By Hamish Johnston

Here in the UK it’s easy to forget that our exit from the EU could have significant unintended consequences for scientists in the remaining 27 member nations.

Yesterday, I was at a public forum called “Brexit: the scientific impact”, which was held at the Royal Institution in London. While there was much discussion about domestic challenges, the second session – “Brexit: the scientific impact on the EU-27” – provided a fascinating insight into the challenges facing the UK’s neighbours.

(more…)

Flash Physics: A fridge for quantum computers, mimicking nature’s colours, NASA launches quick-fire RAISE

A nano-fridge for quantum computers

A nanoscale “refrigerator” that could cool quantum computers has been developed by scientists in Finland. The team from Aalto University has cooled down a qubit-like superconducting resonator by tunnelling single electrons through a 2 nm-thick insulator. By providing the electrons with too little energy to tunnel directly, the charged particles capture the remaining energy needed from the nearby quantum device, with the loss of energy consequently cooling the device. To switch off the quantum-circuit refrigerator, the external voltage is simply turned off as the device it is cooling cannot provide enough energy to push an electron through the insulator. “I have worked on this gadget for five years and it finally works!” says team member Kuan Yen Tan. Next, the team hopes to apply its refrigerator to qubits, which switch states too much when they become too hot. The researchers also want to lower the minimum temperature and increase the rate at which cooling can be switched on and off. The work is presented in Nature Communications.

Mimicking nature’s vivid colours with transparent particles

A simulation of the silica-coated black substrate

Scientists have long known that certain birds and butterflies get their vivid plumage from structures in the wings and feathers that control how light is scattered and reflected, with the “structural colour” often changing depending on the angle with which the animal is viewed. However, the Stellar Jay – a bright blue bird – has underneath the light-scattering structures a layer of black particles that absorb any wavelengths that are scattered towards it, which makes the bird appears blue at all angles. Now, a team led by Yukikazu Takeoka of Nagoya University in Japan has recreated this layering effect. They covered a black plate with layers of transparent, 190 nm silica particles that scatter and reflect the light. By controlling the thickness of the silica, the researchers were able to control the colour intensity – if too thin, the coating was transparent but if too thick, it became white. They found a 1–2 μm-thick layer created bright blue when on a black background, while on glass it was a much less vivid colour. Furthermore, Takeoka and team tested different sized silica particles, which can scatter light to different degrees. The researchers were able to create green using 260 nm particles and purple using 300 nm. The artificial structural colours, presented in Advanced Materials, could be useful for applications where light control is important, such as solar cells or adaptive camouflage.

NASA launches quick-fire solar imager

 

NASA has successfully launched a mission to study the split-second changes that occur at the Sun’s most active regions. The Rapid Acquisition Imaging Spectrograph Experiment (RAISE) was launched by a sounding rocket on 5 May from New Mexico. The rocket travelled around 300 km into the Earth’s atmosphere, during which time the RAISE instrument took images every 0.2 s for five minutes. While there are several missions that continuously study the Sun – such as NASA’s Solar Dynamics Observatory – some areas that rapidly change require dedicated observation. After taking some 1500 images, the RAISE payload parachuted back to Earth where it is now being recovered. This is RAISE’s third flight, following launches in 2014 and 2010.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

Simulating the universe: solving Einstein’s equations of general relativity in a cosmological setting

A visualization of a curved space–time “sea”

From the Genesis story in the Old Testament to the Greek tale of Gaia (Mother Earth) emerging from chaos and giving birth to Uranus (the god of the sky), people have always wondered about the universe and woven creation myths to explain why it looks the way it does. One hundred years ago, however, Albert Einstein gave us a different way to ask that question. Newton’s law of universal gravitation, which was until then our best theory of gravity, describes how objects in the universe interact. But in Einstein’s general theory of relativity, space–time (the marriage of space and time) itself evolves together with its contents. And so cosmology, which studies the universe and its evolution, became at least in principle a modern science – amenable to precise description by mathematical equations, able to make firm predictions, and open to observational tests that could falsify those predictions.

Our understanding of the mathematics of the universe has advanced alongside observations of ever-increasing precision, leading us to an astonishing contemporary picture. We live in an expanding universe in which the ordinary material of our everyday lives – protons, neutrons and electrons – makes up only about 5% of the contents of the universe. Roughly 25% is in the form of “dark matter” – material that behaves like ordinary matter as far as gravity is concerned, but is so far invisible except through its gravitational pull. The other 70% of the universe is something completely different, whose gravity pushes things apart rather than pulling them together, causing the expansion of the universe to accelerate over the last few billion years. Naming this unknown substance “dark energy” teaches us nothing about its true nature.

Now, a century into its work, cosmology is brimming with existential questions. If there is dark matter, what is it and how can we find it? Is dark energy the energy of empty space, also known as vacuum energy, or is it the cosmological constant, Λ, as first suggested by Einstein in 1917? He introduced the constant after mistakenly thinking it would stop the universe from expanding or contracting, and so – in what he later called his “greatest blunder” – failed to predict the expansion of the universe, which was discovered a dozen years later. Or is one or both of these invisible substances a figment of the cosmologist’s imagination and it is general relativity that must be changed?

At the same time as being faced with these fundamental questions, cosmologists are testing their currently accepted model of the universe – dubbed ΛCDM – to greater and greater precision observationally. (CDM indicates the dark-matter particles are cold because they must move slowly, like the mole­cules in a cold drink, so as not to evaporate from the galaxies they help bind together.) And yet, while we can use general relativity to describe how the universe expanded throughout its history, we are only just starting to use the full theory to model specific details and observations of how galaxies, clusters of galaxies and superclusters are formed and created. How this happens is simple – the equations of general relativity aren’t.

Horribly complex

While they fit neatly onto a T-shirt or a coffee mug (see below), Einstein’s field equations are horrible to solve even using a computer. The equations involve 10 separate functions of the four dimensions of space and time, which characterize the curvature of space–time in each location, along with 40 functions describing how those 10 functions change, as well as 100 further functions describing how those 40 changes change, all multiplied and added together in complicated ways. Exact solutions exist only in highly simplified approximations to the real universe. So for decades cosmologists have used those idealized solutions and taken the departures from them to be small perturbations – reckoning, in particular, that any departures from homogeneity can be treated independently from the homogeneous part and from one another.

This “first-order perturbation theory” has taught us a lot about the early development of cosmic structures – galaxies, clusters of galaxies and superclusters – from barely perceptible concentrations of matter and dark matter in the early universe. The theory also has the advantage that we can do much of the analysis by hand, and follow the rest on computer. But to track the development of galaxies and other structures from after they were formed to the present day, we’ve mostly reverted to Newton’s theory of gravity, which is probably a good approximation.

Einstein’s equations of general relativity on a coffee mug

To make progress, we will need to improve on first-order perturbation theory, which treats cosmic structures as independent entities that are affected by the average expansion of the universe, but neither alter the average expansion themselves, nor influence one another. Unfortunately, higher-order perturbation theory is much more complicated – everything affects everything else. Indeed, it’s not clear there is anything to gain from using these higher-order approximations rather than “just solving” the full equations of general relativity instead.

Improving the precision of our calculations – how well we think we know the answer – is one thing, as discussed above. But the complexity of Einstein’s equations has made us wonder just how accurate the perturbative description really is. In other words, it might give us answers, but are they the right ones? Nonlinear equations, after all, can have surprising features that appear unexpectedly when you solve them in their full glory, and it is hard to predict surprises. Some leading cosmologists, for example, claim that the accelerating expansion of the universe, which dark energy was invented to explain, is caused instead by the collective effects of cosmic structures in the universe acting through the magic of general relativity. Other cosmologists argue this is nonsense.

Computers are finally becoming fast enough that modelling the universe using the full power of general relativity – without the traditional approximations – is not such a crazy prospect

The only way to be sure is to use the full equations of general relativity. And the good news is that computers are finally becoming fast enough that modelling the universe using the full power of general relativity – without the traditional approximations – is not such a crazy prospect. With some hard work, it may finally be feasible over the next decade.

Computers to the rescue

Numerical general relativity itself is not new. As far back as the late 1950s, Richard Arnowitt, Stanley Deser and Charles Misner – together known as ADM – laid out a basic framework in which space–time could be carefully separated into space and time – a vital first step in solving general relativity with a computer. Other researchers also got in on the act, including Thomas Baumgarte, Stuart Shapiro, Masaru Shibata and Takashi Nakamura, who made important improvements to the numerical properties of the ADM system in the 1980s and 1990s so that the dynamics of systems could be followed accurately over long enough times to be interesting.

Other techniques for obtaining such long-time stability were also developed, including one imported from fluid mechanics. Known as adaptive mesh refinement, it allowed scarce computer memory resources to be focused only on those parts of problems where they were needed most. Such advances have allowed numerical relativists to simulate with great precision what happens when two black holes merge and create gravitational waves – ripples in space–time. The resulting images are more than eye candy; they were essential in allowing members of the US-based Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration to announce last year that they had directly detected gravitational waves for the first time.

By modelling many different possible configurations of pairs of black holes – different masses, different spins and different orbits – LIGO’s numerical relativists produced a template of the gravitational-wave signal that would result in each case. Other researchers then compared those simulations over and over again to what the experiment had been measuring, until the moment came when a signal was found that matched one of the templates. The signal in question was coming to us from a pair of black holes a billion light-years away spiralling into one another and merging to form a single larger black hole.

General relativity offers at least one big advantage over Newtonian gravity – it is local

Using numerical relativity to model cosmology has its own challenges compared to simulating black-hole mergers, which are just single astrophysical events. Some qualitative cosmological questions can be answered by reasonably small-scale simulations, and there are state-of-the-art “N-body” simulations that use Newtonian gravity to follow trillions of independent masses over billions of years to see where gravity takes them. But general relativity offers at least one big advantage over Newtonian gravity – it is local.

The difficulty with calculating the gravity experienced by any particular mass in a Newtonian simulation is that you need to add up the effects of all the other masses. Even Isaac Newton himself regarded this “action at a distance” as a failing of his model, since it means that information travels from one side of the simulated universe to the other instantly, violating the speed-of-light limit. In general relativity, however, all the equations are “local”, which means that to determine the gravity at any time or location you only need to know what the gravity and matter distribution were nearby just moments before. This should, in other words, simplify the numerical calculations.

Recently, the three of us at Kenyon College and Case Western Reserve University showed that the cosmological problem is finally becoming tractable (Phys. Rev. Lett. 116 251301 and Phys. Rev. D 93 124059). Just days after our paper appeared, Eloisa Bentivegna at the University of Catania in Italy and Marco Bruni at the University of Portsmouth, UK, had similar success (Phys. Rev. Lett. 116 251302). The two groups each presented the results of low-resolution simulations, where grid points are separated by 40 million light-years, with only long-wavelength perturbations. The simulations followed the universe for only a short time by cosmic standards – long enough only for the universe to somewhat more than double in size – but both tracked the evolution of these perturbations in full general relativity with no simplifications or approximations whatsoever. As the eminent Italian cosmologist Sabino Matarese wrote in Nature Physics, “the era of general relativistic numerical simulations in cosmology ha[s] begun”.

Illustration of a simulation of “co-ordinate invariance”

These preliminary studies are still a long way from competing with modern N-body simulations for resolution, duration or dynamic range. To do so will require advances in the software so that the code can run on much larger computer clusters. We will also need to make the code more stable numerically so that it can model much longer periods of cosmic expansion. The long-term goal is for our numerical simulations to match as far as possible the actual evolution of the universe and its contents, which means using the full theory of general relativity. But given that our existing simulations using full general relativity have revealed no fluctuations driving the accelerated expansion of the universe, it appears instead that accelerated expansion will need new physics – whether dark energy or a modified gravitational theory.

Both groups also observe what appear to be small corrections to the dynamics of space–time when compared with simple perturbation theory. Bentivegna and Bruni studied the collapse of structures in the early universe and suggested that they appear to coalesce somewhat more quickly than in the standard simplified theory.

Future perfect

Drawing specific conclusions about simulations is a subtle matter in general relativity. At the mathematical heart of the theory is the principle of “co-ordinate invariance”, which essentially says that the laws of physics should be the same no matter what set of labels you use for the locations and times of events. We are all familiar with milder versions of this symmetry: we wouldn’t expect the equations governing basic scientific laws to depend on whether we measure our positions in, say, New York or London, and we don’t need new versions of science textbooks whenever we switch from standard time to daylight savings time and back. Co-ordinate invariance in the context of general relativity is just a more extreme version of that, but it means we must ensure that any information we extract from our simulations does not depend on how we label the points in our simulations.

Our Ohio group has taken particular care with this subtlety by sending simulated beams of light from distant points in the distant past at the speed of light through space–time to arrive at the here and now. We then use those beams to simulate observations of the expansion history of our universe. The universe that emerges exhibits an average behaviour that agrees with a corresponding smooth, homogeneous model, but with inhomogeneous structures on top. These additional structures contribute to deviations in observable quantities across the simulated observer’s sky that should soon be accessible to real observers.

Creating codes that are accurate and sensitive enough to make realistic predictions will require us to study larger volumes of space

This work is therefore just the start of a journey. Creating codes that are accurate and sensitive enough to make realistic predictions for future observational programmes – such as the all-sky surveys to be carried out by the Large Scale Synoptic Telescope or the Euclid satellite – will require us to study larger volumes of space. These studies will also have to incorporate ultra-large-scale structures some hundreds of millions of light-years across as well as much smaller-scale structures, such as galaxies and clusters of galaxies. They will also have to follow these volumes for longer stretches of time than is currently possible.

All this will require us to introduce some of the same refinements that made it possible to predict the gravitational-wave ripples produced by a merging black hole, such as adaptive mesh refinement to resolve the smaller structures like galaxies, and N-body simulations to allow matter to flow naturally across these structures. These refinements will let us characterize more precisely and more accurately the statistical properties of galaxies and clusters of galaxies – as well as the observations we make of them – taking general relativity fully into account. Doing so will, however, require clusters of computers with millions of cores, rather than the hundreds we use now.

These improvements to code will take time, effort and collaboration. Groups around the world – in addition to the two mentioned – are likely to make important contributions. Numerical general-relativistic cosmology is still in its infancy, but the next decade will see huge strides to make the best use of the new generation of cosmological surveys that are being designed and built today. This work will either give us increased confidence in our own scientific genesis story – ΛCDM – or teach us that we still have a lot more thinking to do about how the universe got itself to where it is today.

Cat-chy quantum song, science TV resurrected, $800,000 textbook, desk traffic lights

By Sarah Tesh 

I never realized it until now, but my life was missing a song about Schrödinger’s cat. Well, theoretical physicist, science writer and now singer/song writer Sabine Hossenfelder  has come to the rescue with a song about quantum states. This is her second music video done in collaboration artists Apostolos Vasilidis and Timo Alho. The rather cat-chy tune not only includes lyrics about quantum entanglement, Boltzmann brains and the multiverse, but also fits in references to Star Trek and The Matrix. In her BackReaction blog, Hossenfelder says, “If you think this one’s heavy on the nerdism, wait for the next.” We’re looking forward to it!

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors