Skip to main content

Atom interferometry heats up with warm-vapour device

An atom interferometer that does not have to be cooled to cryognenic temperatures has been created by physicists in the US. The new device instead employs a cell of warm vapour. The absence of bulky cooling equipment means the device could potentially feature in simple atomic sensors designed for a range of applications – including measuring accelerations with great precision.

Atom interferometers rely on the fact that particles of matter have wave-like properties. Like optical interferometers, they measure the interference fringes produced when the two halves of a split beam are sent along different paths and then recombined. But rather than using components made of matter to split and reflect beams of light, they do the reverse – typically using laser beams to manipulate beams of matter.

Atom interferometers are more sensitive than their optical counterparts because the matter waves they measure travel more slowly than light does. This means that the waves’ phase changes over longer periods of time. This makes them ideal for high-precision measurements, such as looking for variations in the fine structure constant or testing the equivalence principle. They are also used in inertial sensors to make very accurate measurements of position or rotation, for example.

Large and fiddly

So-called light-pulse atom interferometry involves cooling down large collections of atoms to temperatures as low as a few millionths of a degree kelvin. The chilly conditions are needed to reduce the atoms’ range of velocities, so as to increase the signal at the interferometer’s output and keep the atoms closer together to maximize precision. But the lasers and ultrahigh vacuum chambers required to do this are large – the smallest (transportable) systems having a volume of about 1 m3. They are also tricky to operate because they require fine-tuning and must be kept stable.

In the latest work, Grant Biedermann and colleagues at Sandia National Laboratories in New Mexico adopt a different approach involving a vapour of rubidium atoms held at 39 °C inside a 10 cm-long cell. The idea is to reduce the atoms’ velocity spread not by limiting their thermal energy as a whole, but instead by selecting two subsets of atoms with very precise velocities. The researchers did this using two counter-propagating Raman lasers, which first excite the subsets with opposite velocities and then “kick” them along different trajectories to create the interferometer.

Writing in a commentary that accompanies the Sandia group’s paper in Physical Review Letters, Carlos Garrido Alzar of the Paris Observatory in France draws an analogy with optical interferometry. Existing atomic devices, he says, operate like a laser – a coherent source of light – whereas using a warm vapour is like “searching for interferometric effects using the white incoherent light from a common light bulb”.

Flipping spins

To carry out their experiment, the researchers had to overcome a number of technical hurdles. One was how to prepare the atomic states inside their vapour cell. The atoms need to be spin polarized if they are to interfere properly, but their spin can be flipped when they bounce off the cell walls owing to electromagnetic fields created at the surface. To overcome this problem, the researchers covered the walls with a special coating.

Another major challenge was aligning the weak laser beams that were used to detect the interference fringes with the more powerful lasers used to create the interferometer, such that atoms within the two velocity subsets overlapped properly. “Since this really takes place in 3D, angle is critical,” says Biedermann. “So it was a matter of developing optical alignment tricks.”

The scheme’s sensitivity to the phase difference between matter waves travelling along the two arms of the interferometer is limited by the short time it takes thermal atoms to cross the Raman laser beams. With a transit time of just 29 μs, acceleration sensitivity is roughly five orders of magnitude or more below that possible with the best cold-atom interferometers today. However, according to Garrido Alzar, the new scheme does offer “two important advantages” compared with conventional devices. One, he says, is the fact that it can acquire data about 10,000 times more quickly. Another is its ability to measure a broader range of accelerations.

Speedy operation

Mark Kasevich of Stanford University in the US says that the high laser power needed for the interferometer “may be challenging to achieve” in practical devices. But he nevertheless thinks that the scheme’s speedy operation could prove attractive for inertial sensors used to guide cars, for example.

Guglielmo Tino of the University of Florence in Italy also believes that the new research holds promise. “The published results are still very preliminary and the achieved sensitivity is rather low but it will be interesting to see where this method can lead if optimized,” he says. “It might simplify the atomic sensors for several applications.”

  • There is much more about using atom interferometry to test the equivalence principle in “The descent of mass“.

Cassini's emotional countdown, Steve the light show, shooting hoops 'granny style'

 

By Sarah Tesh

This week has seen the beginning of Cassini’s Grand Finale. The rather dramatically named final mission for the NASA spacecraft involves 22 dives between Saturn and its surrounding rings. Once complete, Cassini will crash into the planet’s atmosphere in what the scientists hope will be a flurry of data gathering. The spacecraft has already sent back stunning images of storms in Saturn’s atmosphere from its first dive on 26 April. After 20 years since its launch, the mission to Saturn’s system has been a masterclass in space exploration, and NASA highlights the best bits in this theatrical video. The short film, reminiscent of Star Trek, could be considered a bit cheesy, but it’s hard not to form an emotional attachment to NASA’s loyal Cassini as you join in the countdown to its final demise.

(more…)

Maintaining high standards: the importance of materials-science research to the nuclear industry

Fuel rods being loaded into a reactor core at a nuclear power station

For a few weeks in the autumn of 2016, France had to do without around a third of its nuclear reactors. Several of the closures were due to routine maintenance, but the ones that hit the headlines had a rather more serious cause: quality-control problems in the manufacture of various critical reactor components.

Suspicions – and eyebrows – had been raised among industry insiders more than a year earlier when, in April 2015, the French nuclear regulator released information concerning the discovery of compositional variances in the steel used at Flamanville 3, an Areva-designed European pressurized-water reactor (EPR) on the country’s northern coast. These variances were in the head and bottom of the reactor pressure vessel (RPV), and they arose from an increase in carbon content compared to the designed standard. Following this report, further concerns were raised over other components, such as steam generators, that had been manufactured at the same facility at a similar time. These components were then examined as part of a large shutdown programme aimed at determining whether zones of high carbon concentration could have made the components less mechanically tough than they should have been – perhaps decreasing their resistance to crack propagation.

Clearly, cracks in reactor vessels are unwelcome. However, to understand fully why the French shutdown occurred, it is first necessary to consider the conditions within the core of a reactor like the EPR. Generally, within a pressurized water reactor (PWR) such as the EPRs being built at Flamanville (and, soon, at Hinkley Point C in the UK), the water temperature is about 320 °C, while pressures of around 150 bar ensure the water remains liquid and does not boil. This high-temperature water is then used to heat a secondary water circuit, creating the steam that drives the turbines. Such conditions would, by their very nature, be considered extreme, even before one includes the added complexity arising from radiation.

In such an environment, there are multiple ways for cracks to begin forming. One significant cause is a pre-existing or “locked in” point of stress or strain, which may have formed during manufacture of the component itself, or (as in this case), during the fabrication of the material that makes up the component. Cracking reduces such stress/strain, lowering the overall energy of the system and thus stabilizing the material. Another important crack-formation mechanism, stress corrosion, also reduces the stress/strain on the material, but in this case the process occurs near the surface of the material.

Both of these mechanisms constitute classical materials behaviour, and are not specific to nuclear applications. Stress corrosion can, however, be accelerated through radiolysis of the circulating water, which occurs when high-energy photons are released during fission/decay and go on to create radicals. A third important crack-formation mechanism, neutron-induced damage, is directly related to radiation. In some cases, radiation can cause alloying elements or impurities to segregate in certain areas of the material, weakening it. This type of damage is a key limiting factor in the lifetime of nuclear materials, and it affects every component used within the core – whether it be fuel, cladding or, as at Flamanville 3, the RPV.

Predicting behaviour

If materials are placed under great strain within the core, what is the best way of predicting their behaviour? In many cases, the level of predictability required depends on the material’s location. For example, to predict the behaviour of the fuel-cladding interface, one must have a very good understanding of how both fuel and cladding behave during operation. Such behaviour is directly linked to fuel changes arising from fission-induced damage and thermal-induced expansion in both the fuel and cladding, coupled with fission-induced expansion of the fuel pellets. In essence, predicting this behaviour relies on taking classical materials science and augmenting it with radiation-induced changes.

The codes take into account a wide range of material properties and use them to predict long-term behaviour, including crack propagation

To say that it is difficult to routinely monitor components during their operational use within a core is something of an understatement. The levels of radiation during operation are extreme, and in many cases gaining access to components can be a challenge. For this reason, analytical codes have been developed to model component behaviour. These codes take into account a wide range of material properties (such as hardness, toughness and ductility) and use them to predict long-term behaviour, including crack propagation.

The UK’s Office of Nuclear Regulation (ONR) requires that for every reactor it approves, there must be an accepted method for predicting its expected behaviour over time, and this method must be both reliable and valid. For example, codes known as R5 and R6 are used to predict the behaviour of the advanced gas-cooled reactor (AGR) fleet. These codes are continuously updated and tested against the behaviour of real materials, thanks to a programme of monitoring and assessment that is incorporated into the maintenance and refuelling programme for each reactor. They are thus reliable models that we can use to predict how a material will behave with confidence.

The next generation

As we move into the next generation of reactors and beyond, however, the situation changes somewhat. For traditional light-water reactor designs such as a PWR, there is a breadth of experience worldwide that can help us predict long-term material behaviour in new PWRs. However, the next generation of reactors – the so-called generation-IV family – are based on very different technologies, and feature innovations such as very high temperatures (up to 850–950 °C with helium-gas coolant) or molten salt- and liquid metal-based cores. Data on such systems is sparse. Next-generation designs also have longer planned lifetimes than the current fleet, and can thus be expected to suffer higher levels of radiation damage before they are decommissioned.

Understanding and predicting how materials will behave over extended periods of time is of paramount importance

Under these constraints, understanding and predicting how materials will behave over extended periods of time is of paramount importance. Before we discuss how to handle such challenges, we need to define a unit that is both simple and powerful in its use: displacements per atom, or dpa. This, as the name implies, is the average number of atomic displacements within a material sample; a value of 1 dpa might imply that in a system of 100 atoms, each atom has moved once, or it could mean that one atom has moved 100 times. As such, it is a useful measure of the average level of damage within a system.

For a light-water reactor, the expected level of damage within the core is about 1 dpa per year of operational life. Hence, in the expected 60–70 year lifetime of reactors currently under construction, we can anticipate reaching values of 60–70 dpa. However, generation-IV designs can have damage levels greater than 200 dpa – a significant increase that dramatically complicates the information required to develop future assessment codes. Before we can trust models predicting behaviour at such high damage levels, we need to test those models against sample materials that have been irradiated to similar levels. This type of damage can be induced using a range of methods, such as bombardment with neutrons or ions. The sample must then be analysed and any changes to its properties (such as the fracture toughness) determined. This behaviour can then be included in the relevant codes.

Quality assessment

So how does this discussion of next-generation technologies relate to the increased carbon content in the RPV steel at Flamanville? The answer is that the increased carbon content took the material into a regime that was outside the design and regulatory specifications, introducing property changes that were not included in the original analysis and predictions. As a consequence, these predictions became unreliable. Further testing, requested by the French nuclear regulator, should show whether the models’ predictions are still valid, and thus whether they can be used to make future predictions for the vessels being monitored.

This incident therefore highlights the need for materials and components to be manufactured to a high degree of reliability and accuracy – not only in their shape, but also in their composition, as even small changes can lead to components falling outside of specification. Once a component is shown to be outside of specification, the regulator is well within its rights to initiate a shutdown of the reactor, and to require further testing/data before restart. Such was the case with the French reactors in 2016, when increased carbon content in the RPV steel led to a shutdown of a large portion of the nation’s fleet.

Flash Physics: Cosmic-ray balloon, Tokamak Energy plasma, ripples in cosmic web, APS in open access scheme

Cosmic-ray balloon launches in New Zealand

A 532,000 m3 super-pressure balloon to study ultra-high cosmic rays has been launched in New Zealand by NASA. The balloon’s international Extreme Universe Space Observatory (EUSO) instrument will observe a broad swathe of the Earth’s atmosphere to detect the ultraviolet fluorescence as cosmic rays hit the Earth’s atmosphere. The instrument will aim to detect cosmic rays that have an energy greater than 1018 eV. The balloon will operate for around 100 days and is expected to circle the planet two or three times. If the mission is a success then it could pave the way for a EUSO instrument to be installed on the International Space Station that could then observe a greater area of the Earth’s atmosphere.

Tokamak Energy achieves first plasma

Photograph of the ST40 compact tokamak

The UK-based company Tokamak Energy has created the first plasma in its ST40 tokamak reactor. The firm will now complete the commissioning and installation of a full set of magnetic coils for the device, which will provide greater control over the plasma. The company plans to achieve a plasma temperature of 15 million degrees by autumn 2017 and have the plasma at 100 million degrees in 2018. At this temperature it should be possible for hydrogen nuclei in the plasma to fuse together, releasing large amounts of energy. Tokamak Energy has ambitious plans to create a fusion reactor capable of generating electricity by 2025 and have a commercially viable source of fusion power by 2030. Unlike the much larger ITER tokamak fusion reactor that is being built in France, the ST40 is a compact device that can run at a much higher plasma pressure. This, according to Tokamak Energy, should make more efficient at achieving fusion. Creating a dense plasma will require very strong magnetic fields, which the firm plans to generate using superconducting magnets. Some critics, however, are sceptical that such magnetic fields can be achieved inside a tokamak. The firm’s chief executive David Kingham describes the ST40 as “the first world-class controlled fusion device to have been designed, built and operated by a private venture”. However, he concedes that “we will still need significant investment, many academic and industrial collaborations, dedicated and creative engineers and scientists, and an excellent supply chain”, for the company to achieve its goals.

Cosmos ripples with Big Bang information

Tiny ripples have been observed in the haze of hydrogen left over from the Big Bang. The gas makes up a vast network of tangled filamentary structures, stretching out over billions of light-years. This “cosmic web” accounts for the majority of atoms in the universe, despite there being only one atom per cubic metre in the most barren parts. While the cosmic web does not emit light itself, it is possible to indirectly study it by looking at how it absorbs light from distant quasars – hugely energetic and luminous active galactic nuclei. Using exceedingly rare pairs of quasars, Alberto Rorai from the University of Cambridge in the UK and colleagues were able to measure the subtle differences in the absorption along too sightlines. The region of the cosmic web they observed was nearly 11 billion light-years away, but the team detected variations in the web’s structure on scales 100,000 times smaller than that distance – comparable to the size of a single galaxy (which is tiny relative to the web’s size). The team found that the ripples fitted with simulations of cosmic structures from the Big Bang to now. “One reason why these small-scale fluctuations are so interesting is that they encode information about the temperature of gas in the cosmic web just a few billion years after the Big Bang,” explains Joseph Hennawi of the University of California, Santa Barbara in the US. The findings can be found in Science.

APS joins high-energy physics open-access initiative

The American Physical Society has signed an agreement with the CERN particle-physics lab to join the SCOAP3 initiative that provides open access to journal articles written by particle physicists. SCOAP3 began in 2014 and encourages the “gold” model of open access, whereby published papers can be read free of charge on the internet and authors pay an article-processing charge to the publisher. Since it began, the initiative has made about 15,000 high-energy physics paper by more than 20,000 scientists from 100 countries accessible to anyone. The agreement with the APS now means that starting on 1 January 2018 Physical Review Letters, Physical Review D, and Physical Review C will join eight other journals in SCOAP3. Under the deal with the APS, authors of articles that have a primary designation in the “high-energy physics” category on the arXiv preprint server will not have to pay to make their article open access when publishing in one of above APS journals.

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

The search is on for elusive particle decay

A US experiment to search for neutrinoless double-beta decay has got the green light to start operations. The Majorana Demonstrator, located at the Sanford Underground Research Facility in South Dakota, received “Critical Decision 4” from the Department of Energy (DOE) in March. The decision certifies that the experiment met its “performance parameters”, including the need for ultra-low background measurements.

“The DOE held us to the line and made sure we built something that we can use to do good science,” says Lawrence Berkeley National Laboratory physicist Alan Poon, the Majorana Demonstrator’s detector group leader. “Because we know we have met all these basic requirements, now we start doing physics and trying to improve on the instruments and try to discover new signs.”

The Majorana Demonstrator has been gathering data with one operating module since June 2015 and presented data on background levels last year at the Neutrino 2016 conference in London. However, with the experiment’s second module having been installed late last year together with the detectors’ final and outermost polyethylene layer added to the copper-lead shielding in March, the construction of the experiment is now fully complete.

Annihilating neutrinos

To search for neutrinoless double-beta decay signals, the experiment will use 30 kg of enriched germanium-76 detectors. The decay process involves two neutrons simultaneously decaying into two protons, emitting two electrons and two antineutrinos. If the neutrino is a Majorana particle – its own antiparticle – then the two antineutrinos would annihilate each other before leaving the nucleus – hence neutrinoless double-beta decay.

Poon and colleagues have also teamed up with the GERDA experiment at the Gran Sasso National Laboratory in Italy to create the Large Enriched Germanium Experiment for Neutrinoless ββ Decay (LEGEND) alliance. Members from GERDA were involved with the Majorana Demonstrator’s design and construction and LEGEND will co-ordinate efforts to search for neutrinoless double-beta decay.

Bernhard Schwingenheuer from the Max Planck Institute for Nuclear Physics in Heidelberg, who is co-spokesperson for the GERDA experiment, says that the different shields on the two experiments will help them establish critically low background levels for detecting neutrinoless double-beta decay. GERDA’s liquid-argon shield sends clear flash signals of interference while the Majorana shield’s inner copper layer has a high purity, which is critical for precluding misleading signals. Schwingenheuer and Poon are confident that LEGEND will help usher in a new stage of collaboration, which includes raising the amount of enriched germanium from 65 kg to 100 kg by 2019, with a goal of reaching 200 kg by that date.

Bigger than Higgs

If the LEGEND team does manage to discover neutrinoless double-beta decay, it would indicate lepton-number violation – where the number of leptons minus the number of antileptons is not conserved – which some theorists believe it could explain why there is more matter than antimatter in the universe. “If the lepton number is violated in this neutrinoless double-beta decay, that would be a major breakthrough, bigger than the Higgs boson discovery,” says Schwingenheuer. “If you find this decay you will have only a handful of events [and] you want to have an extremely low background so you want to be extremely sure that it is not something else.”

Writing in Physical Review Letters, Poon and colleagues have analysed data from the first module and have been able to exclude four proposals of exotic physics beyond the Standard Model at a confidence level of 90%. These are the existence of bosonic dark matter; the coupling of solar axions to matter; electronic transitions that violate the Pauli exclusion principle; and the decay of the electron.

Between the lines

Vermin of the sky

Mention the word asteroid and thoughts of doom and destruction seem to spring to mind. Whether you are commiserating with the now-extinct dinosaurs or thinking of an explosive Bruce Willis in Armageddon, these “vermin of the sky” (as they were once described by astronomer Edmund Weiss) seem to scare many of us. But California Institute of Technology scientist and author Carrie Nugent claims to be “obsessed” with these rocks, according to her new book Asteroid Hunters. Part of the TED book series, this concise volume is pretty high impact, pun intended. Written informally (much in the style of a TED talk) and from a rather personal point of view, Nugent swiftly takes the reader across the solar system, talking about where most asteroids are found, what they are made of, the chance of being hit by one (she briefly tells the tale of Ann Hodges, the only person in recent memory to be hit by a meteorite), the various times large impacts have occurred and, finally, the science behind asteroid hunting today, in an effort to be aware of any potential large intruders. Nugent is part of NASA’s NEOWISE mission – its telescope has tracked more than 158,000 near-Earth asteroids and discovered more than 30,000. The book also contains some wonderful illustrations by Mike Lemanski in the form of stylized infographics. Pick up Asteroid Hunters to get a crash course in asteroid tracking and planetary defence.

  • 2017 Oxford University Press 144pp £7.99pb

An atmospheric tale

Regular readers of Between the Lines will recall last month’s review of a book on gravity, part of the Very Short Introductions book series, written by experts in a field but aimed at a general audience and covering a large range of topics. The latest science addition to the series is The Atmosphere, written by atmospheric scientist Paul Palmer from the University of Edinburgh. This book covers most of the topics that one would expect it to – from our planet’s complex atmosphere to how it interacts with the planetary surface and the Sun. Palmer does a good job of contextualizing the subject with advances in science. For example, he talks about the different kinds of exoplanetary atmospheres that could be found and also deals with modern estimates of air pollution and ozone loss, and ends with future challenges facing atmospheric scientists. While the figures in the book are more suited to a journal article and Palmer’s writing is occasionally dry, the book is a useful resource and provides a good overview of the topic.

  • 2017 Oxford University Press 152pp £7.99pb

So you want to know about the dark universe?

Photo of Catherine Heymans, University of Edinburgh

By Matin Durrani

It never ceases to amaze me that we know almost nothing about 95% of the universe. Sure, the consensus is that 25% is dark matter and the rest is something dubbed “dark energy”, but beyond that our knowledge is wafer thin.

The flip side, though, is that there’s plenty for physicists to get stuck into. And if you want to get up to speed with the field and find out more about some of its challenges, do check out a new free-to-read Physics World Discovery ebook by Catherine Heymans from the Royal Observatory, University of Edinburgh, UK.

Available in ePub, Kindle and PDF formats, The Dark Universe explains the dark enigma and examines “the cosmologist’s toolkit of observations and techniques that allow us to confront different theories on the dark universe”. And to get you in the mood for all things dark, I asked Heymans some questions about her life as a research scientist. Here’s what she had to say.

(more…)

Flash Physics: LISA Pathfinder beats static electricity, nanodiamonds enhance MRI, quantum pioneer bags prize

LISA Pathfinder overcomes electrostatic forces

Researchers have successfully minimized the electrostatic forces affecting test masses on the LISA Pathfinder spacecraft. The LISA Pathfinder mission aims to demonstrate technology for the Laser Interferometer Space Antenna (LISA) – a space-based gravitational-wave observatory that will comprise three spacecraft. As part of a huge detector, each spacecraft will contain test masses located at a “Lagrangian point” between the Sun and Earth. It is important that the masses are completely isolated from any external influences because LISA will use lasers to very precisely measure the distances between them. Any passing gravitational waves will then be detected as they cause tiny displacements of the masses. In June 2016, the European Space Agency (ESA) announced that a 2 kg test mass had been successfully isolated within the shell-like spacecraft of LISA Pathfinder. A second mass, located 33 cm away, monitors the test-mass motion and the spacecraft uses thrusters to make sure that the test mass does not bang into the wall of its surrounding shell. Now, researchers have developed methods to reduce the charge-induced electrostatic forces exerted on the test masses. These forces originate from high-energy cosmic rays and solar energetic particles penetrating the spacecraft and shielding, depositing charge on the test mass via secondary emission or by stopping directly. The methods and technology, presented in Physical Review Letters, overcome a major milestone in the development of LISA and the next generation of gravitational-wave experiments.

Nanodiamonds enhance magnetic resonance imaging

MRI scans of vials containing nanodiamonds

A new technique using tiny diamonds and magnetic resonance imaging (MRI) could ensure that cancer drugs and other pharmaceuticals reach the right parts of the body. That’s the claim of a team of researchers including David Waddington at the University of Sydney in Australia and Matthew Rosen and Ronald Walsworth of Harvard University in the US. The group used the “Overhauser effect” to boost the MRI signal from pieces of diamond just 18 nm in size. This involves firing radio-frequency pulses at nanodiamonds mixed with water. This causes the transfer of electron spin polarization from impurities on the surface of the diamond to the nuclear spins of hydrogen in surrounding water molecules. This greatly enhances the MRI signal from the hydrogen nuclei, which should reveal the location of nanodiamonds as they move through the body. A cancer drug could be tagged with nanodiamonds, for example, and then MRI could be used to confirm that it reaches a tumour. Other applications include studying how drugs are transported across the blood–brain barrier and also detecting cancer by using nanodiamonds to tag a pharmaceutical that tends to accumulate in specific types of tumour. Unlike conventional MRI scans, which require strong magnetic fields created by expensive room-sized superconducting magnets, the new technique employs low magnetic fields. This, says Rosen, “opens up a number of new opportunities” beyond the current nanodiamond imaging application. The research is described in Nature Communications.

Quantum-computing pioneer bags mathematical-physics prize

 

Raymond Laflamme has won the 2017 CAP-CRM Prize for Theoretical and Mathematical Physics, “for his groundbreaking contributions on quantum information”. Awarded jointly by the Canadian Association of Physicists (CAP) and Canada’s Centre de Recherches Mathématiques, the prize will be presented on 1 June at a ceremony at CAP’s annual congress in Kingston, Ontario. Laflamme began his career in cosmology, completing a PhD in 1988 with Stephen Hawking at the University of Cambridge. In the mid-1990s he shifted his attention to quantum information and helped develop linear optical quantum computing and other important concepts of quantum-information processing. In 2001 Laflamme joined the University of Waterloo as the founding director of the Institute for Quantum Computing (IQC). He is also a founding member of the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, where Laflamme co-founded the quantum-technology company Universal Quantum Devices in 2011.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on the latest in the search for neutrinoless double beta decay.

Web life: Backyard Worlds: Planet 9

So what is the site about?

Much as its name suggests, Backyard Worlds: Planet 9 focuses on the hunt for a ninth planet in our solar system, along with other possible “rogue” planets that astronomers now believe may abound in the galaxy. The idea is to look though data from NASA’s Wide-field Infrared Survey Explorer (WISE) mission and distinguish certain features – following in the vein of a number of other celestial citizen-science projects. The data in this case are in the form of animated images of the sky, taken at different times. As a participant, your job is to pick out moving celestial bodies – mainly ultracool brown dwarfs and other rogue planets – from artefacts in the data. As the site suggests “There are too many images for us to search through by ourselves. So come join the search, and you might find a rogue world that’s nearer to the Sun than Proxima Centauri – or even the elusive Planet Nine.”

Who is behind it?

It should come as no surprise that Backyard Worlds is part of the Zooniverse family. In case you haven’t come across it before, Zooniverse claims to be the “world’s largest and most popular platform for people-powered research”. Its science programmes involve everything from spotting distant galaxies to counting animals in the wild. The idea is to tap into people’s interest in science, whether or not they have a science degree and use their help to pick out details in large data-sets – a task that computers are still much slower at than the average person.

The Backyard Worlds team is made up of researchers from the American Museum of Natural History, the Space Telescope Science Institute, NASA, the University of California, Berkeley and Arizona State University.

Can I get involved?

Yes of course – that is the aim of the game. At the time of writing, the site had 26,383 registered volunteers who had completed 2,314,451 classifications, but that isn’t even halfway to the goal so there is plenty more help you can offer. Your main task as a volunteer is to look through sets of false-colour images, taken at four different times. You use a marking tool to point out artefacts that are moving through these images, either hopping and jumping across the set of images (“mover”) or appearing as pairs of varying bright and dark spots (a “dipole”). If you think you have spotted a possible dipole or mover, you report it via the chat function by providing the object’s celestial coordinates (simply called Talk, this section also allows you to chat with other users as well as the scientists involved, making it a great open discussion platform).

The next step is to cross-reference your discovery against a database of known astronomical objects. Dubbed the “Set of Identifications, Measurements, and Bibliography for Astronomical Data” or SIMBAD, this database is used by professional astronomers. If your coordinates do not align with an existing object, you get to fill out an exciting “Think you’ve got one?” form with details of your find. At this point, the professionals take over as they first research the object to see what we already know about it, before following up with observations of the most promising candidates. “We need to apply for telescope time to follow up the most interesting objects to take their spectra,” explains the site, adding that “The spectra will allow us to figure out their spectral types and their temperatures, and find out if what we’re looking at really is a new brown dwarf or planet. That whole process will take several months.”

Who is it aimed at?

To some extent, the site is aimed at anyone who would like to hunt for new planets. But Backyard Worlds needs a bit more time and attention than some of the other Zooniverse projects. While looking through the data and marking artefacts is simple, some users may be thrown by having to determine the celestial coordinates and then use the somewhat complicated SIMBAD database to find more data on their discoveries. That said, there are detailed “how to” guides and blog posts on each of these topics and the Talk feature allows you to ask for help if you need it. Ultimately, the hard work will pay off for all volunteers as everyone will be credited with any potential discoveries. And really, how many people can say they helped to find a planet?

The universe through a glass darkly

Browse through the Hubble Space Telescope’s “Top 100 Images” catalogue and you will undoubtedly be amazed by pictures of stellar clusters. Each image shows vast swathes of inky black embedded with hundreds upon thousands of stars, varying in size and brightness from a tiny pinprick to a dazzling splash. Many, if not most, of these celestial objects are logged and classified in online databases such as the Set of Identifications, Measurements, and Bibliography for Astronomical Data (SIMBAD). As of this year, 9,099,070 objects beyond our solar system are logged in SIMBAD. In this age of big data and fast computing, these numbers do not seem that unwieldy.

But imagine that such online resources or even a mechanical computer were not available. This was the conundrum faced by astronomers in the mid-19th century, as modern astrophysics took off thanks to the advent of photographic techniques and devices such as large refractor telescopes. US astronomer and director of Harvard College Observatory (from 1877) Edward Charles Pickering decided to hire women as “human calculators” to process the ever-growing amounts of astronomical data.

Pickering employed up to 80 women during his 42-year stint as director, at a time when women still did not even have the right to vote. This team of gifted and talented women went on to make huge contributions to astronomy as we know it today, and author Dava Sobel tells their untold tale in her latest book – The Glass Universe: How the Ladies of the Harvard Observatory Took the Measure of the Stars.

Sobel is a seasoned science writer and a former New York Times science reporter. Her 1995 book Longitude was a bestseller and won numerous awards. She followed it up with a variety of historical-science books including Galileo’s Daughter and, most recently, A More Perfect Heaven. The Glass Universe opens in 1882 at a glitzy dinner party hosted by American doctor and amateur astronomer Henry Draper (a pioneer of astrophotography) and his wife Anna. The inside of the Drapers’ Madison Avenue mansion was lit by incandescent lamps – something that not even the White House could boast at the time. Thomas Edison (a personal friend of the Drapers) was himself in attendance along with a bevy of science luminaries, who were all in town for a meeting of the National Academy of Sciences. This opening description sets the scene for the Drapers’ love of science, though to call Henry Draper an amateur astronomer is somewhat of a misnomer. He was a doctor who quit his medical practice in 1873 to pursue astronomy and had taken one of the early images of a stellar spectrum that showed absorption lines.

Anna used to assist her husband in his observations – from calling out the exact 165 seconds of totality during the 1878 solar eclipse to spending nights helping him with the photographic plates that captured stellar spectra. Five days after their celebrity-studded party, however, Henry Draper died from double pleurisy at the age of 45. Anna resolved to carry on with Henry’s work herself, more specifically his spectral imaging of some of the brightest stars in the night sky. It was this research that led Pickering to Anna, as he offered to analyse the spectral patterns using specialized equipment at Harvard.

Upon a visit to Harvard to hand over Henry’s plates, she was surprised to find that six of the computers at Harvard were women. Some of these were the wives, sisters and daughters of resident astronomers, but they also included recent graduates from new women’s colleges such as Vassar and Wellesley, whereas others simply showed mathematical promise.

Pickering always struggled to keep the observatory financially afloat and one way of doing this was to hire women as computers, as he could hire many more women at the same budget. But this was not the only reason – according to Sobel, Pickering advocated for women’s rights and felt that hiring women who had just graduated from the new colleges was a good way to silence naysayers who felt that higher education for women was a waste.

In 1886, two years after her first visit, Anna Draper decided to fund Pickering’s project of creating a photographic stellar spectra catalogue at Harvard, setting up the Henry Draper Memorial. Her funding enabled the work of many female computers and scientists, and would lead to the Henry Draper Catalogue (published between 1918 and 1924), which spectroscopically classified 225 300 stars and helped build the observatory’s collection of half a million glass photographic plates that remain until today.

An early talent noticed by Pickering was a Scottish woman by the name of Williamina Fleming, who was originally hired as his maid in 1879. The story goes that Pickering was often frustrated with the abilities of his all-male computing group and would complain that his “Scottish maid could do better”. This turned out to be a bit of an understatement as Fleming began as a part-time computer at Harvard and ultimately went on to identify 10 novae and more than 300 variable stars, discovered the Horsehead nebula in 1888 and classified most of the stars in the Draper catalogue.

Fleming also developed an early stellar classification system based on a star’s hydrogen content. This was later improved upon by Annie Jump Cannon, another female computer who became part of Pickering’s group – or “Pickering’s harem” as it was being referred to – in 1886. Cannon developed the basic version of the temperature-dependent stellar classification system still in use today. It is clear from the book that Cannon was a particular favourite of Sobel, who said as much in an interview with the Atlantic – Sobel delved into the archive of diaries that Cannon meticulously maintained throughout her life, which reveal Cannon’s unpretentiousness and wit.

Many other stalwarts in the field were to become a part of the “Harvard Computers”. They range from Antonia Murray – best known for her spectroscopic analysis of the binary star Beta Lyrae – to Henrietta Swan Leavitt, who discovered the relationship between the luminosity and the period of Cepheid variable stars. This link in turn helped astronomers to measure the distance between Earth and distant galaxies, and eventually allowed Edwin Hubble to determine that our universe is expanding. It also enabled Cecilia Helena Payne-Gaposchkin to become one of the first women to receive a PhD in astronomy (for studying stellar atmospheres). She also became the first ever woman professor of astronomy at Harvard in 1956 and, eventually, the university’s first female department chair.

Sobel is an extremely adept historical writer and researcher, and has scoured through vast amounts of letters, diaries, memoirs and all of the archival material that Harvard had to offer in penning this book, which spans from the mid-1800s to just after the Second World War. She has woven in many details about the lives, loves, fears, frustrations and achievements of each of the main characters, as well as describing life at the observatory at that time. But the book focuses squarely on the people working at or with the Harvard Observatory, and although it follows its characters as they travel to exotic destinations such as Peru, you may be left wondering about the state of women in astronomy elsewhere in the US and beyond. Another gripe is that Sobel introduces many people in each chapter, but most get only a fleeting mention. As someone who struggles to remember names, I did find myself flipping back and forth to discern a particular individual if they were not one of the main characters.

Nevertheless, The Glass Universe is a fascinating story and Sobel an admirable writer. The book does an excellent job of setting the stage for two other recent books along a similar vein – Nathalia Holt’s Rise of the Rocket Girls and Margot Lee Shetterly’s Hidden Figures, which has also been adapted into an Oscar-winning film and tells the story of the women computers at NASA in the mid-1900s.

As more and more stories of overlooked contributions of women in science come to light, I can’t help but wonder how many such tales will never get told. I also can’t help but hope that today’s female researchers get their due and never need to be postscripted into history.

  • 2017 Fourth Estate 336pp £16.99hb
Copyright © 2025 by IOP Publishing Ltd and individual contributors