Skip to main content

Flash Physics: Too radioactive even for robots, IBM to build 50 qubit computer, seeing through opaque materials

Fukushima too radioactive even for robots

Better robots are needed for investigating the Fukushima Daiichi nuclear plant after current designs failed due to radiation levels and debris obstacles. At a recent news conference, president of Fukushima Daiichi decommissioning, Naohiro Masuda, spoke about the need for more creative robot design after repeated failures. In 2011, multiple reactors at the Fukushima nuclear plant went into meltdown after a severe earthquake and tsunami. To safely decommission the damaged plant, its operator Tokyo Electric Power Company (TEPCO) must know exactly where the melted fuel is and the extent of structural damage to the surrounding buildings. The radiation levels, however, would kill a human within seconds, so TEPCO is reliant upon remote-controlled robotic probes. Yet early robots have come across unexpected challenges. In February, TEPCO sent in two robots to investigate the damaged reactor inside Unit 2 of the facility. The first was a cleaner robot designed to clear the way for the other “scorpion” robot that would assess damage and measure radiation and temperature. Unfortunately, the cleaning robot had to be withdrawn after only 2 hours of the planned 10 hour mission because the cameras began to malfunction due to high radiation levels. The scorpion-shaped robot then had to be abandoned before reaching its target location because it began to have difficulty moving and became stuck when crawling over rubble. It is unclear if this failure was due to debris or radiation levels. The Associated Press reports that Masuda called for more creative thinking when developing future robots. “We should think out of the box so we can examine the bottom of the core and how melted fuel debris spread out,” explains Masuda. The data collected and the robot failures imply that the clean-up and decommissioning of Fukushima will be more challenging than previously predicted. It is thought that the process will take decades to complete.

IBM to build 50 qubit quantum computers

Photograph of IBM researchers working on quantum technologies

IBM says it will build a new generation of universal quantum computers that will be available for commercial use via the IBM Cloud platform. The IBM Q systems will have about 50 quantum bits (qubits). This will make them 10 times larger than IBM’s five-bit quantum computer, which is already available on IBM Cloud and has attracted about 40,000 users. According to the US-based firm, increasing the number of qubits will be one step towards boosting the “quantum volume” – or computing power – of their quantum systems. Efforts will also focus on improving connectivity between qubits, boosting the reliability of quantum-logic operations and creating systems that are capable of highly parallel computations. The universal nature of the proposed computer should make it useful for solving a range of problems that are too complex for conventional computers. These include calculating the properties of molecules used to create new drugs and materials, finding optimal processes for supply chains and logistics and creating artificial intelligence systems. “To create knowledge from much greater depths of complexity, we need a quantum computer,” says Tom Rosamilia of IBM Systems. “We envision IBM Q systems working in concert with our portfolio of classical high-performance systems to address problems that are currently unsolvable, but hold tremendous untapped value.”

Very few photons needed to see through opaque material

An optical image of a region within a nearly opaque medium can be obtained using a surprisingly small number of photons. That is the conclusion of Mooseok Jang and Changhuei Yang at Caltech in the US and Ivo Vellekoop of the University of Twente in the Netherlands, who have shown that an established technique called optical phase conjugation (OPC) can be extended for use when very little light makes it out of the medium. OPC involves illuminating a point of interest in a nearly opaque medium with light beams from opposite directions. The first beam provides information about how light is scattered in the medium. This information is then used to cause the second beam to undergo the exact reverse scattering as it travels to the point of interest – illuminating that point. By scanning the beams around the sample, an image is built up. However, in very opaque materials scientists had thought that not enough light emerges to provide useful information about the scattering. Applying the technique to a sample of highly opaque opal, the trio showed that it worked when as few as 1000 photons were detected emerging from the sample – which is far fewer than the number of pixels in the detector used to measure the signal. The discovery is reported in Physical Review Letters and could be used to improve the optical imaging of opaque biological tissues such as brain matter.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new quasiparticle called the angulon.

Anger over Trump’s travel curbs

The scientific community has reacted with dismay at US President Donald Trump’s executive order to temporarily ban travellers from seven predominantly Islamic countries from entering the US. The concerns, which have been shared among academic institutions, hi-tech firms and scientific societies worldwide, remain even after a court of appeals upheld a federal judge’s decision to block the ban. As Physics World went to press, the Trump administration insisted that it will find a way to overturn those judgments and reinstate a ban similar – if not identical – to the original.

The executive order closed US entry to immigrants from Iran, Iraq, Libya, Somalia, Sudan and Yemen for 90 days, suspended the entry of refugees from anywhere in the world for 120 days and permanently banned Syrian citizens from entry. President Trump claimed that the order protected the country from incursion by “radical Islamic terrorists”. His opponents, meanwhile, assert that no individuals from the seven named nations have killed any Americans in terrorist attacks over the past four decades.

Issued a week after the new president’s inauguration, the ban prevented several scientists, doctors and members of technology companies from visiting or even returning to the US. Because the original executive order lacked detail, customs officers in some US airports initially refused entry to individuals from the targeted nations who possessed “green cards” that allow them to remain in the country with all the privileges of US citizens except the right to vote. “There really are science issues at stake, because you can’t do good science if you don’t have freedom of collaboration and a diversity of perspectives in research” says Rush Holt, the physicist and former Congressman who is chief executive of the American Association for the Advancement of Science (AAAS).

One prominent scientist to be affected by Trump’s ban is Iranian researcher Samira Asgari, who was initially prevented from flying to the US to take up a two-year contract at Harvard University to study the effect of the human genome on susceptibility to tuberculosis. She was later allowed to make the trip. Others simply decided not to fulfil their travel plans. Mohamed Hassan, a dual citizen of Sudan and Italy who is interim director of the World Academy of Sciences, cancelled a visit to the AAAS annual meeting in Boston last month. So did Sudanese electronic engineer Rania Abdelhameed Mokhtar, despite being scheduled to collect an award, which was given in absentia.

“The executive order signed by the US president is profoundly disruptive. It will immediately have a negative effect on scientific research and the essential scientific processes of exchanging information and ideas,” Hassan told Research Fortnight. “In the long run the order will erode trust in the US and undermine the sense that the US is a reliable partner for scientific research. This is very disturbing both for scientists from the developing world and for our colleagues in North America and Europe.”

Scientific societies outside the US have lamented the prevention of individuals from specific countries from entering the US. The International Astronomical Union (IAU) noted in a statement issued before the ban was temporarily overturned that it “considers that mobility restrictions can have a direct impact on the astronomical communities of countries at both ends of the ban, as well as astronomy as a whole”. IAU general secretary Piero Benvenuti told Physics World that the IAU will “always denounce the possible damage that such decisions may cause to science” adding that the IAU has no plans to stop activities in the US because of the ban. “If anything, we will try to facilitate the participation in our activities, scientific and educational, by any world citizen,” he adds. However, G2 Massive Stars, one of the IAU’s 35 commissions, which plan activates in various sub-fields of astronomy, announced in early February that it will not hold any meeting in the US while any such ban remained in place.

National scientific organizations have also added their concerns. A statement by a group of German scientific societies describes the order as “a sweeping discrimination of human beings based on their ethnicity and consequently also an act of aggression against the fundamental values of science”. And according to the UK’s Royal Astronomical Society, “The ban hinders researchers from sharing their work with their peers, a fundamental tenet of scientific endeavour. The restrictions threaten to damage collaboration between the US and nations around the world.”

Scientific progress depends on openness, transparency, and the free flow of ideas and people

Within the US, a group of 171 scientific, engineering and educational societies, national associations and universities – among them the American Physical Society and the American Institute of Physics – issued a statement before the ban was overturned urging the administration to rescind the order. The statement expresses deep concern that it will “have a negative impact on the ability of scientists and engineers in industry and academia” to travel freely.

“Scientific progress depends on openness, transparency, and the free flow of ideas and people, and these principles have helped the US attract and richly benefit from international scientific talent,” the statement says. “The executive order will discourage many of the best and brightest international students, scholars, engineers and scientists from studying and working, attending academic and scientific conferences, or seeking to build new businesses in the US. Implementation of this policy will compromise the United States’ ability to attract international scientific talent and maintain scientific and economic leadership.”

Slow out of the box

The travel ban is not the only issue that has concerned scientists as they come to terms with a new approach to business at the White House. As the Senate approved the administration’s nominees, government efforts to counter global warming appear all but certain to be reduced, although perhaps more slowly than some administration advocates have suggested. Scott Pruitt, the lawyer whom the senate approved as the new head of the Environmental Protection Agency (EPA) late last month, reportedly plans to cut the agency’s staff, close some of its regional offices, repeal recent regulations on battling climate change and weaken its regulations on environmental matters. Intriguingly, a predecessor of Pruitt’s, Anne Gorsuch, carried out a similar downsizing agenda as Ronald Reagan’s first EPA director in the early 1980s. The Trump administration has nominated her son, Neil Gorsuch, as a Supreme Court justice.

Rumours also emerged last month that the Princeton University physicist William Happer could become Trump’s scientific adviser. In the past Happer has said that researchers working on climate change resemble a “glassy-eyed chanting cult”, adding that climate change was a “so-called” science. The physicist apparently met Trump in January to discuss taking the role and has since said that if he was offered the job, he would accept. Another individual tipped as possible science adviser – computer scientist David Gelernter from Yale University – has said that he is “unconvinced” by evidence of human contribution to climate change. He has also criticized the “intellectualism” of modern academia.

The House of Representatives Science, Space and Technology Committee has also resumed efforts it began two years ago to restrict the ways in which government agencies use scientific results in their development of policies. Lamar Smith, the Texas Republican who heads the committee, has continued to question the findings of government scientists. In a recent hearing, Smith called on the AAAS publication Science Advances to retract a paper on “data biases in global warming” by National Oceanic and Atmospheric Administration (NOAA) researchers. He says that a former NOAA scientist had questioned the team’s scientific integrity. In testimony, Rush Holt stated that the objection was to the way the data was archived rather than the paper’s findings, which have been replicated. “Policy-makers should never dictate the conclusions of a scientific study and they should base policy on a review of relevant research and the provision of relevant statutes,” Holt told the committee. “In other words, the integrity of the process must be upheld.”

Trump is also likely to relax long-held policies on the process of approving new pharmaceutical drugs. The administration has raised the possibility of a presidential commission to study the safety of vaccines, including a purported connection between vaccines and autism that the medical profession has discredited. “What will become of the major government agencies of scientific research, the National Institutes of Health and the National Science Foundation?” asks Bard College president Leon Botstein in a comment in the New York Times. “Will their research agendas be manipulated to fit Trump’s view of reality? Will there be a continuing erosion of support for basic research as opposed to research that contributes to some commercial product?”

An open letter issued by 39 European science organizations warns against the executive order and also indications that the US government is paying too much attention to views not based on fact and sound scientific evidence, especially in areas such as climate science and the safety of vaccines. It also highlights the danger of the administration stopping scientists from speaking to the media without first seeking permission. “All of these are at odds with the principles of transparency, open communication, [and the] mobility of scholars and scientists, which are vital to scientific progress and to the benefit our societies, economies and cultures derive from it,” the statement says. “Restrictions on research, scientists and research centres in inconvenient areas have no place in science…Our colleagues working in the US will suffer, the United States and US citizens will pay a price, as will Europe and Europeans, and countries and people all across the globe.”

One US group has gone beyond just issuing statements. On 22 April thousands of scientists are expected to participate in a March for Science in Washington DC as well as in several other cities around the world. The event is intended to “champion publicly funded and publicly communicated science as a pillar of human freedom and prosperity”. While much of the scientific community approves of the event, some members worry that it could be counterproductive, by politicizing science. “[The march] will make my job more difficult and increase polarization,” Robert Young, a Western Carolina University geologist who studies the effect of rising sea levels on coastlines, wrote in the New York Times.

Holt, meanwhile, says that the US scientific community is anxious that science will suffer from government neglect. “The administration and transition team have been silent about scientific issues. Many scientists think that it’s been an ominous silence,” he says. “There is no science adviser appointed and essentially no new appointments of trained scientists and engineers to any positions. If this is to be a science-friendly administration they’re pretty slow out of the box.”

Exoplanet christening, physics on the catwalk, ultrasonic wine

Whiskey aging barrels

By Sarah Tesh

Last week NASA announced the major find of seven Earth-like exoplanets orbiting a nearby dwarf star. The news that at least three of the seven could possibly support life was reported far and wide. Yet, as with most astronomical finds, the planets do not have the most imaginative names. Simply named after the star they orbit, they are currently called TRAPPIST-1a to TRAPPIST-1h. So NASA took to Twitter with the request #7NamesFor7NewPlanets and the public delivered. Suggestions have included the names of lost astronauts, famous composers and ancient deities. But naturally, there were also some less sensible contributions, including the seven dwarfs, many Harry Potter references, dedications to Pluto and, obviously, Planet McPlanetface 1 to 7.

(more…)

Complex ultrasound signals created by light

A new way of creating specially shaped pulses of ultrasound using light and a 3D printer has been unveiled by Michael Brown and colleagues at University College London. The pulses, which are creating using the photoacoustic effect in a 3D-printed material, could be tailored to perform a range of tasks including manipulating biological cells and delivering drugs to specific parts of the body.

Renowned for its ability to let us see inside the body, ultrasound refers to acoustic waves at frequencies above about 20 kHz. Such waves can also be used for medical treatment, industrial product imaging and chemistry. Researchers have also recently developed acoustic tractor beams and tweezers for the non-contact manipulation of small objects.

Ultrasound is usually generated by applying an electrical signal to a piezoelectric transducer. Complicated ultrasound signals can be created using arrays of transducers, but the ability to create certain very precise waveforms would require many tiny components – making such ultrasound generators expensive.

Heating up

Brown and colleagues’ technique to create specific ultrasound waveforms involves using a light signal, such as a laser pulse, to heat part of an object so it locally expands. This triggers vibrations that travel out from the surface of the material as sound waves. The precise nature of the ultrasound wave is defined by the 3D shape of the surface of the photoacoustic material.

To create surfaces that output specific ultrasound signals, the team developed an algorithm that calculates the 3D surface profile required to create a desired ultrasound signal. “Our algorithm allows for precise control of the intensity of sound at different locations and the time at which the sound arrives, making it quick and easy to design surfaces or ‘lenses’ for a desired application,” says Brown.

At the heart of their ultrasound generator is a 3D-printed cylinder of transparent material. One end of the cylinder is flat, while the other has a 3D pattern picked specifically to create an ultrasound wave in the shape of the numeral “7” (see figure). The patterned surface is then coated with black plaint, which makes it a good absorber of light.

To create an ultrasound pulse, a laser pulse is fired at the flat end of the cylinder. The light travels through the cylinder and strikes the paint at the opposite end, where ultrasound waves are emitted from the surface of the cylinder into a tank of water containing an ultrasound detector.

Using this set-up, Brown and colleagues were able to create and detect ultrasound waves shaped like a “7”. But as well as creating pulses with complicated shapes, the technique could also be used to create intense ultrasound pulses. “One useful feature of the photoacoustic effect is that the initial shape of the sound that’s generated is determined by where the light is absorbed,” explains Brown. “This can be used to create tightly focused intense points of sound just by depositing an optical absorber on a concave surface, which acts like a lens.”

Tiny bubbles

One possible application of the ultrasound generator is to create acoustic tweezers that can manipulate living cells and other delicate objects without any physical contact. Another possible use, according to Brown, is the targeted delivery of drugs. This would involve encapsulating the drugs in tiny bubbles that burst only when exposed to an ultrasound signal at, say, the site of a tumour.

The technique could also be used to correct for distortions to ultrasound signals as they travel through tissue or other materials. If the structure of the material is known beforehand, an ultrasound generator that compensates for the distortions can be made. So far, limitations in laser power have restricted the team to using pulsed lasers, but Brown says that the team is also interested in generating ultrasound using continuous-wave optical signals.

The new technique is described in Applied Physics Letters.

Flash Physics: Intricate crystals made with DNA, laser harnesses Josephson effect, borophene has Dirac cones

Intricate gold crystals made with DNA

The most complex synthetic nanoparticle crystal ever made has been created using DNA and gold. Researchers have used gold nanoparticles and DNA “smart glue” to assemble intricate clathrate-crystal structures. There are many aspects of nature that scientists struggle to emulate in a laboratory. This includes a huge array of complex crystal structures such as clathrates. These cage-like lattices comprise polyhedral clusters and pores that can house small molecules. Such structures are useful for environmental applications where pollutants can be held within the pores. Recreating clathrates using nanoparticles is difficult because it relies upon precise nanoparticle shapes and dimensions. Yet a group of experimentalists and computer simulators has been able to both make and model the exact structure and assembly process when using gold nanoparticles and DNA glue. For more than a decade, Chad Mirkin from Northwestern University in the US and colleagues have pioneered the application of synthetic DNA strands to bond nanoparticles into programmable designs. For the study reported in Science, the researchers used gold-nanoparticle bipyramids. These look like two flattened tetrahedrons joined at their bases. The dimensions and angles of the nanoparticles created were ideal for forming clathrates, but the group found that if the DNA was too short, the bipyramids would assemble in disordered structures. Electron microscopy was used to image the resulting crystals and once clathrates were formed, the simulation team led by Sharon Glotzer of the University of Michigan in the US were able to accurately identify and model the crystal assembly. The resulting nanoparticle clathrates possess the cavities seen in natural systems, meaning they could be useful for environmental and medical diagnostic applications. Furthermore, as the dimensions of the nanoparticles are similar to visible-light wavelengths, the crystals may have potential in light-controlling devices such as new lenses, lasers and cloaking materials.

Microwave laser harnesses Josephson effect

Diagram showing how the new microwave laser works

A new type of microwave laser that is based on a Josephson junction has been unveiled by Leo Kouwenhoven and colleagues at the Delft University of Technology in the Netherlands. The device is similar to components already being used to develop superconductor-based quantum logic devices and could therefore play an important role in future quantum computers. A Josephson junction is a tiny gap in a superconducting circuit across which electron pairs can tunnel. When a voltage is applied across the barrier, the tunnelling pair can emit a microwave photon at a specific wavelength. The team’s Josephson junction is contained within a microwave cavity that is tuned to maximize the emission of microwave photons. These photons then bounce back and forth in the cavity, which stimulates the emission of even more photons. The cavity fills up with a coherent field of microwaves and some of the radiation can be removed as a coherent microwave laser beam. Unlike other microwave lasers, the device operates at extremely cold temperatures – making it suitable for use in superconductor-based quantum computers. The new laser is described in Science.

Borophene has Dirac cones after all

Illustration of borophene's Dirac cones

Borophene – a layer of boron just one atom thick – is more like graphene than previously thought, according to calculations and experiments done by an international team led by Iwao Matsuda at the University of Tokyo. Also just one atom thick, graphene has a unique set of electronic properties that arise from the fact that its atoms are arranged in a hexagonal lattice. Its valence and conductions bands are described by “Dirac cones” that touch, which means that graphene electrons behave as Dirac fermions that can travel at very high speeds through the material. In contrast, borophene can assume several different structures that do not have a perfect hexagonal lattice. As a result, some physicists had thought that borophene would not harbour Dirac fermions. Matsuda and colleagues calculated the electronic properties of a specific type of borophene – called β12 that forms when boron atoms are deposited on a silver substrate. This work suggested that this borophene should have Dirac cones and this was then backed up by angle-resolved photoemission spectroscopy experiments, which revealed that the borophene cones are actually split into pairs (see figure). Writing in Physical Review Letters, Matsuda and colleagues suggest that like graphene, borophene could be used to create high-speed electronic devices.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new source of ultrasound.

Insects inspire water-repellent material

Hydrophilic and hydrophobic cones

Surfaces inspired by cicada wings and mosquito eyes have antifogging properties. A team of scientists from France and the US has created surfaces that mimic these natural systems, in an attempt to replicate their anti-fogging mechanisms.

When water comes into contact with a surface, its behaviour can vary from beading into tiny droplets to spreading evenly over the surface. For water-loving, hydrophilic materials, the droplet will spread out and maximize contact with the surface. In contrast, for water-repelling, hydrophobic materials, the water forms beads. This behaviour is due to the angle between the droplet’s edge and the surface directly underneath – known as the contact angle. When the contact angle is more than 90°, the surface is considered hydrophobic. If it is higher than 160°, the material is superhydrophobic and strongly repels water, even causing it to bounce off. While superhydrophobicity is dependent on surface chemistry, the texture of the surface also plays a vital role.

“Many textured materials can repel water, with millimetre-size water drops bouncing off their surfaces,” explains team member Charles Black of Brookhaven National Laboratory in the US. In these cases, the water droplets are too big to penetrate the surface features and remain on top of the texture, floating on a cushion of air. Therefore, the surface remains dry. “But many of these surfaces fail when exposed to foggy or humid conditions,” Black continues. Under these conditions, moisture condenses as microdroplets that are comparable in size to the surface features. The fine droplets nucleate and grow among the features. This water remains stuck as dew accumulates, and any larger droplets on the surface also become attached. The result is a wet material.

Survival features

While man-made superhydrophobic materials often suffer this limitation, a solution can be found in nature – specifically, on the surfaces of insects. Bugs have tiny nanoscale features on their surfaces. These give the insects properties that are essential for survival. For example, moth eyes have low light reflectance, springtail carapaces repel oil and palmtree bugs are ultra-adhesive. Meanwhile, anti-reflective mosquito eyes and cicada wings display anti-fogging properties and self-cleaning capability.

The team from École Polytechnique, ESPCI Paris Tech and the Thales Group in France alongside Brookhaven scientists used cicada wings as inspiration. These have a textured structure comprising tiny nanocones. Water droplets can spontaneously jump off these surfaces because of the efficient conversion of surface energy to kinetic energy when two droplets combine. Timothée Mouterde and team investigated the underlying anti-fogging mechanisms of man-made structures mimicking the cicada wings and the effect of feature size and shape.

To make the textured samples, the researchers used a method initially developed at Brookhaven. The technique uses block copolymers (chains of two linked, distinct molecules) that can self-assemble into ordered patterns with nanoscale dimensions. The researchers covered etched silicon with these hydrophobic molecules to create the desired textured surface.

Drip feeding

Foggy atmospheres were created by drip feeding hot water onto the cold surface. This causes some of the water droplet to evaporate and then condense as microdroplets in subjacent textures. By tilting the substrate, the scientists measured the mass of the droplet when it became detached at the bottom of the slope. The mass reveals how much of the condensed microdroplets adhered to the larger droplet – a measure of the water repellency. Using an optical microscope, the team was also able to watch the droplet formation.

The study, presented in Nature Materials, reports two key results – smaller features mean greater anti-fogging ability and cones are better than cylinders. “We show that water condensing on cold nanocones gets expelled with an unprecedented efficiency,” explains David Quéré of École Polytechnique and ESPCI. The scientists observed that all textures are initially covered in many microdroplets, but over time the cylinders become covered in water. Meanwhile the cones “dry” themselves in a manner similar to the cicada wings. As the droplets are so lightly adhered to the surface, when two join together, they gain enough energy to jump off the surface. “Droplets as small as 1.5 μm were repelled from the surface,” continues Quéré, “while it is said generally in the literature that this should not occur for drops smaller than 10 μm.”

Hot water

The next stage for the research is to look at the velocity droplets depart from the surface. “It is a very interesting question at the crossroads of fluid mechanics and interface physics,” Quéré says. “We are also studying the impact of hot water on our textures to see whether water repellency is maintained when water is hot.”

Antifogging materials could be useful for vehicle windshields and mirrors, corrosion-prone materials and other fog-prone surfaces. Whether these man-made structures can achieve the performance of natural materials remains to be seen, however, as the mosquito eyes and cicada wings are too small to perform similar experiments.

Introducing Physics World Discovery

Image of the first five Physics World Discovery ebooks

By Matin Durrani

What better way to celebrate World Book Day than by checking out Physics World‘s new series of free-to-read, short-form ebooks. Entitled Physics World Discovery, they are short introductions to some of the hottest topics in physics written by leading voices in the physics community.

Available online here, these short-form ebooks follow all the attributes of feature articles in Physics World magazine – being well written, accessible, timely and authoritative. But as ebooks, they allow authors to go into more detail than a standard Physics World feature and include plenty of graphs, diagrams and pictures too.

Being short, each title is an ideal starting point for for physicists at all stages of their careers to get quickly up to speed with an evolving physics field.

We’ve published five Physics World Discovery texts so far, with more in the pipeline. You can read them in PDF, ePUB or Kindle format, making them perfect for those wanting intellectual stimulation on a train or plane journey.

(more…)

Flash Physics: Lensing study backs cold dark matter, cancer detected by Raman spectroscopy, black-hole burbs

New gravitational lensing study backs cold dark matter

A new high-resolution map of dark matter – an invisible substance that appears to have a profound gravitational effect on galaxies and other large-scale structures in the cosmos – has been produced by an international team of astronomers using the Hubble Space Telescope. The map focuses on three galaxy clusters that act as cosmic telescopes by magnifying images of the more distant universe through gravitational lensing. The degree to which this magnification occurs gives an extremely precise measurement of the dark matter within the clusters. “We have mapped all of the clumps of dark matter that the data permit us to detect, and have produced the most detailed topological map of the dark-matter landscape to date,” explains Priyamvada Natarajan of Yale University in the US, who led the team. An important feature of the map is that it is in close agreement with computer simulations of how cold dark matter (CDM) – a popular theoretical description of dark matter – is expected to be distributed within the galaxy clusters. The map is described in the Monthly Notices of the Royal Astronomical Society.

Hard-to-detect skin cancer imaged using Raman spectroscopy

A hard-to-detect pigment in melanoma skin cancer can be imaged using a laser-based technique. A team at Massachusetts General Hospital’s Wellman Center for Photomedicine in the US has used a form of Raman spectroscopy to identify the pheomelanin molecule. Melanoma is the deadliest form of skin cancer and fair skin has a higher probability of developing the hard-to-detect variation of the disease called amelanotic melanoma. This is linked to the fact that fair skin contains a higher concentration of pheomelanin – a pigment, or melanin, within the skin. While the black-brown pigment found in most melanomas is easily observed, pheomelanin is essentially invisible. To detect the pigment, the team, led by Conor Evans, turned to a form of Raman spectroscopy called coherent anti-Stokes Raman Scattering (CARS) microscopy. Raman spectroscopy is a well-known technique that uses lasers to measure the unique chemical vibrations within molecules and hence identify them. CARS microscopy meanwhile, is a high-resolution imaging technique. It focuses two lasers on a sample and “tunes” the energy difference to specific molecular vibrations. This means a high-resolution image can be generated. Using CARS, the researchers successfully imaged the usually invisible pheomelanin by looking for its unique chemical structure. The method could be incorporated into a brand-new tool for early cancer diagnosis. The work will be presented at the OSA Biophotonics Congress: Optics in the Life Sciences meeting on 2–5 April in San Diego, US. It has also been described in Scientific Reports.

Black hole caught “burping” by space telescopes

Artist's impression of a supermassive black hole

A connection between the sudden outflows of gas from a supermassive black hole and X-ray bursts has been made by astronomers using two space telescopes – NASA’s NuSTAR and the European Space Agency’s XMM-Newton. Gas outflows are common features of supermassive black holes, which sit at the centre of large galaxies. These objects ingest vast amounts of material and the dynamics of this accretion process can lead to the ejection of gas in a burp-like ultrafast wind. The team trained the instruments on an outflow from the black hole at the centre of galaxy IRAS 13224-3809 and observed that the temperature of an outflow was changing much more rapidly than had previously been seen in other events – on a timescale of less than 1 h. According to team member Erin Kara of the University of Maryland, these fluctuations provide important clues about where the outflow was created. “Because we saw such rapid variability in the winds, we know that the emission is coming from very close to the black hole itself, and because we observed that the wind was also changing on rapid timescales, it must also be coming from very close to the black hole.” The observations were made over several days and revealed that the temperature fluctuations were a response to changes in the intensity of X-rays emitted by the black hole. This information could provide important clues about where the X-rays and outflows are produced. The research is described in Nature.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on mimicking the antifogging properties of insect wings.

A quantum boost for machine learning

The blue room is dense with concentration. At a table in the centre sit two opponents staring at a board of black and white marbles that are moved in silent turns. Finally, the player on the right resigns. It is 9-dan Go master Lee Sedol. On the left sits software developer Aja Huang who gets his instructions from AlphaGo, a computer program developed by Google’s DeepMind project. It is March 2016 and AlphaGo has just beaten one of the world’s best players in four out of five matches of the popular board game Go.

The success of AlphaGo has been widely perceived as a milestone in artificial intelligence research. Go is a much more complex game than chess, at which a computer first won against a world champion in 1997. In Go, exploring all strategies by brute force – in which all possible moves are evaluated to decide the best move to make – is no option; there are more possible marble positions than there are atoms in the universe, and the 2200 computer processors delivering the power for the game are lightweight compared with today’s supercomputers. The secret of AlphaGo’s success lies much more in a strict training regime with a special sparring partner, namely the software itself. To become a worthy training partner, AlphaGo’s “deep neural networks” – computer algorithms inspired by our brain’s architecture – initially learnt how to master the game by consulting a database of around 30 million professional moves.

Machine learning can be understood as the data side of artificial intelligence, where one often deals with large amounts of information, or “big data”. Similarly to human learning, machine learning involves feeding very many instances of a problem into a computer that has been programmed to use patterns in the data to solve a previously unseen instance. For example, a computer could be fed a lot of images of a particular person, and then given a new image before being asked whether it is the same person as before. The crux is that we do not know how we link the visual stimulus to the concept of recognizing a person in the image. In other words, there is no simple correlation between the pixel at, say, position (1334, 192) being red and the picture containing our friend Sivu that we could programme the computer to exploit. Machine-learning research therefore has to come up with generic ways to find complicated patterns in data, and as Facebook’s automatic tagging function shows, this is done with ever-increasing success.

What does this have to do with physics, and more precisely, with quantum physics? The computer that executed the AlphaGo software is based on classical physics. Information is processed by microelectronic circuits that manipulate signals of zeroes and ones, and these circuits follow the laws of classical electrodynamics. But for two decades, physicists have been rethinking the concept of a computer right from scratch. What if we built a computer based on the laws of quantum theory? Would such a device fundamentally change the limits of what is computable? The answer, it turns out, is not so easy to find, although we seem to understand the question a lot better today. Despite the fact that we haven’t yet been able to build quantum computers large enough to solve realistic problems, several powerful languages have been developed to formulate and study “quantum algorithms”, the software for quantum computers, from a theoretical perspective. This research effort has now left the borders of purely academic interest and is pursued in the labs of large IT companies such as Google and IBM. As its realization seems more and more certain, the pressure to find “killer apps” for quantum computing grows. This is where machine learning comes in.

Since we know the language quantum computers will speak one day, we can already start thinking about what impact they will have on the frontiers of machine learning. This approach is called quantum-enhanced machine learning and is part of the larger research field of quantum machine learning (which also investigates the opposite approach of using traditional machine learning to analyse data from quantum experiments). To get an idea of quantum-enhanced machine learning, one first has to understand how machine learning works, and the “black art” involved in using it to greatest advantage.

Machine learning

A quick way to access the concept of machine learning is through data fitting, an exercise that most scientists have come across during their undergraduate studies and forms one of many methods used to recover patterns or trends in data. Imagine you run an experiment that generates data points (x, y) for setting a control parameter x and measuring the result y. As a physicist you would like to obtain a model that can explain these measurement results. In other words, you want to find a relationship y = f(x) that, up to some degree of noise, produced the data. This can be done by feeding the experimental data into a computer and using numerical software to find the best fit of a parameter-dependent function f(x) (figure 1). Mathematically speaking, this is an optimization problem.

A graph is marked with y as the vertical axis and x as the horizontal. Eight data points are marked on the graph as black crosses and labelled in the legend as "data". A blue dashed line, labelled in the legend as "model 1", passes through all the points and has several maxima and minima. A red dashed line, labelled in the legend as "model 2", doesn't pass directly through any points but goes approximately between them, having only one maximum and two minima. A black cross with a circle around it is labelled in the legend as "unseen data", and on the graph sits roughly in the middle horizontally, and below both plots vertically. A blue cross with a circle around it is labelled in the legend as "prediction of model 1", and sits way above the unseen data point, on the blue dashed line. A red cross with a circle around it is labelled in the legend as "prediction of model 2", and sits near the unseen data point, on the red dashed line

Solving the optimization problem is already the job half done for machine learning, where one can now use the best model function to predict the measurement outcomes for new control parameters without performing the actual experiment. Of course, in most machine-learning applications one is less interested in physical experiments than in tasks that traditionally require human experience. For example, x could represent a set of macroeconomic variables and y stand for the oil price development in the next week. If we derive a model y = f(x) from the data, we can use it to predict tomorrow’s oil price. Alternatively, the inputs could be the pixels of images and the output a yes-or-no answer to whether your friend Sivu is in the picture, in which case the machine-learning software is used for image recognition. One thing most applications have in common is that they allow us to answer questions about complex relationships where the answers are worth a lot of money.

So far this sounds pretty straightforward. All you have to do is solve an optimization problem to find the best predictive model. But machine learning usually deals with very difficult types of optimization problems that are avoided by even the more adventurous mathematicians. Think, for example, of an optimization landscape like the Himalayan mountain range, where you want to find the deepest valley on foot and without a map (figure 2). The real “black art” lies in the subtleties of formulating the optimization problem. In the data-fitting case of figure 1, for example, if we define the best model to be the one where the prediction f(x) is closest to the real value y for all data points, the more flexible model function (blue) would win, because the model function goes through all data points. But when we introduce a new data point, it is clear that the “rougher fit” (red) gives a much better prediction. For our hiker, the optimization landscape that corresponds to the more flexible model is not very helpful, because even if they find the deepest valley it does not necessarily lead to a good model. A useful optimization landscape leads to optimal models that generalize from the underlying pattern to unseen data, even if they do not predict the seen data perfectly well. Formulating effective optimization problems requires a lot of intuition and hands-on experience, which are key to harnessing the power of machine learning.

Upon a blue undulating 3D surface with gridlines on it resembling a mountain range, a white dashed line meanders this way and that from a white flag at the bottom of the hills to a red flag at the top

A quantum boost

The most common approach to enhancing machine learning with quantum computing is to outsource the hard optimization problems to a quantum computer, either the small-scale devices available in the labs today, or the full-blown versions that we hope to have access to in the future. An entire toolbox of algorithms for this purpose has been developed by the quantum-information-processing community. The continuing challenge is to combine, adapt and extend these tools with the aim of improving number crunching on conventional computers. Three different approaches to solving optimization problems using quantum computers are explained in more detail in the box opposite. Although we know by now that most hard computational problems tend to remain hard even if quantum effects are exploited, modest speed-ups can still prove crucial for today’s big-data applications.

There is one important caveat in outsourcing, however. For this approach to work, one needs to encode the data that shape the optimization landscape into the quantum system. One way to do this is to represent a black-and-white image with a lattice of spins pointing up (white pixel) or down (black pixel), as in figure 3. Using quantum superposition – where a physical system is in two or more quantum states at the same time – allows us to store many images in a single quantum system. Other encoding strategies are more involved, but all of them require in practical terms that we prepare the initial state of the quantum system (or, in some cases, the interactions) to represent the values of the dataset. For an experimental physicist, preparing a microscopic physical system so that it encodes billions of pixels from an image dataset, to a high precision, must sound like a nightmare. Data encoding is therefore a crucial bottleneck of quantum algorithms for machine learning and a challenge with no direct equivalent in classical computing.

A five-by-five grid of squares is shown sideways-on. Some are pale grey and some are dark grey, so that together as pixels they show the letter A. In every pale-grey square is a red arrow pointing upwards, and in every dark-grey square is a red arrow pointing downwards

Towards a quantum AlphaGo?

Without question, there is a long road to walk before future generations of AlphaGo and its companions can run on quantum hardware. First we need robust large-scale quantum computers to run the software developed. We need to design an interface between classical data and quantum systems in order to encode the problems in these devices. We also need better quantum tools for optimization, especially when the landscapes are complex.

More than anything, we need to learn the “black art” of machine learning from those who have been practising it for decades. Instead of merely outsourcing optimization tasks formulated for classical computers, we should begin to formulate problems for quantum computing right from the start. An early generation of quantum devices is waiting in the labs for practical implementations. The question is, what type of optimization problems do these devices allow us to solve, and can the answer to this question be used to define new machine-learning methods? Could there be specific physics-research problems that quantum-enhanced machine learning could tackle? Can we use genuine “quantum models” for these tasks? And can the way we think in quantum computing give rise to innovation for conventional strategies in machine learning?

In summary, the emerging discipline of quantum-enhanced machine learning has to be relocated from the playgrounds of quantum computing and must become a truly interdisciplinary project. This requires a fair bit of communication and translation effort. However, the languages might be less far apart than we expect: both quantum theory and machine learning deal with the statistics of observations. Maybe we do not need to take the detour via digital bits of zeros and ones in the first place. All in all, we do not know yet if some decades into the future, a quantum computer will calculate AlphaGo’s decisions. But asking the question gives us a lot to think about.

Approaches to quantum-enhanced machine learning

Quantum search
In the mid-1990s, computer scientist Lov Grover showed that a future quantum computer can search an unsorted database – such as telephone numbers in a phone directory – faster than classical computers can. This method can be adapted to find the k “closest” entries, or the k phone numbers that have the most digits in common with a given number. Finding closest data points to a new input is an important task in machine learning, for example in a method called “k-nearest neighbour” that chooses the new y-value according to the neighbours’ y-values. Maybe the most straightforward approach to machine learning with quantum computers is therefore to reformulate search problems in the language of quantum computing and apply Grover’s well-studied algorithm.

Linear algebra
A small quantum system can have a large number, N, of different configurations or measurement outcomes. Quantum theory describes the probability that one of the possible outcomes from all of these configurations is measured, and it is largely based on the mathematical language of linear algebra. In 2009 Aram Harrow, Avinatan Hassidim and Seth Lloyd from the Massachusetts Institute of Technology proposed a quantum algorithm that uses these properties in a clever way to solve systems of linear equations, which can under very specific circumstances be done incredibly fast. Likewise, many machine-learning optimization problems can be mathematically formulated as a linear system of equations where the number of unknowns depends on the size of the dataset. For big-data applications, numerical solutions can take a lot of computational resources and they are therefore excellent candidates for the application of the quantum linear systems algorithm.

Finding the ground state
A third type of optimization problem minimizes an overall energy function to find an optimal sequence of bits. A popular numerical method to carry out such “combinatorial optimization” is “simulated annealing”. Such an approach simulates the process in thermodynamics in which a system cools down until it reaches its ground state. In the quantum equivalent of the process, “quantum annealing”, an energy landscape is similarly minimized, however the algorithm can in addition use quantum-mechanical tunnelling to travel through the energy peaks – rather than having to “climb” over them – meaning it may find the lowest valley more quickly. Quantum annealing devices already exist, such as that announced in 2010 by the Canadian firm D-Wave as the world’s first commercial quantum computer. These devices have been shown to solve (admittedly, rather exotic) problems where they find a global minimum 100 million times faster than a conventional computer running a simulated annealing algorithm.

Coding and computing: the March 2017 issue of Physics World is out now

PWMar17cover-200By Louise Mayor

Physics these days wouldn’t succeed without software. Whether those lines of code are used to control new apparatus, make sense of fresh experimental data or simulate physical phenomena based on the latest theories, software is essential for understanding the world. The latest issue of Physics World, which is now live in the Physics World app for mobile and desktop, shines a light on how some physicists are exploiting software in new ways, while others are reinventing the hardware of a computer itself – binary isn’t the only way to go.

Sometimes there are so much data that software collaboration is the best way forward. In the issue, physicists Martin White and Pat Scott describe how the GAMBIT Collaboration is creating a new, open-source software tool that can test how theories of dark matter stack up against the wealth of data from various experiments such as direct searches for dark matter and the Large Hadron Collider. And with software development being so essential for physics research, data scientist Arfon Smith argues that we need to adopt better ways of recognizing those who contribute to this largely unrewarded activity. Columnist Robert Crease explores the other extreme: whether software can be patented.

Meanwhile, in an emerging field straddling both coding and computing, researcher Maria Schuld explains how quantum computers could enhance an already powerful software approach known as machine learning. (You can also read her article on physicsworld.com here.) Further into the realm of raw computing, physicist Jessamyn Fairfield describes the quest to develop a new kind of hardware that is physically, and functionally, similar to the computers inside our very own heads. As for how our brains process information, don’t miss a glimpse into the mind of physicist Jess Wade who has created a doodle based on the work Fairfield describes.

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors