Flash Physics is our daily pick of the latest need-to-know developments from the global physics community selected by Physics World's team of editors and reporters
Fukushima too radioactive even for robots
Better robots are needed for investigating the Fukushima Daiichi nuclear plant after current designs failed due to radiation levels and debris obstacles. At a recent news conference, president of Fukushima Daiichi decommissioning, Naohiro Masuda, spoke about the need for more creative robot design after repeated failures. In 2011, multiple reactors at the Fukushima nuclear plant went into meltdown after a severe earthquake and tsunami. To safely decommission the damaged plant, its operator Tokyo Electric Power Company (TEPCO) must know exactly where the melted fuel is and the extent of structural damage to the surrounding buildings. The radiation levels, however, would kill a human within seconds, so TEPCO is reliant upon remote-controlled robotic probes. Yet early robots have come across unexpected challenges. In February, TEPCO sent in two robots to investigate the damaged reactor inside Unit 2 of the facility. The first was a cleaner robot designed to clear the way for the other “scorpion” robot that would assess damage and measure radiation and temperature. Unfortunately, the cleaning robot had to be withdrawn after only 2 hours of the planned 10 hour mission because the cameras began to malfunction due to high radiation levels. The scorpion-shaped robot then had to be abandoned before reaching its target location because it began to have difficulty moving and became stuck when crawling over rubble. It is unclear if this failure was due to debris or radiation levels. The Associated Press reports that Masuda called for more creative thinking when developing future robots. “We should think out of the box so we can examine the bottom of the core and how melted fuel debris spread out,” explains Masuda. The data collected and the robot failures imply that the clean-up and decommissioning of Fukushima will be more challenging than previously predicted. It is thought that the process will take decades to complete.
IBM to build 50 qubit quantum computers
IBM says it will build a new generation of universal quantum computers that will be available for commercial use via the IBM Cloud platform. The IBM Q systems will have about 50 quantum bits (qubits). This will make them 10 times larger than IBM’s five-bit quantum computer, which is already available on IBM Cloud and has attracted about 40,000 users. According to the US-based firm, increasing the number of qubits will be one step towards boosting the “quantum volume” – or computing power – of their quantum systems. Efforts will also focus on improving connectivity between qubits, boosting the reliability of quantum-logic operations and creating systems that are capable of highly parallel computations. The universal nature of the proposed computer should make it useful for solving a range of problems that are too complex for conventional computers. These include calculating the properties of molecules used to create new drugs and materials, finding optimal processes for supply chains and logistics and creating artificial intelligence systems. “To create knowledge from much greater depths of complexity, we need a quantum computer,” says Tom Rosamilia of IBM Systems. “We envision IBM Q systems working in concert with our portfolio of classical high-performance systems to address problems that are currently unsolvable, but hold tremendous untapped value.”
Very few photons needed to see through opaque material
An optical image of a region within a nearly opaque medium can be obtained using a surprisingly small number of photons. That is the conclusion of Mooseok Jang and Changhuei Yang at Caltech in the US and Ivo Vellekoop of the University of Twente in the Netherlands, who have shown that an established technique called optical phase conjugation (OPC) can be extended for use when very little light makes it out of the medium. OPC involves illuminating a point of interest in a nearly opaque medium with light beams from opposite directions. The first beam provides information about how light is scattered in the medium. This information is then used to cause the second beam to undergo the exact reverse scattering as it travels to the point of interest – illuminating that point. By scanning the beams around the sample, an image is built up. However, in very opaque materials scientists had thought that not enough light emerges to provide useful information about the scattering. Applying the technique to a sample of highly opaque opal, the trio showed that it worked when as few as 1000 photons were detected emerging from the sample – which is far fewer than the number of pixels in the detector used to measure the signal. The discovery is reported in Physical Review Letters and could be used to improve the optical imaging of opaque biological tissues such as brain matter.