A new solar-powered system that can extract water from air in arid regions of the world has been unveiled by researchers in the US and Saudi Arabia. Led by Omar Yaghi of the University of California, Berkeley, and Evelyn Wang of the Massachusetts Institute of Technology, the team created the device using a metal-organic framework (MOF). The device is powered by heat from sunlight. It can harvest 2.8 litres of liquid water per kilogram of MOF per day at relative humidity levels of 20–30% – which are common in arid regions of the world. MOFs combine metals with organic molecules to create rigid, porous structures that are ideal for storing gases and liquids. The system comprises a kilogram of compressed MOF crystals that sits below a solar absorber and above a condenser plate. Ambient air diffuses through the porous MOF, where water molecules preferentially attach to the interior surfaces. Sunlight heats up the MOF and drives the bound water toward the condenser, where the vapour condenses and drips into a collector. “This work offers a new way to harvest water from air that does not require high relative humidity conditions and is much more energy efficient than other existing technologies,” says Wang. Yaghi adds: “There is a lot of potential for scaling up the amount of water that is being harvested. It is just a matter of further engineering now.” The system is described in Science.
Material glows in response to stress
Forced to shine: stress-sensing molecules cause polymer to glow. (Courtesy: OIST)
A material that repeatedly lights up in response to mechanical forces has been developed by researchers at Okinawa Institute of Science and Technology Graduate University in Japan. To create the material, Georgy Filonenko and Julia Khusnutdinova incorporated stress-sensing molecules called photoluminescent mechanophores into the common polymer, polyurethane. While mechanophores are not new, they are typically one-use only. They emit light when a strong force breaks a specific chemical bond between atoms or pulls apart two molecular patterns. The radical change in structure causes a shift in the wavelength of light emitted (the glow), but it is difficult to return the molecule to its original, “off” state. Therefore, Filonenko and Khusnutdinova developed a molecule that relies upon dynamic rather than structural changes. Their phosphorescent copper complexes move rapidly when the host polymer is in a relaxed state, and the motion suppresses light emission. Yet when a mechanical force is applied, the movement of the polymer chains, and hence the mechanophores, slows, and consequently the complexes are able to luminesce. The light emitted is visible to the naked eye when the material is bathed in UV light and becomes brighter with increasing force. However, unlike previous stress-reacting materials, Filonenko and Khusnutdinova’s can revert to its original, non-luminescent state as no chemical bonds have been broken. The new mechanophores are described in Advanced Materials and could be used to assess stress and dynamics in soft materials.
Physicist bags prestigious economics award
The physicist-turned-economist Dave Donaldson has won the John Bates Clark Medal of the American Economic Association (AEA). The medal is given to “economist under the age of 40 who is judged to have made the most significant contribution to economic thought and knowledge”. Described by the AEA as “the most exciting economist in the area of empirical trade,” Donaldson has studied topics as diverse as the economic impact of railways in 19th century India and the consequences of climate change on agricultural markets. Donaldson, 38, is associate professor of Economics at Stanford University in California and is a dual citizen of Canada and the UK. He studied physics at the University of Oxford before doing an MSc and PhD in economics at the London School of Economics.
You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on rainfall during tropical storms.
Cancer is a complex disease to treat, and yet the operating principle of many current treatments is to simply kill healthy cells a little slower than cancerous ones. In response, scientists at The University of Electronic Science and Technology of China have developed a sophisticated nanoparticle-based treatment. Their theranostic nanoparticles carry an anti-cancer drug cargo, and showcase multiple cutting-edge nanomedicine technologies to enhance the drug’s efficacy, including selective drug delivery, photoactive agents, and even signal-jamming genetic material.
The researchers have designed each individual nanoparticle to be a toolbox for cancer therapy, able to passively and actively target tumours (Biomater. Sci. 2017 Advance Article). The nanoparticles can act as contrast agents for both magnetic resonance imaging and X-ray, they deliver a concentrated dose of anti-cancer drugs, and they also thwart the cancer’s attempts at developing immunity to the drug. They even deliver a photosensitizer that can be used to specifically weaken cancerous tissue by photodynamic treatment.
Yiyao Liu and colleagues demonstrated the efficacy of their nanodevices in vitro and in vivo on a range of cell lines and on tumours in living mice. They found that their nanoparticle drug-delivery technique effectively stopped tumour growth, whereas tumours in mice treated with the drug alone grew at a rate half that of a control group that had not been treated.
The nanoparticles are complex, many layered spheres. Protected by a jacket of natural polymer is a nugget of silica, holey like a sponge and soaked in doxorubicin, a common anti-cancer drug, along with the photosensitizer. The polymer jacket is pH sensitive so that it falls off in the acidic microenvironment of the tumour, only then releasing the active cargo.
Doxorubicin has two flaws. Firstly, it works by slotting in-between DNA base pairs to stop the replication process needed for cells to divide. This kills cells that need to duplicate quickly, such as cancerous cells, but harms many healthy cell types too. Secondly, it triggers the body’s natural defences, causing cells to over express p-glycoprotein, a microscale pump that removes toxic molecules like doxorubicin from cells, and making the drug less and less effective against cancer.
The scientists at The University of Electronic Science and Technology of China countered both of these flaws. Healthy cell exposure is reduced by the polymer jacket, which makes sure the drug is only released under the conditions expected in a tumour. The jacket itself is covered in signal-jamming RNA to inhibit the expression of the cellular pumps, keeping the doxorubicin trapped inside the cells to allow the drug to work for longer. This impressive display of multifunctional nanoparticle design and synthesis demonstrates the power of nanomedicine for producing synergistic effects, offering new solutions to previously unsurmountable problems.
What’s your overall feeling of the Donald Trump administration?
I wish I could answer your question. It’s mostly uncertainty in the science community. There have been troubling signs for years about the role of science in society and policy-making – whether opinion is crowding out evidence in public debate. And that kind of concern has risen to a very high level right now: whether people understand evidence and facts or even want to distinguish between opinion and scientifically validated evidence.
What do you think Trump thinks about science?
He has not said very much about science at all. During the campaign he touched on a few politically difficult issues, questioning the safety of vaccination, challenging climate science [which] he went so far as to say was a Chinese hoax. So that has made people very suspicious of whether he appreciates science. In past years, when scientists said they were uneasy about things, they meant funding for research. But now it’s a deeper unease about whether people understand the value, nature and importance of science. That’s the overall concern right now.
How do you think funding will fare under Trump?
Congress still [controls] that – not the president – and the Congress has not changed as dramatically as the presidential administration has changed. So I think we will be – and Congress has been pretty clear about this – on a very austere budget for all non-defence discretionary activities. Even though in recent times science and research have done a little bit better than their share, there is no guarantee that will continue. And even if it does, it will be tightly constrained.
What would you say if you were in a one-to-one meeting with the president?
[I would say] we would like him to get a science adviser for himself and for each of his agencies and make sure all agencies and cabinet departments are well salted with people who understand science. I would say to the president: “You’ll have a crisis next month and the month after that. I can’t tell you what it will be. It might be an oil-well blowout. It might be an emerging disease. It might be a radiation leak. And you don’t want to get up to speed then.” I would [also] try to dispel the idea, that I believe he has, that a science adviser is simply a “plant” of the science community who is there to represent the interests of people who wear lab coasts. Rather, that person is his best defence against erroneous policies that just won’t hold up under the tests of reality.
What do you think about Rick Perry – the new head of the Department of Energy (DOE)?
The DOE is one of the largest funders of physics research in the US. A lot of people think it only deals with oil wells and wind turbines. And yes it does a little bit of that, but it also supports high-energy physics and even some biological sciences. When he was running for president [in 2012], Perry said he wanted to abolish the DOE, but in his confirmation testimony before the senate he said he’s learnt a lot about the good work done in and by the DOE. So that’s a promising sign. Of course, he does not control his own budget – that is imposed on each cabinet department. Nevertheless, it shows that some of the doubts that scientists had about the incoming administration are subject to correction. And also that maybe Perry now has a much better understanding of what it takes to maintain a good high-energy physics programme or plasma-science programme and so forth.
So might some areas of physics benefit from the new administration?
The other thing I would say [to Trump] if I received the requested audience – we’ve asked but so far heard nothing – is to get him to understand that investment in infrastructure, something he says he wants to do, can and should also include investment in human infrastructure as well as research and development. When President Barack Obama came into office and the economy was failing badly, he proposed an economic stimulus plan in which science was initially nowhere. After we made the case to Obama, there were tens of billions of new money for R&D. So maybe this president can be convinced that infrastructure also includes laboratories and equipment and grad students.
What do you think about Trump’s attempt to ban people from certain nations from travelling to the US?
What scientists are most concerned about is not funding levels, though they should be concerned about them, but the climate in which science can be practised. The ban on travel for people from designated countries [Iran, Libya, Somalia, Sudan, Syria and Yemen], has humanitarian, political, diplomatic and security concerns. Constraints on communication and on travel by scientists – so you can have the freedom to choose collaborations and build teams that are diverse – [can hamper] the progress of science. The confidence of scientists that science will be allowed to thrive has been shaken by the immigration ban and has caused a very large reaction in the science community. Other constraints on communication by government scientists – or threatened constraints – have added to [those concerns].
What might those constraints be?
When the transition team came to the DOE, someone in that team asked for information about any DOE employee who had visited a climate-change conference. The only reason they would ask for that would be to give them a hard time. So that seemed to be a constraint on the practice of good science.
What future do you foresee for the national labs under Trump?
The new president appears to be influenced by reports that have come out of one of the conservative think tanks [the Heritage Foundation] that have included calls for the privatization of the national labs. So it’s possible that this administration will make some big changes, or try to make some big changes. But I don’t think Congress would implement a call for turning national labs into corporate subsidiaries. It wouldn’t work since a lot of their work is defence-related and classified and quite varied. I’m not sure a corporation would want to undertake that – they’d have trouble seeing how they’d make a profit from running a national lab.
Who do you think’s in the running for presidential science adviser?
There have been rumours that someone who has actually met President Trump is [77-year-old Princeton University atomic and optical physicist] Will Happer. He is a well-known physicist and he has idiosyncratic ideas about climate change.
Would you be happy with him in that role?
Happer is a friend. He is a very smart person. I’d be surprised if he took it. I haven’t talked to him about it. I think he is so far from the mainstream on climate change, which is one of the big issues in science and policy-making today, that it would be difficult for him. It would be difficult for anyone who had ideas that far out of the mainstream.
Would you fancy the job?
There’s no reasonable chance that I would be selected. But if the president asks you to do a job for the sake of the country, most people would say yes. It would be hard to turn down. But I left Congress two years ago and then got this terrific job heading up the AAAS. It is important and satisfying work and because of my political record in Congress, this president would not choose me and probably would not want me at his right hand on science issues. The AAAS has offered the president help vetting candidates to look at their science accomplishments and stature. But for an adviser, he needs to choose someone who’s relatively close to his way of thinking and who he could trust.
So how do you see science panning out under Trump?
There are potential big changes but a lot of uncertainty. Meanwhile, science and physics in America are going along quite well. There is great research coming out of the universities and laboratories. In some ways we’ve never seen a better time for science. But in other ways, I guess like Charles Dickens said, we live in the best of times and the worst of times.
Quantum interference involving three photons has been measured by two independent teams of physicists. Seeing the effect requires the ability to deliver three indistinguishable photons to the same place at the same time and also to ensure that much more common single-photon and two-photon interference effects are eliminated from the measurements. As well as providing deep insights into the fundamentals of quantum mechanics, three-photon interference could also be used in quantum cryptography and quantum simulators.
When a stream of single photons travel through a double slit they will build up an interference pattern on a detector behind the slits – an example of single-photon interference. An example of two-photon interference is the Hong–Ou–Mandel (HOM) effect, which involves two photons entering a beam splitter with two exit ports. If the photons are identical and arrive at the same time, they will interfere and will always exit the same port of beam splitter. If these two criteria are not met, there is a 50% chance of each photon exiting either port.
Trio in a tritter
Now, a three-photon version of the HOM effect has been created by team led by Ian Walmsley of the University of Oxford in the UK. Their experiment begins with the creation of three independent photons in three different sources. These are sent to a fibre-optic based interferometer called a tritter, which has three inputs and three outputs. The team looked at the probability that all three photons exited the same output portal. To isolate the effects of single- and two-photon interference, they control something called the “triad phase” of the three photons. This is non-zero only if the photons are partially distinguishable – but not fully distinguishable. They were able to show that the probability for three photons emerging from one port varied with the triad phase, just as expected for three-photon interference. And crucially, single- and two-photon effects remained constant.
Meanwhile at the University of Waterloo in Canada, Thomas Jennewein and colleagues did their experiment using a photon source that emits three photons in an entangled quantum state. The trios are created by firing a single photon into a series of nonlinear crystals, each of which is able to convert one photon into a pair of entangled photons. Very occasionally an entangled trio emerges and is then sent into an interferometer that has two output ports. By changing the relative phases of the three photons, Jennewein’s team saw the probability of three photons emerging from one port vary as expected from three-photon interference. The probability of two photons emerging from the same port remained the same, however, suggesting that the team was observing genuine three-photon interference.
One possible application of the three-photon interference created in the experiments is three-photon sharing. This involves a secret quantum key that is shared by three parties, but can only be used by all three parties together. Three-photon interferometry could find use in quantum-sensing applications and also in a quantum-computing technique called boson sampling.
Science has taken motor racing to a whole new, extremely small level with the NanoCar Race. The competition on 28 April will see nanoscale molecular machines “speed” around a gold racetrack for 38 hours. As the tiny-molecule cars are not visible to the naked eye, the race will take place inside a scanning tunnelling microscope (STM) at the Center for the Development of Materials and Structural Studies (CEMES), part of the National Center for Scientific Research (CNRS) in France. The teams behind the NanoCars control their vehicles using electric pulses but are not allowed to push them mechanically. Details about the cars and their teams can be found on this website, where you will also be able to watch the race later this month. There is more about the competition in the above video.
A group of physicists in China has taken the lead in the race to couple together increasing numbers of superconducting qubits. The researchers have shown that they can entangle 10 qubits connected to one another via a central resonator – so beating the previous record by one qubit – and say that their result paves the way to quantum simulators that can calculate the behaviour of small molecules and other quantum-mechanical systems much more efficiently than even the most powerful conventional computers.
Superconducting circuits create qubits by superimposing two electrical currents, and hold the promise of being able to fabricate many qubits on a single chip through the exploitation of silicon-based manufacturing technology. In the latest work, a multi-institutional group led by Jian-Wei Pan of the University of Science and Technology of China in Hefei, built a circuit consisting of 10 qubits, each half a millimetre across and made from slivers of aluminium laid on to a sapphire substrate. The qubits, which act as non-linear LC oscillators, are arranged in a circle around a component known as a bus resonator.
Initially, the qubits are put into a superposition state of two oscillating currents with different amplitudes by supplying each of them with a very low-energy microwave pulse. To avoid interference at this stage, each qubit is set to a different oscillation frequency. However, for the qubits to interact with one another, they need to have the same frequency. This is where the bus comes in. It allows qubits to transfer energy from one another, but does not absorb any of that energy itself.
“Magical interaction”
The end result of this process, says team member Haohua Wang of Zhejiang University, is entanglement, or, as he puts it, “some kind of magical interaction”. To establish just how entangled their qubits were, the researchers used what is known as quantum tomography to find out the probability of detecting each of the thousands of possible states that this entanglement could generate. The outcome: their measured probability distribution yielded the correct state on average about two thirds of the time. The fact that this “fidelity” was above 50%, says Wang, meant that their qubits were “entangled for sure”.
According to Shibiao Zheng of Fuzhou University, who designed the entangling protocol, the key ingredient in this set-up is the bus. This, he says, allows them to generate entanglement “very quickly”.
The previous record of nine for the number of entangled qubits in a superconducting circuit was held by John Martinis and colleagues at the University of California, Santa Barbara and Google. That group uses a different architecture for their system; rather than linking qubits via a central hub they instead lay them out in a row and connect each to its nearest neighbour. Doing so allows them to use an error-correction scheme that they developed known as surface code.
High fidelity
Error correction will be vital for the functioning of any large-scale quantum computer in order to overcome decoherence – the destruction of delicate quantum states by outside interference. Involving the addition of qubits to provide cross-checking, error correction relies on each gate operation introducing very little error. Otherwise, errors would simply spiral out of control. In 2015, Martinis and co-workers showed that superconducting quantum computers could in principle be scaled up, when they built two-qubit gates with a fidelity above that required by surface code – introducing errors less than 1% of the time.
Martinis praises Pan and colleagues for their “nicely done experiment”, in particular for their speedy entangling and “good single-qubit operation”. But it is hard to know how much of an advance they have really made, he argues, until they fully measure the fidelity of their single-qubit gates or their entangling gate. “The hard thing is to scale up with good gate fidelity,” he says.
Wang says that the Chinese collaboration is working on an error-correction scheme for their bus-centred architecture. But he argues that in addition to exceeding the error thresholds for individual gates, it is also important to demonstrate the precise operation of many highly entangled qubits. “We have a global coupling between qubits,” he says. “And that turns out to be very useful.”
Quantum simulator
Wang acknowledges that construction of a universal quantum computer – one that would perform any quantum algorithm far quicker than conventional computers could – is not realistic for the foreseeable future given the many millions of qubits such a device is likely to need. For the moment, Wang and his colleagues have a more modest aim in mind: the development of a “quantum simulator” consisting of perhaps 50 qubits, which could outperform classical computers when it comes to simulating the behaviour of small molecules and other quantum systems.
Xiaobo Zhu of the University of Science and Technology of China, who was in charge of fabricating the 10-qubit device, says that the collaboration aims to build the simulator within the next “5–10 years”, noting that this is similar to the timescale quoted by other groups including the one of Martinis. “We are trying to catch up with the best groups in the world,” he says.
A lizard can change its spots, and now scientists have shown that the process is a living version of a popular computing algorithm. The ocellated lizard begins life with brown skin that is peppered with white spots. As the creature ages, the skin pattern transforms to a labyrinthine pattern in which each individual scale is either green or black. Now, a team of scientists in Switzerland and Russia has shown that this transformation is governed by a process that they call a “living cellular automaton”. In the field of computing, a cellular automaton consists of connected cells whose individual behaviour depends on the states of neighbouring cells. Computational cellular automatons have been used in a wide range of applications from simulating biological systems to creating computer games. In this latest work, Liana Manukyan of the University of Geneva and colleagues used a high-resolution system of robotic cameras to image the skin of three male lizards as they grew to adulthood. This allowed them to watch about 1500 scales on the back of each lizard change colour over a period of four years. Unlike many other animals, in which colour patterns are determined by interactions within individual cells, the results suggest that the colour of an individual cell (lizard scale) is determined by that of its neighbours. The team says that this is the first time that a cellular automaton has been seen in a living organism. Further studies into the chemical and biological processes involved in the skin transformation could provide important insights into why patterns emerge in living systems. The study is described in Nature.
Graphene keeps a lid on liquid analysis
Carbon caps: graphene sheet keeps liquid samples in channels during high vacuum PEEM analysis. (Courtesy: A Strelkov / NIST)
Graphene lids allow liquids to be analysed with a technique usually limited to solid samples. Andrei Kolmakov, from the National Institute of Standards and Technology (NIST) in the US and colleagues have developed a carbon-capped liquid sample array to extend the capability of photoemission electron microscopy (PEEM). The analysis tool involves bombarding a sample with ultraviolet light or X-rays. The photons transfer energy to electrons within the sample, allowing them to escape the material if they are near to the surface. The energy of an emitted electron is specific to the atom it came from, and therefore by using a series of electric lenses and detection systems, PEEM can create an image of the sample’s chemical make-up. Although a popular and powerful tool, PEEM is usually restricted to solid surfaces as liquid samples evaporate and create sparks under the required high vacuum. Kolmakov and team used graphene – an atomically thin sheet of carbon – to seal liquid or gas samples within a multi-channel array. Once the system is under vacuum, the samples are constrained to their channels and remain at atmospheric pressure, but photons and electrons can pass through the graphene nearly completely unimpeded. The simple solution, described in Nano Letters, allows researchers to analyse liquid interfaces and the surface of nanometre-scale objects immersed within liquid, potentially leading to the advancement of batteries and chemical catalysts.
Electron pulses outrun atoms
Electron pulses so short that atoms have no chance to move as the pulses pass through them have been produced by scientists working on the Pegasus radiation facility at the University of California, Los Angeles. The pulses are less than 10 fs in duration and could be used to do time-resolved electron microscopy studies such as following the motions of individual atoms in materials undergoing structural reorganization. Jared Maxson and colleagues created electron pulses of less than 10 fs duration using an electron source similar to those used to deliver electron bunches to synchrotron storage rings. The process begins with firing a 100 fs laser pulse at a cathode, which ejects a pulse of electrons. This relatively long pulse is then sent down a linear accelerator, where it is compressed in time to 10 fs. The final energy of the pulses – which each contain about 500,000 electrons – is several MeV, which is higher than the 50–300 keV used in conventional electron microscopes. While using higher-energy pulses does introduce several challenges, it also offers advantages that arise from relativistic effects in the higher-energy pulses. The research is described in Physical Review Letters.
You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a device that entangles 10 qubits.
The environment is a shared resource, and in recent years the question of how to moderate the impacts of human activity on the air, water and land has become increasingly important. One of the characteristics that has received the most attention in this effort is the “greenness” of our energy supply. Yet the term “green” seems not to be well understood and it is not consistently applied. Some sources define a “green” energy source as one with a low environmental impact, but that merely shifts the question towards defining impact. There is currently a lot of focus on greenhouse gases (GHGs), and many definitions seem to view “low impact” and “low-GHG emissions” as synonymous. Alternative definitions, however, include other types of environmental effects, such as particulate air pollution, use of water or the generation of waste products.
This confusion and multiplicity of definitions has particular implications for considering what the term “green” might mean for the nuclear-energy industry. Nuclear power is sometimes characterized as producing no GHGs, and while this is not completely true, it is certainly accurate to say that GHG emissions from nuclear power generation are much lower than those of fossil-fuel-based power sources.
Another characteristic of interest is the sustainability of different energy sources – that is, whether the supply could be “used up” over time. Simplistically, it would appear that both fossil fuels and uranium for nuclear power, being mined from the ground, are finite and will eventually be used up, while “renewable” resources such as wind and sunlight are effectively infinite. However, this viewpoint ignores the fact that the systems required to extract energy from sunlight and wind use mined materials as well.
Likewise, some consider the fact that nuclear power produces long-lived waste as proof that it is not “green”, yet a similar argument could be applied to the wastes generated in producing the components required for solar or wind power generation. All of these arguments – plus others relating to as-yet-untapped possibilities, such as reprocessing used nuclear fuel to extract more energy from it; using a thorium fuel cycle instead of (or in addition to) uranium; or even extracting uranium from unconventional resources, such as sea water – must be taken into account when deciding whether nuclear energy counts as “green”.
Cradle to grave
Unlike fossil fuels, which are all carbon-based and thus produce carbon dioxide when burned, “burning” uranium fuel produces no GHGs. However, other parts of the nuclear fuel cycle, including mining, extraction and enrichment of uranium, do produce some GHGs. This fact has been recognized in many analyses. What is less well recognized is that generating power from wind and sunlight also produces some GHGs.
1 Start to finish The life cycle of an energy-producing system can be split into three stages: the ‘front end’ (that is, steps before energy generation), ‘operations’ and the ‘back end’ after the system has reached the end of its useful life. With fossil fuels, most of the greenhouse gases (GHGs) come from combustion during the ‘operations’ phase, while for nuclear power and renewable energy sources, most of the GHG emissions come from the front end. For renewable energy sources, component manufacturing and facility construction produce the bulk of GHGs, while for nuclear power, enrichment accounts for more than half of the total GHG emissions.
This needs to be considered in any analysis of “greenness”, but often, it is not. The US Environmental Protection Agency (EPA) is one of many organizations to make this mistake. The EPA definition of green power is “electricity produced from solar, wind, geothermal, biogas, eligible biomass and low-impact small hydroelectric sources”. Thus, on their website it says that, “Although nuclear power generation emits no greenhouse gases during power generation, it does require mining, extraction and long-term radioactive waste storage.”
The EPA does not make a similar statement for solar or wind power. Yet, although solar and wind power – like nuclear power – emit no greenhouse gases during power operation, they also require raw materials to be mined, extracted and processed. In fact, because of the diffuse nature of wind and solar energy, more materials are required to construct and manufacture the structures and components for power production, per unit of energy generated, than are required for other energy sources. In addition, wind turbines use rare-earth metals, which are in limited supply, and the manufacture of solar photovoltaics involves the use of highly toxic materials.
Instead of simply identifying specific energy systems as “green” or “not green”, a better way to assess the “greenness” of energy sources is to examine the full set of environmental impacts from “cradle to grave”. This life-cycle assessment gives a more accurate idea of the total GHGs from any source. It makes it clear that all energy sources generate some GHGs, although the parts of the fuel cycle responsible for the emissions are different for each energy source (see “Start to finish”, above).
Another issue that must be addressed in evaluating GHG emissions is that different researchers have computed different levels of emissions. In 2013 the US National Renewable Energy Laboratory (NREL) reviewed a large number of studies and determined that the variation can be attributed to such factors as differences in the exact choice of designs assumed for each study, different operating assumptions and different evaluation methods. Despite these variations, however, all evaluations of the total life-cycle emissions show coal having significantly greater emissions than any other energy source (see “Measuring up”, above). Natural gas is the best of the fossil fuels in this respect, but it still produces significant emissions. Solar, wind, geothermal, hydro and nuclear power all generate only a small fraction of the GHGs of the larger emitters, with nuclear power usually ranking among the lowest emitters. Hence, if the measure of greenness is based on the emissions over the whole life cycle, nuclear power should be categorized as being similar to wind and solar power.
Waste not
In terms of non-GHG impacts, nuclear power and the renewable energy sources do not generate the particulates that are associated with coal burning, nor do they generate some of the other emissions associated with fossil fuels, such as methane, sulphur or nitrous oxides, organic compounds or toxic heavy metals. The nuclear-energy industry does, however, produce nuclear waste that must be sequestered for thousands of years before the radioactivity decays. The industry is often criticized for this, and its “green” credentials questioned on this basis. However, the volumes involved are small and waste repositories are being developed to ensure that the waste remains sequestered from the environment.
It is also worth noting that other energy sources are not waste-free. Coal produces large volumes of solid waste, in addition to the airborne emissions, but even renewable-energy sources produce waste. Frequently, the toxicity of this waste is an issue, but in some cases even the sheer volume of waste creates challenges. For example, solar energy requires large arrays of solar panels, which degrade over time and must be replaced. For a small country such as Japan, the question of what to do with all these spent panels is already becoming a problem.
2 Measuring up A comparison of direct greenhouse-gas emissions (red bars) and full-life-cycle emissions (blue bars) produced by different energy technologies. Although biomass is essentially a carbon-based fuel, and thus generates large quantities of carbon dioxide when it is burned, it also absorbs carbon as it grows. After including the environmental impacts of mining, extraction and enrichment, the greenhouse-gas emissions from the full nuclear fuel cycle are revealed to be on a par with those of sources like wind and solar power, while all three are much less than those of fossil fuels. (Source: Annex III: Technology-cost and performance parameters. In Climate Change 2014: Mitigation of Climate Change. Contribution of Working Group III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change.)
The environmental impacts of mining are often mentioned as a consequence of using fossil or nuclear fuels. However, the volume of the resource needed for nuclear power operations is much less than that needed to burn fossil fuels, and this translates to a lower level of environmental impacts from extraction. Moreover, as already noted, renewable energy sources require the mining of large volumes of structural materials, as well as the mining of toxic materials. Although the environmental impacts are a lot smaller than for coal, there are some measurable impacts associated with these mining operations.
A detailed assessment of other types of environmental impact is beyond the scope of this article. A full comparison of energy sources would include such factors as land use, which is greater for solar power and wind than for other sources (although some solar and wind sites can be used for other purposes, such as agriculture in the case of wind farms); impacts on birds and bats (a concern for wind farms); accidents (serious accidents at nuclear power plants can contaminate surrounding areas); earthquakes (from hydroelectric dams and fracking for oil and gas); and impacts associated with disposing of wastes from the various energy sources (building and operating disposal facilities, transporting wastes, and so on).
No free lunch The real truth is that no energy source is completely green. Perhaps it is more accurate to say that there are shades of green. By that measure, nuclear power is very close to the same shade of green as that of most renewables.
But evaluating energy supply options is an incredibly complex and multi-faceted exercise, and while greenness is important, it must be viewed in the context of other considerations. Decision-makers must weigh the GHG and other emissions from each energy source against such measures as cost; short- and long-term resource availability; the reliability of the overall energy supply 24 hours a day; and the security of the energy supply against interruptions by weather or by foreign suppliers.
These evaluations are not static. Resource discoveries and actions by foreign governments can affect supply, while technology developments and government decisions can affect a host of measures, including costs, availability, emissions and other measures. This article has considered only current energy-supply technologies, but a breakthrough in some areas could alter the comparative data significantly. One example of such a breakthrough would be “clean coal” technology – that is, some way of extracting the GHGs from coal emissions before they are released to the environment, and sequestering them from the environment permanently and reliably. Since this type of development can’t be counted on at the current time, it is not considered in the comparative data used in this article. If it later proves to be technically feasible and economic, it could alter the discussion radically. So, too, could the development of fusion energy, or cheap energy storage for renewables.
At present, though, no single energy source excels in all measures. Each has some pros and cons, and most rational national policies seek to diversify their energy portfolios in order to take advantage of the benefits different energy-supply technologies offer and to ameliorate any disadvantages. Although nuclear power has some challenges – notably waste disposal – it appears to be one of the most attractive sources in terms of a small environmental footprint, reliable energy generation, security of the energy supply, and other important measures. Hence, the short answer to the question raised in this article’s title is that nuclear energy is indeed green, and it offers several other advantages as well. It should, therefore, be considered in this light in decision-making on future energy-supply options.
Scientists studied this 250-year-old violin in an X-ray beamline at Italy’s Elettra synchrotron. (Courtesy: Elettra)
You can hardly blame the Norwegian musician Peter Herresthal for never letting anyone else play his 250-year-old violin. He bought it at an auction, with the help of a sponsor, for an eye-watering half a million dollars. “I don’t even let it out of my sight,” Herresthal told me. So for him to hand over the violin in October 2010 to a team of scientists at the Elettra synchrotron in Trieste, Italy – and then let them enclose it inside an X-ray beamline for two days – was a stupendous display of trust in science.
Operational since 1993, the Elettra facility has 27 different X-ray beamlines that are put to different uses. One beamline, known as SYRMEP (SYnchrotron Radiation for MEdical Physics), is often used by scientists to non-invasively examine archaeological specimens and ancient artefacts, including flutes and paper-pipe organs. These studies inspired a team of Elettra scientists to image a working violin – a cheap student model – with synchrotron light.
The resulting 3D images of the instrument – along with detailed information about its composition and manufacture – emboldened the researchers to seek a historically significant violin to image. Their thoughts immediately turned to the legendary “Cannon” violin that once belonged to the virtuoso Niccolò Paganini (1782–1840). One of the most famous instruments in the world, it was named after the explosive sounds Paganini could create with it.
Paganini’s Cannon is an Italian national treasure and is displayed in an earthquake-proof case in Genoa’s Palazzo Tursi; on the rare occasions it’s moved, it is ferried around in an armoured car. When the Elettra scientists approached its curator Alberto Giordano, he unsurprisingly didn’t feel comfortable lending it. Giordano, however, thought of his friend Herresthal, who had recently bought another historic violin, made in 1753 by the Piacenza instrument builder Giovanni Battista Guadagnini.
Herresthal was using the quality sound of the antique instrument to cement his reputation as a foremost expositor of contemporary violin music and also appeared reluctant. Although Giordano explained the procedure and assured Herresthal that the X-rays wouldn’t damage or alter the wood, Herresthal had other concerns too. “The violin’s wood is old and fragile,” he told me. “But the climate where I live in Norway is dry, and when you take the violin to a place with a different climate it affects the response.”
Herresthal eventually consented to the project but only after the Trieste scientists agreed to build a specially created environmental system that could control a sample’s temperature and humidity. And so it was that in October 2010 Herresthal and his wife put the violin in the back of their car and drove the 160 km down the autostrada from Venice, where he’d been teaching, to Trieste. Once there, a team of half a dozen scientists showed him the synchrotron and described how it works.
“They explained the technique, but I don’t know if I can explain it back!” Herresthal recalls. Staff then showed him the two-stage climate monitoring and control system they’d built. The first stage contained a humidifier and air conditioner designed to bring the climate to the target range of 55% relative humidity and 25 °C. The second consisted of a more precisely controlled environment inside a 50 × 50 × 130 cm Plexiglas box, equipped with an alarm that would ring in the control room should the humidity or temperature change.
Still, Herresthal’s first glimpse of Elettra’s experimental hall was a shock, crammed as it was with silver-foil-wrapped equipment and serviced by an overhead crane. “It looked like a scene from a James Bond movie just before everything blows up,” he says. “I remember how excited the scientists all were, but I was still worried.” Indeed, when Giordano asked Herresthal to remove his violin’s fittings to improve the imaging, Herresthal refused. He also said no when Giordano asked if the strings could be removed. “Hands off!” Herresthal recalls saying, worried about an upcoming performance in Kentucky. “If you remove the strings, the bridge can fall down.”
The scientists compromised, re-thinking their imaging plans. They mounted the violin on a support, closed off the experimental area from direct view, and began work. Two days later, they had a 3D digital image of the violin, resolving details down to 50 µm, presenting it in a way Herresthal could understand. “The images were remarkable,” he admits. “You could see all the repairs; I even saw a drop of glue.” When I asked Herresthal if the imaging changed his views about the violin, he agreed it had, saying it made him more confident. “It showed me there were no cracks and the repairs had been done well,” he says. “Sometimes you are tempted to open a violin to try to improve it; I’ll keep mine closed.” He also reckons violin imaging will change the market. “It won’t be possible to hide that a historic violin has cracks, or is heavily restored, or is made with composite materials.”
The critical point
One moment in this episode encapsulates what made the trust between Herresthal and the scientists possible, an encounter in which his respect for the scientists, and theirs for him, became most transparent. It took place after the first day, when the scientists had to stop to adjust the violin’s position. They asked Herresthal if he wanted to test it before they carried on.
Herresthal, still worried, said yes. If anything were awry he would refuse to let them continue. He picked up the violin, raised the bow, and started to play. Music filled the experimental hall – some scales, followed by snippets of a piece by a Danish composer that Herresthal was planning to perform in Kentucky.
Satisfied with the sound, Herresthal smiled. He handed them back the violin. and said: “Go ahead.”
Adding nanoparticles to the surface of tumour cells could make them more susceptible to treatment with particular cancer drugs, according to new research at MIT. The study showed that nanoparticles tethered to the cell surface can increase the effects of forces exerted on the tumour cells by physiological fluids flowing within the body, which makes the cells much more vulnerable to attack by certain therapeutics.
Scientists have recently been exploring the physical properties of tumours and their microenvironment, with recent research showing that tumours can exploit the forces in their surroundings to enhance their survival and promote the progression of the cancer. But researchers at MIT believe that increasing the forces exerted on tumour cells can make certain therapeutics more effective in killing cells and controlling the cancer.
The study, which was led by Robert Langer, used an experimental drug known as TRAIL (a TNF-related apoptosis-inducing ligand), which exerts a cytotoxic effect on tumour cells without damaging healthy cells. TRAIL also avoids many of the debilitating effects of more commonly used therapies.
The researchers found that when used to target tumour cells, TRAIL was more successful in killing cancerous cells once they had been exposed to the shear forces generated by physiological fluids such as blood flow in the body (Nat. Commun.8 14179). The MIT team set out to optimize the forces required for cell death, and they found that increasing the force on the cells made them more susceptible to TRAIL.
Nanoparticles strengthen forces
To produce these forces, Langer and his colleagues have pioneered the use of nanoparticles made from biodegradable polymers known as PLGA (poly(lactic-co-glycolic acid)). When injected into the bloodstream, the nanoparticles attach to the tumour cell surface and increase the force on the cell from the flow of physiological fluids.
Members of the MIT research team: Michael Mitchell, Amanda Chung, and Pedro Guimaraes
The nanoparticles are coated with PEG (polyethylene glycol), which is tagged with a specific ligand that interacts with proteins found on the surface of the tumour cells. These ligands, and therefore the nanoparticles, are attached to the tumour cell like a ball tied to a string. The shear forces from the flow of physiological fluids cause the nanoparticles to bump and bash the tumour cells, causing them to become more susceptible to the effects of the therapeutics.
The MIT team found that attaching the nanoparticles to tumour cells prior to treatment with TRAIL killed metastatic tumours and reduced the progression of tumours in mice. The researchers also found that the treatment appeared to be specific to tumour cells and that healthy cells remained unaffected. Nanoparticle size and quantity were also found to have an effect on cell death. Larger particles, around one micrometre across, and a higher quantity of particles were found to have the most positive effect.
The researchers believe that the mechanism the nanoparticles induce on the tumour cells may cause the molecules surrounding the tumour cells to compress, enabling the therapeutics to interact more efficiently with receptors on the cell surface.
Langer and his research team are now exploring the possibilities of using this technique in combination with other drugs. This is a key strategy to prevent drug resistance in cancer treatment since tumours often regrow and become unaffected by drugs that had previously been effective. The MIT team is particularly interested in drug combinations that induce cytokines, which stimulate signalling chemicals that trigger an immune response to the site that helps destroy the tumour.