Internationally, women only represent 25% of researchers in the physical sciences, compared with 40% in health and life sciences. This is the finding of a new study published by Elsevierthat examines the gender trends across 27 research areas in 12 countries during the periods 1996–2000 and 2011–2015. According to the work, while the past 20 years has seen a significant increase in the percentage of women in scientific research, with nine of the 12 countries now over 40%, the physical sciences are still dominated by men. For physics and astronomy in both the UK and US, around 22% of researchers were women in 2011–2015. Although this is an increase from 15% during 1996–2000, women are still under-represented. Portugal, meanwhile, has the best ratio of women to men in physics and astronomy, with 37% of researchers being female. The team behind the study hopes that the empirical evidence will help governments, funders and institutions worldwide as they develop gender-balance initiatives. “[The report] will enable us to explore ideas about the causes of gender inequality in science,” explains Uta Frith of University College London in the UK, who provided guidance for the report. The study, which is freely available online, used high-quality data sources including Elsevier’s SciVal and Scopus, and the World Intellectual Property Organization (WIPO).
Mystery of drying paint cracked by new calculations
Watching paint dry: diagram showing how larger particles eschew the air interface as paint dries. (Courtesy: J Zhou et al. / Phys. Rev. Lett.
Last year, physicists made the surprising discovery that smaller particles in a layer of drying paint tend to move towards the air–paint interface, whereas larger particles move towards the surface being painted. This upended conventional wisdom, which suggested that smaller particles (which experience more random motion than larger particles) are more likely to diffuse away from the air interface. This response was expected to be driven by increased particle concentration near the air interface that is caused by evaporation. Now, Jiajia Zhou, Ying Jiang and Masao Doi at Beihang University in Beijing have come up with an explanation for why particles stratify in the opposite way. Writing in Physical Review Letters, they describe a model system that contains particles of two different sizes. The system is governed by the standard diffusion equation as well as an interaction between particles of different sizes. Their calculations suggest that the large particles do not tend to move towards the air interface because their size makes it difficult for them to push their way through the region of high particle concentration. This restricted mobility also features in a popular explanation of the “Brazil-nut effect“, whereby larger nuts in a shaken tin of mixed nuts tend to congregate at the top of the tin. The research could lead to the development of new techniques for creating layered structures – and better paint.
Gravitational-wave pioneer Ronald Drever dies
Gravitational wave pioneer Ronald Drever has died. (Courtesy: American Physical Society)
The Scottish physicist Ronald Drever, a key person behind the direct detection of gravitational waves, has died at the age of 85. Drever was born in 1931 in Bishopston, Scotland, and after studying a BSc in physics at the University of Glasgow, he graduated with a PhD from the same institution in 1958. Drever continued to work at Glasgow, setting up a research group on gravitational-wave physics and began building a prototype detector. In 1979, Drever then moved to the California Institute of Technology, where he worked on a gravitational-wave programme with the theorist Kip Thorne. Together with Rainer Weiss from the Massachusetts Institute of Technology, the trio co-founded the Laser Interferometer Gravitational-wave Observatory (LIGO), which is located in Hanford, Washington and Livingston, Louisiana. Drever retired in 2002 and his death comes just a year after LIGO announced the first direct detection of gravitational waves.
You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on how a time crystal has been created in the lab.
I was born in 1971, by which time astronaut John Glenn had orbited the Earth and Neil Armstrong had walked on the Moon. My parents were witness to these monumental achievements, but sadly they knew nothing of a team of phenomenal black women who quietly played a pivotal part in making these significant moments possible. My family didn’t know their amazing story or how it would relate to me when I started dreaming about becoming an astronaut. As an African American woman, a physicist and a current employee of the National Aeronautical and Space Administration (NASA), Margot Lee Shetterly’s book, Hidden Figures: the Untold Story of the African American Women Who Helped Win the Space Race, both excited and moved me.
I was eager to delve into the untold true story of the African American female mathematicians who came to work at NASA (then known as the National Advisory Committee for Aeronautics) at Langley Field campus in Hampton, Virginia, following the labour shortages of the Second World War. Part of the segregated West Area Computers division, these human “coloured computers”, who had previously worked as underpaid maths teachers in segregated public schools, stayed on at Langley after the war ended. They became a crucial part of America’s race into space during the Cold War, as they calculated the flight paths that would send Armstrong to the Moon.
Before I had even known of the book, in July 2016 I was given the opportunity to watch the trailer for the upcoming film Hidden Figures, based on Shetterly’s book. I recall having goose bumps down my arms and back while watching the excerpts, alongside cast members Janelle Monáe and Aldis Hodge, together with other NASA employees, including astronaut Victor Glover. I felt an overwhelming connection to the women I watched on-screen and I knew it was my duty to read the book and discover more about their story that was omitted from the history of space that I had been told.
I was ready to learn about the sacrifices made by these women who played a key role in integrating NASA and providing a pathway for me to follow. The book mainly outlines the contributions of four women: Dorothy Vaughan, Mary Jackson, Katherine Johnson and Christine Darden. We learn about their personal lives, their careers and their contributions to NASA. There are two passages in particular that really resonated with me, and I have a feeling that many other African Americans in science, technology, engineering and mathematics will agree. The first was the bit that mentioned the “West Computers” having to “prove themselves equal or better, having internalized the Negro theorem of needing to be twice as good to get half as far”. Also, the fact that “not everyone could take the long hours and high stakes of working at Langley, but most of the women in West Area Computers felt that if they didn’t stand up to the pressure, they’d forfeit their opportunity and maybe opportunity for the women who would come after them” really drew me into parts of the story. I personally identified the most with Johnson who, like me, has three children. As the first black graduate from the University of Alabama with a concentration in physics, I also connected with Jackson who was NASA’s first black female engineer.
Although I was fascinated with the story being told, I unfortunately found the book somewhat difficult to read. The depth with which Shetterly chronicles the women’s lives clearly shows her deep personal connection to the story, as well as her own history with NASA – her father was a research scientist at Langley. But at the same time, the book was written in a more distant way than I was expecting – the book often spans further out to address the wider historical context of the time, instead of remaining true to the women’s stories. The prologue describes the author’s personal experience of the subject and provides a short account of her view of the story and her dedication to unveiling the history – it was beautifully written by Shetterly and was one of my favourite parts of the book.
Since the release of the film I have watched it eight times, and for three of those I had the honour of viewing it with Johnson’s granddaughter, Katherine Michelle Sanders. Each time she was present, Sanders provided little quips on what was true or false and even expanded on a part of the history that the film omitted. This film moved me to tears each time I watched it and I still sometimes find myself angry that women and minorities are currently dealing with the same issue of bias that Johnson and her colleagues had to face in their day. In my current role as president of the National Society of Black Physicists (NSBP), so many people have approached me and felt compelled to share stories of similar segregation, being left out of important meetings or generally treated differently due to their gender or race.
Hidden Figures is an inspiring story that outlines the significant and remarkable impact these intelligent and brave African American women had on some of NASA’s greatest hits. The film and book should be seen and read by all young African American girls, as it not only proves to them that they can be black, female and top-notch mathematicians and engineers, but also shows them that the pathway has already been laid for them and that their participation in science will go some way to lessening the large disparity that exists when it comes to race and gender in science. For me, the book and film provided insight into the women who preceded me at NASA and provided context to those “giants” on whose shoulders I stand. I can only imagine the dreams I might have dreamed if I had known this story before now.
Hidden Figures: the Untold Story of the African American Women Who Helped Win the Space Race by Margot Lee Shetterly (2016 William Collins 368pp £16.99hb)
Hidden Figures directed by Theodore Melfi (2016 Fox 2000 Pictures, Chernin Entertainment, Levantine Films, TSG Entertainment)
What would you say are the core “products” of academic research? Most people, when asked this question, talk about research papers, trained scientists, books and perhaps even data. But this list misses a critical component of much of the research being done today: software.
We all know that much of modern physics research relies on the development of specialist software, whether it’s for experiments that create a huge amount of data such as the Large Hadron Collider, or for supercomputer simulations modelling the distribution of dark matter in the early universe. More than 90% of UK academics use software, according to a survey of Russell Group Universities (Hettrick et al. 2014 UK Research Software Survey 10.7488/ds/253). About 70% say their research would be impractical without it and more than half develop their own. Why, then, is software in physics not as visible as it arguably deserves to be?
Part of the problem here is that the research paper is becoming an increasingly unsatisfactory way of describing modern, data-intensive research. Academic publishing hasn’t changed substantially since the first communications in the journal Philosophical Transactions in 1665. Academics writing down their thoughts and sharing results with their peers in a journal-based system (paper or electronic) is the same solution we’ve had for more than 300 years.
Yet the full spectrum of activities in modern physics (and many other disciplines) simply can’t be completely described with text, a few equations and the occasional plot or figure. To completely describe the origin of any individual result, researchers need to share both their ideas and results (perhaps in the form of a paper) but also the data they collected and the analyses they carried out to reach their conclusions.
This idea of sharing more than just a paper isn’t new. In 1995 statisticians Jonathan Buckheit and David Donoho wrote “An article about a computational result is advertising, not scholarship. The actual scholarship is the full software environment, code and data, that produced the result.” Buckheit and Donoho argued that papers about “computational science” (the same argument also holds for physics) aren’t sufficiently complete descriptions of the work. They’re simply “adverts” for the research that we place in journals. For a third party to properly understand the research, they would need to be able to see all of the components that resulted in the paper (Wavelets and Statistics, New York: Springer, pp55–81).
The publishing problem
With dependencies on software woven into the fabric of modern research, finding ways for researchers to share this work seems like it should be a high priority.
On the face of it, asking researchers to share a more complete description of their research is hard to argue against. In reality though, there are a number of factors limiting progress. Probably the biggest impediment is that for many researchers, especially those in the early stages of their career, the pressure to publish as many papers as possible trumps almost every other activity. Publishing anything in addition to a peer-reviewed paper requires additional time and effort that most researchers simply cannot afford.
But as we move towards a future where a growing fraction of research output is described by data and software, it becomes increasingly urgent to find ways of at least capturing references to software in papers and ideally establishing community norms for the publishing of these tools.
There are a number of challenges to actually doing this. First, it’s not completely obvious how a researcher should cite software in a paper. Unlike a paper, which is a static “snapshot” of a research idea, popular software packages often have lifetimes of many years and are released multiple times with different version numbers and often with subtly different behaviour of the tools. As a result, capturing both the software name, location (i.e. where to find it) and the version of the software used is considered by many to be the minimal useful citation. Another obstacle is that even if an author wants to cite a piece of software, many journals don’t let them cite anything other than papers in their bibliographies. Finally, most academic fields lack cultural norms, such as dedicated journals, for publishing research software, which in turn means that spending time doing so generally isn’t recognized as a creditable research output. Put bluntly, why would anyone spend time publishing anything other than papers if it doesn’t contribute substantially to their career?
Change has been slow since the first scientific journal was published in 1665 (top left) and it is rare for journals to publish papers about software despite its increasing importance. A recent exception is The Journal of Open Source Software (bottom left). In the meantime, large science collaborations have found other methods of sharing their software and data, such as online portals from CERN (top right) and LIGO (bottom right).
Large physics collaborations are one area where all research outputs, including software and data, are shared well. This is probably because of a number of factors, including, first, that the collaborations are so large that there are individuals who devote most of their time to authoring software for data analysis and reduction and so they “go the extra mile” in publishing their code. Second, the results from these experiments have such a high impact in science that the community expectations for publishing all the research products (code, data, papers) is higher. Third, the community interested in reproducing these big results is large and so it’s more efficient for the wider community if the project releases software tools that enable others to check the data analyses.
A good recent example of a large collaboration publishing its research products well was in February 2016, when members of the Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration announced they had made the first detection of a gravitational wave. When announcing their results they published not only a paper describing the detection, but also all of their software used to analyse the data. The collaboration in addition created a complete online analysis environment, the “LIGO Open Science Center”, which leveraged this software in an interactive online environment. Publishing all of the constituent parts of their work meant that anyone with the time and interest could dive into the analysis carried out by the LIGO team, thereby increasing the community’s confidence in this groundbreaking result.
Looking outside of academia
Over the past few decades, there has been a major shift in the cultural norms of developing and sharing software that affects individuals, businesses and parts of academia. The reliance of businesses on closed-source, proprietary software, has given way to open-source software development, with even the biggest stalwarts of proprietary software such as Microsoft embracing open source as the future of technology development.
The term “open source” is often used to describe more than one thing. Strictly speaking, open-source software is software that has been shared publicly together with one of a number of approved licences that describe the conditions by which the code can be modified, reused and shared with others. What can be done with the code varies depending on the licence, but all of them permit the use of the software for any purpose. This is in contrast to, for example, image usage licences, which can specify that an image may not be used for commercial purposes.
In addition to being a collection of licences and legalese, the term “open source” is often used to describe the culture of open-source projects, in which there is an emphasis on working in an open and collaborative way, a focus on transparency and an effort to engage the community. As such, many of the principles of open source are well aligned with the core tenets of the open-science movement and academia more generally.
The success of open source relies not only on the goodwill of software developers and businesses to share their work free of charge, but also on an organically developed “ecosystem” that relies on a variety of factors (see box below). If open-source software is to flourish in the field of physics, physicists should consider adopting some of these key ingredients of success.
Data-science brain drain
Many of the problems we solve in academia, especially in data- and computer-intensive sciences, are, at least functionally, very similar to those in data-rich industries. This has led to a growing overlap in the skills required to be successful in both sectors. Often described with the catch-all term of “data scientist”, an individual capable of collecting, analysing and creating knowledge from data is highly employable in any large, data-driven organization. They might also be a good physicist.
In his 2013 blog post “The big data brain drain: why science is in trouble”, University of Washington data-science researcher Jake VanderPlas captures the essence of the problem facing academia. “The skills required to be a successful scientific researcher,” he writes, “are increasingly indistinguishable from the skills required to be successful in industry.”
VanderPlas is an astronomer and a prolific contributor to open-source tools that are used both in academia, for his research, and in industry, by data scientists. In his blog post, he describes a number of factors that should worry anyone who cares about the long-term health of our universities. First, the individuals most likely to be suffering a career penalty from spending time working on (open-source) software are some of the most employable people outside of academia. Second, the work these individuals contribute to open source is highly visible, and discoverable, because of the significance of these tools in industry. Third, with jobs in industry often paying two or three times more than postdoctoral-level salaries, many of the best and brightest young academics are leaving academia for industry.
One could argue that this “brain drain” is the university system working well for our economy – training a skilled workforce for industry. Unfortunately though, much of modern research is highly data-driven and needs individuals with these skills to make the best use of the voluminous data streams from modern experiments.
An imbalance of incentives
As things currently stand, most academic fields rely on a one-dimensional credit model where the academic paper is the dominant factor. Incentives to publish other parts of the research cycle, such as software and data, do exist but they don’t currently exist at the individual researcher level.
Papers that are accompanied by well-described data and analysis routines should be easier to understand and reproduce, which in turn should lead to an increased level of confidence in any new result. While physics has been left relatively unscathed compared to some other disciplines, without this increased level of transparency, many fields are running the risk of placing too much trust in “black box” methods whereby data are fed into analysis routines and results published with little critical review. Described by some as a reproducibility “crisis”, a number of high-profile retractions of novel results, especially in the biosciences in recent years, have led some scientific and medical publishers such as PLOS to require that submitting authors make software and data available when publishing a paper.
In physics and astronomy, publishers have so far been slower to adopt such requirements. But change is afoot: the American Astronomical Society journals The Astronomical Journal and The Astrophysical Journal, for example – published by IOP Publishing, which also publishes Physics World – now allow software papers describing research software with an astrophysics application. The Elsevier-published journal Astronomy and Computing, meanwhile, is dedicated to topics spanning astronomy, computer science and information technology.
In addition, there is a growing list of journals designed specifically for publishing software papers such as the Journal of Open Research Software, Software X and The Journal of Open Source Software – for which I led the development and continue to play the role of editor-in-chief. While these solutions are not the same as an academic ecosystem that rewards all of the constituent parts of the research output, they are a step in the right direction.
Physics experiments are only getting bigger and their data sets more complex. As such, much of modern research depends upon the availability of high-quality software and data products for community use. If we are to continue to make the best use of these experiments then we’re going to need to train – and retain – a workforce with a broad range of skills including data analysis, visualization and theory. To achieve this is going to require us to rethink what “counts” as an academic contribution.
Authorship and reputation in open source
Authorship is an important potential signal of trust in open source. When choosing an open-source project to solve a problem, knowing who the main authors of a project are is critical for evaluating the potential quality of a package. With platforms such as GitHub, Bitbucket and GitLab, contributions of individuals to the open-source ecosystem are placed front-and-centre on user profiles and easily discovered when viewing a project.
Another core tenet of open source is the reuse of existing tools. There are tens of millions of open-source packages hosted on a variety of platforms and most of these packages “depend” on other open-source tools. Tools such as Libraries aggregate all of these packages and provide rich metrics tracking their usage and the inter-project dependencies throughout the ecosystem. Understanding the “rank” of a project – that is, which projects are most reused by others – is similar to the citation count in the academic literature and is a strong reputation signal for the community.
A laboratory for cooling an atomic gas to just a billionth of a degree above absolute zero will soon be sent up to the International Space Station (ISS) by physicists working at NASA’s Jet Propulsion Laboratory. The goal of the Cold Atom Lab (CAL) mission is to create long-lived Bose–Einstein condensates (BECs) that could lead to better sensors and atomic clocks for use on spacecraft. The BECs could even provide important insights into the nature of dark energy, according to the researchers.
First created in 1995, a BEC is made by trapping and cooling an atomic gas to an extremely low temperature so the atoms fall into the same low-energy quantum state. Instead of behaving like a collection of individual atoms, a BEC is essentially a large quantum object. This makes it very sensitive to disturbances such as stray magnetic fields and accelerations, and therefore BECs can be used to create extremely good sensors.
Falling down
Here on Earth, gravity puts an upper limit on the lifetime of a BEC – the atoms fall down and after a fraction of a second the BEC has dropped out of view of the experiment. In the microgravity environment of the ISS, however, NASA’s Robert Thompson and colleagues reckon that their BECs should be observable for 5–10 s. As well as allowing physicists to make more precise measurements of the quantum properties of BECs, the longer lifetime should also make the BECs better sensors. With further development, the team believes that BECs in space could endure for hundreds of seconds.
Five scientific teams will do experiments using Cold Atom Lab, including one led by Eric Cornell of the University of Colorado – who shared the 2001 Nobel Prize for Physics for creating the first BECs.
As well as creating BECs, CAL will also cool fermionic atoms to create degenerate Fermi gases. These systems can be made to mimic the behaviour of electrons in solids and could provide important insights into phenomena such as superconductivity. Physicists will also study ultracold mixtures of bosonic and fermionic atoms. Other planned experiments include atom interferometry and very precise measurements of gravity itself.
Pervasive forces
“Studying these hyper-cold atoms could reshape our understanding of matter and the fundamental nature of gravity,” says Thompson. “The experiments we’ll do with the Cold Atom Lab will give us insight into gravity and dark energy – some of the most pervasive forces in the universe.”
CAL will be contained within a package about the size of an “ice box”. This will contain a vacuum chamber, lasers and electronics. It will also include an electromagnetic “knife”, which will be used to cool the atoms. The lab is currently in the final stages of assembly and will be launched in August on a SpaceX CRS-12 rocket.
The world’s smallest astronomical satellites have identified the largest “stellar heartbeat” to date. Using the nanosatellite network Bright Target Explorer (BRITE)-Constellation mission, a group of astronomers has observed the pulse and tidal events of the Iota Orionis binary star system for the first time. The BRITE-Constellation project comprises five tiny satellites – cubes measuring 20 cm across – in low-Earth orbits. As the first-ever nanosatellite mission, they are used to investigate the structure and evolution of the brightest stars using high-precision photometry. One such system is Iota Orionis, which is the brightest star in the constellation Orion’s Sword and easily visible with the naked eye. Iota Orionis is dominated by a massive blue-giant star that is in a 29–day orbit with a main-sequence class B star. While light from the system is relatively stable 90% of the time, the team, led by Herbert Pablo of the University of Montreal observed a repetitive rapid dip and sharp spike. “The variations look strikingly similar to an electrocardiogram showing the sinus rhythms of the heart, and are known as heartbeat systems,” says Pablo, who is also a member of the Centre for Research in Astrophysics of Quebec (CRAQ). The phenomenon is caused by the stars coming closer together for a short time during their elliptical orbit. The closer contact means the gravitational forces between the stars become so strong that their shapes distort, their light is seen to pulse and quakes are triggered in the star. This the first time that a heartbeat and induced quakes have been observed in such a massive system (35 times the mass of the Sun). The findings, presented in the Monthly Notices of the Royal Astronomical Society, could provide new clues as to how massive stars evolve.
Floating spectrometer could detect oil spills
Spectrometer ahoy: the new sensor detects fluorescence from oil (left), allowing it to identify the type of oil. Components of the device are shown centre and right. (Courtesy: Óscar Sampedro / University of Vigo)
A floating sensor that can detect an oil spill in water and identify the type of oil present has been launched by Óscar Sampedro and José Salgueiro of the University of Vigo in Spain. The device uses the fact that crude or refined oil absorbs ultraviolet (UV) light and emits fluorescent light. Different types of oil emit different spectra of fluorescent light, and the type of oil can be determined by comparing the detected spectrum to a database of known oil types. While fluorescence is usually measured using delicate and expensive equipment, Sampedro and Salgueiro have built a low-cost and robust spectrometer based on four photodiode detectors. Each detector is covered by a cellophane film that filters out a different colour of light and the fluorescence is stimulated using UV light from inexpensive LEDs. “The four signals proved to be enough to build a specific fingerprint for every oil type used in our study, letting us identify the different types of oil,” explains Salgueiro. “This approach dramatically reduces the cost of the instrument and simplifies contamination testing.” The sensor also includes a low-cost microcontroller and a radio module that allows the device to send data and receive commands. The prototype sensor is about 30 cm in size and could be placed in a buoy. It is described in Applied Optics.
Gluons contribute 50% to proton spin
Half of the spin of the proton is associated with gluons, according to a state-of-the art calculation done by Yi-Bong Yang and Terrence Draper of the University of Kentucky and colleagues in the XQCD collaboration. Quantum chromodynamics (QCD) describes the proton as comprising three quarks that are bound together by gluons – bosons that mediate the strong force. Both quarks and gluons have intrinsic angular momentum – or spin – and physicists are keen to understand how these spins combine to give the proton its well-known spin of 1/2. Accelerator-based experiments done over the past three decades suggest that the quarks contribute about 30% of the spin, with the rest unaccounted for. As well as coming from the gluons, some of the remaining spin could be orbital angular momentum. There are also other effects that could screen the quark contribution or make some of it invisible to experiments. Calculating proton spin is extremely difficult because of the enormous strength of the strong nuclear force and the fact that calculations must consider large numbers of virtual quark–antiquark pairs that pop into and out of existence. The XQCD team is the first to use the computational technique lattice chromodynamics (QCD) to calculate the contributions of the gluons to the proton’s spin, which the researchers found to be about 50%. The calculations also suggest that screening of the quark spin is not significant, which means that the remaining 20% of the proton’s spin is probably related to orbital angular momentum and a topological effect that makes some of the quark spin invisible. The calculations are described in Physical Review Letters.
You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a plan to put ultracold atoms into space.
An organic retinal implant designed in Italy can stimulate retinal neurons and send signals to the brain, restoring near-normal vision indefinitely to rats with degenerative blindness without causing apparent damage to the rats’ eyes. That’s the claim of the researchers who developed it, who believe it could potentially lead to treatments for a major cause of blindness in humans. Other researchers, however, are more cautious.
Retinitis pigmentosa describes multiple genetic disorders that cause the photoreceptors on the retina to die. These lead to blindness, even though the other neurons concerned with signal processing and the optic nerve remain functional. There is currently no effective clinical treatment for the condition, but several groups are developing various proposals to effectively replace these lost photoreceptors by stimulating the retinal neurons artificially. While this could one day restore a patient’s vision, these approaches face severe difficulties. For example, most of the implants require a power supply, and wiring into the eyeball is extremely tricky. One solution is a photovoltaic cell that generates a voltage using only the incoming light, but this faces two principal problems. First, previous researchers have found the intensity of ambient light insufficient to stimulate the neurons. Secondly, silicon is much stiffer than nervous tissue: “In the long term, [silicon] can induce a reaction by the tissue,” says neuroscientist Fabio Benfenati of the Center for Synaptic Neuroscience and Technology in Genoa, “leading to encapsulation, [scarring] and things like that.”
Silken substrate
In the new research, materials scientist Guglielmo Lanzani of the Center for Nano Science and Technology in Milan and colleagues designed a more flexible, organic retinal implant based on a polymer solar cell. They deposited a thin layer of conductive polymer onto a silk-based substrate and covered it with a semiconducting polymer. When the semiconductor absorbs a photon, it creates an electron–hole pair called an exciton. The positive holes are drawn into the conducting polymer, whereas the electrons remain in the semiconductor, causing a negative charge.
Surgeons led by ophthalmologist Grazia Pertile of Sacro Cuore Hospital near Verona implanted the devices underneath the retinas of Royal College of Surgeons (RCS) rats – a strain of rats that reliably develop retinitis pigmentosa owing to a genetic mutation also found in some human cases of the disease. They placed the implants such that the semiconducting polymer was in contact with the retinal neurons, so absorption of light would apply a negative voltage to the cells. After 30 days, when the swelling from the surgery had completely subsided, Benfenati’s group compared the rats’ vision with both untreated RCS rats and healthy rats.
They first tested the pupil’s contraction in response to light, finding that although it was significantly impaired in untreated RCS rats, it was near normal in rats with the implant. In further tests using an electrode in the primary visual cortex, the researchers showed that implanted rats’ light sensitivity and visual acuity was substantially better than that of untreated RCS rats, and positron emission tomography showed that the metabolism of their primary visual cortices was higher. Furthermore, the rats – which naturally prefer dark environments – avoided light more effectively.
No ageing
The researchers tested the rats again later, both after 180 days and after 300 days: they found that, although the quality of the implanted rats’ vision declined, it stayed just as good relative to the other rats. “There is a generalized decrease in [the rats’] sight with age,” explains Benfenati. The recovery of the rats’ vision appears greater than can be explained by simple photovoltaics, so the researchers suspect other effects are involved, although precisely what these are remains unclear.
After dissecting the rats, the researchers tested prostheses removed from their eyes and showed that they worked similarly to prostheses stored in sterile conditions. The researchers are now testing an adapted implant in pig’s eyes: “We believe, based on these data, we could probably attempt the first [human] implant…within the next two years,” says Benfenati.
“The article is indeed interesting,” says ophthalmologist Mark Humayun of the University of Southern California in Los Angeles. He is impressed by the simplicity of using light to stimulate the implant, although he cautions: “The RCS rat retina is known to be much easier to stimulate. When it comes to a patient with longstanding retinal degeneration, we have found that ambient light intensity is insufficient and it requires intensified light – often multiple Suns.”
Bright lights
Daniel Palanker of Stanford University, is more sceptical, noting that “their RCS rats responded to every visual test, indicating that they still have photoreceptors”. He also pointed out that sub-retinal surgery is known to help preserve photoreceptors in RCS rats, and therefore these rats have better vision. He noted that, in their test of the implant, the researchers used light six million times more intense than the light levels to which the rats responded. “This indicates that the visual response has nothing to do with the photovoltaic response of the polymer,” he concludes. The researchers attempt to rule out this explanation by showing that a silk implant without the photovoltaic coating does not work, but Palanker is unconvinced: “The difference between the photovoltaic polymer and the passive control could be due to electrochemical reactions, which might help preserve photoreceptors better,” he says. “I’m not sure, but given other major problems, it’s not the central issue here.”
As recently reported on this website, the app can be used with virtual reality (VR) headsets, enabling users to observe particle tracks inside the detector and enjoy tutorials about the nature of neutrinos. It can be downloaded free of charge from the App Store and Google Play.
Developed by an international team of physicists, the app also has a game element whereby users can search for neutrino signals. In the podcast, Glester asks the developers why they believe it is important for professional physicists to develop outreach tools such as VENu to inspire public interest in their work. Not one to rest on his laurels, the app’s chief developer Marco Del Tutto is already considering ways in which the group can further develop the app. Eventually, such an app could even be used as a citizen science tool in which the public can help particle physicists to identify neutrino detections amid large data sets.
As Glester mentions in the podcast, VENu is not the only immersive video experience that might be of interest to physicists. CMS-cardboard is a VR visualization of the CMS detector at CERN’s large hadron collider (LHC). Meanwhile, NASA has created a 360-degree artist’s impression of the surface of one of the seven planets recently discovered around the TRAPPIST-1 star.
Synchrotron X-ray analysis may provide the answer to how dinosaurs became giants. A collection of dinosaurs specimens, ranging from eggs to juveniles, has been analysed at the European Synchrotron Radiation Facility (ESRF) in France. It is the first time such a range of ages of the same species has been studied at ESRF and it could provide answers about dinosaur growth and evolution. Found in a reproductive colony in central Patagonia, the specimens analysed are prosauropod Mussaurus patagonicus – a primitive herbivorous dinosaur that lived 200 million years ago during the Late Triassic period. Prosauropods are believed to be ancestors of the giant sauropod dinosaurs common to the Jurassic period. However, how they evolved from intermediate-sized creatures to massive giants remains a mystery. Now, palaeontologist Diego Pol from the National Scientific and Technical Research Council (CONICET) in the Museum of Palaeontology Egidio Feruglio in Argentina hopes that high-resolution X-ray analysis will provide some answers. The study involves 30 eggs, a baby and a juvenile Mussaurus patagonicus. After being unable to achieve sufficient data in the home laboratory, Pol turned to Vincent Fernandez of ESRF. The synchrotron has been performing palaeontology studies for around a decade. By using high-energy X-ray radiation, 3D anatomical models can be built without the need to damage specimens. Furthermore, high-resolution investigation means it is possible to analyse the bone-growth patterns of the dinosaurs. While the data collected remain to be fully processed, it could provide key answers to Mussaurus patagonicus growth and the origin of giant dinosaurs.
Coffee-ring effect could make better solar cells
Spilled semiconductor: researchers have discovered that crystallization behaviour can be controlled locally, creating regions with different crystal patterns. (Courtesy: KAUST)
A chance observation that a semiconductor solution behaves like spilled coffee could result in better electronic devices, according to researchers at the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia. When coffee dries on a surface, the coffee solids are pushed towards the edge of the puddle to create a familiar coffee ring. This effect occurs in many liquids that contain tiny particles, and now Aram Amassian, Liyang Yu and colleagues have harnessed it to improve how thin-film semiconductors form when they are deposited in solution on a non-crystalline substrate. These semiconductors normally form polycrystalline films of tiny crystallites that are randomly oriented. While this is fine for some electronic devices, performance could be boosted by gaining more control over the crystal structure. Yu noticed that an organic semiconductor solution formed a coffee ring as it crystallized. Oddly, the thickest parts of the ring crystallized first, which is the opposite of what was expected. This led Yu to discover that the depth of the solution had an important effect on how the crystals were forming. Using this insight, the team used the local thickness of the semiconductor solution to create patterned semiconductor films in which the locations and orientations of crystallites can be controlled. “We can now make customized polycrystals on demand,” explains Amassian. The team hopes its discovery will lead to improvements in a wide range of devices including solar cells, and they describe the research in Science Advances.
AAAS’s Rush Holt responds to new Trump travel ban
Immigration concern: AAAS chief executive Rush Holt. (Courtesy: Chet Susslin / National Journal)
“We are concerned that the executive order announced 6 March may be implemented in a manner that will continue to restrict travel to the US and negatively impact students and scientists who seek to work and collaborate with their peers in the US,” says physicist Rush Holt, who is chief executive of the American Association for the Advancement of Science (AAAS). Holt was responding to a new executive order from US president Donald Trump that limits travel from six Muslim-majority countries in the Middle East and Africa. “Scientific progress depends on openness, transparency, and the free flow of ideas; these principles have helped the US attract and benefit from international scientific talent,” adds Holt. “Impacts to US leadership in science, technology and innovation should be considered in development of immigration and visa policy.”
Cash boost for South African radio telescope
The Hydrogen Epoch of Reionization Array (HERA) observatory, located in Losberg near Carnarvon in South Africa, has received $5.8m from the Gordon and Betty Moore Foundation in the US. The telescope array is currently under construction and consists of 35 14 m radio dishes. Last year, the US National Science Foundation announced it would invest $9.5m in HERA to boost the number of dishes to 240 by 2018. The new money from the Gordon and Betty Moore Foundation will increase that number even further to 350. The rise in the number of dishes will allow astronomers to explore the large-scale structures that formed during and prior to the epoch of reionization – a billion-year period after hydrogen collapsed into the first galaxies, a few hundred million years after the Big Bang. HERA is a precursor array to the upcoming Square Kilometre Array that will be built in southern Africa and Australasia in the coming decade.
You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new type of eye implant.
The quasiparticle concept allows physicists to describe complex, many-body interactions in terms of the behaviour of a single particle-like entity. Usually these particles turn up in condensed-matter systems such as semiconductors, but a new type of quasiparticle known as an angulon has been proposed to describe the rotation of an atomic or molecular impurity within a solvent. First proposed theoretically two years ago, angulons have now been shown to explain the curious behaviour of a range of different molecules rotating within liquid helium.
Physicists have been studying quasiparticles since at least the 1940s, when Lev Landau and Solomon Pekar put forward the idea of the polaron to describe the behaviour of an electron travelling through a crystal lattice. As the electron moves forward it disturbs the surrounding atoms and so polarizes that region of the crystal. Describing the process completely would involve calculating the changing interaction between the electron and vast numbers of atoms, but Landau realized that it could be approximated by regarding the electron and the associated polarizations as a single particle that acts like a more massive electron travelling through free space.
In the latest work, Mikhail Lemeshko of the Institute of Science and Technology Austria just outside Vienna has looked at the collective motion of a rotating molecule interacting with the many atoms inside a drop of superfluid helium. Such drops allow scientists to hold single molecules at a fraction of a degree above absolute zero and record their spectra without distortions. In particular, it is useful for studying very reactive molecules such as free radicals.
Not enough atoms
The system can be analysed semi-classically by assuming that the trapped molecule creates a shell of non-superfluid helium around itself as it rotates, so slowing it down. But superfluid helium is a fundamentally quantum-mechanical material that is described by Bose–Einstein, as opposed to classical Boltzmann, statistics. Physicists have carried out brute-force numerical simulations of the system in recent years, but the complexity of the many-body interactions has limited the number of helium atoms in those simulations to around 100. The droplets used in experiments, in contrast, tend to contain more than 1000 atoms.
Lemeshko has found that he can simplify the problem enormously by using the concept of the angulon. Just as a polaron consists of an electron plus the deformations in the surrounding lattice, so an angulon is made up of the rotating molecule plus the disturbances it creates in the surrounding helium. And whereas a polaron is in effect a free-moving but more massive version of the electron, an angulon acts like an un-trapped version of the molecule in question but with a larger moment of inertia.
Having put forward the theory of angulons with Richard Schmidt of the Harvard-Smithsonian Center for Astrophysics in the US in 2015, Lemeshko has now compared that theory against 20 years of experimental results. For each of 25 different molecules, Lemeshko calculates the effect of the surrounding helium atoms on the molecule’s rotational constant – which is inversely proportional to its moment of inertia – and then compares the modified constant to the value obtained experimentally.
Two regimes
This was not a straightforward one-size-fits-all comparison, however. To obtain simple analytic expressions for molecular rotation, Lemeshko solved the angulon problem in two “regimes”. One regime, mainly applicable to heavy molecules such as those containing atoms of sulphur, involves molecules with significant coupling to the helium (a high potential energy) but with little kinetic energy. Conversely, the other regime, relevant to lighter molecules such as water, entails greater amounts of kinetic energy but weak coupling.
Although not all the predictions within the strong-coupling regime ended up within the experimental uncertainty, Lemeshko considers that for most heavy molecules he achieved “a good agreement with experiment”. He did even better in the weak-coupling regime, getting to within 2% of the experimental values for most light molecules. With some of the medium-sized molecules, however, he struggled, being unable to accurately predict their modified rotational constants within either the strong- or weak-coupling regimes. He says that an “intermediate-coupling” theory for angulons could in principle make accurate predictions here, but adds that rough estimates can be achieved in the meantime by splitting the difference between the strong- and weak-coupling predictions.
Despite the problems, Lemeshko concludes that the results of his study “provide strong evidence” that molecules rotating within superfluid helium do indeed form angulons. “An angulon is not a real physical entity in the sense that a fundamental particle such as an electron is,” he says. “But it is as real as any other quasiparticle.”
Electron angulons
Lemeshko is now looking to apply his theory beyond molecules within liquid helium. For example, he is investigating whether angulons could be used to represent electrons exchanging their orbital angular momentum with a crystal lattice. Doing so, he says, might aid the development of ultrafast switching and advanced data storage, but he cautions that this research is “very preliminary”.
Better robots are needed for investigating the Fukushima Daiichi nuclear plant after current designs failed due to radiation levels and debris obstacles. At a recent news conference, president of Fukushima Daiichi decommissioning, Naohiro Masuda, spoke about the need for more creative robot design after repeated failures. In 2011, multiple reactors at the Fukushima nuclear plant went into meltdown after a severe earthquake and tsunami. To safely decommission the damaged plant, its operator Tokyo Electric Power Company (TEPCO) must know exactly where the melted fuel is and the extent of structural damage to the surrounding buildings. The radiation levels, however, would kill a human within seconds, so TEPCO is reliant upon remote-controlled robotic probes. Yet early robots have come across unexpected challenges. In February, TEPCO sent in two robots to investigate the damaged reactor inside Unit 2 of the facility. The first was a cleaner robot designed to clear the way for the other “scorpion” robot that would assess damage and measure radiation and temperature. Unfortunately, the cleaning robot had to be withdrawn after only 2 hours of the planned 10 hour mission because the cameras began to malfunction due to high radiation levels. The scorpion-shaped robot then had to be abandoned before reaching its target location because it began to have difficulty moving and became stuck when crawling over rubble. It is unclear if this failure was due to debris or radiation levels. The Associated Press reports that Masuda called for more creative thinking when developing future robots. “We should think out of the box so we can examine the bottom of the core and how melted fuel debris spread out,” explains Masuda. The data collected and the robot failures imply that the clean-up and decommissioning of Fukushima will be more challenging than previously predicted. It is thought that the process will take decades to complete.
IBM to build 50 qubit quantum computers
Quantum development: IBM scientists Hanhee Paik (left) and Sarah Sheldon examine the hardware inside an open dilution fridge. (Courtesy: IBM)
IBM says it will build a new generation of universal quantum computers that will be available for commercial use via the IBM Cloud platform. The IBM Q systems will have about 50 quantum bits (qubits). This will make them 10 times larger than IBM’s five-bit quantum computer, which is already available on IBM Cloud and has attracted about 40,000 users. According to the US-based firm, increasing the number of qubits will be one step towards boosting the “quantum volume” – or computing power – of their quantum systems. Efforts will also focus on improving connectivity between qubits, boosting the reliability of quantum-logic operations and creating systems that are capable of highly parallel computations. The universal nature of the proposed computer should make it useful for solving a range of problems that are too complex for conventional computers. These include calculating the properties of molecules used to create new drugs and materials, finding optimal processes for supply chains and logistics and creating artificial intelligence systems. “To create knowledge from much greater depths of complexity, we need a quantum computer,” says Tom Rosamilia of IBM Systems. “We envision IBM Q systems working in concert with our portfolio of classical high-performance systems to address problems that are currently unsolvable, but hold tremendous untapped value.”
Very few photons needed to see through opaque material
An optical image of a region within a nearly opaque medium can be obtained using a surprisingly small number of photons. That is the conclusion of Mooseok Jang and Changhuei Yang at Caltech in the US and Ivo Vellekoop of the University of Twente in the Netherlands, who have shown that an established technique called optical phase conjugation (OPC) can be extended for use when very little light makes it out of the medium. OPC involves illuminating a point of interest in a nearly opaque medium with light beams from opposite directions. The first beam provides information about how light is scattered in the medium. This information is then used to cause the second beam to undergo the exact reverse scattering as it travels to the point of interest – illuminating that point. By scanning the beams around the sample, an image is built up. However, in very opaque materials scientists had thought that not enough light emerges to provide useful information about the scattering. Applying the technique to a sample of highly opaque opal, the trio showed that it worked when as few as 1000 photons were detected emerging from the sample – which is far fewer than the number of pixels in the detector used to measure the signal. The discovery is reported in Physical Review Letters and could be used to improve the optical imaging of opaque biological tissues such as brain matter.
You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new quasiparticle called the angulon.