Entangled photon pairs have been separated and sent to cities in China more than 1200 km apart. This is about 10 times further than had been achieved previously. The feat was performed using pairs produced on board a Chinese satellite and could lead to the development of long-distance quantum cryptography.
In August 2016, China launched the world’s first satellite dedicated to testing the fundamentals of quantum communication in space. On board the $100m Quantum Experiments at Space Scale (QUESS) spacecraft is a “Sagnac” interferometer that is used to generate two entangled infrared photons by shining an ultraviolet laser on a nonlinear optical crystal. Now, a team led by Jian-Wei Pan of the University of Science and Technology of China in Hefei has used the photon source to distribute entangled photons to pairs of three ground stations in China – each up to 1200 km apart.
Increasing distances
Entanglement is a purely quantum-mechanical phenomenon whereby two or more particles can have a closer relationship than is allowed by classical physics. Entanglement plays an important role in quantum technologies such as quantum cryptography, quantum teleportation and networks for distributing quantum information. Over the past decade, physicists have been able to transmit pairs of entangled photons over increasing distances, both in the air and along optical fibres.
The distance record in both media had been about 100 km. Photon loss increases exponentially with distance travelled, so linking distant cities using fibre would be extremely difficult. While quantum repeaters could be used to boost the transmission distances, practical devices are proving difficult to create. Satellite distribution is an attractive solution because much of the photon’s journey is through parts of the atmosphere with very low air pressure – and therefore low photon loss.
Polarization entangled
The source on board QUESS produces nearly six million entangled photon pairs per second. Each pair is entangled in terms of the horizontal and vertical components of the photons’ polarization. The pairs are split and individual photons are directed at two different receiving stations – covering distances of 500–2000 km. To minimize the angular spread of the photons over long distances, Cassegrain telescopes are used to focus the light into a beam with a divergence of about 10 μrad.
The photons are received on Earth using telescopes with diameters of 1.2–1.8 m and their polarizations are measured to verify entanglement. This is done by doing a “Bell test”, which determines whether correlations between the photons are stronger than that allowed by classical physics. The test confirmed entanglement at a statistical confidence of four standard deviations.
High fidelity
The team was able to detect entangled photons at a rate of about one pair every second. The quality of the entanglement – the state fidelity – was about 0.87, with perfect fidelity being 1. This represents an efficiency that is 1012 greater than is possible using special optical fibres and 1017 greater than possible using commercial fibres.
The researchers say that satellite-based entanglement distribution could be used to implement quantum key distribution (QKD) on a global scale. QKD uses the laws of quantum mechanics to ensure that two parties can securely exchange cryptography keys and is already being used over short distances by banks.
Today, in our “post-truth” era, these sorts of statements have become commonplace. A type of politics has entered the mainstream that rejects the claims of “experts” and pitches itself against what it perceives as the intellectual and political elite. This sometimes includes scientists and the scientific consensus on issues such as climate change. One factor in the rise of this brand of populist politics is a perceived failure of professionals to predict significant events such as the global economic crash and high-profile election results. Levitan – who used to write for FactCheck.org – discusses the types of tactics deployed by populist politicians in relation to science, and he emphasizes that his book is not exclusively an attack on the Republican Party.
Of course, these issues don’t just affect the US. The podcast also features the British scientists Tara Shears and Alice Roberts, who share their concerns about the current lack of evidence-based debate in the UK. This was particularly apparent during the campaign ahead of the 2016 referendum on the UK’s membership status in the European Union, in which spurious claims were made on both sides of the argument. One of the defining statements of the campaign came from vote-leave campaigner Michael Gove who said “The people of this country have had enough of experts from organizations with acronyms saying they know what is best and getting it consistently wrong.” In the Physics World podcast, Glester and his contributors explore how and why this sort of sentiment can hold such wide appeal among voters.
Music lovers will remember 1967 as the year the Beatles released Sgt. Pepper’s Lonely Hearts Club Band. For sports fans it was the year when Celtic became the first British team to win football’s European Cup. As for scientists, 1967 will go down in history as the year in which the first human heart transplant took place and the first radio pulsars were detected by Jocelyn Bell Burnell, Antony Hewish and others at the University of Cambridge, UK.
There was, though, another scientific event, which went under the radar at the time. On Thursday 15 June 1967 physicists moved into an office west of Chicago to begin work on a new scientific facility. Their goal was to build a proton–antiproton collider 30 miles away on rural prairieland as the centrepiece of America’s new National Accelerator Laboratory. The Tevatron eventually opened in 1983 in what by then had been renamed Fermilab in honour of Enrico Fermi, who had created in Chicago the first controlled, self-sustained nuclear chain reaction.
Cathedral of physics: Fermilab’s main building is named after Robert Wilson. (Courtesy: Fermilab)
Fermilab went on to become one of the most iconic and famous physics labs in the world, not least because of its cathedral-like main building that rises above the flat Illinois prairie. Its researchers’ successes include discovering the bottom quark in 1977, the top quark in 1995 and the tau neutrino in 2000. These and other feats are being celebrated all year by Fermilab. Although the Tevatron stopped running in 2011, the lab is busy reinventing itself as a world-class facility for neutrino physics – so who knows what breakthroughs are yet to come from Fermilab?
China has launched the country’s first dedicated X-ray telescope to study the radiation produced by black holes and neutron stars as well as detect gamma-ray bursts. The 1bn RMB Hard X-ray Modulation Telescope (HXMT) was launched today at 11:00 local time from the Jiuquan Satellite Launch Center in north-western China’s Gobi Desert. It will now be put into low-Earth orbit with an altitude of around 550 km.
First proposed in 1994 and approved in March 2011, HXMT has been developed jointly by the China Academy of Space Technology, Tsinghua University and the Institute of High Energy Physics (IHEP) of the Chinese Academy of Sciences. The 2700 kg probe carries three instruments that will detect X-rays between 1–250 keV. The high-energy X-ray instrument has a total collecting area of 5000 cm2 and will work between 20–250 keV. The instrument will also be able to detect gamma-ray bursts at energies 3 MeV. “We expect to monitor about 200 gamma-ray bursts every year,” says IHEP physicist Shuangnan Zhang, who is the principal investigator of the satellite. The medium-energy X-ray instrument will operate between 5–30 keV while the low-energy X-ray instrument will work from 1–15 keV.
Due to atmospheric absorption, cosmic X-rays can only be seen from space with black holes and neutron stars being the two main sources in the universe. By combining sky-survey data with single-point observations, HXMT will develop a high-precision, hard X-ray sky map, looking for new sources and studying in greater detail the temporal properties of known sources to significantly improve our knowledge of the X-ray sky.
Modulation detection
Dedicated X-ray astronomy began in 1970 with the launch of NASA’s Uhuru and since then there have been more than 50 missions, including NASA’s Chandra and the European Space Agency’s XMM-Newton. But instead of using focusing optics to identify X-ray sources, HXMT adopts a unique “modulation” technique to detect X-rays. “We use collimators to filter the incoming light so that only radiation travelling in a specific direction is allowed through,” says Zhang. By swinging the detector in various directions, astronomers can then reconstruct a specific source and eventually render a map of the entire X-ray sky.
Paolo Giommi, an astronomer at the Italian Space Agency, says that the modulation technique is “clever” because it does not require “complex and costly X-ray mirrors”. That view is backed by astronomer Jonathan Grindlay from Harvard University, who says that the strength of the modulation technique is its simplicity. “With HXMT’s broad energy band coverage for wide-field imaging, it should obtain better spectral energy distribution than what is now being done with [NASA’s Monitor of All-sky X-ray Image],” he adds.
Joint observations
However, HXMT’s sensitivity will be limited by using this approach, so Zhang’s team is planning to carry out joint observations with other missions such as NASA’s NuSTAR, which was launched in 2012.
Data from the HXMT will also be shared with international collaborators including those at the University of Tübingen’s Institute of Astronomy and Astrophysics. Zhang and colleagues from Tübingen are also working on the proposal for a successor to HXMT – the enhanced X-ray Timing and Polarimetry mission. If approved, it will be an international project involving more than 20 nations and led by China with a launch date as early as 2024/2025.
After I did my first degree and my PhD in the physics department here at the University of Strathclyde, UK, I had an industry job lined up that had nothing to do with biology whatsoever. But this was around the time of the September 11th terrorist attacks, and the company panicked and rescinded its job offer, leaving me high and dry. At that point my PhD supervisor said, “Well, I’ve got three months’ worth of post-doc salary for work in this newly created ‘Centre for Biophotonics’ – would that interest you?” During my PhD I had spent a lot of time developing lasers using nonlinear optics, and when we wrote papers we’d always include a paragraph mentioning possible biomedical optics applications. But we never actually did any of that future application work, so I thought it might be nice to try.
That three-month post opened my eyes to the possibilities, and then I took a two-year appointment where, as well as engaging in technology development, I was also working with some of the biologists. They would bring their specimen preparations – biological cells and tissue – and I would help them to do some confocal imaging or wide-field epifluorescence imaging. This was a baptism of fire for me because I had never done any biology – I’d never really even done any chemistry, and I had never looked down a microscope until very late in my doctoral studies. But it was also a kind of gateway to another world, and I started to appreciate both what was driving biological and biomedical research, and some of the limitations of commercial imaging technologies. That gave me a sense of how I could use my knowledge in physics to develop better technology for biomedical imaging.
Although I come from a laser development background, as time has progressed I’ve started to appreciate that the laser is potentially just one component within an optical imaging system, and there are other technologies that also need to be developed. This could be optical technologies, addressing questions such as “Can we redesign the microscope objective lens to give more information about the specimen?” But there are also questions like “How can we prepare the biological tissue or specimen in a way that gives more meaningful or revealing information about biological function?” As a consequence, in the last five years we have really developed into more of a biophysics group rather than a physics group who happen to be working in the life sciences.
What are some of the major challenges imaging biological samples?
That very much depends on the biological question that you are trying to answer. This is an extreme example, but behavioural scientists working at the border between biology and neuroscience are interested in animal behaviour, perhaps in response to a reward or some pharmacological intervention. What they need is a video or camera-based system that can track the movement of the whole animal, and that would obviously require quite a different type of optical imaging solution from the type of systems we develop, which are more about understanding sub-cellular activity or structural morphology within individual cells. Biology and biomedical challenges are so diverse that it’s not a case where one solution fits all. That’s why physicists are constantly developing new types of imaging systems. As biologists are using more advanced instruments they are learning more, and as soon as they reach a technological threshold they want to image deeper, they want to collect images more rapidly, they want to see more spatial detail and they want to use the latest fluorophores and photoproteins. At the same time, increases in computational power are making storage and analysis of big data sets possible – it’s a cyclical process that drives the technology development.
Could you give us an example of how this has worked in one of your projects?
The development of the so-called “Mesolens” is a good example. This project was initiated in the mid-1980s by Brad Amos, one of the developers of the point-scanning confocal microscope. Although this instrument can produce 3D images of cells with sub-cellular resolution, and without needing to section the tissue mechanically, he noticed that whenever biologists gave him a large volume of tissue, they were inevitably disappointed with the resolution. So he set about trying to find an optical solution to this problem – how could we accommodate large volumes of tissue without compromising the spatial resolution? I started working on the project around 2009–2010. A prototype of what we now call the Mesolens had been funded by the Medical Research Council (MRC), and Brad had done some preliminary tests, but he was approaching retirement and the MRC rules are that when you retire, you no longer have lab space. After some negotiations, we were able to move the whole project to Strathclyde, and since then we have developed a confocal version of the Mesolens that makes it possible to image up to 120 mm3 of tissue with sub-cellular detail throughout.
This has taken us in directions that we never expected because it’s such a departure from a standard confocal microscope. Initially we thought that the Mouse Atlas community (which is developing a 3D computer model of successive stages of mouse embryo development) would be a key user, and that drove some of the lens design parameters. For example, a mouse embryo at day 12.5 is around 6 mm long and that is why the Mesolens has a 6 mm image field. But while the Mouse Atlas community is engaging with us, we are also seeing applications emerging that we did not expect.
I understand there’s now a Mesolens spin-out company.
Yes, but I have made a conscious decision to not become a part of the company at this time. This is partly because I don’t want to compromise the prospect of academic funding routes, but mostly because I want to focus on the scientific development rather than get diverted into financial management and other technical aspects of the business. I’m a bit of a card-carrying academic, and because my basic training is in physics, I also have a remaining curiosity around what new physics can emerge while we are studying these biomedical processes.
What’s the most frustrating problem in developing the Mesolens?
One of the difficulties we had to overcome was mirror jitter, which wasn’t something we expected to be a show-stopper.
If you take an image with your mobile phone you have maybe 10 megapixels, but for full Nyquist sampling – using the Mesolens to obtain all the available information – we need a 400 megapixel camera. At the moment we have the next best thing, which is a 260 megapixel effective camera where we obtain images by moving the sensor within the body of the camera. But what we really want is to scan a laser beam across 20,000 by 20,000 pixels, and that means we need to accurately sweep the laser spot across many points within our 6 mm image field. At Photonics West we met a company called Nutfield Technology that sells scanning mirrors to the digital video industry (particularly Pixar) so they can project digital images onto a large cinema screen while ensuring there are no gaps.
We were able to work with them and now we have mirrors that can scan our laser spot as accurately as we need to get reproducible data.
Where do you see the field going in the future?
Biomedical research is generating vast data sets. We have excellent microscopes and the biologists are desperately keen to use them to see ever more detail. This might mean using super-resolution techniques to resolve very fine spatial detail, or making very fast recordings of physiological behaviour or intact organs. Either way, it inevitably leads to giant data volumes. So we can generate the data, but are we adequately placed to analyse it? Can we retrieve meaningful measurements and perform appropriate statistical analysis of these data sets that we can now generate? I see a great drive towards every biomedical imaging centre needing a computational biology expert around to help unpack these data volumes so they can provide the biomedical research community with the information that they actually want, rather than what the physicists think they want.
In some ways this is a “luxury” problem because we have these fabulous new tools, and we just need to make sure we are using them to the best of our ability. For example, we were using the Mesolens to image some mouse lung tissue recently, and that generated a 200 GB data file. We can reconstruct that dataset in 3D, but if, for example, we want to count the number of cells within the tissue, that takes quite a lot of computational power. We are starting to become smarter as a community about using computational methods to retrieve meaningful information from these very large datasets, but the datasets are only going to get bigger. I also think that biologists are becoming more keen on quantitative information. Rather than just generating a pretty image, they now want numerical values, and they want to make measurements from within their datasets. This is great for physicists. We can help, through collaboration and interdisciplinary research, to push it along.
Politicians have long misused, misquoted and misinterpreted science to suit their agenda. From Ronald Reagan, the 40th president of the US, first uttering the infamous words “I’m not a scientist and I don’t know the figures, but I have a suspicion…” in the 1980s, to today’s catchphrase of “alternative facts”, scientific results are often used and abused. In the current political climate, you may find that author and science journalist Dave Levitan’s Not A Scientist: How Politicians Mistake, Misrepresent and Utterly Mangle Science, attracts you with the same compulsion that pulls moths towards flames.
The book’s friendly yellow cover promises “an eye-opening tour of the political tricks that subvert scientific progress”. Although Levitan mostly focuses on politics in the US, you will find the ploys he describes are universal. Some will horrify you with their familiarity and relevance – Reagan’s quote above, taken from a speech that concerned acid rain, is a case in point. His suspicion that “one little mountain out there [Mount St Helens] has probably released more sulphur dioxide into the atmosphere of the world than has been released in the last 10 years of automobile driving” disregarded facts and scientific research in one cavalier swoop. As Levitan points out, “simply saying you’re not an expert is not an introduction for trying to act like one”. When a president rates gut feeling higher than evidence, confidence in science is eroded, scientists are marginalized and populism grows. At this point you may be uncomfortably reminded of then UK secretary of state for justice Michael Gove’s infamous comment that “people in this country have had enough of experts”, in the run-up to last year’s referendum on the UK’s membership of the European Union.
Levitan begins by categorizing the many rhetorical tricks that politicians use to spread misinformation about science. Some are subtle – oversimplifying a concept and getting it wrong, cherry-picking a fact and ignoring the bigger picture, or even ignoring more recent data and findings. Some tricks are more insidious: sourcing blogs or reports of dubious provenance instead of peer-reviewed science; ridiculing, dismissing or ignoring scientific research; wilfully misinterpreting the notion of uncertainty. Global warming seems a particular target. We read about a US senator producing snowballs in the Senate, claiming their mere existence as proof that global warming can’t exist. Another senator deliberately interprets climate data out of context to tell “global warming alarmists…the satellite data show it ain’t happening”. Another deploys their own version of statistics to argue against measures to combat climate change.
But certain other tricks are outrageous. In an “impressive example of fearmongering”, Levitan describes how Alabama congressman Mo Brooks declared that “our kids just aren’t prepared for a lot of the diseases that come in and are borne by illegal aliens”. Most countries in central and southern America have higher vaccination rates than the US, but that didn’t stop Brooks spotting the potential political capital to be gained by criticizing immigrants. Levitan draws a link between misrepresentation of scientific evidence, a fear of immigrants and the diseases they carry, and policies that exploit this fear such as current US president Donald Trump’s proposed border wall.
If you’re not somewhat depressed by this point in the book, you haven’t met the trick of straight-up fabrication and “the unverifiable story” that crops up in anti-vaccination debates in order to sway public opinion. You might dismiss Trump as an uninformed source (“a beautiful child went to have the vaccine, and came back, and a week later got a tremendous fever. Got very, very sick, now is autistic”), but it’s more troubling to read of Senator Rand Paul (“an actual doctor”) stating “I have heard of many tragic cases of walking, talking, normal children who wound up with profound mental disorders after vaccines.” There is no scientific basis for these statements.
With fake news and alternative facts seemingly everywhere, it is vital that this type of political manipulation be recognized. Luckily, Levitan is on hand to help identify and combat these tricks. Each illustration of shady political practice is accompanied by the relevant science to place it in context. Each chapter ends with advice on how to recognize a rhetorical technique and combat it. You might think this isn’t rocket science, but some techniques are remarkably sly and subtle, and require homework to prepare for them, and a good overview of the relevant literature to refute them.
You may read much of this book with your head in your hands, praying for the future of humanity, but don’t worry. Levitan is an entertaining guide whose language is lively (“if this isn’t enough wrongness for you, it gets worse”), liberally sprinkled with italics for even more emphasis, and often a little leading (“let’s claim a new term – climate TOADS – Those who Oppose Action/Deniers/Skeptics”). There are limitations to his treatment however. Levitan does not analyse political motivations to misuse science or examine the wider use of these techniques in debate. He also does not consider the depressing idea that there may be deeper anti-science trends in society.
There is also not enough space for Levitan to fully describe some of the complex areas of scientific research used as examples (although he backs up his statements with an extensive set of notes) – it’s just not that sort of book. Instead, it is a snappy catalogue of a selection of ways in which science is misused for political gain. Levitan states at the start that this is the “unfortunate reality” of American political life, rather than a political bias. Whether it is or not, the lessons are valid everywhere and it won’t take you long to think of examples in UK political life.
If it amazes you that Levitan manages to remain so upbeat and positive through the book, read his more measured and urgent foreword. Trump’s election, after the book was written, is a “terrifying state of affairs”, and his misuses of science would form part of the collection “if only he were better at making them”. Levitan writes that our best antidote to “misinformation, deception and backwardness” is vigilance. Read this book. It’s not just timely, it could save your future.
An octopus-inspired adhesive surface that works in wet and dry conditions has been created by researchers in South Korea. They believe their suction cups could find a range of uses including the manufacture of silicon chips and wound healing. The research also provides a better understanding of why the suction cups of the common octopus are so good at grabbing hold of surfaces.
The limbs of octopuses are covered with suction cups, which the animals use to pull themselves along and catch prey. The suction cups of the common octopus are particularly interesting, containing a central bump whose function had previously been unknown.
Adhesives are required in all sorts of different applications, from keeping packages shut to closing wounds after major surgery. However, glue can lose its stickiness after a while, and can leave chemical contamination on surfaces. Imaginative solutions using charged polymers, nanoparticle solutions and systems inspired by various living creatures have been tested, but they can be difficult to manufacture. As a result, creating contamination-free, easily manufactured adhesives that stick repeatedly to wet and dry surfaces of variable geometries remains a challenge.
Polymer moulds
In the new research, Changhyun Pang and colleagues at Sungkyunkwang University in South Korea fabricated polymer surfaces covered with thousands of miniature replicas of the octopus suction cups – complete with central bumps. The team made several versions of the surfaces with suction cups of different diameters in the 15–500 μm range. The structures are easily produced by pouring a polymer into a mould.
The researchers tested their film’s adhesiveness to a silicon wafer – and compared the results with the adhesiveness of films covered with similarly sized cylindrical holes, solid cylindrical pillars and hollow cylindrical pillars. They found the octopus-like structures were significantly more effective than the others in damp conditions or when fully submerged in water. Indeed, strong adhesion was even achieved in silicone oil, which is used as a lubricant.
When one of the suction cups is gently pressed into a surface in the presence of liquid, it contracts, pushing some of the fluid out. When the cup expands again, the pressure inside is lower than outside, creating suction. If more force is applied, the octopus-inspired dome-like protuberance in the middle of the suction cup flattens, expanding sideways until it touches the edges of the cup. This divides the cup into upper and lower chambers. Capillary forces draw the liquid into the upper chamber. When the pressure is released, cohesive forces between the liquid molecules in the upper chamber keep the protuberance flattened, holding the upper chamber shut and creating extremely low pressure in the lower chamber.
Fluorescent water
This two-stage suction process could explain why the octopus-inspired surface attaches to the wafer much more strongly if the two surfaces are pressed together harder, whereas surfaces covered with the other structures often show little or no increase in suction. The researchers confirmed their model by using confocal microscopy to study the movement of fluorescently labelled water within the suction cups.
The ability to hold a silicon wafer could be useful in the semiconductor industry, where wafers have to be transported precisely and repeatedly in both dry and wet conditions – an expensive process that can also damage the wafers. Pang and colleagues have shown that their adhesive surface can do the job in a repetitive industrial setting because it shows no loss of suction after more than 10,000 cycles of attachment and detachment. They also showed that saline-loaded patches can stick to mouse skin and assist wound healing, although less effectively than the standard current medical dressing: “Our group is currently investigating stem-cell- and drug-loading approaches to improve these patches’ practical utility,” says Pang.
Rough surfaces
Solid and structural mechanics expert Nicola Pugno of the University of Trento in Italy believes that the work is significant. He points out that while the adhesiveness of a large conventional suction cup is comparable or slightly higher that the new surface, large suction cups cannot attach to very small or rough surfaces.
He also says that producing a film embedded with microscopic conventional suction cups would be very difficult: “A suction cup seems to be simple but, as a geometry, it’s quite complex: it’s tapered, for example.” “Perhaps the most interesting aspect is that they were able to produce in large area these microscopic, strange suction cups with the peculiar geometry of the octopus suction cups. When you understand all the physics and optimise everything, perhaps you can achieve an order of magnitude increase in adhesion strength.”
The Advanced Research Projects Agency-Energy (ARPA-E) has the ability to make significant contributions to energy research but must be allowed time to do so, according to a report by the US National Academies of Sciences, Engineering and Medicine. The academies’ report – An Assessment of ARPA-E – says that the agency is making progress towards its goals but cannot be expected to have fulfilled them yet, given that new energy technologies require decades of effort. Operated by the US Department of Energy, ARPA-E was created in 2009 to fund high-risk, high-reward research. The report finds that a quarter of supported projects have received follow-on funding, while around a half have published their results in peer-reviewed journals, with 13% receiving patents. The 18 strong committee that wrote the report recommends that ARPA-E management now develop a way of measuring and assessing the agency’s impact to demonstrate the agency’s value. The report comes just weeks after US president Donald Trump included no money for the agency in his administration’s 2018 budget request.
New evidence says the Sun had a runaway twin
Infrared image from the Hubble Space Telescope of the Perseus molecular cloud. (Courtesy: NASA, ESA and J Muzerolle, STScI)
The Sun likely had a non-identical twin that escaped into the depths of the Milky Way a long time ago. This is according to Sarah Sadavoy of the Harvard-Smithsonian Center for Astrophysics in the US and Steven Stahler from the University of California, Berkeley, who have theorized that all Sun-like stars are born in pairs. The idea that the Sun once had a twin is not new. The long-lost sibling is called Nemesis and may have kicked out the asteroid that exterminated the dinosaurs. However, astronomers have struggled to find the evidence to support the existence of Nemesis. Now, Sadavoy and Stahler have run a series of statistical models to try and find one that fits the populations of single and binary young stars seen within the Perseus giant molecular cloud during a radio survey. “The only model that could reproduce the data was one in which all stars form initially as wide binaries,” says Stahler, where “wide” refers to a separation of at least 500 AU apart – 17 times the distance between the Sun and Neptune. “These systems then either shrink or break apart within a million years.” In Sadavoy and Stahler’s model, stars with masses similar to the Sun start as wide binaries within star-forming egg-shaped cores. “As the egg contracts, the densest part will be toward the middle, and that forms two concentrations of density along the middle axis,” Stahler explains. “These centres of higher density at some point collapse in on themselves because of their self-gravity to form stars.” 60% of the time, the stars then separate, but for the remaining 40%, the pairs go on to form tight binaries. The work is presented in the Monthly Notices of the Royal Astronomical Society.
Laser lift-off boosts chemical analysis of fingerprints
Image showing a laser spot on a fingerprint
A new way to analyse the chemical composition of fingerprints has been developed by researchers at Louisiana State University in the US. The system uses an infrared laser to vaporize material that is left behind when a person touches a surface. While this might sound like a violent process, the laser wavelength was carefully selected to heat up water molecules while minimizing damage to biological molecules such as DNA. Material is removed from a region about 0.3 mm in diameter – leaving most of the fingerprint intact – and the vapour is sucked into a filter that captures the molecules of interest. The contents of the filter can then be analysed using a number of different techniques, including mass spectrometry. The system was developed by Fabrizio Donnarumma, Eden Camp and colleagues, who say that it could also be used to work out if a person had been handling explosive materials. Indeed, the technique was able to identify a range of substances including caffeine, antiseptic cream and TNT using mass spectrometry. The team is now working with industry and police agencies to come up with ways to use the portable system to analyse chemical signatures left at crime scenes. The research is described in the Journal of the American Society for Mass Spectrometry.
Water waves can gain energy when they scatter from a whirlpool-like vortex. That is the conclusion of physicists in Brazil, Canada and the UK, who are the first to observe a phenomenon called “rotational superradiant scattering”. The team says that the effect could be used to study black-hole physics.
Silke Weinfurtner and colleagues at the University of Nottingham, Universidade Federal do ABC and University of British Columbia observed the effect using a rectangular water tank 1.5 m long and 3 m wide. Water is pumped continuously into one corner of the tank and allowed to drain through a 4 cm-diameter hole in the middle of the tank – creating a familiar draining vortex.
Controlling the flow so that the water is always about 6.25 cm deep, the team then generated plane waves from one side of the tank with excitation frequencies in the 2.9–4.1 Hz range. An “absorption beach” is located on the opposite side of the tank to minimize the effect of the waves reflecting back into the tank.
Simple bathtub
Using a special 3D sensor developed by the team, the researchers were able to precisely measure the height of the waves passing through the swirling vortex at the centre of the tank. According to the team, images of the waves at various excitation frequencies agree with “simple bathtub flow models”. Furthermore, the plane waves undergo an angular phase shift upon scattering, analogous to the Aharonov–Bohm effect in quantum mechanics.
The team also looked at how the waves interact with the vortex by expressing the incident waves in terms of azimuthal components relative to the rotational direction of the water. Azimuthal wave components rotating in the opposite direction as the vortex were found to have lower energy upon scattering, which the team says is the result of energy being absorbed by the vortex hole. However, azimuthal wave components rotating in the same direction as the vortex gained energy as a result of superradiant scattering, with a maximum amplification of about 14% for waves at 3.7 Hz.
The team points out that exactly how energy is lost to the vortex is not known and this could be the subject of future experiments that try to measure the amount of wave energy that goes down the drain. These experiments could provide insights into black holes, which can be modelled in terms of waves interacting with water swirling out of a tank.
Glass eels have a magnetic compass linked to the tides
European eels could navigate tricky coastal waters using the Earth’s magnetic field, according to researchers in the US and Norway. The eels begin their lives in the Sargasso Sea and then migrate thousands of kilometres to the coasts of Europe and North Africa, where they enter rivers and spend their adult lives in fresh and brackish water. Scientists believe that much of this journey is made in the larval stage by hitching a ride with the Gulf Stream current that travels from the Sargasso Sea to Northern Europe. When they reach the European continental shelf, the larvae become “glass eels”, which peel off the current and head for the coast. European eels have recently suffered a large decline in numbers and how the glass eels find their final destinations while being buffeted by tides and currents is of great scientific interest. Now, researchers at the University of Miami and the Norwegian Institute of Marine Research have shown that the eels use Earth’s magnetic field to determine which direction to swim. In a Norwegian fjord, they placed locally caught glass eels in a mesh container that was suspended from a surface float. A camera recorded the behaviour of the eels and the team found that they tended to swim in a southerly direction at ebb tides. The same glass eels were then placed in a specially designed tank in which they are shielded from external stimuli such as daylight. Magnetic fields were applied to the tank, which effectively rotated the magnetic north–south axis by 90°. During periods of ebb tides, the team found that the eels oriented themselves in a southerly direction as defined by the applied magnetic field. This led them to conclude that the glass eels navigate using Earth’s magnetic field in a manner that is linked to the local cycle of tides. This, they believe, could help the eels to use the tides to reach freshwater in rivers. “It is incredible that these small transparent glass eels can detect the Earth’s magnetic field,” says Miami’s Alessandro Cresci. “The use of a magnetic compass could be a key component underlying the amazing migration of these animals,” he adds. “It is also the first observation of glass eels keeping a compass as they swim in shelf waters, and that alone is an exciting discovery.” The study is described in Science Advances.
Ancient alignment of giant galaxies puzzles astronomers
Mysterious alignment: massive galaxies have been aligned with surroundings for at least 10 billion years. (Courtesy: ESA / Hubble, NASA, HST Frontier Fields)
Massive galaxies became aligned with their surroundings before the universe reached a third of its current age, astronomers report. The orientation of a galaxy is one of the easiest properties to observe. While most are randomly orientated with respect to their surroundings, some have a preferred alignment – this particularly applies to giant elliptical galaxies at the centres of rich galaxy clusters. The reason behind why these massive galaxies have an orientation, however, remains a mystery to astronomers. Michael West of Lowell Observatory in the US and collaborators therefore used the Hubble Space Telescope to observe 65 distant galaxy clusters. The team found that the galaxies have been aligned for at least 10 billion years – the earliest the phenomenon has been seen. “It’s an important new piece of the puzzle,” says West, “because it says that whatever caused these alignments happened early.” However, it is still unclear how the alignment occurred, and the team suggest several possible theories in their paper in Nature Astronomy. One theory is that the galaxies aligned with the surrounding matter at the time of formation, while another attributes it to the gravitational pull of neighbouring galaxies slowly orientating the largest.
Research productivity rises in Spain despite resource decline
Universities in Spain have seen a rise in scientific productivity and impact despite a decrease in resources and researchers. This is according to a report by the IUNE Observatory, run by the 4U Alliance. The report considers the research activity at 79 public and private Spanish universities between 2006 and 2015. IUNE found that funding for research and development diminished by 19% between 2008 and 2014. The number of scientific researchers was also seen to decline by 9.1% between 2010 and 2015, although the number of professors slightly increased. Despite these falling resources, the average number of publications per professor per year increased from 0.46 in 2006 to 0.83 in 2015. IUNE also reports an increase in research visibility and impact. The percentage of publications in the top 25% of journals based on impact factor rose from 49% to 53%. International scientific collaboration also saw a rise, with the US, UK and Germany being the top collaborators. The field of physics and astronomy are cited as the main areas where international collaboration has increased, due to projects such as those at CERN.