Peritoneal cancer metastasis is a common and severe disease. Fortunately, advances in chemotherapy have resulted in encouraging treatment outcomes. However, these therapies still present major limitations such as poor tissue penetration, and current preclinical models fail to reproduce the disease conditions.
Thus, preclinical models that faithfully mimic peritoneal metastasis are needed to assess the efficacy of new therapies. To this end, Olivier De Wever and his team at Ghent University have developed a novel 3D model that combines 3D printing, multicellular cell culture and in vivo implantation. This model improved the replication of the peritoneal metastasis compared with current models (Biomaterials158 95).
Metastasis model fabrication (click to zoom)
Printing a tumour model
One of the limitations of current models is imitating the dimensions and mechanical properties of the tumour. However, with the new technologies of 3D printing, and subsequent surface modifications such as collagen coating and UV treatment, the properties of the produced scaffold model can be tuned as required. In particular, the researchers employed 3D printing of polylactic acid (PLA) and treatments with plasma, gelatin and UV to modify the surface.
Using this process, they created a scaffold with the appropriate size, porosity, and mechanical and biochemical properties to replicate metastasis conditions and promote a population of relevant cancer and interacting cells. This resulted in a more clinically relevant model than achieved in previous studies.
Cell viability and spheroid formation
Replicating tumour complexity
Solid tumours include different cell populations that interact among themselves. Such interactions play a crucial role in the fate of the tumour. Thus, they must be present to achieve a relevant model. In this study, the researchers combined cancer-associated fibroblasts (CAF) with tumour cell lines. The combination of these cell populations enabled the formation of tumour structures in vitro (spheroids), demonstrating the importance of the presence of CAF in forming a representative tumour model and providing an additional advantage over previous models.
To further assess whether these modelled constructs can reproduce metastasis conditions in vivo and serve as a representative model, the researchers implanted them in the peritoneal cavity of mice. After 11 weeks, they observed that the implanted models presented a heterogeneous cell population, including the seeded CAF and proliferating cancer cells, as well as host cells such as immune, adipose and epithelial cells.
Interestingly, these constructs also promoted the formation of blood vessels. This is thought to be dependent on the CAF-cancer cell interaction, which is able to produce specific signals that promote neovascularization. In addition, comparing the implanted constructs, a traditional metastasis model and a patient-extracted tumour revealed that the immune response and histological organization of the construct was similar to that found in the patient tumour, in contrast to the traditional model.
A step closer
Overall, the research team succeeded in developing a more representative preclinical model of peritoneal metastasis, by replicating more closely characteristics such as size, heterogeneous cell population, blood vessel formation, immune response and histological resemblance to human tumours. This study represents a step forward in the development of preclinical models to assess drug efficacy and penetrability, a major limitation in the field of peritoneal metastases. Furthermore, it provides a basis to advance screening technologies for the development of new therapies.
Histology comparison
Overall, the research team succeeded in developing a more representative preclinical model of peritoneal metastasis, by replicating more closely characteristics such as size, heterogeneous cell population, blood vessel formation, immune response and histological resemblance to human tumours. This study represents a step forward in the development of preclinical models to assess drug efficacy and penetrability, a major limitation in the field of peritoneal metastases. Furthermore, it provides a basis to advance screening technologies for the development of new therapies.
Coral reefs have always lived near the edge. Now, thanks to global warming, life there is five times more precarious.
Forty years ago, the world’s coral reefs faced a known risk: every 25 or 30 years, ocean temperatures would rise to intolerable levels.
Corals would minimise the risk of death by everting the algae with which they lived in symbiotic partnership: that is, the reef animals would avoid death by getting rid of the algae, deliberately weakening themselves.
This response is known as bleaching, and it can have a catastrophic effect on other life on the reef. In the Pacific such episodes were sometimes linked to cycles of ocean warming known as an El Niño event.
By 2018 the odds had altered. Coral reefs now face this hazard every six years. That is, in four decades of global warming and climate change, the risks have multiplied fivefold.
Bleaching breaks out
“Before the 1980s, mass bleaching of corals was unheard of, even during strong El Niño conditions, but now repeated bouts of regional-scale bleaching and mass mortality of corals has become the new normal around the world as temperatures continue to rise,” said Terry Hughes, who directs Australia’s Centre of Excellence for Coral Reef Studies at James Cook University.
He and colleagues report in the journal Science that they analysed data from bleaching events at 100 locations around the planet between 1980 and 2016. Bleaching events are a fact of life for corals: these little creatures tend to live best in temperatures near the upper limit of their tolerance levels, and respond to extreme events by rejecting the algae that normally provide the nutrients they need.
But as global air temperatures have increased, in response to profligate burning of fossil fuels that increase greenhouse gas levels in the atmosphere, so have sea temperatures. And Professor Hughes and his team report that in the last two years more than a third of all bleaching events have been “severe,” extending over hundreds of kilometres.
When they measured the growth of risk over the decades, they found that the bleaching hazard had increased by 4% per year since 1980.
The finding should be no surprise. In 2015, during a severe El Niño event, scientists began to record cases of coral death. In 2016, they observed that 93% of Australia’s Great Barrier reef had been affected. In 2017 they found that reefs in the western Pacific and off the Indian Ocean had been damaged beyond repair, and a separate set of calculations has warned that by 2100, up to 99% of the world’s coral colonies could be at risk of bleaching every year.
Reefs can recover, but this recovery can take as long as a decade. Coral reefs are among the planet’s richest habitats, and the death of a reef puts many ocean species at risk: it also damages local commercial fish catches and local tourist industries.
“Reefs have entered a distinctive human-dominated era – the Anthropocene,” said Mark Eakin of the US National Oceanic and Atmospheric Administration, a co-author. “The climate has warmed rapidly in the past 50 years, first making El Niños dangerous for corals, and now we’re seeing the emergence of bleaching in every hot summer.”
And Professor Hughes said: “We hope our stark results will help spur on the stronger action needed to reduce greenhouse gases in Australia, the United States and elsewhere.” – Climate News Network
For more than 20 years, physicists have been unable to explain why two types of experiment yield different values for the lifetime of the neutron. One or more unknown systematic errors biasing the results is a possibility. But now a pair of particle theorists in the US have come up with an alternative explanation: that occasionally neutrons decay to a previously unknown particle which might account for the universe’s dark matter. They say that such a particle could leave a very distinctive signature in nuclear physics detectors.
When isolated, neutrons decay in around 15 min. They do so via beta decay, which involves a neutron transforming into a proton, an electron and an electron antineutrino. Conservation of energy, charge, angular momentum and other quantum numbers dictates that this is the only way that neutrons can decay within the Standard Model of particle physics.
To measure the average neutron lifetime precisely, physicists employ two basic techniques. One is to house neutrons within a container, known as a bottle, and simply count how many of them remain after a fixed interval of time. The other approach is to fire a neutron beam with a known intensity through an electromagnetic trap and measure how many protons emerge in a given time.
Growing disagreement
Ongoing since the early 1990s, the two types of experiment yield results that remain at odds with one other. While the bottle method tells us that neutrons decay after about 880 s on average, the beam tests put the figure around eight seconds higher – at 888 s. The difference is significant because it can’t be accounted for through either statistical or known systematic uncertainties. Until 2013, the discrepancy amounted to 2.9σ. Then, following improvements to the world’s leading beam experiment, the mismatch hardened – rising to 3.8σ.
In the latest work, Bartosz Fornal and Benjamin Grinstein at the University of California, San Diego propose that the anomaly could be a sign of dark matter. The idea is that while most neutrons disappear via beta decay, a small fraction (about 1%) would instead decay to a “dark sector” particle – a process that would violate the conservation of baryon number. While bottle experiments would measure both beta and dark decay, beam experiments can only detect beta decay. As a result beam experiments would overestimate the neutron’s lifetime.
The new proposal does not feature a unique dark particle with specific properties. But Fornal and Grinstein have shown that several candidate particles are consistent with existing experimental results. They have also shown that some of the possible decay routes generate clear experimental signatures. These include the neutron decaying into a dark particle, plus either an electron–positron pair or a photon, with the energy available for these accompanying particles limited by the narrow range of allowed masses for the dark particle.
Lighter than a proton
The dark particle’s mass must exceed that of beryllium-9 minus that of beryllium-8, since beryllium-9 is known to be stable and therefore doesn’t decay. At the same time, the mystery particle must be lighter than the neutron if that is to decay. In fact, if the particle is indeed the dark matter that has shaped the evolution of the universe then it must obey a slightly tighter constraint – it must be lighter than the proton and electron combined otherwise dark matter would be unstable, which observations tell us it isn’t.
These constraints lead to a particle of dark matter having a mass between 937.9–938.8 MeV. Given that the neutron weighs in at 939.6 MeV, an accompanying photon would have to have an energy between about 0.8-1.7 MeV.
Fornal says that such photons could potentially be seen in many nuclear physics experiments currently operating, but would only be visible once the immense amounts of background noise are filtered out (the photon having no detectable particle companion). He adds, however, that some experimentalists are devising data analysis techniques to try to remove the noise.
Striking experimental signatures
For Susan Gardner, a theorist at the University of Kentucky in the US, the newly postulated dark matter is “feasible” and cannot be excluded by any existing data. “It is particularly exciting,” she adds, “that some of the suggested scenarios have striking experimental signatures.”
In fact, two collaborations at the Los Alamos National Laboratory in New Mexico – UCNA and UCNtau – are currently searching for the photon (gamma-ray) and electron–positron signals within data from neutron decays. “Data are in hand and analyses are under way,” says UCNA team member Peter Geltenbort of the Institut Laue-Langevin in France.
Ben Rybolt of Kennesaw State University in the US describes the latest work as “a reasonable approach” to resolving the neutron anomaly, having himself worked on possible experimental signatures of a rival exotic solution – that ordinary neutrons can sometimes oscillate into non-Standard Model “mirror neutrons”. If upcoming, improved measurements of the neutron lifetime – involving magnetic bottles and monitoring of electron emission from a neutron beam – fail to uncover any hidden systematic errors, he says “there will be even more reason to look for exotic solutions”.
A preprint describing the research is on the arXiv server.
Increasingly, radiation oncology clinics are favouring small, mobile kilovoltage X-ray tubes over radioactive sources to deliver intra-operative radiotherapy. The devices are used to irradiate surgical cavities following the excision of tumours from sites such as the breast. Short-range radiation minimizes the exposure of nearby healthy tissue, enabling large doses to be delivered in a single shot.
Introduced to clinical practice within the last 20 years, the dosimetry of such so-called electronic brachytherapy sources is still under development. Currently, manufacturers typically use their own protocols, making comparisons of doses delivered by different devices problematic.
“Having an accurate methodology for independently verifying the dosimetry of these devices is crucial for performing dose comparison studies, or for planning treatments that combine miniature X-ray sources with other radiation modalities,” says Peter Watson, a PhD candidate at McGill University in Montréal, Canada.
In a new study, Watson and co-authors Marija Popovic and Jan Seuntjens developed a formalism for the absolute dosimetry of therapeutic kilovoltage sources, applying it to the INTRABEAM system manufactured by Carl Zeiss Meditec (Phys. Med. Biol.63 015016).
Calibration check
The INTRABEAM system produces an isotropic distribution of 50 kVp bremsstrahlung and characteristic X-rays using a gold target. The manufacturers provide a factory calibration that enables reproducible doses to be delivered for treatments, correcting for fluctuations in output using daily quality assurance (QA) measurements. System output is also checked annually by the manufacturers.
“This dose is ‘correct’ in that it has been shown to be empirically safe and effective in treatments,” explained Watson. “However, exactly how closely this represents the actual physical dose, by definition of absorbed dose to water, is an open question.”
As an optional QA tool, dose delivered by the INTRABEAM system can be measured using a manufacturer-supplied water phantom and a thin window parallel plate ionization chamber. A depth-dependent correction function supplied by the system manufacturer enables comparison of the doses with system calibration data.
However, information on the correction function is “limited” and how or if the factory measurements or those using the water phantom are traceable to an absorbed dose standard is unclear. Consequently, Watson and his co-authors set out to assess doses as measured by the QA tool against their formalism.
In the new approach, absolute dose in water in the clinical beam is calculated using CQ, a depth-dependant ionization chamber conversion factor. The factor converts the chamber air kerma calibration coefficient, measured in a reference beam, to an absorbed dose to water coefficient for the INTRABEAM photon spectrum in water.
To calculate CQ, the researchers developed a Monte Carlo model of the INTRABEAM source and parallel plate chamber based on the EGSnrc particle transport code. By using direct calculations, the approach does not rely on cavity theory and its associated assumptions. The model was validated by comparing the half value layers (HVL) and depth doses it calculated with measurements, revealing good agreement between the two.
Model validation (click to zoom)
The simulations did, however, reveal significant differences between CQ and the equivalent parameter reported by the manufacturer, corresponding to discrepancies in absolute dose of up to 23%. The model consistently produced a greater CQ value. Differences were seen across the quoted tolerance in chamber plate separation and two corresponding effective points of measurement, as well as over a range of depths. Uncertainty in the plate separation alone accounted for variations in CQ of up to 15%. The results highlight a need for investigations using another chamber with more precisely known dimensions, Watson told medicalphysicsweb.
CQ comparison
In ongoing work, the authors have compared dose calculations using the formalism with the INTRABEAM manufacturer’s QA approach and prescribed doses derived from the factory calibration, as well as measurements using GafChromic film.
Research such as the ongoing TARGIT-B breast cancer trial could benefit from the greater certainty in doses provided by the formalism. In the multi-centre trial, researchers are comparing cancer control provided by whole-breast external-beam radiotherapy combined with either intra-operative or external beam boosts to the tumour bed.
Testing the effect of new drugs on heart tissue could become more straightforward, thanks to new research by a team at Harvard University’s Wyss Institute. The researchers have developed a faster method for manufacturing a “heart-on-a-chip” that can be used to test the reaction of heart tissue to external stimuli (Biofabrication10 025004).
“One of the major challenges in drug discovery and development is failure at the clinical testing stage due to cardiac toxicity,” explained co-lead author Lisa Scudder. “A way to overcome this is to develop new preclinical tests for new drugs using engineered tissues that mimic the native organs of the human body such as the heart.”
Scudder and colleagues are working to build improved platforms to test the safety and efficacy of new drugs, by engineering functional units of human organs using human tissue. For the heart-on-a-chip, they aim to build a tissue that is as contractile and organized as the human heart, enabling measurement of contractile force – a major determinant of heart pumping performance.
“The chip we’ve made is an extension of our previous work, which established a new platform using the miniaturized structural formation of cardiac muscle on a cantilever of hydrogel,” said co-lead author Janna Nawroth. “We use hydrogel because its mechanical properties are similar to the extracellular matrix of the heart. This pre-patterned substrate results in an organized growth of cardiac muscle. The end goal is to put these tissue structures into a microfluidic environment, where we can closely regulate and monitor the flow of the new drug being tested.”
For use in drug development and biomedical research within industry, the chips need to be amenable to mass manufacturing. The existing way of engineering cardiac tissues involves using photomasks, stamping, and manually moulding gelatin to create patterns in the hydrogel. This approach takes too long and is not practical for large-scale chip manufacturing.
“Our new heart-on-a-chip fabrication method uses a UV laser to pattern the hydrogel, employing riboflavin to sensitize the gel for optical ablation,” explained Scudder. “This patterning method then allows the cardiac cells to align into organized laminar tissue structures like in the native heart.”
This UV micropatterning method creates features on the gel much faster than traditional moulding techniques, but with the same resolution and reproducibility. It can generate aligned cardiac tissue and muscular thin films, which beat and contract in response to external stimuli like electrical pacing. As well as being scalable, the new fabrication scheme doesn’t alter the properties of the hydrogel and is up to 60% faster than the old process.
This method could also be used to manufacture other organs-on-a-chip, such as a brain or skeletal muscle-on-a-chip. “In the future, we hope to expand on this fabrication method to mimic disease states like fibrosis, or to create more complex three-dimensional tissue structures that could provide new tools for the preclinical drug development process,” said Scudder.
Every age has its great teachers, and in the early 18th century one of them was Nicholas Saunderson. From 1711 until his death in 1739 at the age of 57, Saunderson was the Lucasian Chair of Mathematics at the University of Cambridge in the UK – a post held just a few years previously by Isaac Newton, and in recent times by Stephen Hawking. He would lecture to packed halls for at least eight hours a day on subjects ranging from mechanics and hydrostatics, to optics and astronomy. He was said to have a tremendous feel for his subject. Literally, as it turns out: he was blind.
Times were certainly tough 300 years ago for the sightless. Having lost his eyes to smallpox as a baby, Saunderson is said to have taught himself to read by tracing out the letters on gravestones. Yet he was luckier with his situation than most. Saunderson’s father and his friends supplemented his school education by reading to him at length, and in his teens he made a learned acquaintance who brought him up to speed with the latest developments in mathematics. His ultimate accession to Lucasian professor was doubly impressive, for the university’s decision went against the wishes of Newton himself – and there are not many known instances of that.
But is it any easier to be a blind physicist today? Inclusive education policies, leaps in technology and a better awareness of the needs of the disabled all suggest a positive answer, yet blind physicists seem to be few and far between. Perhaps the most famous living example is the US astronomer Kent Cullers, whose involvement in NASA’s Search for Extra-terrestrial Intelligence (SETI) programme was fictionalized in the 1997 blockbuster Contact. “We’re in a sighted person’s world,” says Aqil Sajjad, a blind theoretical physicist who lives in Boston, US.
Sajjad knows how determined a blind person must be to have a career in science. Aged 10, he was playing with his cousins one day at home in Islamabad, Pakistan, when a large green blob entered his vision on one side. Doctors diagnosed it as a detached retina, and before long he was blind in that eye. Six years later, his sight left the other eye too – and with it, his dream of being a scientist. “The education authorities back in Pakistan basically told me that I couldn’t be a scientist,” he recalls.
Two senses are better than none: John Gardner uses audio-tactile to explore the periodic table. (Courtesy: ViewPlus)
To avoid missing out on education altogether, Sajjad switched his studies to the arts and humanities, and went to business school. Like Saunderson he benefited from having materials read to him, in this case by his mother, who had a maths degree herself. But to pursue his passion for science he applied to universities in the US, where he anticipated more encouraging attitudes. From several offers, he chose Oregon State University in Corvallis.
Hidden problems
Being blind poses all sorts of challenges in day-to-day life, but one additional difficulty is finding learning materials that are either audible, as in text-to-speech software, or tactile, as in braille. A particular problem with physics, however, is its fundamental reliance on maths, which doesn’t always convert easily to these non-visual media. Within English-speaking countries there are at least two versions of mathematical braille, which means that students have to learn the version suited to where most of their publications come from; and even then, the vast majority of publications are not available in braille in the first place, and can take months or years to translate. Meanwhile, traditional text-to-speech software – though rapidly improving – still often garbles mathematical formulae.
At Oregon State University, Sajjad rejected mathematical braille, which he says was “too slow”, and so staff instead provided specialist text-to-speech software called Triangle. Based on a simplified version of LaTeX – the widely used code to prepare documents containing mathematical formulae for publication – Triangle can read scientific learning materials out loud in a sequential manner, equations included. If an equation contains a fraction, for example, the software alerts the user to a nominator before reading through it, and then tells the user that the nominator is over before moving on to the denominator.
After his degree, Sajjad did a PhD in particle physics phenomenology at Harvard University in Cambridge, US, where he was given a graduate student to oversee the accessibility of his materials. The student wrote a new program to convert LaTeX documents directly into Triangle format, thereby making available a huge portion of the scientific literature: many of the papers on the popular arXiv server exist in LaTeX format, for example. Technical books are often also prepared in LaTeX format, although Sajjad was disappointed to find that the publishers were unwilling to share those versions, for protection of copyright.
His PhD complete, Sajjad is now in the enviable position of deciding which area of theoretical physics he wants to pursue next. He admits that he was wise to take his undergraduate studies at Oregon State University, knowing that it had already primed itself for blind students thanks to earlier work by John Gardner, a leading materials scientist at the institution. In 1988, at the age of 48, Gardner had surgery to correct vision that had long been deteriorating due to glaucoma. Instead of improving his sight, however, the surgery suddenly stole it away. “The doctor told me I needed this tiny operation to keep me from going slowly blind. And it did,” Gardner jokes now, nearly 30 years later.
Gardner was “pretty sick” for several months after his operation, he says, yet he couldn’t afford to be absent long. “I had a large research group to manage. From the hospital bed, I was fighting with the NSF [US National Science Foundation] to keep funding for a postdoc.” His established position in academia was an advantage: colleagues were almost all supportive, while the university itself “bent over backwards” to help because of the grant money he brought in. But returning to his actual research was hard, as much of it involved fitting theoretical models to experimental data. “That was the difficult bit. Everything else had a solution, but that didn’t, except using the eyes of other people.”
Learning curve: There are several different versions of mathematical braille. (Courtesy: iStock/Olga Raktiva)
The magic touch
Before long, Gardner found himself brainstorming ways to overcome the hurdles he faced. Triangle – the mathematics-to-speech software to which Sajjad would turn years later – was in fact one of his inventions. But the real test was how to deal with graphs, diagrams and other graphics. In the early 1990s studies of accessibility had identified “audio-tactile” as an effective way of presenting graphical information to the blind. The method involves an embossed hard copy of a graphic that is linked to an audio system, which has been taught (e.g. by a sighted colleague) to describe different parts of the graphic when those parts are touched. In the dark, an embossed bicycle might feel like a bunch of random squiggles, for example, but it quickly comes to light with touch-sensitive audio descriptions of “wheel”, “crossbar” and “pedal”. “I knew audio-tactile was the right method,” says Gardner, “but there was no software or hardware to support it.”
The key was an embosser – a printer that prints a raised surface – with sufficient resolution. One type of embosser already available at the time used a viscous polymer ink, but Gardner found that the images it produced were too low, too subtle, for his fingers to discern. When he made the embosser use more of the ink, the shapes simply peeled off the surface. His student finally came up with the answer: an electronic device that looks like the face of a miniature meat tenderizer, with an array of tiny pyramids. With a spacing of just 0.127 mm, these independently moving pyramids could stamp a distinct image onto paper with enough resolution for even the most sensitive finger.
Despite the innovative solution, no existing braille companies wanted to license the patent, so Gardner started a company, ViewPlus, to manufacture the embosser himself. Today, ViewPlus sells an entire solution for blind students and academics: a range of embossers, touch pads on which to place the embossed pages, and software to interpret the signals and describe what a user is feeling. Business has been steady, though not as much as Gardner would like. “The hope is that a student can be given an embosser by their university, to take back to the lab,” he says. “It will save the university money, because the students can do things themselves.”
For Gardner, though, the main stumbling block for blind physicists today is not technology but prejudice. One year after the US Department of Energy labelled one of his research projects “world class”, the body stripped funding for it. “The review panel was not convinced that a blind person could continue to do physics, even though I had been doing it as a blind person for a long time,” he says.
But overt discrimination is not the only kind that blind physicists face. Sajjad borrows a phrase used in gender studies, “death by a thousand cuts”, to describe the small incidents that can gradually deter the blind from science academia. These could be unconscious expressions of prejudice, or day-to-day obstacles that the sighted get over without thinking twice, such as recognizing a face across a crowd. Sajjad laughs that his inability to recognize faces has also led to embarrassing instances, when he has said something awkward in the presence of someone he didn’t even know was there. “Sometimes it’s not one bullet that kills you,” he says. “It’s those various little things.”
Into the deep end
Michael Whapples, a blind physics graduate based in the UK, struggled to find employment after studying at the University of Nottingham, and was not helped by poorly educated employers. “I couldn’t prove that they were discriminating,” he says. “But there were some who didn’t seem to have enough awareness.” In fact, Whapples’ frustrations began as a child. Having lost his sight young due to glaucoma and other complications, he attended a small, specialist secondary school for the visually impaired in Worcester, but had a shock once he enrolled at Nottingham. “The classes were big,” he says. “The tutors didn’t even know about braille.”
That is not to say the university didn’t offer help. Whapples was given an assistant to take notes in lectures, he had weekly tutorials, and one member of the teaching staff provided dedicated support. But he had the impression that the administration was improvising as it went along: for example, the conversion of text and maths to braille was sent to a central disability assistant, who did not have a scientific background. “Sometimes it would work, sometimes it wouldn’t,” he says.
Based on his experience, Whapples thinks it would be better if universities thought about potential accessibility issues in advance of blind students matriculating; he also thinks the particular needs of science, technology, engineering and mathematics (STEM) subjects should be better integrated into central disability support services. “I don’t want to deter anyone, but I would say be prepared to be actively involved in finding the solutions to some of the issues you will encounter. It may seem like hard work at the time, and it probably is harder than studying some other subjects, but if you really are interested in the subject then hopefully it will be worth it in the end.”
Worth it indeed – for in a subject that can appear to delve further from everyday reality with every passing decade, Whapples even wonders whether blind physicists could have an advantage over their sighted counterparts. “When people are taught, for example, about the 10 or so dimensions in string theory, a lot of them ask how they ought to imagine those dimensions. But me, I don’t feel such a need to think about them in a visual sense. They’re just more sets of co-ordinates.”
Whapples isn’t the first scientist to believe sight could, in some instances, be regarded as an encumbrance. Bernard Morin, a blind French mathematician who is now retired, once remarked that he could determine the sign of a variable “by feeling the weight of the thing, by pondering it”. The non-visual senses of Saunderson were heightened, too: reportedly he could recognize where he was, and the size of a room, by hearing alone.
Seeing the unseeable
Daniel Hajas, who graduated from the University of Sussex in the UK last year, believes modern physics ought to be suited to the blind. “Aside from light, no-one can actually see electromagnetic waves,” he says. “No-one can see gravitational waves. No-one can see quantum physics in action.” Of course, all these phenomena are probed by instruments – instruments whose outputs are primarily visual.
Hajas’s plunge into darkness was as much of a jolt as it could be: he lost his sight just one year before he left for university in the UK, away from his native Hungary. From the start he was determined to make his subject more accessible. In the second and third years of his course he designed a refreshable electromechanical display for tactile images: a 100 by 100 grid of tightly spaced pins, whose vertical movements were controlled with low-cost actuators. Developing such actuators proved to be too ambitious on a short timescale, so he and six collaborators have since begun another project, for immediate impact: a website that he calls Grapheel IRIS. The idea is that enough sighted scientists sign up to the website as volunteers, so that when a blind person uploads an image, at least one of the volunteers can spare a moment to describe what that image is.
Over just a few months from the end of 2016 to the beginning of 2017, Hajas and colleagues got more than 50 volunteers to sign up, from 14 different countries. He is now developing a version of Grapheel Iris for testing. Ideally, he wants users to be able to install a Grapheel IRIS extension button on Internet browsers, so that they need only click on an image for it to be returned from the Iris community with an audio description.
A service like that would certainly make life easier for blind physicists. Hajas also applauds developments in policy such as the Institute of Physics (IOP) Diversity Programme, which aims to encourage disabled students into physics with evidence-based models of good practice. The IOP, which publishes Physics World, is also a member of a collaborative group known as the STEMM Disability Advisory Committee, which helps disabled education and employment in STEM and medicine.
Yet with all this focus on the needs of the visually impaired, Hajas wonders whether sighted physicists couldn’t relax their sensory preferences at the same time. “I don’t see why we should be using just sight,” he says. “We could all be using a multi-sensory approach.”
A new system for delivering targeted cancer therapies has been developed by researchers in China and Sweden. The biodegradable drug-delivery system uses an external light source to degrade black phosphorous (BP) hydrogel nanostructures containing drugs for cancer therapies. As the structure degrades, it releasesA new system for delivering targeted cancer therapies has been developed by researchers in China and Sweden. The biodegradable drug-delivery system uses an external light source to degrade black phosphorous (BP) hydrogel nanostructures containing drugs for cancer therapies. As the structure degrades, it releases the drugs at tumour sites. the drugs at tumour sites.
Targeted drug delivery to tumours is a very attractive option for cancer therapies as it provides treatment directly at the point of the tumour. However, localized approaches such as injections of chemotherapy drugs are invasive, and are often painful. Polymer-based drug-delivery systems that are inserted into the body and degrade to release drugs are encouraging, but often cannot be controlled. The result is therapies that don’t work, which contributes to the increased risk of resistance in cancer cells.
The team led by Han Zhang at the Shenzhen University, China, and Yihai Cao at the Karolinska Institutet, Sweden trialled the use of black phosphorous. The newly discovered material boasts high photothermal conversion efficiency, easy fabrication and excellent biocompatibility and biodegradation. Their BP@Hydrogel system is comprised of BP Nanosheets (BPNSs) and a hydrogel (a hydrophilic network of polymer chains), and releases its cargo when exposed to a near-infrared light source. The black phosphorous photothermal transducing agents convert light to thermal energy, which increases the temperature of the hydrogel matrix. The agarose hydrogel then softens, which releases the drug from the black phosphorous scaffold into the body.
The process of softening is reversible, meaning the release can be controlled in bursts, so that the drug is discharged where it will be most effective. At the end of the treatment, enhanced laser power can be used to melt the hydrogel completely, degrading it into oligomers that are excreted through the urine.
Highly controllable drug release
The researchers loaded the hydrogel with Doxorubicin (DOX) – a commonly used cancer therapy – to measure the light-controlled drug release. When irradiated with a near-infrared source, the concentration of DOX increased dramatically compared with the control without irradiation. The researchers also showed that both light intensity and exposure duration accurately controlled the device.
Irradiation of 1 W·cm-1 applied to the BP@Hydrogel was sufficient for drug release. The temperature increase softens the device due to the hydrolysis of cross-linking. Complete melting occurs at 2 W·cm-1 irradiation, at which point the BPNSs are no longer encased by the hydrogel and degrade rapidly.
Safely degrades and shrinks tumours
The waste products were found to be non-toxic. Studies on mice found that tumours treated with a DOX-loaded BP@Hydrogel device, which were irradiated with an NIR light source, were notably smaller, demonstrating an excellent tumour ablation effect in vivo.
The BP@Hydrogel system can be loaded with other drugs to target a wide range of tumour types and other diseases, while minimizing unpleasant side effects. The authors believe a design such as this has the potential to help millions of patients suffering from cancer.
Nanostructuring a thin layer of the semiconductor germanium using a technique called nanoimprinting lithography can greatly boost the amount of light it absorbs across the visible to near-infrared wavelengths. The broadband absorption comes from the strong interplay between Brewster and photonic crystal modes in the material and the effect could benefit optoelectronics applications such as photovoltaics and telecommunications.
Designing ultrathin semiconducting materials that absorb light over a broad range of wavelengths is crucial for making improved optoelectronics devices that more efficiently convert light into active electrons. One way of achieving this is to increase the thickness of the semiconductor layer so that it can capture the maximum number of photons across the optical spectrum.
Researchers at the Institut de Ciència de Materials de Barcelona in Spain have taken a different approach to this one by employing different light-trapping strategies so that a reduced amount of semiconductor is still able to strongly absorb light. In the photonic metastructure they studied, incoming light is coupled to different types of light resonant modes: Brewster and hybrid photonic-plasmonic resonant modes. It is these resonances that are responsible for concentrating the light electric field in small volumes and so allow the material to absorb light over the visible to the NIR range (400–1500–nm).
Two effective thicknesses of germanium for light to interact with
A Brewster mode is a photonic mode in which there is no light reflected from the surface, explains team leader Agustin Mihi. “A thin layer of semiconductor, like the one we used in our study, on a noble metal substrate sustains this type of mode thanks to the high refractive index of the semiconductor and the non-ideal behaviour of noble metals in the visible part of the optical spectrum. In this mode, light is strongly confined in the thin film at wavelengths determined by its thickness.
“Because of the nanostructuring of the germanium (Ge) film in our sample, there are two effective thicknesses of Ge that light can interact with: the thinner one with the Brewster mode and the thicker one with the plasmonic-photonic mode.”
The metastructure made by the researchers comprises a Ge 2D square array of cylindrical holes built on top of a gold film using soft lithography – a scalable technique that has the advantage of being compatible with mass production processes such as roll to roll.
Metasurface exhibits a series of photonic resonances
“It is challenging to achieve broadband light absorption through photonic resonances alone because each resonance amplifying light absorption acts only over a specific wavelength range,” says Mihi. “In our experiments, we make a metasurface that exhibits a series of photonic resonances from the NIR to the visible, increasing the light absorption of the Ge layer at all energies above its electronic bandgap.”
The broadband absorption in fact comes from the simultaneous excitation of the different resonances throughout the entire absorption spectra of the Ge layer, he tells nanotechweb.org. “In the visible, our photonic architecture sustains a broad Fabry Perot resonance, which is enhanced by coupling with a Brewster mode, as mentioned. In the NIR, there are multiple absorption peaks coming from the plasmonic-photonic modes excited in the photonic crystal fabricated on top of the metal substrate.
Applications in photovoltaics and telecommunications
“The photonic crystal provides ways to couple the light in the in-plane direction of the ultrathin Ge film, allowing it to confine long wavelength photons (up to 1400 nm),” he adds. “We also carefully design the metasurface to couple this light to slow light modes, which combine photonic crystal and plasmonic effects, resulting in strong absorption peaks.”
Such strong broadband absorption could be useful for making more efficient photovoltaics devices and the NIR absorption in particular (which reaches 100% over the important telecommunications window) could benefit applications such as photodetectors, he says.
The team, reporting its work in Advanced Materials DOI: 10.1002/adma.201705876, is now busy designing different optoelectronics devices using its nanostructured Ge. “These include third-generation solar cells (based on perovskite materials). We hope to improve their efficiency and make them competitive with established silicon technology.”
Air pollution is a major global issue. In the city of London, for example, more than 9000 deaths each year are attributed to air pollution, with around 3500 of those associated with long-term exposure to PM2.5. Now a study has shown that globally PM2.5 concentrations have increased by more than one-third since 1960, and that the number of deaths attributable to long-term exposure to PM2.5 has increased by nearly 90%.
Pollution particles are inhaled deep into the lungs, with smaller particles (PM2.5) penetrating the furthest and having the most serious effect. Those exposed to higher levels of particulate pollution are more likely to suffer from respiratory and cardiovascular diseases, and have a shorter average life expectancy.
Today, nearly 90% of the global population lives in areas exceeding the World Health Organisation’s air-quality guidelines for annual mean PM2.5. A number of studies have investigated the impact of present-day air pollution on health, but few have looked at how air pollution has affected health over the last few decades; a period where air quality changed rapidly.
To rectify this, Edward Butt from the University of Leeds, UK, and his colleagues used the HadGEM3-UKCA coupled chemistry-climate model, along with demographic and disease data, to estimate the changes in global and regional air pollution PM2.5 and the attributable health burden over the period 1960 to 2009.
The team found that global mean population-weighted PM2.5 concentrations increased by 38%, dominated by increases in China and India. The global attributable deaths from particulate pollution, meanwhile, rose by 89% to 124% over the same period, again dominated by large increases in China and India. These large increases were not only a result of regional growth in particulate air pollution levels, but were heavily influenced by rapid population growth and ageing.
“These changes resulted in significant increases in attributable deaths in China and India year-on-year because more people were being exposed to high levels of air pollution, which is further exacerbated by an ageing population, who are more vulnerable to the effects of particulate air pollution,” said Butt.
In contrast, the study showed that air-quality regulation and emission controls in Europe and the US reduced air pollution PM2.5 concentrations over the same time period. There was a significant drop in the number of attributable deaths – by 65.7% for Europe and 47.9% in the US. In 1960 the US and Europe accounted for 27% of the global attributable deaths; this fell to around 1% by 2009.
Butt and his colleagues hope that understanding the changes in particulate air pollution deaths over the past few decades can help policy makers make sound decisions on future air quality policy. They also note that projected demographic changes in Asia will pose a challenge for policy makers aiming to reduce the total number of deaths due to particulate air pollution in the near future.
“We hope that the study will give policymakers, utilities and solar businesses the ability to fine-tune their offers so that the costs of PV [photovoltaics] more closely match its benefits,” Parth Vaishnav and colleagues Nathaniel Horner and Inês Azevedo of Carnegie Mellon University told environmentalresearchweb. “We’d also like to highlight the geographical variation that exists in the costs of systems and the benefits they produce.”
It’s estimated that 2.5 GW of distributed PV installations were added across the US in 2015, followed by another 3.4 GW in 2016. Driving this capacity growth is a reduction in system prices, plus a range of federal, state and local subsidies as well as net metering programmes offered by some utility providers.
Looking at the financial benefits, data from 2014 point to a much wider range of the population receiving incentives for adopting rooftop solar than back in 2006. But the study concludes that PV subsidies still flow disproportionately to areas with higher incomes.
Other observations include the importance of generous net metering policies for the financial viability of rooftop systems. In US states where customers can sell excess power to the grid at retail prices, the private benefits of solar PV typically exceed private costs. However, if marginal pricing is applied, then customers need to be in locations with abundant sunshine such as California, Nevada or Texas to make the numbers add up. But that’s not the only factor, as the calculation is also sensitive to the financing terms of purchasing a rooftop system – even in sunny states.
While the study estimates that the total upfront subsidy per kilowatt of installed capacity has fallen from, on average, $5200 in 2006 to $1400 in 2014, the absolute magnitude has soared as the number of rooftop systems in operation has rocketed. This public investment has helped to grow jobs – the number of installers in the dataset rose from 514 in 2006 to 2900 in 2015 – as well as stimulate a regulatory environment for firms to operate in.
The scientists touch on other plus points too, adding that the increase in installed capacity also contributes to understanding how best to integrate distributed PV into the electric grid.
However, when considering the public benefit in terms of pollution cuts, such as avoided carbon dioxide emissions and improvements in air quality, the group recognizes the difficulty in determining appropriate monetary values. As a result, the researchers caution against defining public benefits too narrowly.
Renewable energy sources such as solar and wind have a crucial role to play in reaching climate goals, and success will be felt on a global scale. It follows that the benefits of technologies such as rooftop solar PV need to be priced accordingly.