Skip to main content

Holes reveal first fractional quantization

Advances in epitaxial growth of high-quality low-dimensional strained germanium semiconductor structures in recent years have opened up options for studies of these systems that previously were not possible. Maksym Myronov, who led these fabrication advances at Warwick University, Michael Pepper, and colleagues in the UK have seized the opportunity to study the transport behaviour of holes when loosely confined to one dimension (1D). To their great surprise they observed fractional quantized conductance for the first time ever in the absence of strong magnetic fields.

Quantum hole wire interview

“We’d been looking at electrons for a long time so we thought let’s see if there’s any difference between holes and electrons,” says Pepper, a researcher at University College London (UCL) who pioneered the study of low-dimensional electron gas systems and associated quantum effects. He had been studying electron transport in two-dimensional (2D) systems as they were gradually confined to 1D for several years with Sanjeev Kumar, also at UCL, before turning his attention to the transport of holes – corresponding positive charges where an electron is absent.

At the same time their colleague Maksym Myronov at Warwick University had been perfecting one of the epitaxial growth techniques to create high-quality epitaxial semiconductor materials, including germanium, which had previously been out of reach. Myronov was working with Stuart Holmes at Toshiba Research Europe at Cambridge Research Laboratory to examine the potential of holes in germanium for developing new spintronics systems.

As their interests aligned, Pepper, Kumar, Myronov and Holmes joined forces alongside Yilmaz Gul, who was studying for his PhD with Holmes and Pepper. “We thought we’d see the regular quantization, but we hadn’t expected to see any fractions, because they had not been seen before in any system, unless of course you apply a very strong magnetic field,” explains Pepper.

System architecture

The researchers studied epitaxial germanium grown on silicon, which makes it strained. They used split gates to confine the hole transport to 1D and gradually relaxed this confinement to allow some lateral drift so that a zig zag developed as opposed to a straight line of transport. With high confinement from the split gates, they measured hole conductance values in regular integers of the quantum unit 2e2/h – where e is the charge of an electron or hole, and h is Planck’s constant. This matched with the behaviour they had observed in studies of 1D electron conductance in other semiconductors such as GaAs. However, to their great surprise, as the confinement was weakened they found conductance values at 1/2, 1/4 and 1/32 of 2e2/h.

Transconductance showing fractional quantization

Quantized conductance – fractionalized

The first observations of fractional quantum conductance date back to 1982 with the fractional quantum Hall effect. However, this only occurs in the presence of a very high magnetic field, and there have been no observations of fractional quantum conductance besides this effect. In the experiments by Pepper and colleagues’ there is no large applied magnetic field, and the cause of the fractional quantization remains a mystery. The researchers are liaising with theorists to find an explanation.

“It is interesting to note, that so far, the fractional quantum Hall effect has been observed only in 2D systems made of semiconductor and oxide materials,” says Myronov. He adds, referring to work reported in one of his recent papers, “Just recently, in 2015, I and some colleagues observed this effect in superior quality strained germanium that I had grown epitaxially, which outlined the pathway to our latest discovery.”

As for explanations of the effect, Pepper describes some of the approaches to explore. “One is to look at extensions to the theory of the fractional quantum Hall effect, but without the magnetic field; others start from the formation of the zig zag and then seeing what happens when you add various interactions into that – but so far no explanation has emerged.”

Gul, for whom the observations formed the basis of his PhD thesis laughs at how in some ways it was an easy thesis to defend. “I didn’t have to learn any theories because there weren’t any.”

Applications and further work

The effect may find applications in quantum computing where the silicon substrate for the germanium hole wire may have advantages for integrating with silicon electronics processes and systems. Although to see the quantized conductance it is important that the germanium is high purity, as Holmes points out “The states that we observe are naturally protected from the environment, so things like disorder and impurity scattering may not be important. That would mean that you could have slightly impure materials that you might find more in industry – companies like Toshiba would be interested in developing that kind of technology.”

Myronov adds that although in the fractional quantum Hall effect there is no direct observation of a quarter charge, there is indirect evidence that it exists, and theory suggests that it can be very strongly coherent and difficult to decohere. This means that if it were carrying quantum information it would be much more difficult to corrupt than in other types of qubit. “There is of course a very long way to go but if this effect can be exploited in quantum information, then because it is a semiconductor it can be miniaturized with a very high density and germanium can be grown on silicon so the control circuitry can be integrated with the qubits,” says Myronov.

In addition, manipulating quibits with an applied electric field is more manageable than the large magnetic fields required for the fractional quantum Hall effect, although as Holmes emphasizes, it may be a little premature to talk of qubits before the state itself is properly understood.

Alongside discussions with theorists to try and explain the effects, the researchers’ next steps include looking to see if there are any more fractions of the conductance in germanium, studying the system at lower temperatures, and experiments to measure the actual charge itself. “It’s a completely new system,” says Pepper. “So we expect more surprises.”

Full details are reported in Journal of Condensed Matter Physics.

Composite polymer electrolyte helps improve Li-ion batteries

Solid-state electrolytes, such as those made from ceramic-polymer composites, are promising alternatives to liquid electrolytes for next-generation lithium-ion batteries that are safer and more stable. However, there is a problem in that these composites agglomerate at high weight ratios, which adversely affects their good ion conductivity properties. A new 3D interconnected framework consisting of ceramic nanofillers made using a nanostructured hydrogel template that does not aggregate could help overcome these problems.

“Ceramic-polymer composite electrolytes comprising ceramic nanofillers and a polymer matrix have improved mechanical and thermal stability as well as ionic conductivities that are better than pure polymer electrolytes,” explains Guihua Yu of the University of Texas at Austin, who led this study. “However, as these composites agglomerate at high weight ratios, the ion conductivity pathways in these materials becomes discontinuous. This is because the percolating network in the nanofiller deteriorates.

“In this new work, we fabricated 3D interconnected frameworks consisting of ceramic nanofillers using a simple nanostructured-hydrogel template technique. The framework provides a continuous ion conduction path that prevents nanofiller aggregation. It also provides excellent conductivity and electrochemical stability.”

The 3D LLTO framework

Yu and colleagues made their percolating network of ceramic nanoparticles by nanoscale phase separation of a polymer (polyethylene oxide), water, and ceramic precursors (Li0.35La0.55TiO3, or LLTO). The nanosized ceramic particles are interconnected to form a free-standing continuous, 3D porous structure.

“Once the self-supporting ceramic framework has formed, we fill the void pores with the polymer to prepare a polymer electrolyte,” Yu tells nanotechweb.org. “Unlike traditional composites in which particles are dispersed and separated in the polymer matrix, the nanostructured framework acts as a 3D nanofiller that can then form a continuous interphase between the polymer and the ceramic. The interaction between the filler and the polymer results in improved lithium ion conductivity, as mentioned, thanks to the formation of a continuous interphase, but also in sufficient mechanical integrity for use in a variety of flexible electronic devices.”

According to Yu, the interphase between the ceramic filler and the polymer is responsible for faster lithium ion conduction thanks to the high dielectric constant and the abundant surface defects in the LLTO.

The team, reporting its work in Angewandte Chemie DOI: 10.1002/anie.201710841, says that it would now like to apply its structural design strategy to other energy applications, such as composite electrodes. To this end, we need to better understand the interaction between the ceramic filler and the polymer matrix,” explains Yu.

Population grows in US floodplains

Maps that the US Federal Emergency Management Agency (FEMA) rely upon to determine flood risk underestimate the potentially affected population by a factor of three. A new model that covers the 48 contiguous states of the US (excluding Alaska and Hawaii) at 30-metre resolution places 41 million Americans in a 100-year floodplain, meaning the annual probability of flooding is 1%. FEMA's maps put that population figure at just 13 million.

The reason for the discrepancy, according to Oliver Wing of the University of Bristol, UK, is that FEMA's maps cover only about 60% of the land area, and "they tend to just focus on the primary large streams in those catchments".

Wing, who presented his findings at the American Geophysical Union's (AGU) Fall Meeting in New Orleans in December, said that FEMA "will tend to have not modeled the smaller headwater tributaries, and ... when you add those up across the country, that's a lot of risk that is therefore missed".

Wing and his colleagues combined data from their expanded floodplain maps with population and land-use projections from the US Environmental Protection Agency (EPA), to estimate how the affected population might grow this century. Currently, 13.3% of the US population lives in a 100-year floodplain, they found. By 2050, that will increase to around 15.7%, and by 2100, it could reach as high as 16.8%.

The researchers also evaluated the amount of developed land within flood zones. Currently about 150,000 sq. km, this could increase by anywhere from 37% to 72% by 2100, depending on the growth scenarios. In addition, the value of assets located on floodplains, currently $5.5 trillion, will double during the rest of the century.

"We're not even accounting for what climate change may do to flooding," Wing said at AGU. "We are simply looking at the intensification of floodplain development, and due to that alone, we see that risk is increasing quite dramatically."

Asked at a press conference whether FEMA would be making use of the data presented at the AGU meeting, Wing said his team would be happy to work with FEMA, if asked. He emphasized that the cutting-edge methodology employed in the study, involving "big data" and the enhanced computational power required to generate more detailed new maps, had only become available recently. He said he would be "very much surprised" if FEMA did not adopt the new methodology within the next five years.

The study included inputs from academia, the commercial world, the nonprofit sector, and government agencies, Wing said. At the time, the paper was under review for publication in Environmental Research Letters (ERL) – it has now gone live.

NPL launches Metrology for Medical Physics Centre

The National Physical Laboratory (NPL), the UK's National Measurement Institute, has launched the Metrology for Medical Physics Centre (MEMPHYS). The aim of MEMPHYS is to accelerate the development and implementation of innovative diagnostic and therapeutic technologies.

The Centre will provide a range of consultancy and research services, and offer quality assurance and commissioning tools that help get new equipment into routine use, and enable more efficient use of technologies already in the clinic. "MEMPHYS brings together all of the medical physics activities happening at NPL under a single banner and helps us to grow," explained Rebecca Nutbrown, NPL's head of metrology for medical physics. "NPL has been expanding its work in this area over the last year."

One area of medical physics in which NPL already has extensive experience is radiotherapy dosimetry, such as testing the calibration of linacs, for example. "We need to check that the patient treatment is delivered as expected," said Nutbrown, speaking at the MEMPHYS launch event. "So we devise phantoms that mimic the properties of the human body, embed dosimeters, and then treat them as we would a patient and check the radiation is delivered to the right place."

For example, as hypofractionated radiation treatments become increasingly prevalent, NPL - in collaboration with the Royal Surrey County Hospital and the Radiotherapy Trials QA (RTTQA) group - used a specialized CIRS head phantom (STEEV) to ensure that all centres using stereotactic radiosurgery (SRS) are treating patients in the same way, and treating accurately. Another development is a dosimetry audit for testing spinal stereotactic body radiotherapy (SBRT) - a treatment that necessitates exceptionally precise beam targeting.

NPL has also created a graphite calorimeter that measures radiation dose extremely accurately. The calorimeter is used to perform measurements on one of NPL's two linacs; then a hospital will send their instrument to NPL for comparison, thereby ensuring the accuracy of the instruments used for clinical dosimetry.

More recently, NPL has been working on a small portable calorimeter for reference dosimetry of proton therapy. The aim here is to standardize treatments across the UK's forthcoming proton therapy centres, and to ensure that proton therapy is delivered with the same level of accuracy as conventional radiotherapy.

Lab launch
MEMPHYS also sees the introduction of a new nuclear medicine laboratory. Last December, NPL installed a unique clinical SPECT/PET/CT camera that is directly calibrated against primary standards of radioactivity. As nuclear medicine continues to transition from qualitative to quantitative imaging, the goal is to provide primary standards for emerging quantitative approaches.

The SPECT/PET/CT scanner

The camera will be used to image phantoms, which will then be taken to a clinic for imaging on its scanners. The two data sets, both of which will be processed by NPL to provide quantified measurement uncertainties, are then compared to assess scanner accuracy. To help tailor this process for specific applications, MEMPHYS has also set up a new Rapid Phantom Prototyping Laboratory to produce and optimize medical phantoms.

The lab applies 3D printing to rapidly develop complex phantoms. Examples include anthropomorphic phantoms for medical imaging and dosimetry, and mouse phantoms for calibrating equipment employed in small-animal and radiobiology experiments. The phantoms are fillable with radioactive solutions, though future approaches will involve integrating radioactivity directly into the printing material.

3D printed complex phantoms

Device development
Finally, NPL's ultrasound group has developed a new scanner for breast imaging. Ultrasound can be employed as an adjunct to X-ray mammography, and generates more reliable images when breast tissue is dense. The team is developing a novel phase-insensitive ultrasound sensor, originally conceived for a measurement application, to create a system with potentially improved image quality and far fewer artefacts than a conventional ultrasound scanner.

The team has now built a prototype of the phase-insensitive ultrasound CT (piUCT) device. The system employs 14 parallel ultrasound emitter elements, plus one large sensor, all housed in a water tank. The elements can be can moved vertically to select the scan plane. During scanning, the patient lies prone and the breast is imaged without requiring compression, making the procedure much more comfortable than mammography.

Prototype phase-insensitive ultrasound CT device

The system is currently being evaluated at NPL using a breast phantom, and images will be compared in terms of quality and spatial resolution with X-ray CT and MRI scans. The next step is to test the system on a number of volunteers. For this, the piUCT will shortly be moved to Southmead Hospital in Bristol, for trials on 20-30 patients. Looking further ahead, NPL hopes to partner with a commercial manufacturer to translate this novel system into a clinical device.

Ultimately, MEMPHYS will function as an international centre for excellence, fostering interdisciplinary and inter-sector research to inspire cutting-edge innovations. MEMPHYS will work closely with the NHS, academia and industry to enable the rapid and widespread implementation of a host of new diagnostic and therapeutic technologies.

The March 2018 issue of Physics World is now out

Image of the cover of the March 2018 issue of Physics World

I hope you’ve been enjoying the new-look Physics World website, which we launched earlier this week. I’m sure you’ll agree it’s a vast improvement on what went before.

And as if the excitement of the new site wasn’t enough, I’m pleased to say that that the March 2018 issue of Physics World is also now out.

As always, selected articles from the magazine will appear on this website over the course of the month, but if you’re a member of the Institute of Physics, which publishes Physics World, you can read the entire March issue right now in digital format.

We also look at the career opportunities for physicists in environmental science, the research that goes on at US tape and adhesives giant 3M, as well as developments in technology that can help disabled people in everyday life.

Plus there are all the usual sections, including Reviews, Careers, Transactions and the ever-popular Lateral Thoughts.

Let us know what you think about the issue on TwitterFacebook or by e-mailing us at pwld@iop.org .

For the record, here’s a run-down of what else is in the issue.

  • Revitalizing Japanese physics – Japan is trying to boost its declining international competitiveness in science by attracting top foreign researchers. But as Matin Durrani finds out, working in Japan can be challenging to outsiders
  • Science in a changing world – Japan has traditionally been strong in science and technology, but Tateo Arimoto calls for the country to reform if it wants to stay ahead
  • Lighting the way – In the second of his new columns about physics in industry, James McKenzie looks at the lessons we can learn from the humble light bulb
  • The pioneer princess – Robert P Crease celebrates a woman who transformed how learned societies should be run
  • My invisible battle – The stigma around mental illness is slowly crumbling, with society becoming increasingly aware of the problem that affects an estimated one in four adults, from any walk of life. But what about attitudes within the academic community? A theoretical physicist tells their story of battling mental illness while pursuing a research career, raising the question of whether the community does enough to help
  • The Cybathlon challenge – Science and engineering are vital for developing “assistive technology” to help disabled people perform everyday tasks. But as Rachel Brazil finds out, a sporting contest called the Cybathlon has proved invaluable for discovering if the devices are fit for purpose
  • A sticky wonderland – Adhesives are everywhere, from the aerospace industry to the simple but infamous Post-it Note. Alaina G Levine visits adhesive-giant 3M Company’s main US innovation centre to find out more about the physics involved
  • Race to space and beyond – Tim Gregory reviews Ad Astra: an Illustrated Guide to Leaving the Planet by Dallas Campbell
  • Tale of the atom tamers – Tushna Commissariat reviews the new documentary film Let There Be Light: the 100 Year Journey to Fusion, directed by Mila Aung-Thwin
  • From physics to environmental science: a natural evolution? – Physics and environmental research are more compatible than you might first think. Kate Ravilious talks to three leading physicists-turned-environmental researchers, to find out about their journey
  • In the pursuit of inspiration – Martijn Boerkamp from Dutch start-up firm Inkless on what triggered him into physics

And don’t forget, if you have any thoughts on the issue do let us know on TwitterFacebook or by e-mailing us at physics.world@iop.org.

Ancient hydrogen reveals clues to dark matter’s identity

A potentially huge breakthrough in the study of dark matter has come from an unlikely source: radio emissions detected from hydrogen gas that existed just 180 million years after the Big Bang.

Dark matter makes up 26.8% of the total mass and energy in the universe, yet it continues to remain elusive. Astronomers have only been able to detect it through its gravity, while terrestrial experiments designed to identify dark matter particles have found nothing. However, new observations of the ancient universe suggest that the reason for this is that our experiments have been looking in the wrong place.

A team led by Judd Bowman of Arizona State University has used the EDGES  all-sky radio antenna in western Australia to search for the faint signature of primordial hydrogen in the very early universe. When the first stars lit up, their ultraviolet radiation was absorbed by hydrogen atoms, which as the most common element is ubiquitous in the universe. This absorption caused the single electrons in hydrogen atoms to undergo a small jump between two hyperfine energy levels, releasing 21 cm/1420 MHz radio emission in the process.

Ancient history

EDGES was able to detect this emission, redshifted to a frequency of just 78 MHz by cosmic expansion. The redshift corresponds to an era just 180 million years after the Big Bang. However, there was a surprise. The amplitude of the signal was twice as large as had been predicted, and Bowman thinks this could be explained “if the gas in the early universe was colder than expected,” he tells Physics World.

The temperature of the gas 180 million years after the Big Bang had been expected to be about six degrees above absolute zero, but the strength of the signal detected by EDGES implied a temperature of just 3 K. The first implication of  this observation is that it constrains when quasars switched on and began pouring out X-rays, since the X-rays would have heated the gas, but it is the second result, if confirmed, that could have the greatest repercussions for science.

Chilling effect

Rennan Barkana, of Tel Aviv University, suggests that dark matter is responsible for the hydrogen’s low temperature. At that point in time, the only matter that should have been colder than the hydrogen was dark matter, and if the dark matter particles and hydrogen atoms were scattering off one another, this would remove heat from the hydrogen atoms.

“If the dark matter interpretation of the new 21 cm signal is correct, then it is the first direct observational indication of a non-gravitational interaction,” says Barkana.

Although in the modern universe dark matter does not seem able to interact with ordinary matter through anything other than gravity, in the past this may have been different. The cold, low velocity conditions unique to that era may have permitted some interactions. For the scattering process to take place, the individual masses of the dark matter particles must be less than 4.3 GeV, which is equivalent to the mass of just a few protons. This is far removed from the expected region of 100 GeV, and could explain why experiments trying to directly detect dark matter particles are failing to find anything, because they are looking in the wrong energy range.

Dark matter did something 

Although cautious about the results, Richard Massey of the UK's University of Durham, who was not involved in the research, says that, “If this interpretation is correct, it would be a breakthrough that transforms the field: the first time in 40 years that dark matter is seen as doing something rather than nothing.”

Independent confirmation could come soon. EDGES measured the hydrogen emission by taking an average across the entire sky, but the Hydrogen Epoch of Reionisation Array (HERA), and the upcoming Square Kilometre Array, both in South Africa, will be capable of taking more intricate measurements. Since the dark matter hypothesis predicts a very specific pattern in the radio waves, Barkana says “the real test will come from these more detailed measurements".

Beyond that, “people will think very hard now about the consequences of the dark matter interpretation and perhaps some new tests will be identified,” says Bowman. If Barkana and Bowman are proven correct, then it will mean that dark matter’s days of remaining hidden from us will be numbered.

The observations and theoretical calculations are described in two papers in Nature: Bowman et al and Barkana.

Welcome to a new look for Physics World

After months of activity behind the scenes, we are incredibly excited to reveal our stunning new website. We hope you like it as much as we do.

Our key motivation throughout the relaunch project has been to make it easier for you to discover the content of most interest to you. The striking new design can be enjoyed on any device you happen to be using, while improved navigation places renewed emphasis on the applied and interdisciplinary fields that are driving progress in such mission-critical areas as healthcare, the environment, and commercial and industrial development.

To help you explore our content, we have introduced 15 subject topics that span both core physics disciplines and emerging interdisciplinary fields. Whether your interest lies in condensed matter or biophysics and bioengineering, science careers or business innovation, you’ll find all the relevant news and commentary in a single place on our website.

We have also launched a series of special collections that allow you to delve deeper into some of the most exciting and thought-provoking topics from across the scientific spectrum. Two of these collections are already available – one on the emerging field of multimessenger astronomy, and the other on the opportunities and challenges faced by physicists in their working lives – and more will be added in the coming weeks and months.

One thing hasn’t changed: our commitment to providing daily coverage of the most exciting research breakthroughs and technology innovations, as well as the key developments that affect the global scientific community. Through the site you can also enjoy a rich programme of multimedia and longer form articles, including:

  • Videos and webinars that offer new ways to explain and explore the latest science. Our podcast series will also be available very shortly
  • Analysis and feature articles that take a deeper look at the stories behind the headlines
  • Careers articles that offer a unique insight into the experiences of scientists working in both industry and academia
  • Expert opinions and reviews of the best new science books and cultural events

Tell us what you think

We’ll be continuing to update the site over the next few months, and we’ll keep you posted on all the new changes as they are rolled out.

As part of that continuing development, we’re really keen to know what you think about the new site. You can use the rating tool to provide an instant reaction, or please contact us at pwld@iop.org to let us know your views.

The ‘Big Science’ marketplace

What do a particle accelerator, a space agency, a free-electron laser lab and an experimental fusion reactor have in common?

The answer – or, at least, the most relevant answer for the inaugural Big Science Business Forum (BSBF) here in Copenhagen, Denmark – is the network of companies that supply them with equipment and services. Although the scientific goals of CERN, ESA, the European XFEL and the ITER fusion project are very different, when you get down to the nuts and bolts of everyday operations, the technologies required to probe the fundamental nature of the universe and to explore our little corner of it are strikingly similar.  And if your company can make instruments that can withstand the harsh environment of space, then maybe you ought to consider building some that can operate in a fusion reactor, too.

That was one of the messages at the BSBF plenary session on “Big Science as a Market”, which featured speakers from several companies where big science is also big business. Among them was Gaizka Murga-Llano, astronomy business manager at the Bilbao-based engineering and construction firm IDOM. Their involvement in the big-science market dates back to 2005, when IDOM took part in a design contest for the dome of the Extremely Large Telescope. At the time, the instrument known as the E-ELT (it’s now just the ELT, in a rare example of an inverse relationship between acronym complexity and time) was intended to be 100 m in diameter, and a traditional dome was not considered feasible. Over the subsequent decade, however, both budget and technical constraints forced managers at the European Southern Observatory (ESO) to scale back their plans. Throughout that period, Murga-Llano explained that IDOM worked with the ESO to develop designs that met the instrument’s evolving needs; today, construction on a 39 m ELT is under way at the ESO site in Cerro Armazones, Chile, with first light expected in 2024.

The story of IDOM and the ELT nicely illustrates some of the challenges that companies – especially small and medium-sized enterprises (SMEs) – face in working with big-science facilities. “If you are in the development phase, and every three months, you have to tell your bosses or your shareholders that the interesting commercial returns will come in five years, it is not very easy,” said Kurt Ebbinghaus, who is now an industry liaison officer at Fusion 4 Energy (the umbrella organization for ITER) but was previously managing director at the industrial-services firm Bilfinger Noell. Another of the session’s speakers, Jens William Larsen, made a similar point, explaining that when his small coatings firm, Polyteknik AS, works with industrial customers, they measure timelines in hours, days and weeks. With their big-science customers, however, they have to talk about months, years and decades.

In some cases, there are cultural barriers, too. Big-science facilities are interested in scientific discoveries and grand challenges, Larsen observed, but “we are just practical guys who want to run a business, who want to make some products. How do we combine those perspectives?” Although the rewards of interacting are significant for both sides – contracts for the companies; expertise and a more robust supplier base for the scientific organization – the overwhelming message from the speakers is that these interactions don’t happen on their own.

“We are just practical guys who want to run a business, who want to make some products. How do we combine those perspectives?”

Jens William Larsen, Polyteknik AS

Toward the end of the session, one of the BSBF co-organizers, Juliette Forneris, offered some concrete suggestions for improving the state of the big-science market. Forneris is the Industrial Liaison Officer (ILO) for both CERN and the ESO in Denmark, where the national government set up an umbrella group several years ago to help Danish firms engage with all of the Big Science organizations mentioned above, plus the European Synchrotron Radiation Facility (ESRF) and the European Spallation Source under construction across the Øresund in Sweden. In her view, one of the biggest barriers, especially for small firms, is the maze of different procurement procedures in operation at the different scientific facilities. In many cases, these procedures are written into the facility’s founding documents, so changing them would be next to impossible. However, Forneris believes that facilities could do a better job of communicating their rules to companies – especially those that haven’t worked with big science organizations before, and need an entry point.

That last part is important, because as IDOM’s example shows, one big-science project often leads to another. Barely a decade after they submitted their first ELT design to the ESO, the firm now has a client portfolio that reads like a who’s who of major scientific facilities. “Our previous work on the astronomy field, and especially the ELT, made it easier to work on the ITER programme,” said Murga-Llano. “If you are thinking of doing something, my advice is, go for it. It’s fascinating.”

Mechanical splitter divides cell-sized liposomes

Liposomes, aqueous vesicles with a lipid-bilayer membrane much like our own biological cells, can be split into two stable daughter liposomes using a mechanical splitter. The division process is highly symmetrical, that is, it produces two roughly equally-sized “daughters”, and occurs in just milliseconds. The new technique might not only provide a way to mimic the growth-division cycle of cells, it might also be used to divide liposomes in an exponential manner to make them smaller and smaller, which may be important for drug-delivery applications. It might even help shed more light on how life itself originated on Earth.

Liposomes are routinely employed as objects for both fundamental and applied studies. For example, they can be used as drug-delivery vehicles in medicine. They can also be filled with biomolecules to mimic biological cells.

Cell division, which produces one or more “daughter” cells from a “mother” cell with transfer of genetic material to the daughter(s), is one of the fundamental characteristics of living cells. However, reproducing this process in the lab is no easy task.

Floppy mother liposomes

“We have now demonstrated for the first time that we can mechanically split liposomes into two daughter liposomes,” says Cees Dekker of the Kavli Institute of Nanoscience Delft, who led this research effort. “We did this in a very intuitive way by running them onto a mechanical splitter that sliced the liposomes in two. While this sounds very logical and simple, it required some fine control of the surface-to-volume ratio of the liposomes since we needed to make the mother liposomes very ‘floppy’ so that the daughter liposomes could survive and did not burst.”

Dekker and colleagues made their liposomes using a technique that they recently developed called octanol-assisted liposome assembly (OLA) to produce liposomes on a chip.

Bubble blowing

“This technique is similar to bubble-blowing,” explains team member Siddharth Deshpande, lead author of the study. “As mentioned, we make the liposomes floppy as they are produced so as to have enough excess surface area to make division possible. We then run these liposomes against a Y-shaped splitter located on the same chip and split them symmetrically into two daughters.”

The process produces equal-sized daughter cells and occurs in just milliseconds, he tells nanotechweb.org. “It does not require any sophisticated protein machinery (unlike biological cells) and there is only limited leakage – that is, only a tiny amount of material inside the liposomes is lost to the surroundings.

Applications in synthetic biology

“We also have a good degree of control over the division process – for example, depending upon the liposome size, we can predict whether it will split, burst like a balloon or snake. Bursting often occurs in bigger liposomes and snaking in ones that are too small (they simply pass through the Y-branch).”

From a synthetic biology point of view, the fact that we can simply divide liposomes in a such a brute-force manner is very interesting, he says. “One of our goals is to mimic the growth-division cycle of cells and such division could be one half of the cycle. Such repeated division cycles might also be used to produce ever smaller liposomes, which might be important for drug delivery applications.

Fundamental questions

“What is more, the technique may even help us better understand how life originated on Earth,” he adds. “Might primitive cells have relied on such simple ways of dividing themselves at the very beginning?”

The team, reporting its work in ACS Nano DOI: 10.1021/acsnano.7b08411, will now be trying to couple this division process to the growth of liposomes. “We are currently working on new techniques to grow liposomes and it will be quite an achievement to combine these two modules to produce the complete ‘life cycle’ of these aqueous vesicles.”

Superconductivity – pairing up with nanotechnology

The fundamental requirement for superconductivity is the coupling of fermionic electrons into Cooper pairs. Theory paints a neat picture of how the resulting bosonic behaviour allows occupation of the same energy levels and leads to a host of exotic behaviour - zero electrical resistance and the expulsion of magnetic flux lines so that superconducting objects levitate on magnets, to name a few. Where the picture grows fuzzy is extrapolating from there what specific aspects a material system needs to become superconducting at a given temperature. While design principles to fabricate a room-temperature superconductor remain elusive, a lot has been learnt in the chase, bringing applications of superconductors in a range of sectors from imaging, testing and quantum cryptography ever closer.

2D materials

Among the material systems where unusual electronic behaviour akin to Cooper pairing might be likely is the interface between perovskite oxides – in particular, LaAlO3 and SrTiO3 – where there is a discontinuity in the polarity of the crystalline lattice. Following the initial discovery of a highly mobile “2D electron gas” at the interface in 2004, Jochen Mannhart and colleagues then identified superconducting properties at the interface in a layer limited to just 20 nm in 2007. The transition temperature was a chilly 200 millikelvin, and the exact origins of the effect were unclear, but oxide interfaces remain a hotbed for exploring electronic and spintronic behaviour.

Since then several 2D structures have revealed superconducting behaviour where it does not exist in the bulk, an example being “grey” tin. The form of tin usually considered most useful is “white” tin, which has a conventional metal crystallographic structure, and was among the first superconducting materials to attract study. However, at low temperatures white tin will gradually transform into grey tin, which has a diamond cubic structure and is sometimes described as “tin pest”. To their surprise, Qi-Kun Xue, Ding Zhang and colleagues at Tsinghua University in China found that when they reduced the dimensions of tin to 2D stanene of just 2-20 layers, they could observe superconducting properties in grey tin too. Going even thinner to monolayers resulted in insulating properties.

"What we found is that the grey tin can be scientifically quite interesting," Zhang told nanotechweb.org. As well as the fundamental science the discovery opens up, it also poses the opportunity to produce circuits from all one material, with superconducting wires of few layer stanene separated by insulating monolayers.

Structure of stanene

Triplet vs singlet superconductors

Graphene – the mother of today’s explosion of 2D material research – has also demonstrated superconducting properties. In most superconductors electrons pair up with opposite spins to give a singlet spin state with isotropic s-wave symmetry following the Bardeen, Cooper and Schrieffer (BCS) theory of superconductivity. However other types of superconductivity are possible where the spins are aligned in parallel in a triplet state with anisotropic p-wave symmetry or chiral d-wave symmetry.

According to theory p-wave triplet superconductors are very sensitive to defects so there is no chance of observing it unless the crystals have very high purity. In addition, the superconducting transition temperature hovers at a frosty 1.5 Kelvin, below even the reach of liquid helium refrigerants. However, when it comes to graphene theory gives a more positive outlook for observing p-wave superconductivity. As well as the possibility of doping graphene to achieve superconducting effects, placing single-layer graphene on a superconductor should enhance intrinsic electron pairing with p- or d-wave symmetry to the point that a superconducting state is triggered in the graphene at temperatures above 4.2 K, i.e. the temperature of liquid helium. This kind of intrinsic superconductivity by proximity was observed with s-wave symmetry on the s-wave superconductor rhenium in 2013, and at the beginning of 2017 researchers in the UK, Israel and Norway reported superconductivity by proximity with p-wave symmetry on the electron-doped cuprate superconductor Pr2−xCexCuO4 (PCCO) at 4 K.

Moving down the dimensions

While electronics may seem the obvious sector for exploiting zero-resistance phenomena, as electronics becomes increasingly preoccupied with reducing feature sizes the next question is how low can you go before a wire loses its superconducting effects? In 2000 A. Bezryadin, C. N. Lau and M. Tinkham at Harvard in the US tackled the issue using measurements of ultrathin superconducting nanowires made from carbon nanotubes coated in superconducting Mo–Ge alloy. At high temperatures thermal excitations give rise to phase slips, which disrupts the processes that cause superconductivity. Bezryadin, Lau and Tinkham argued that quantum tunnelling of phase slips could also effectively localize Cooper pairs, suppressing superconductivity if the wire were thin enough for the normal-state resistance of the wire to be greater than the quantum resistance of Cooper pairs.

Nanowires have proven to be fruitful for exploring some of the exotic phenomena found alongside superconductivity too. Majorana fermions are particles that are their own antiparticle. Predicted by Ettore Majorana in 1937, they could be useful in quantum computing but remained unobserved for the next 75 years. In 2012 a team of researchers led by Leo Kouwenhoven at Delft University of Technology and Eindhoven University of Technology identified what they believed were Majorana fermions in a semiconductor wire wrapped in a superconductor. Although some had suggested that the signature of the Majorana fermions observed could be attributed to scattering, recent experiments by the same group with atomically smooth interfaces have ruled this out, providing strong indication that the systems do harbour Majorana fermions.

Majorana measurement team

High-temperature-superconductor nanostructures

As well as carbon nanotubes, superconducting properties have been conferred on other nanostructures such as anodized alumina arrays by coating in superconducting materials. A popular choice of coating is YBa2Cu3Ox. Discovered by Paul Chu at the University of Houston in 1987, the superconducting transition temperature of YBa2Cu3Ox is a balmy 93 Kelvin. Other cuprates have demonstrated “high-temperature” superconducting properties since. While the transition temperature of YBa2Cu3Ox is still way below freezing, what made Chu’s discovery so significant is that above 77 K liquid nitrogen suffices as a coolant, and this is much easier to handle than liquid helium.

As well as coating nanotemplates, researchers have also produced YBa2Cu3Ox nanostructures through micromachinining and electrospinning, but the final product usually requires heat treatment for superconducting properties to appear. While the initial powder provides a degree of versatility for coating arbitrary structures, the heat treatment makes the material brittle. William Rieken, Atit Bhargava and colleagues at Nara Institute of Science and Technology showed that they could produce a powder of YBa2Cu3Ox nanorods through a solution processing approach, which is not only simple but does not require any further heat treatments. As a result they could paint the powder on structures without leaving them brittle. "Our materials don’t represent a new material, but a new way of thinking about superconductors and opens a route to designer superconductors, tailoring them for specific uses," says William Rieken.

Josephson junctions and photon detection

Many of the applications of superconductivity – from quantum key distribution in cryptography, long range 3D infrared depth imaging, and integrated circuit testing, to fundamental tests of quantum mechanics - rely on exploiting the phenomenon for photon detection. Superconducting wires are very sensitive to incident photons as these can break up the Cooper pairs thus suppressing superconductivity. Devices based on this effect have successfully detected single photons in the visible and infrared regions of the spectrum, but had been less sensitive to lower energy photons. A graphene sheet contacted at both ends by a superconductor has demonstrated the ability to detect microwave photons as well, an important region of the electromagnetic spectrum for astronomers to detect cosmic background radiation and determine how galaxies form.

The basic structure for these devices – a non-superconductor or insulator with superconducting contacts at either end – is described as a Josephson junction. The material between the superconductors is a weak link, which provides a channel for a supercurrent – a current that could in theory run forever with no applied voltage. Observations of the effect had been dismissed as breaches in the insulator between the superconducting contacts until Brian David Josephson explained the phenomenon with predictions of the supercurrent based on the mathematical relationship between current and voltage in 1962.

As well as photon detectors, researchers at NIST in the US have used Josephson junctions to build artificial synapses. Connections in the brain respond to the history of signals that have passed through them giving learning functions. Neuromorphic electronics researchers are keen to emulate synaptic connections to produce electronics with added functionality. Schneider and colleagues built Josephson junctions consisting of two layers of superconducting materials separated by an insulating silicon matrix embedded with nanoscale clusters of manganese. When an applied electric current exceeds a critical level, voltage spikes are produced that mimic the action potentials – signal spikes – that neurons generate.

“These artificial synapses are in fact better than their biological counterparts,” Schneider told nanotechweb.org. “They can fire much faster – 1 billion times per second compared to a brain cell’s 50 times per second using just one ten-thousandth as much energy.”

Superconducting devices out and about

Despite the great potential, superconducting devices are often confined to the lab because of all the bulky cooling equipment required. Progress has been made here too. Robert Hadfield at the University of Glasgow and colleagues in STFC Rutherford Appleton Laboratory in the UK, Single Quantum B. V. in the Netherlands and KTH Royal Institute of Technology in Sweden miniaturized a platform for superconducting photon detectors that operate at 4 K. As the researchers point out in their report, “Although the need for liquid cryogens has been eliminated by the use of practical closed-cycle cryocoolers, such bulky, power hungry systems are not truly capable of mobile operation.”

The researchers go on to describe how they have developed a fully closed-cycle miniaturized cooling platform based on Stirling and Joule–Thomson (J–T) cycles that can reach a base temperature of 4.2 K and is the size of a desktop printer. The design has already been launched aboard the Ariane 5 rocket in 2009 as part of the Planck mission, where it operated “flawlessly for the entire mission duration of nearly 4.5 years (>39 k h)”. This cooler design has now been adapted specifically to house for example the superconducting nanowire single-photon detectors.

A miniaturized 4 K platform for superconducting infrared photon counting detectors

SQUIDs

Take two Josephson junctions in parallel and you get a superconducting quantum interference device (SQUID) – a highly sensitive magnetic flux detector. In the SQUID, current splits through each of the Josephson junctions but the presence of a magnetic flux will generate a screening current to cancel the flux. This screening current loops round both Josephson junctions adding to the current in one arm and subtracting from it in the other. As the flux in the loop increases, the screening current will change direction to either increase or decrease the flux to an exact integer quanta of flux.

A key application of SQUIDs is in biosensing. Their high sensitivity can detect iron concentrations in body organs for magnetoencephalography, magnetocardiology, fetal magnetocardiology and biomagnetic liver susceptometry. They can also detect injected magnetic nanoparticles that are functionalized to target specific proteins, cells or antigens to track, image and diagnose disease. Keiji Enpuku, Yuya Tsujita, Kota Nakamura, Teruyoshi Sasayama and Takashi Yoshida at Kyushu University in Japan reviewed recent progress in magnetic biosensing techniques using SQUIDs in the Superconductor Science and Technology focus collection on SQUIDs in biomagnetism.

Unleashing the full potency of this biosensing tool means detecting magnetic-nanoparticles inside the human body with SQUIDs, and this raises additional challenges. “For in vivo human measurements the system must be non-invasive and conform to the anatomic restrictions requiring sensitive detectors and dedicated setups,” explain Oswaldo Baffa and colleagues at Universidade de São Paulo and Universidade Federal de Goiás in Brazil in a report in the same focus collection. They go on to describe a system based on an a.c. biosusceptometer to induce magnetism in manganese ferrite-based magnetic nanoparticles surface-coated with citric acid, which they detect using a second-order axial gradiometer coupled to a radio frequency SQUID. They achieved limits of detection of 8–11 × 109 at distances of 1.1–2.5 cm. While noting a number of possible improvements to the system they conclude, “The results found show that the bio-susceptometric technique has good sensitivity and temporal resolution since measurements take around 30 s, and might have interesting applications in the real-time in vivo detection of nanoparticles after systemic injection.”

Although studies of nanostructures have made huge contributions to advancing understanding of superconductivity, many aspects of the phenomenon remain a marvellous mystery and a great stimulant for further research. Equally as fascinating again is the creativity in applying superconductors in such a diverse array of fields, and here without a doubt the tiny world of nanostructures has had a huge impact.

Copyright © 2026 by IOP Publishing Ltd and individual contributors