Skip to main content

Could the molecules of life supplant Moore's law?

By Hamish Johnston at the AAAS Annual Meeting, Vancouver, Canada

Depending on how you express it, Moore’s law has held up remarkably well over the past 40 years. In particular, chipmakers have been able to double the number of transistors that can be squeezed onto a chip every two years. This explains why the mobile phone in your pocket is more powerful than the most advanced “supercomputers” of the early 1970s.

Round about 2004 however, one aspect of this exponential growth hit the buffers – it became very difficult to remove the vast amount of heat produced by all these tightly packed circuits. The result being that it is not possible to use all circuits to their full potential. This problem has the ominous moniker of “dark silicon”.

Looking into the not-so-distant future, another problem is expected to crop up when the size of insulating structures in circuits drops below about 4 nm. At this point, electron tunnelling between circuits is expected to put an end to Moore’s law.

One way round this, according to Ralf Cavin of the Semiconductor Research Corporation, is to make the electrons heavier and therefore less likely to tunnel. While this might sound crazy, an electron in a solid has an effective mass that is often greater than its mass in free space. The thing that I don’t understand about this solution is that the speed at which a transistor operates is related to the effective mass of the electrons. The heavier the electron, the slower the speed; so it seems this is at odds with sustaining Moore’s law.

Ultimately, engineers will have to look beyond Moore’s law, and that was the topic of a session where Cavin spoke at the American Association for the Advancement of Science (AAAS) meeting here in Vancouver.

Cavin is keen on in carbo electronics – devices that are based on the remarkably efficient information processing done by living organisms. The benefits, according to Cavin, are many. For one thing, biological molecules such as DNA can store data at much higher densities than the ultimate upper limit of semiconductor devices. Living systems are also highly parallel and extremely energy efficient. On the downside, living circuits are much slower than silicon.

I’m not sure when your mobile phone will contain in carbo devices, if ever, but work has begun in that direction.

Graphene-based composites could cool electronics

 

Researchers from the University of California at Riverside, US, say that they have developed a new “thermal interface material” (TIM) that could efficiently remove unwanted heat from electronic components such as computer chips or light-emitting diodes. The material is a composite of graphene and multilayer graphene.

Unwanted heat is a big problem in modern electronic systems that are based on conventional silicon circuits – and the problem is getting worse as devices become ever smaller and more sophisticated. TIMs are positioned between a heat source – such as a computer chip – and a heat sink, and they play a crucial role in cooling devices. Conventional TIMs are generally filled with thermally conducting metal particles and have thermal conductivities in the range 1–5 Wm–1 K–1 at room temperature. A high volume fraction (of more than 50%) of filler particles is usually needed to achieve such conductivities.

Cooling composition

Graphene could be ideal for use as a filler in TIMs for carrying away heat because pure graphene has a large intrinsic room-temperature thermal conductivity that lies in the range 2000–5000 Wm–1 K–1. These values are higher than those of diamond, the best bulk-crystal heat conductor known.

Ideally, for practical applications, researchers would like to make TIMs with thermal conductivities of about 25 Wm–1 K–1. Such materials would be used to not only efficiently cool digital electronic components, but also in energy applications – for example, to prevent solar cells from overheating – and in next-generation high-power-density communication devices.

Alexander Balandin and colleagues proposed the use of few-layer graphene as a TIM in 2010. Now, they have succeeded in increasing the thermal conductivity of a routinely employed industrial epoxy-resin-based TIM, or “grease” as it is better known in the industry, from about 5.8 Wm–1 K–1 to a record 14 Wm–1 K–1. The filler particles in this case consist of an optimized mixture of graphene and few-layer graphene, and the volume fraction of the carbon-based material in the epoxy is very low at just 2%.

The researchers prepared their own graphene and few-layer graphene using an inexpensive and simple liquid-phase exfoliation technique. This is a high-yield method that can easily be scaled up to industrial levels.

Reduced resistance

According to the team, it is the presence of single and bilayer graphene together with thicker graphitic multilayers that enhances the thermal conductivity of the composite to the values observed. “The excellent performance of graphene in this respect – compared with, say, carbon nanotubes, for instance – probably comes about thanks to the smaller ‘Kapitza’ thermal-interface resistance between graphene and the base matrix material,” says Balandin. “Graphene simply couples to the matrix material better.”

The experiments show that graphene and few-layer graphene flakes are more-efficient filler materials for increasing the thermal conductivity of TIMs than conventionally used fillers, such as alumina particles. The new graphene-based fillers are also different to previously tested materials, such as carbon nanotubes or graphitic nanoplatelets, because these materials only weakly couple to the matrix.

Exploiting nanoscale effects

Balandin says that he has been studying the thermal properties of nanostructures – including extremely thin films and nanowires – for nearly 15 years. “My motivation was to exploit nanoscale effects to control the propagation of phonons – vibrations of the crystal lattice responsible for heat conduction in many materials,” he explains.

The team now plans to work with industry engineers to develop the next generation of TIMs, which could well be based on graphene. “These would have to meet the specific requirements of different applications,” says Balandin.

The work was reported in Nano Letters.

Tying qubits in knots

Photo of a knot

By Hamish Johnston at the AAAS Annual Meeting, Vancouver, Canada

Some of the world’s leading experts on quantum computing are here in Vancouver for the American Association for the Advancement of Science (AAAS) annual meeting – and it’s been great to hear them speak and to also interview some of them.

One topic that has come up several times is the idea of topological quantum computing. A major challenge for those trying to build practical quantum computers is how to protect the “quantumness” of their fragile devices from the destructive effects of environmental noise and heat.

One approach is to take advantage of the topological nature of some quantum states. One example involves quasiparticles called anyons that are predicted to exist in 2D semiconductors. One feature of anyons is that they cannot overlap with each other as they travel through space and time. The result is that the anyons exist in quantum states called “braids” that criss-cross each other.

A key feature of the braids is that they are robust to noise and heat. Indeed, to destroy such a state it must be unravelled much like untying a knot – a process that takes time and effort. This is unlike a more conventional quantum state such as the spin of an electron, which can be destroyed by a simple nudge from a random magnetic field.

Michael Freedman of Microsoft Station Q in Santa Barbara is one of the pioneers in developing the theory of topological computing, and he spoke at the conference. He left the audience with this vision for the future: “There is a serious prospect that quantum computing will change the face of computation.”

Other speakers had a complementary take on this. Scott Aaronson of the Massachusetts Institute of Technology believes that quantum computing and the emergence of quantum computers will give physicists new insights into quantum physics. “Quantum computing has opened a two-way street between physics and the science of computation,” he said.

The essential guide to topological computing can, of course, be found in Physics World.

A soggy afternoon at TRIUMF

Ariel hardhat


Building ARIEL

By Hamish Johnston at the AAAS Annual Meeting, Vancouver, Canada

One of the pleasures of my job is that I get to talk to people who are passionate about physics. But nothing prepared me for Lia Merminga, who has to be the most enthusiastic physicist I have ever met. Merminga is head of the accelerator division at TRIUMF in Vancouver – which started as a particle-physics facility back in 1968 but has since branched out into nuclear, medical, biological and condensed-matter physics.

I was at the lab yesterday, dodging the puddles as we toured the campus under leaden skies. The highlight of the tour was getting a close-up look at the cyclotron, which was shut down for maintenance.

You can see the photos I took during the tour on our Flickr page.

I also spoke to Merminga about the Advanced Rare Isotope Laboratory (ARIEL) electron accelerator facility that is currently being built at TRIUMF. Indeed, I suspect much of her enthusiasm comes from the fact that she has what must be a dream job for an accelerator physicist – she’s in charge of building a brand-new accelerator!

I spoke with Merminga about many aspects of ARIEL, so look out for an interview sometime in the future on physicsworld.com.

Inside the box at D-Wave

Geordie Rose

By Hamish Johnston, reporting from Vancouver, Canada

Yesterday I took a cab out to nearby Burnaby to have a chat with Geordie Rose, a physicist who is co-founder of the quantum-computer maker D-Wave systems. That’s Geordie on the right standing next to one of the firm’s famous black boxes.

What’s inside the box? I had a look. The box itself is a shield that protects its contents from electromagnetic fields that would wreak havoc with D-Wave’s quantum bits (qubits) – which are superconducting flux qubits. In simple terms, each qubit is a little magnet that could easily be perturbed by stray fields.

Also in the box is a dilution refrigerator, which cools the chips to near absolute zero. D-Wave uses “dry” fridges that don’t need to be topped up with costly liquid helium. Indeed, Rose says that the firm has played an important role in the development of dry fridges.

The fridge cools an integrated circuit that contains hundreds of flux qubits. They are arranged in a 2D array where each is coupled with its nearest neighbour, creating an Ising model on a chip.

I also stopped by to chat to D-Wave’s Suzanne Gildert (below), who is developing the firm’s Developer Portal, where programmers can learn about how to write code for the systems. The portal is in a beta version at the moment but a full-blown portal will soon be available to all.

Suzanne Gildert

Coming away from D-Wave, you can’t help thinking that the company has cracked the challenge of creating a viable and scalable quantum computer. Indeed, you can even buy one, if you want. But Rose admits there are lots of challenges ahead before quantum computing goes mainstream – and he thinks the best way forward is to keep building and keep improving the systems.

You can see more photos from my visit on out Flickr page.

Quantum weirdness

butler_x.jpg

By Matin Durrani

Quantum physics is notoriously counter-intuitive and difficult to grasp, which is perhaps why the subject is often invoked to explain other seemingly counter-intuitive and difficult-to-grasp areas of life.

But that doesn’t mean that wheeling out the subject necessarily makes any sense.

So here’s to the former Bishop of Southwark, Thomas Frederick ‘Tom’ Butler, whose comments on the Thought for the Day slot on BBC Radio 4 this morning (at about 1.47) are a classic of the genre.

The bishop begins, for some reason, by talking about the search for the Higgs boson (or Higgs “bos’un” as he irritatingly puts it) at CERN’s Large Hadron Collider, which, he reveals to the world, “will shortly be reactivated”. (I presume he means restarted after the scheduled winter shut-down as the scientists on the machine have happily been taking data for more than two years now.)

Having brought up the LHC, Butler then jumps suddenly into quantum physics, bemoaning his lack of understanding of the subject. But that’s okay, we’re told, because Niels Bohr once said that “those who aren’t shocked when they first come across quantum theory can’t possibly have understood it”. (Quote a heavyweight from science – that’ll impress the listeners.)

And so to the heart of the matter: fundamental entities can be both particles and waves, which is, er, a bit like religion really. “Paradoxically it [quantum mechanics] has made some of the traditional problems of the nature of God easier to understand,” says the bishop, pointing out, for example, that Jesus is both human and divine.

Butler admits he found it hard, when he was younger, to come to terms with this paradox. “I found this both/and faith world difficult to grasp,” he says. “Surely this paradox couldn’t be right?”

Ah, but all’s well now, thanks to quantum physics. “It tells us the world is paradox. The fundamental nature of existence is both/and.”

But if you have any lingering doubts over the validity of quantum physics, don’t worry. “Hopefully we’ll soon have the Higgs boson to give the theory the stamp of approval.”

You can relive the whole item here – jump to about 1.47 minutes.

Good morning from Vancouver

View from Vancouver


Snowy mountains and the sea

By Hamish Johnston at the AAAS Annual Meeting, Vancouver, Canada

I have just registered for the American Association for the Advancement of Science (AAAS) meeting in Vancouver. Above is what you see from the convention centre – nice view!

More from Vancouver later, I’m off to D-Wave to talk about quantum computers.

DNA nanorobot delivers drugs

Researchers in the US have developed a new nanorobotic device based on DNA that can deliver “cargo”, such as drugs, to individual biological cells. The technology might one day be used to treat various diseases by directly programming the immune response of cells.

The nanorobot, developed by Shawn Douglas and colleagues of the Wyss Institute for Biologically Inspired Engineering at Harvard University, US, is in the form of a hexagonal DNA “hinged” barrel that can be opened and closed. The device measures 35 × 35 × 45 nm and can hold various types of cargo – such as metal nanoparticles – in its interior. It is kept closed by two “locks” that are encoded with aptamers – artificial nucleic-acid receptor molecules that bind to specific target molecules, like some antigens – and can be opened like an oyster shell when it interacts with the right combination of antigen “keys”. These keys can be proteins on biological cell surfaces and can be made to include specific disease markers, explains Douglas.

Logical locks

The nanorobot can be programmed to open when exposed to a single type of key by using the same aptamer sequence on both lock sites. Another possibility is to encode different aptamer sequences in the locks to recognize two inputs. Both locks need to be opened at the same time to activate the device and it remains firmly shut if only one of the two locks is opened. The lock mechanism thus functions as a logical AND gate: cell-surface antigens either bind (“0”) or do not bind (“1”) to aptamer locks.

The barrel was made using “DNA origami” in which complex 3D shapes and objects are constructed by folding strands of DNA. Such a technique has already been used to make nano-sized boxes, which are also capable of carrying cargo, with lids that can be locked and unlocked.

“When the nanorobot opens, its previously sequestered biologically active payload can then interact with nearby cells,” explains Douglas. “By designing the lock to open when bound to a particular antigen available on a cell surface, the device can thus be targeted to induce cell-signalling ‘instructions’ in certain cell populations that express the antigen. At the same time, the cargo is prevented from interacting with other cells [that do not express the antigen] in the same environment.”

Instructing cancer cells

The researchers have already used their device to deliver instructions, encoded in antibody fragments, to two different types of cancer cells – those responsible for leukaemia and lymphoma. “The instructions were different for both types of cancer cell and contained different antibody combinations,” explains Douglas, “and in each case, the message was to activate the cells’ ‘suicide switch’ – a standard feature that allows ageing or abnormal cells to destroy themselves.”

The device is the first DNA-origami-based system that employs antibody fragments to convey molecular instructions, something that allows for a completely controlled and programmable way to replicate an immune response, says team leader, George Church. “We are now finally able to integrate sensing and logical computing functions via complex, yet predictable nanostructures,” he states. “These are some of the first hybrids of structural DNA, antibodies, aptamers and metal atomic clusters aimed at useful, very specific targeting of human cancers.”

The team now plans to test its device on rodents before considering human clinical trials. “Its applications are possibly not just restricted to smart therapeutics but may also be used in diagnostics and even nonmedical applications,” adds Church.

The work was published in Science 10.1126/science.1214081.

Uterus contractions caused by electrical coupling

New research could help explain one of the “miracles” of childbirth – how the uterus contracts to push babies into the world. Computer models, developed by researchers in India and France, reveal that as the moment of birth approaches, the cells in the uterus become more electrically connected, enabling them to behave in synchrony. To date, it had not been clear how the cells in the uterus could act together to generate a large-scale contraction during labour.

The uterus is composed of active muscle cells plus electrically passive cells that connect and support them, with muscle cells outnumbering the passive cells. In a general sense it is believed that contractions are caused by positive ions flowing into a muscle cell via small protein complexes called gap junctions. These gap junctions electrically couple the two types of cell, and these junctions are known to multiply near the end of a pregnancy.

Earlier experiments on rats demonstrated that at the time of delivery, the electrical conductivity of the uterus was about nine times greater than it was two to three days before, which suggests that the number of gap junctions rose from about 50 to 450 per muscle cell. Structural studies have shown that gap-junction conductance could increase as much as 100-fold. However, the link between increasing the number of gap junctions and the emergence of synchronized contractions was not fully understood.

Modelling the uterus

This link is the subject of new research by a team including Sitabhra Sinha of the Institute of Mathematical Sciences, Chennai, India, working alongside colleagues at the Ecole Normale Supérieure in Lyon, France. In order to explore how increased electrical connectivity could affect the electrical oscillations of the cells, Sinha and colleagues used a computer simulation to model a network of excitable and passive elements with varying coupling strength.

The results showed that with relatively few connections between neighbours, the oscillations in the system were initially all out of synch. As the coupling increased, clusters of excitable elements began to oscillate with the same frequency. With further connections, the clusters merged, until all oscillating elements shared a frequency, with a few inactive regions. Finally, all the excitable elements oscillated in a single wave, similar to the waves observed experimentally in the uterus of a guinea pig.

“There is some evidence that in the uterus, the frequency of contractions increases as the time of delivery approaches,” says Sinha. “Our results suggest an intriguing possibility of how such increased oscillation frequencies can come about as a result of enhanced coupling between cells.”

Robert Garfield of St Joseph Hospital and Medical Center in Phoenix, Arizona, calls the study a “wonderful contribution” that supports the theory that burgeoning gap junctions among the muscular cells can bring on the “synchronous and rhythmic activity of the uterus required for delivery of the fetus”.

Excited and passive cells

But Chi Keung Chan of the Academia Sinica, Taipei, Taiwan points out that the model does not quite reflect the real electrical potentials of the active and passive cells in the uterus. “Real muscle cells in the uterus have resting potentials below that of the passive cells, but their excited potentials are above that of passive cells,” he says. In the model, the excitable cells always have a potential below that of the passive cells, whether they are resting or excited. “It is like connecting lots of batteries to the excitable cells, and it is not surprising that the system will fire continuously,” Chan adds.

According to Pik-Yin Lai of the National Central University in Jhongli City, Taiwan, the transition within uterus cells to synchronized behaviour is likely to involve other biological factors. “The onset of collective periodic contraction is mostly likely an external event caused by hormonal secretions,” says Lai, pointing out that an oxytocin injection can induce contractions in about 10 min.

Sinha agrees that the role of hormones must be included, and that this is another current research direction for his collaboration. “Our preliminary simulations show that increased excitability of the excitable cells results in a transition to coherent activity,” he says, and he notes that when the gap junctions between cells were dissolved in other experiments, a hormone injection could no longer induce collective contractions.

This research is published in Physical Review Letters.

Optical fibres with integrated semiconductor junctions developed

An international team of researchers has integrated a semiconductor junction into an optical fibre for the first time. The device, which works at gigahertz frequencies, is the first step in creating an all-fibre optical-communications network where light is generated, modulated and detected within a fibre itself without the need for integration with electronic chips. Its range of applications could run from improved telecommunication systems and laser technology to more-accurate remote-sensing devices.

The research team, led by John Badding, a chemist at Penn State University, includes other chemists at Penn State and a team of physicists at the Optoelectronics Research Centre at Southampton University in the UK.

Fibre to wire, and back

One of today’s crucial technological challenges involves developing a way to seamlessly and rapidly exchange information between optical systems and electronic ones. Badding explains that when it comes to existing technology, the merging of a round optical fibre with the minute components on a flat electronic chip is difficult to achieve. But integration is crucial because silicon-based integrated circuits are the building blocks for most electronic devices.

“For example, light is transmitted from London to New York via fibre-optic cables when two people set up a video call on their computers. But the computer screens and associated electronic devices have to take that light and convert it to an image, which is an electrical process. Light and electricity are working in concert in a process called an OEO conversion, or an optical–electrical–optical conversion,” says Badding. Instead, Badding and other researchers around world are trying to find an “all-fibre solution” where the electrical components can be directly integrated into an optical fibre such that the light never needs to leave the fibre.

But that is easier said than done. In addition to the challenge of shaping a connection between flat chips and round fibres, a major problem is the size of the objects involved. “An optical fibre is 10 times smaller than the width of a human hair. On top of that, there are light-guiding pathways that are built onto chips that are even smaller than the fibres by as much as 100 times, so imagine trying to line those two devices up,” Badding explains.

Novel merger methods

Instead, the researchers have come up with a novel way to integrate the two components: they built a new kind of optical fibre that contains its own integrated electronic component. They used a high-pressure chemistry technique known as “chemical vapour deposition” to deposit layers of semiconducting materials directly into specially fabricated tiny pores in an optical fibre. By doing so, they effectively create a semiconductor junction inside a fibre. “We had to rethink basic semiconductor fabrication, only within an optical core,” says Badding.

The researchers began with an empty pore in an optical fibre and then added layers of platinum and doped silicon, creating the junction. Badding points out that an added benefit of the technique is that it uses silicon – “the classic material of modern electronics”. The main challenge was to layer the semiconductor materials evenly on such a long, thin hole. “People were surprised that our method worked because the holes are so long and thin…they kept asking us why the hole did not clog,” says Badding. But he explains that they used extremely high pressure – more than 300 times greater than the atmospheric pressure – to deposit one layer, which is then heated and then another layer is painted onto the first, creating the junction. “It’s like trying to evenly and smoothly painting a garden hose that runs from say London to Southampton,” he claims.

All-fibre networks

In the end the team was successful and the researchers produced, for the first time ever, an integrated optoelectronic fibre. Their method is also cost-efficient, as it does not require the multimillion-dollar clean-room facilities and equipment that conventional chip fabrication demands. Other benefits of the method include the fact that the technique could work quite well with other semiconducting materials.

In the future, one of the key goals of research in this field is creating an all-fibre network. “If the signal never leaves the fibre, then it is a faster, cheaper and more efficient technology,” says Pier Sazio from the University of Southampton, who is one of the team’s leaders. “Moving technology off the chip and directly onto the fibre, which is the more natural place for light, opens up the potential for embedded semiconductors to carry optoelectronic applications to the next level. At present, you still have electrical switching at both ends of the optical fibre. If we can actually generate signals inside a fibre, a whole range of optoelectronic applications becomes possible.”

The work is published in Nature Photonics.

Copyright © 2026 by IOP Publishing Ltd and individual contributors