Engineers in the US and Taiwan have carried out an experiment that they say exposes a serious flaw in our understanding of how transistors work. The research finds that as transistors shrink, the amplitude of electronic “noise” in these devices grows even more than standard theory says it should. The researchers warn that unless our understanding of noise is reviewed, then development of next-generation laptops, mobile phones and other low-power devices could be hampered.

Transistors perform an essential role in electronic devices by amplifying and switching signals, but in order to do this reliably they must be made from highly purified materials. Defects in these materials can — like rocks in a stream — impede the flow of current and cause a transistor to malfunction. As a result, the transistor may fluctuate rapidly between its “on” and “off” states in an effect known as “random telegraph noise”.

What’s all that noise about?

For decades, engineers have been guided by a standard theory that says these fluctuations should become larger as transistors get ever smaller in size, spelling bad news for low-power devices. Recent findings from Kin Cheung and colleagues of the National Institute of Standards and Technology (NIST) in Gaithersburg have shown that the fluctuations may be somewhat larger than predicted and, more importantly, the frequency of their occurrence is inconsistent with conventional noise theories.

These researchers looked specifically at the most common transistor in both digital and analogue circuits — the MOSFET, or the metal–oxide–semiconductor field-effect transistor. Surprisingly, they found that even in nanoscale transistors with widths and lengths of 0.085 micrometres and 0.055 micrometres, the frequency at which the device fluctuates between on and off states does not vary much from larger transistors.

Whilst there have been previous criticisms of the standard model for noise, no-one has been able to prove unequivocally that it is flawed. Cheung and his team now say their results are the most convincing falsifier yet because they have been generated using an “ultra-thin” transistor. With the gate dielectric being only a few molecules thick, they claim they can rule out other potential sources of noise and showcase the first “absolute test” of the standard theory. “We have now used our data to examine all the alternative models and found that, to first order, none of them work,” Cheung told physicsworld.com.

It’s good to talk

If the current model of noise is indeed wrong then this could have a significant impact on the design of low-power technologies. The hope is that consumers will see benefits like mobile phones that can run for a week on a single charge or pacemakers that operate for a decade without requiring a change of batteries. These would require very small and reliable transistors. “We have to understand the problem before we can fix it — and troublingly, we don’t know what’s actually happening,” said Jason Campbell, another of the NIST researchers.

Asen Asenov, an electronics researcher at the University of Glasgow in the UK believes this research addresses a pressing issue in electronics. “RTN has become dramatically important and is a main show stopper to the Flash memory scaling,” he said. Asenov is concerned, however, that the researchers do not take into account that transistors occasionally capture single electrons. “[electron capture] creates localized depletion regions in the semiconductor changing the relative position of the energy level and the conduction band.”

Even though Cheung and his team have taken these accurate readings, physicists will now need to carry out more research in order to confirm what is really going on in ultra-small transistors.