Skip to main content

Dots deliver full-colour display

Researchers in South Korea and the UK say that they have produced the first large-area, full-colour display based on red, green and blue quantum dots. The technology could spur the launch of colour TV screens combining a vast colour range with an incredibly small pixel size.

Both attributes stem from the intrinsic properties of the quantum dots, which despite being just a few nanometres in diameter comprise several thousand atoms that form tiny compound-semiconductor crystals. When electrons inside the quantum dots recombine with their positively charged counterparts, known as holes, it can result in the emission of a narrow band of light.

Making a colour display with the dots requires their deposition onto a substrate in a well-controlled manner. Monochrome displays can be made by spin-coating – dropping a dot-containing solution onto a substrate and spinning this around to yield a thin film of material. This approach is unsuitable for making a full-colour display, however, because it would cross-contaminate red, green and blue pixels.

Patterned rubber stamps

In this new work, a team led by Tae-Ho Kim at the Samsung Advanced Institute of Technology in South Korea, overcame this issue by spin-coating red, green and blue dots onto separate “donor” substrates, before transferring them in turn to the display with a patterned rubber stamp.

To make a 4-inch diameter, 320 × 240 pixel display, a pair of electron-transporting polymers was deposited onto a piece of glass coated in indium tin oxide. Red, green and blue dots were stamped onto this structure, which was then coated in titanium dioxide, a material with good hole-transporting properties.

Adding a thin-film transistor array allowed a different voltage to be applied to each of the 46 × 96 µm pixels. Increasing this voltage increases the brightness of the pixel, because more electrons and holes are driven into the dots, where they recombine to emit light.

Higher-resolution displays could be possible by reducing pixel size. “We showed an array of narrow quantum dot stripes of 400 nm width [in our paper], which indicates the feasibility of nano-printing quantum dots with extremely high resolution,” says Byoung Lyong Choi, one of the Samsung researchers. This demonstrates that the Korean team’s technology is more than capable of producing the displays with the highest practical resolution for viewing with the naked eye, which can resolve pixel sizes of up to 50 µm.

Improving efficiencies

One downside of the Korean display is its low efficiency – just a few lumens per watt, which is roughly half that of an incandescent bulb. But Choi says that far higher efficiencies should be possible by modifying their quantum dots. Samsung will continue to develop the technology, which it is trying to patent, before deciding whether to manufacture displays with this approach. “Transfer-printing can be scaled up to roll-to-roll systems for huge size printing onto flat or curved surfaces, such as rolled plastic sheets,” explains Choi.

John Rogers, a researcher at the University of Illinois, Urbana Champaign, is very impressed by the Korean effort: “It is, by far, the most complete demonstration of this technology.” However, Rogers also believes that the technology will face stiff opposition in the commercial market. “The entrenched technology – backlit liquid-crystal displays – continues to get better and better, and cheaper and cheaper.”

The Korean team reports its work in the latest edition of Nature Photonics.

Will the LHC find supersymmetry?

The first results on supersymmetry from the Large Hadron Collider (LHC) have been analysed by physicists and some are suggesting that the theory may be in trouble. Data from proton collisions in both the Compact Muon Solenoid (CMS) and ATLAS experiments have shown no evidence for supersymmetric particles – or sparticles – that are predicted by this extension to the Standard Model of particle physics.

Supersymmetry (or SUSY) is an attractive concept because it offers a solution to the “hierarchy problem” of particle physics, provides a way of unifying the strong and electroweak forces, and even contains a dark-matter particle. An important result of the theory is that every known particle has at least one superpartner particle – or “sparticle”. The familiar neutrino, for example, is partnered with the yet-to-be discovered sneutrino. These sparticles are expected to have masses of about one teraelectronvolt (TeV), which means that they should be created in the LHC.

In January the CMS collaboration reported its search for the superpartners of quarks and gluons, called squarks and gluinos, in the detector. If these heavy sparticles are produced in the proton–proton collisions, they are expected to decay to quarks and gluons as well as a relatively light, stable neutralino.

SUSY’s answer to dark matter

The quarks and gluons spend the energy that was bound up in the sparticle’s mass by creating a cascade of other particles, forming jets in the detector. But neutralinos are supersymmetry’s answer to the universe’s invisible mass, called dark matter. They escape the detector unseen, their presence deduced only through “missing energy” in the detector.

CMS physicists went hunting for SUSY in their collision data by looking for two or more of these jets that coincide with missing energy. Unfortunately, the number of collisions that met these conditions was no greater than expected with Standard Model physics alone. As a result, the collaboration could only report new limits on a variation of SUSY called constrained minimal supersymmetric standard model (CMSSM) with minimal supergravity (mSUGRA).

ATLAS collaborators chose a different possible decay for the hypothetical sparticle; they searched for an electron or its heavier cousin, the muon, appearing at the same time as a jet and missing energy. ATLAS researchers saw fewer events that matched their search and so could set higher limits, ruling out gluino masses below 700 GeV, assuming a CMSSM and mSUGRA model in which the squark and gluino masses are equal.

Good or bad omens?

Many believe that these limits are not bad omens for SUSY. The most general versions of the theory have more than a hundred variables, so these subtheories simplify the idea to a point where it can make predictions about particle interactions. “It’s just a way to compare with the previous experiments,” says CMS physicist Roberto Rossin of the University of California, Santa Barbara. “No-one really believes that this is the model that nature chose.”

ATLAS collaborator Amir Farbin, of the University of Texas, Arlington, calls these first results an “appetiser” for the SUSY searches to be discussed at the March Moriond conferences in La Thuile, Italy. “At this point, we’re not really ruling out any theories,” he says.

At this point, we’re not really ruling out any theories Amir Farbin, University of Texas

Still, CMS scientists Tommaso Dorigo of the National Institute of Nuclear Physics in Padova, Italy, and Alessandro Strumia of the National Institute of Chemical Physics and Biophysics in Tallinn, Estonia, say that there is some cause for concern. Supersymmetry must “break”, making the sparticles much heavier than their partners. It stands to reason that this should happen at the same energy as electroweak symmetry breaking – the point where the weak force carriers become massive while the photon stays massless.

This is thought to occur in the vicinity of 250 GeV. “But the LHC results now tell us that supersymmetric particles must be somehow above the weak scale,” says Strumia.

Dorigo notes that although SUSY can allow for high sparticle masses, its main benefit of solving the hierarchy problem is more “natural” for masses near the electroweak scale. The hierarchy problem involves virtual particles driving up the mass of the Higgs boson. While supersymmetric particles can cancel this effect, the models become very complex if the sparticles are too massive.

John Ellis of CERN and King’s College London disagrees that the LHC results cause any new problems for supersymmetry. Because the LHC collides strongly interacting quarks and gluons inside the protons, it can most easily produce their strongly interacting counterparts, the squarks and gluinos. However, in many models the supersymmetric partners of the electrons, muons and photons are lighter, and their masses could still be near the electroweak scale, he says.

Benchmark searches

CMS collaborator Konstantin Matchev of the University of Florida, Gainesville, explains that new physics was expected between 1 and 3 TeV – a range that the LHC experiments have hardly begun to explore. In particular, he notes that of the 14 “benchmark” searches for supersymmetry laid out by CMS collaborators, these early data have only tested the first two.

“In three years, if we have covered all these benchmark points, then we can say the prospect doesn’t look good anymore. For now it’s just the beginning,” says Matchev.

But not everyone is optimistic about discovering SUSY. “We will get in a crisis, I think, in a few years,” Dorigo predicts, sceptical of the theory because it introduces so many new particles of which data presently show “no hints”. However, even though he would lose a $1000 bet, he says that he would still be among the first celebrating if the LHC does turn up sparticles.

The CMS and ATLAS results are available on arXiv.

Dating the universe

Carl Sagan famously said that if you wanted to make an apple pie from scratch, you first had to create the universe. The deceptively simple title of David Weintraub’s latest book invokes a very similar philosophy: if you really want to know the age of the universe, then you too have to start from scratch. How Old is the Universe? places the question in its proper historical context and explains what has gone into answering it. Although other astronomy books have explained some of the methods here, Weintraub’s brings everything together into one narrative. Such an approach is sorely needed, as the universe’s age lies at the heart of modern cosmology.

In the first chapter, some main sources of evidence – such as meteorite samples, globular clusters, Cepheid variables and white dwarfs – are introduced and briefly explained. This gives a nice overview, before each item is discussed in depth later.

Weintraub’s narrative begins in the 17th century with James Usher, an Irish bishop who calculated the age of the Earth using biblical chronology (concluding that it started in 4004 BC.) Today, his methods are often ridiculed, particularly with the resurgence of young-Earth creationism. However, Weintraub shows that he was neither the first nor the last person to use the Bible to date the Earth. Moreover, Usher did the best that he could with the information available to him. Surprisingly, perhaps, the Copernican revolution two centuries before had contributed little to the 17th-century understanding of the universe’s age. But by attempting to calculate the Earth’s age, Usher and his contemporaries were at least on the right lines: if you could know the age of the Earth with precision, it would serve as a vital “stepping stone”, a lower bound on the age of the universe.

The next figure to make an impact was Johannes Kepler, who, in his mathematically rigorous fashion, proposed a more astrophysical approach. But even he came up with a figure of 3993 BC. Surely Sir Isaac Newton could do better? No. Newton’s approach was unscientific, and the figure that he arrived at was close to those put forward by scientists, bishops, rabbis and other great thinkers of the time. In this entertaining way, Weintraub shows that scholarly consensus does not always equate to fact – an important lesson for all scientists. It was not until the discovery of radioactivity at the end of the 19th century that the Earth’s age could be estimated accurately.

The Sun provided the next stumbling-block. Weintraub shows how scientists in the 19th and early 20th century did their best to try to explain the Sun’s age, especially in terms of the Earth. Geological and evolutionary evidence suggested that the Earth had been around for at least a billion years, perhaps longer. But if the Earth was so old, countered the physicists, how could the Sun have remained luminescent for so long without consuming all of its fuel? The big gap in their knowledge was nuclear fusion, and the “old Earth, young Sun” paradox could not be resolved until that particular breakthrough in physics had occurred.

The major headaches, however, were stars and star clusters. As Weintraub says, “Not all stars are the same.” He does a good job of conveying the interrelated problems of estimating a star’s apparent brightness, distance and luminosity. Some scientists found that their calculations were erroneous – again, because “not all stars are the same”. But some of these dissimilarities led to opportunities: the intrinsic brightness of Cepheid variables, for example, was found to relate to their luminosity period. Hence, by observing Cepheid variables in a galaxy, combined with redshifted spectra, astronomers could measure values for the Hubble constant, and thus the universe’s expansion rate. The rate of white-dwarf formation was also calculated, so that observations of their total number gave another way of placing an age on the universe. The book describes all of this in detail but, curiously, the proof copy that I read never mentions the terms “standard candle” or “distance ladder” when discussing objects with well-established intrinsic brightnesses that are used to measure the universe’s size (and hence its age). These two terms crop up time and time again in astrophysics, but possibly their omission will be remedied in the final version.

The next challenge came with galaxies. As late as the early 20th century, no-one knew what they were. It would take the efforts of many scientists to work out their dynamics, structure, composition and nature. This work paid off: investigation of galactic spectra would produce a paradigm shift in our view of the universe, leading scientists to conclude that it is expanding – and far bigger and older than previously thought.

As an astrophysics graduate and as someone who writes astronomy articles, I found How Old is the Universe? to be a satisfying, necessary and timely book. It should appeal to anyone wanting to learn about cosmology and astronomy in its broad context, but it would be especially good for astrophysics undergraduates because it assumes some physics knowledge, and has a good smattering of graphs, spectrograms, diagrams and images. University departments should ensure that they have some copies to hand.

In the book’s final section, Weintraub brings the journey right up to date by discussing supernovae, the cosmic microwave background, dark matter, dark energy, the Big Bang, inflation and quantum physics. He pulls together all of the relevant facets of scientific investigation from a variety of different fields, including geology, palaeontology, astronomy and physics, to ultimately arrive at the current best estimate for the universe’s age: 13.7 billion years (give or take 100 million years or so). We often take this figure for granted, along with the fact that it is known to within 1%. What this book shows is how deduction, dedication, care and persistence in many fields have led to the figure we have today. It is the story of a scientific triumph.

Uncertainty hits SESAME project

A major scientific project designed to foster collaboration between countries in the Middle East is undergoing a period of difficulty following growing unrest in the region. The Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) is currently under construction and due to start up in 2015, but the toppling of governments across the region is putting a strain on the ability to guarantee funding to complete the synchrotron.

SESAME is a project that aims to create the region’s first major international research centre by building a synchrotron light source in Jordan. The founding members of SESAME are Bahrain, Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, the Palestinian Authority (PA) and Turkey. The facility would produce X-rays that can be used to study materials in a range of disciplines from biology to condensed-matter physics.

The revolution in Egypt, which led to its president, Hosni Mubarak, stepping down on 11 February, and growing anti-government protests in Iran and Bahrain have put the project on an uncertain footing. “In the short term it is very worrying,” Chris Llewellyn Smith, president of the SESAME council, told physicsworld.com.

‘Moment of great uncertainty’

Although Llewellyn Smith says the unrest has yet to have a direct impact on the project, the former director-general of CERN is working with the SESAME members to put together a financial package that would guarantee the roughly $35m that is needed to complete and open the facility by 2015. “[The package] is now in jeopardy as ministers of member countries are changing,” says Llewellyn Smith. “It is a moment of great uncertainty for the project.”

Llewellyn Smith says that he was discussing SESAME with the Egyptian science minister, Hany Helal, only on Saturday, but yesterday [Tuesday] Helal was removed from office by the military-led government. Egypt was expected to contribute about $5m to the $35m gap but no-one can be now sure what the attitude to SESAME will be for a new prospective government. Similarly, just last week the PA government resigned en masse and the growing unrest in Bahrain is adding to the uncertainty around SESAME members.

However, Llewellyn Smith, who is currently in Washington, DC to discuss a possible US contribution to SESAME, says that he is “optimistic” that in the long term the facility will still be able to open by 2015. He also adds that the situation in the region could even turn out to be positive for the project and science in the region. “With more democratic governments, maybe we can get renewed and greater support for SESAME,” he says.

Think Canada

finalpic.jpg



By Michael Banks in Washington, DC

The temperatures have been mild here at in Washington DC for the 2011 American Association for the Advancement of Science (AAAS) meeting. But according to the latest forecast the snow is on it way just as delegates are heading off home.

The AAAS was jampacked with interesting talks. We had sessions on the search for exoplanets, storing antimatter, first physics at the Large Hadron Collider, an outline of the MESSENGER mission to Mercury, detecting traces of nuclear materials, the effect a nuclear war could have on the climate, and talks on adaptive optics. Even this breathless list only represents a tiny fraction of the complete programme of the 2011 AAAS conference.

The thing that caught my attention when I first entered the Washington Convention Center, which held the 2011 AAAS, was the big red “Think Canada” badges some people were wearing. I was slightly confused in the beginning, but their purpose quickly became apparent that it was to publicise the next AAAS conference.

That meeting will be in Vancouver, Canada, from 16 to 20 February 2012, so see you there (together with my pair of big red mittens).

Global challenges for science

CLS.jpg
Chris Llewellyn Smith speaking to delegates

By Michael Banks in Washington, DC

The 2011 American Association for the Advancement of Science meeting in Washington, DC had a slight winding-down feel to it today as the placards were being removed and the exhibitors packed their stalls.

But there was still a morning of talks to be had. So I headed to a session entitled “Can global science solve global challenges?” where Chris Llewellyn Smith spoke about past and future global science projects. He is an ideal speaker for the topic, given that he has been director-general of the CERN particle-physics lab and also served as chairman of the ITER council – the experimental fusion facility currently being constructed in Cadarache, France.

Llewellyn Smith went through some of the successes of global collaboration and consensus such as the eradication of smallpox in 1979 and the banning of CFCs in 1987, which successfully reduced the ozone hole.

The particle physicist also named a few examples of global collaborations that he felt had failed. This included scientists who were warning that a tsunami could occur in the Indian Ocean. The tsunami happened in 2004 killing 230,000 people and Llewellyn Smith says that lives could have been saved if warnings from scientists around the world had been heeded. He also adds communicating climate change as a challenging area that was damaged by scientists “not keeping objectivity and turning to advocacy”;.

Llewellyn Smith now calls for a global endeavour to be set up for the application of carbon capture and storage (CCS) to coal power stations that would include working out if the technique is at all possible and, if so, then the best way to store carbon dioxide underground. “CCS is going to be crucial if we don’t stop burning coal,” he says.

Indeed, Llewellyn Smith is involved with a Royal Society report into global science, which will be released on 29 March. He didn’t want to give the report’s conclusions away but says the report will concern “where science is happening and who is working with who”. There will be no specific recommendations made in the report but “we hope that it will start a debate” he says.

To solidify, just add water

Scientists in Germany have shown that a suspension of particles can be transformed from a viscous fluid to an elastic gel by adding a small quantity of a second liquid – as long as the second liquid does not mix with the bulk fluid. They say that the second liquid binds the particles more tightly together, and found that this enhanced binding takes place even when the liquid itself adheres poorly to the particles. Applications of this work, say the researchers, include lighter and cheaper foams as well as improved manufacturing of paints and other suspensions.

Being able to control the flow of suspensions – small, solid particles dispersed in a fluid – is important in the manufacture of many commercial products, such as coatings and foodstuffs. For example, it is better if paint is less viscous when it is being mixed during production, but more viscous when in its finished state so that it sticks to walls and does not drip.

In the latest research Erin Koos and Norbert Willenbacher of the Karlsruhe Institute of Technology have demonstrated a new and practical method for adjusting the viscosity of a suspension. In their experiment, they first dispersed hydrophilic (or water-attracting) glass beads, each about 25 µm in diameter, into an organic solvent. Then they added water to this suspension so that it made up just 1% of the suspension by weight. When they stirred, the initially viscous fluid transformed into a gel-like material.

Water builds bridges

This transformation has been known about for many years, and occurs because the water wets the particles – in other words, it tends to adhere to the surface of the hydrophilic particles more readily than does the organic liquid and so forms a thin film around them. When two particles get close enough, the water’s surface tension then dictates that it becomes energetically favourable for the coatings of water to join up and form a bridge, so binding the particles together and creating a network that makes the suspension more rigid.

What the researchers also found, however, was that the reverse can happen. When dispersing hydrophobic (water-repelling) glass beads into the solvent and then adding 1% water, they discovered that the initially viscous suspension becomes gel-like upon stirring. In other words, adding a small quantity of a substance that wets less well than the solvent, rather than better, also binds the beads together to form a more-or-less rigid network.

This process has also been observed before. For example, adding water to melted chocolate causes the latter to solidify even though the solvent in the chocolate – cocoa butter – wets the cocoa particles more readily. What Koos and Willenbacher have done is to demonstrate that this is a general phenomenon, having subsequently found that the process occurs in a wide variety of fluid and particle combinations. They were also able to pin down the mechanism responsible, and show that it is essentially the same process as occurs when introducing a superior wetter, but in reverse.

Rotating plates

In their experiment, the researchers measured the amount of force needed to set two parallel metal plates rotating relative to one another when a suspension is placed between them. They found that the force required increased markedly as the second, non-mixable fluid was added to the suspension – in other words the “yield point” of the suspension increased – and they found that this increase is just what would be expected were the particle binding determined by surface-tension, or “capillary”, effects. In this case, however, the water does not coat the particles but instead seeks to minimize its total area of contact with the particles and the bulk fluid, which leaves the water enclosed by particles (see figure).

Koos says that this effect can also be seen when building sandcastles. Adding a little water makes sand firmer because it creates bridges between the sand particles. The inter-particle binding increases as more water is added but when the sand finally becomes saturated with water, and there is no longer any air present, the sand particles simply slosh around in the water. But adding back in a small amount of air – the inferior wetting fluid – provides nuclei around which the sand particles can agglomerate.

According to Koos, this ability to enhance rigidity by adding small quantities of either a superior or inferior wetter could improve the industrial manufacture of suspensions. For example, she says, they found that foam can be made by dispersing PVC particles in water and then adding a tiny amount of oil. This requires less PVC than the traditional approach of dispersing PVC directly into oil, meaning that the process is cheaper and the resulting foam is lighter, which could make it attractive in the manufacture of lightweight building materials and insulating foams.

Wilson Poon at the University of Edinburgh in the UK points out that while other groups are studying the effects of adding a small amount of a second, unmixable liquid to suspensions, this latest work provides a striking demonstration of the potential of such an approach. And he agrees that the results obtained by Koos and Willenbacher indicate that capillary forces play a dominant role. “Since it is very difficult to eliminate moisture entirely from oil-based systems, or oily impurities from aqueous systems, it is possible that the effects presented here are widespread, but not widely recognized as such,” he says. “It is always tempting to think that the odd 0.3% impurity doesn’t really matter. In fact, it may matter, and matter hugely, either for good or for ill.”

The work is described in Science 331 897.

Freebies galore

freebies.jpg
Conference collectables

By Michael Banks in Washington, DC

No conference trip is complete without hoarding freebies from exhibitor stands.

So above is the result of my 30-minute sweep through the exhibition hall at the 2011 American Association for the Advancement of Science (AAAS) meeting here in Washington, DC.

Kudos to the Nanyang Technological University in Singapore who were providing USB hubs to conference goers (bottom left item in the image above). No expense spared there.

The AAAS yo-yo was a particular hit with delegates, with many people walking through the exhibition doing yo-yo tricks. The strangest item has to be the EurekAlert! sticky brain – not sure what I am going to do with that.

My favourite freebie has to be the big red “Canada” mittens. Next year’s AAAS conference is in Vancouver, Canada, so they just might come in handy then.

conferencehall.jpg
Gateway to conference freebies

Carbon concerns

cow.jpg
How much carbon is coming out?

By Michael Banks in Washington, DC

“Carbon is the most important element, but we are deeply ignorant of its effect on the Earth,” says Robert Hazen from the Carnegie Institution of Washington.

Hazen is the principal investigator of the deep carbon observatory – a 10-year programme funded by the Alfred Sloan Foundation to better understand the Earth’s carbon cycle.

It’s a wide-ranging study and speaking at the 2011 American Association for the Advancement of Science meeting in Washington, DC, Hazen spelled out the many questions that remain unanswered about carbon. These include how much of the element is stored in the Earth, especially in the core, and how much of the material is released when a volcano erupts.

In the case of a volcanic eruption, Hazen says some scientists conclude carbon makes up around 2% of the material ejected, while others say it is more like 75% – a big discrepancy that the programme will hope to reduce.

The programme only started in 2009 so Hazen is issuing a call to arms for scientists of different backgrounds to come together and join the project.

You will have to be quick as proposals for research activities must be submitted by 11 March.

Read more about the programme here.

Eye-catching exhibits

cow.jpg
Science on a sphere

By Margaret Harris in Washington, DC

No trip to the AAAS meeting would be complete without a tour of the exhibit hall, which for the past two days has been buzzing with visitors to “Family Science Days”, a public outreach-oriented event running in parallel with the more technical seminars.

One of the most eye-catching exhibits was the National Oceanic and Atmospheric Administration’s Science on a Sphere, which pretty much does what it says on the tin. The Sphere is the brainchild of Alexander McDonald, director of NOAA’s Earth Systems Research Laboratory, and there are now over 250 datasets that can be displayed on it. In this photo, it’s illustrating the shock waves that spread around the globe after the Boxing Day tsunami of 2004, but I also saw depictions of ocean currents, aeroplane flight paths, global temperatures and the past week’s weather. According to exhibitor Jana Goldman, there’s even one in a science fiction museum in Seattle, Washington that displays the (hypothetical) features of a (fictional) alien planet – so it’s definitely a versatile beast!

Another exhibit that got a lot of traffic was the US Department of Energy’s set of bicycle-powered light bulbs, which is designed to teach kids (and maybe some adults) about the differences between voltage and current, and to demonstrate in a very physical way how much power it takes to light up an incandescent 50 W bulb compared with fluorescent and LED bulbs. The young gentleman in this photo, for example, was having real trouble getting the incandescent bulb to give off any light, but despite being a little too short for the pedals, he managed the LED bulb just fine.

cow.jpg
Bicycle-powered light bulbs

For the bigger kids, exhibitor Steve Eckstrand keeps a 12 V, 300 W hairdryer on hand. “They can usually get the 50 W bulb working just fine, and one girl did manage to pedal hard enough to get a faint glow out of the 100 W bulb,” he says. “But nobody can do more than get the hairdryer sort of gently warm.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors