Skip to main content

Collaborations drive innovation at JEOL UK

When you sell a microscope that focuses electrons into a 0.1 nm diameter beam — which then must be held steady on a single column of atoms — you don’t just drop the instrument at your customer’s door and let them get on with it. Indeed, long before the system is packed for shipping, customer and supplier will have already collaborated on preparing an appropriate location where the instrument will perform at its best. Some of the customer’s personnel will have even been trained in how to use the instrument.

But according to Mike Hepburn, who is managing director of the UK subsidiary of the Japanese microscope maker JEOL, some of the firm’s customers actually help the company to develop microscopes at its factory in Japan.

Take Angus Kirkland, a materials scientist at the University of Oxford in the UK. JEOL is working with Kirkland’s research group to develop ultrahigh resolution transmission electron microscopes (TEMs). This has involved development staff from the firm’s manufacturing facility in Japan visiting Kirkland’s lab for several months at a time.

Hepburn told physicsworld.com that both parties gain from this type of interaction, with the Oxford group even publishing a number of research papers on the development of “super resolution” techniques for TEM. For its part, Hepburn says that JEOL can now offer technology developed in collaboration with Oxford on TEMs that it sells to other customers.

Making new science possible

According to Hepburn, a key achievement of the collaboration is that the TEM — which was originally specified to have a 0.12 nm resolution — is now operating at 0.07 nm. “That doesn’t seem like a huge difference numerically”, he says, “but in terms of the new science it makes possible, it really is a big difference”.

Other collaborations between the firm and UK universities focus more on developing new applications for JEOL instruments. This includes the company’s current collaboration with Pratibha Gai, who is JEOL Professor of Electron Microscopy at the University of York in the UK. Gai is one of the leading authorities on environmental TEM and co-director of the York JEOL Nanocentre, a £5.5m facility that opened in April 2007 and was partially funded by JEOL.

According to Hepburn, the centre is home to one of the world’s most powerful electron microscopes. This is a “double aberration-corrected microscope”, which has both imaging (TEM) and probe (scanning TEM) aberration-correctors fitted. It can obtain images at 0.1 nm resolution and resolve column of atoms at a resolution better than 0.2 nm.

The York researchers are using the instrument to develop new techniques for environmental TEM, in which a sample is placed in a small cell that does not need to be held under a high vacuum as with an ordinary electron microscope but can operate at ambient pressures. This technique allows researchers to see, for example, chemical reactions occurring in real time. “We help our customers through advanced training to understand the instrument so they can develop new applications,” explains Hepburn.

Elsewhere in the UK, JEOL maintains an ongoing relationship with Peter Goodhew, a materials engineer at the University of Liverpool and director of the superSTEM project at the nearby Daresbury Laboratory. SuperSTEM is developing two aberration-corrected TEMs — one of which is an existing TEM that has been retro-fitted with an aberration-correction system and the other which was delivered earlier this year by the US-based firm NION.

Corrected columns

While JEOL is not involved directly with the superSTEM project, Hepburn said that it is important for the company to be associated with one of the key groups working in this area “[Liverpool] is one of the leading groups for using corrected columns”, he explains. “We are very excited to be working with them”.

JEOL UK also collaborates with Tony Cullis, a semiconductor physicist at the University of Sheffield. The Sheffield group has recently ordered an ultra-high resolution TEM from JEOL, which is currently developing the instrument and expects to deliver it by mid-2009. To ensure that Cullis’s team will be able to use the new instrument when it arrives, JEOL is currently installing an interim instrument that will be used to train the staff.

“Our collaborations often begin with the development of leading-edge instrumentation and establish a close working relationship between the user and our development team”, says Hepburn. He believes that this delivers advantages to both parties. “We get feedback from the researchers back into the factory for future design and development and they benefit from getting advanced information about new developments,” he says. “In some cases we arrange for researchers to visit our factory in Japan to have discussions about design. It is a genuine win-win situation for both sides.”

Selling instruments

While collaborations with high-profile researchers help the company to develop new technologies, Hepburn emphasizes that JEOL is in the business of selling instruments to a wide range of users. He believes that there is much more to a successful sale than simply delivering the instrument. “It is one thing to encourage an institute to buy your instrument, but if it not sited correctly, or their staff not trained, then they are going to be disappointed”.

As a result, the company works very closely with the customer to ensure that its microscope is installed in the best possible location. “We get involved in the very beginning by having our engineers attend the customer’s site meetings”, explains Hepburn. JEOL engineers will even talk to architects and builders regarding alterations to existing buildings and the design of new facilities. This is important because environmental factors such as vibrations can have a detrimental effect on the operation of an electron microscope.

“We have been lucky enough at Oxford to have a new purpose-built building, but that is the exception rather than the rule”, he explains. “Normally we have to work within the confines of an existing building, which means that sometimes a hole has to be dug, then insulated from its surrounding area and filled with a solid concrete block” in order to minimize vibrations.

“Within JEOL UK we have several people who project manage the installation process from a technical point of view and our engineers have many years experience with creating the best location for an instrument,” says Hepburn. Although he admits that knowing what the correct environment should be is “a little bit of a black art”, Hepburn insists that the company will always get the right answer.

40th anniversary

JEOL UK has about 40 employees, with its parent company JEOL Ltd of Japan having 3250 people worldwide. JEOL Ltd was established in Japan in 1949, while JEOL UK was launched 40 years ago this year. According to Hepburn, the firm has installed more than 1000 electron microscopes in the UK — along with several hundred NMR instruments and mass spectrometers that the company also makes.

The vast majority of the company’s sales and service staff have scientific or engineering backgrounds, and many have been students or technicians in university labs that use JEOL instruments. This includes Hepburn, who worked in the Materials Department at the University of Surrey where he used a JEOL TEM.

Hepburn says that the firm is seeking to boost its ranks by hiring people at the postdoc level, who will help the firm expand its user-training and applications-development programmes. The company is also looking to sponsor the research of up-and-coming scientists who use their equipment. In doing so, the firm hopes to encourage the next generation of microscope users, as well as benefit from the drive and enthusiasm of less established researchers.

DAFNE enjoys a particle boost

Scientists at the Italian particle accelerator DAFNE are looking forward to an era of new physics having implemented an upgrade that is boosting the rate at which the machine can collide electrons and positrons. Their success paves the way for a new accelerator that is over 10 times as big with 10 times more energy.

The upgrade, which was completed five months ago, has already doubled the luminosity of DAFNE’s electron–positron collisions. Over the coming months the scientists think that it should be possible to make the luminosity three to six times higher.

“It’s increasing as we speak,” says Pantaleo Raimondi, project leader and researcher at the National Institute of Nuclear Physics, during a phone interview. “More luminosity means a higher collision rate, more particles and therefore more physics.”

Luminosity is increasing as we speak Pantaleo Raimondi, DAFNE

DAFNE is a compact, circular accelerator based at the Frascati National Laboratories, near Rome. It has two storage rings, each 100 m long, that accelerate electron and positron beams to energies of 0.5 GeV. At one point in the rings the electrons and positrons are extracted and then smashed into each other, before the remainders of the beams are re-entered into the ring.

The collisions produce phi-mesons, short-lived particles that contain a strange quark bound to a strange anti-quark. By studying the decays of these mesons — particularly into lighter kaons — scientists at DAFNE have been able to spend the best part of a decade studying aspects of particle physics such as quantum chromodynamics and CP violation.

‘As we hoped’

When it was built in 1997, DAFNE was deemed the world’s first particle “factory” because of its copious output of phi-mesons. The aim of the upgrade, which was implemented over the latter half of last year, was to see whether this factory could be made far more productive for minimal cost. If the upgrade was a success, it would prove the concept for a “super factory” called SuperB — a larger circular accelerator that will investigate the decays of heavier B-mesons. “Everything is performing exactly as we hoped, as we expected,” says Raimondi.

The upgrade has altered the geometry of the electron and positron beams to collide them at more of an angle. Although intuition would say that head-on collisions would give the highest luminosity, this arrangement leads to so much disruption that only a small remainder of the uncollided beams can be salvaged to re-enter into the rings.

An oblique collision reduces the disruption, but it comes with its own side-effect: it deters the beams from colliding at the point where the accelerator’s magnets force them to be thinnest. Raimondi and his team came up with the idea of giving the beams a “crabbed waist” — essentially tilting and rotating them — to make this thin point accessible again.

The upgrade has required around 60% of DAFNE’s rings to be modified at a cost of just €1.5m, which is modest considering the €150m value of DAFNE itself. The engineers are currently improving the accelerator’s detectors to exploit the large volumes of phi-mesons so that their decay processes can be measured more accurately.

‘Stepping stone’

With the upgrade already providing significant increases to DAFNE’s luminosity, the INFN and other institutions can concentrate on the design of SuperB, which will be constructed nearby. “It’s fantastic news,” says Tim Gershon at the University of Warwick, UK. “It’s a very important development in accelerator physics. It has immediate applications at DAFNE that could revitalize the programme. And it’s an important stepping stone for the SuperB factory.”

By putting all of these various things together we can get sets of measurements that complement those at the LHC Tim Gershon, University of Warwick

The measurements of B-meson decay processes will be important in testing particle physics beyond the current Standard Model. The behaviour of B-mesons is influenced by virtual particles that pop in and out of existence by borrowing energy from the vacuum, giving researchers a handle on particles that are too heavy to be produced directly (see this month’s Physics World feature: “A taste of LHC physics”).

There are already experiments looking into B-meson decay processes, including the Belle experiment at the KEK laboratory in Japan and the BaBar experiment — which was recently shutdown — at the Standford Linear Accelerator Centre in the US. There is also the LHCb experiment, which will start when the European laboratory CERN brings the Large Hadron Collider (LHC) online later this year.

SuperB, however, will be able to look for physics that would otherwise be out of reach. For example, it will be able to measure the decay of a B+-meson into a tau-lepton and a tau-neutrino — a process that the LHCb is not sensitive enough to measure.

In general, the B-meson factories give a different approach to direct particle searches, such as the LHC’s ATLAS and CMS detectors. “By putting all of these various things together we can get sets of measurements that complement those at the LHC,” says Gershon.

Force-microscopy firm finds success in 3D

When physicists Roger Proksch and Jason Cleveland founded Asylum Research in 1999, little did they know that nine years later one of their atomic force microscopes (AFMs) would be featured in one of the most successful US television programmes of all time — Crime Scene Investigation (CSI).

The company’s Molecular Force Probe-3D was used to produce a high-resolution image of a drug capsule found during a murder investigation in the episode “Rock and a hard place” of CSI: Miami. While the real-life applications of the firm’s systems are perhaps not as sexy as an appearance on CSI, Proksch told physicsworld.com that he continues to be surprised by the growing number of ways that researchers are using Asylum’s equipment.

Before founding the company, Proksch and Cleveland worked at US-based Veeco. However, the firm was becoming increasingly focused on making metrology equipment (including AFMs) for the semiconductor industry — something that didn’t appeal to the two physicists.

Dabbled in biophysics

“We were interested in making instruments for biophysics at the time,” recalls Proksch. Although the pair both did PhDs in the use of AFMs to study magnetic materials (Cleveland at the University of California at Santa Barbara and Proksch at the University of Minnesota) they had a growing interest in biophysics, having both “dabbled” in the field as postdocs. So they joined forces with lawyer Dick Clark to found Asylum Research, which has its headquarters in Santa Barbara, California. They now have about 60 employees, including many with PhDs in physics.

The firm’s first instrument was not an AFM, but rather a system that allowed biophysicists to measure the forces that hold large molecules together. This is done by attaching one end of the molecule to a substrate and the other to the tip of a tiny cantilever. The cantilever is then used to carefully pull the molecule apart, while the forces required to do so are determined by monitoring minute deflections of the cantilever.

This technique is called single-molecule force spectroscopy and can be used to study the structural properties of a wide range of large molecules such as DNA or polymers. Smaller molecules can also sometimes be studied by engineering them into larger molecules — using the larger molecules as “handles” to pull on the smaller molecules.

In 1999 the technique was only a few years old, recalls Proksch: “For us it was a nice niche, it allowed us to do something that was related to our experience but new enough that there was no competition”. The company ended up selling about 70 force spectrometers, the vast majority going to university biophysics research labs. Asylum had hoped that its instrument would become a diagnostic tool in the pharmaceutical industry. “But that just hasn’t happened and I’m not sure it will ever happen for this technology”, says Proksch.

Customer requests

By about 2002, sales to universities were beginning to slow down and Asylum saw its next opportunity in a request that it was getting from many of its customers — the ability to precisely scan the tip in the x–y direction, as well as up and down. Although the height of cantilever above the substrate could be positioned with nanometre accuracy, the instrument had relatively low resolution in the x–y plane. This meant that the user couldn’t know for certain exactly where on the molecule they where pushing or pulling.

“Our customers were asking us to make a system that could create an image of the molecule before you started poking at it — and that is basically an AFM”, says Proksch.

Fortunately, Asylum could use some of the same technology it had developed for the accurate positioning of the height of the cantilever to scan it in the x–y direction across the sample. This led to the development of the firm’s current “flagship” instrument, the MFP-3D. “One of the strengths of this instrument is that you can take these wonderful images of the sample and then you can go back and examine the other properties at specific locations”, says Proksch.

Proksch puts much of the firm’s success down to its use of linear variable differential transformer (LVDT) position sensors for use in its instruments. An LVDT comprises a primary coil with a current flowing through it to create a magnetic field. Some of the magnetic flux passes through two secondary coils, inducing currents in these coils. One contact of the position sensor is connected to the primary coil and the other contact to the two secondary coils. If the distance between the contacts changes, so does the relative positions of the coils and this change can be determined by measuring and comparing the induced currents in the secondary coils.

This is unlike its competitors, which use sensors based on capacitive plates to determine the position of the cantilever, if they use a sensor at all. The main difference between Asylum’s LVDTs — which use mutual inductance between magnetic coils — and low-noise capacitive sensors is that the latter require plates that are very large and in a perfect parallel planar configuration, which is very difficult to achieve, according to Proksch. “Magnetic coils are much more forgiving in terms of geometry”, he explains.

Low-noise amplifiers

Most LVDTs use a primary coil with a ferromagnetic core, which increases its magnetic field thereby boosting the sensitivity of the sensor. However, the sensitivity of this structure is limited by “Barkhausen noise” in the core, which results in sudden random changes in the magnetic field. To get around this problem, Asylum does not use a core in its patented LVDTs, but rather boosts the sensitivity of the secondary coils by using the latest in extremely low noise amplifier technology.

As a result, the sensors have noise level of about 0.1 nm noise in a 1 kHz bandwidth, which means that the instrument can make one position measurement per millisecond with an uncertainty of 0.1 nm.

While capacitive sensors claim to offer similar noise levels, Proksch says that they are much more difficult to integrate with an AFM. LVDTs, he says, can be used in a number of different geometries, giving Asylum considerable flexibility in designing their instruments — something that can be used to improve their overall performance. “We are the only AFM maker that I know of that takes this magnetic approach”, says Proksch, adding “I believe we are the only company that makes its own position sensors and this gives us the expertise to put our sensors just about anywhere”.

Growth in materials science

By creating instruments that scan in 3D, the firm has expanded its customer base beyond biophysics — which today is about 40% of its business — and has enjoyed significant growth in materials science.

The study of the piezoelectric properties of materials is one area where Asylum has seen significant growth. Piezoelectric materials generate an electric potential when squeezed or stretched — and such materials squeeze or stretch themselves when an electric potential is applied.

Although physicists have long known that many common materials are piezoelectric — including bone, wood, and many minerals — it had been very difficult to make detailed studies of the relationships between the microscopic structure of a material and its piezoelectric properties. This is particularly important to researchers trying to develop ferroelectric memories, in which data are stored in bits defined by the electric polarization of tiny domains in a ferroelectric material.

Today’s ferroelectric memories have a relatively low density and to improve this, researchers must be able to reduce the size of ferroelectric domains in the material.

An AFM can be used to study the piezoelectric properties of a material by bringing the tip in contact with the surface, applying an electric potential between the tip and the material, and then measuring any movement in the material by watching for a deflection in the position of the tip. This technique is called piezo force microscopy or PFM.

According to Proksch, the MFP-3D can be used in piezoelectric scanning mode to image ferroelectric domains at resolutions down to several nanometres and garner additional structural information about ferroelectrics that could lead to denser ferroelectric memories.

Understanding ‘bioelectricity’

Beyond the development of ferroelectric memories, Proksch believes that the piezoelectric response of biological materials is a new and exciting area of growth for Asylum’s AFM. The piezoelectric properties of bone, for example, could be related to healing processes, whereby structural changes in damaged bone send out electrical signals telling surrounding cells to increase bone mass. As a result, such studies could lead to a better understanding of the role of “bioelectricity” in living organisms.

Proksch told physicsworld.com that the company is also currently working on reducing the time that it takes to obtain an image, which is typically about five minutes. This can be done by increasing the oscillation frequency of the tip, which in practical terms means making the cantilever smaller. While this is straightforward in principle, in practical terms it means that the rest of the hardware and software on the instrument also operate much faster — which is a significant challenge that the firm hopes to meet.

Microscopical advances redefine science

Mark Rainforth is professor of engineering materials at the University of Sheffield in the UK and has been involved with the UK-based Royal Microscopical Society (RMS) since he was a PhD student. His research involves using electron microscopy to study the interfaces and surfaces in metals, ceramics and coatings.

As president of the RMS since 2006, Rainforth oversees an organization that promotes every aspect of microscopy and allows all microscopists — users and instrumental developers alike — to interact. The society has over 1000 members including not just physicists, but also material scientists, life scientists, dentists and even archaeologists.

What have been the most exciting recent breakthroughs in electron microscopy?

There are so many that it is difficult to include them all. I remember going to a lecture a few years ago at which the speaker predicted that there was nothing exciting left to happen in microscopy and that all the changes would be incremental. In reality, over the past five years, the microscopy community has gone through a renaissance.

Perhaps the biggest advancement for me has been the ability to correct for the spherical aberrations that result from having round lenses. The resolution of a microscope has always been limited by such aberrations. People have known for a long time that the problems could be corrected by measuring the spherical aberrations and by using special lenses known as optipoles to put in equal and opposite aberrations. However, this requires many lenses and lots of image processing so has only really been possible with advances in computing power.

Before this it was only possible to produce an image that is a “projection” of atoms, rather than the actual individual columns of atoms in a material. Such a projection is made by recombining a number of diffracted electron beams to produce an interference image, which only under very carefully defined conditions gives us direct information regarding the atomic structure. Now, with aberration-free transmission electron microscopes, we can get true atomic images and get an idea of the physics of individual bonds between atoms.

In 1959 Richard Feynman famously said that, “It would be very easy to make an analysis of any complicated substance; all one would have to do would be to look at it and see where the atoms are…I put this out as a challenge: Is there no way to make the electron microscope more powerful?” Nearly 50 years later that challenge has been answered.

What other developments have been interesting?

The structures of many materials depend on their how atoms bond, something that must be studied at the angstrom (0.1 nm) length scale. But to understand how bonding affects what we can see with the naked eye, we must be able to make such measurements across much larger regions of a sample.

Electron back-scatter diffraction (EBSD) addresses this challenge by using a scanning electron microscope (SEM) to determine the crystal structure and crystal orientation of the atoms in a sample at the nanometre scale. The really exciting thing is that you can scan a wide area with SEM so you can cover a square centimetre of a material’s surface at nanometre-scale resolution.

This process is very computer-intensive and has only recently become possible with advances in computer power and digital cameras. Some of these latest techniques will be presented later this month in London at the Microscience 2008 conference.

Are these developments driving new science and technology?

Yes, but this is difficult to summarize because there are so many interesting cases. Indeed, it is almost like this renaissance has allowed us to revisit all known science.

Nanoscience and nanotechnology are widely recognized as having huge potential benefits. However, the technology cannot develop in a logical manner without the ability to probe atomic structure directly. This does not just mean the physical location of the atom, but also its atomic number and charge as well as the type of bonding with, and proximity to, surrounding atoms. Developments in microscopy are beginning to make this possible.

Microscopy also lets us manipulate objects on the nanoscale. For example, microscopes can be used to look at how a carbon nanotube can be moved into a specific position on a sample. In electronic components, we can now look at the interfaces between thin films and their substrates and understand the chemical distribution within the film. This understanding helps to further miniaturize the components leading to smaller mobile phones, for example.

Breakthroughs in microscopy have also boosted our ability to understand the effects of intentional or unavoidable trace elements in a material, which can dramatically alter its properties. Until the advent of the latest transmission electron microscopy (TEM) techniques, determining the locations of trace elements within the structure of a material was extremely difficult. Now, such microscopy techniques will, for example, bring new understanding in the degradation mechanisms in solid oxide fuel cells. Similarly, the development new high-strength, lightweight steels will also rely on the latest high-resolution techniques.

How is this renaissance in microscopy affecting your field of research?

My research focuses on the understanding of surfaces and interfaces. Microscopy underpins and explains all our work in materials science, be they metals, ceramics, coatings or biomaterials.

I use a range of techniques and sometimes need to combine more than one technique. For example, I use focused ion-beam microscopy to section surfaces for subsequent high-resolution TEM investigation. One situation where this is used is in examining the dynamic changes that occur in hip joint prosthetics while in the body.

What are today’s challenges in microscopy and how are they being addressed?

The current challenges cross many length scales, from achieving true atomic resolution, through to being able to find key features at the microscopic level, embedded randomly in a large object.

For atomic resolution, the first generation of aberration correctors have provided wonderful advances in resolution and therefore understanding. The next generation of (fifth-order) aberration correctors and chromatic aberration correctors, along with new electron gun designs, will in a few years’ time take the spatial resolution of electrons to a new level. Single atom electron spectroscopy should become almost routine. Such technology will also be built into SEMs, bringing major advances in the understanding of surfaces.

At the other end of the length scale, automatic stages coupled with advanced image analysis and related techniques, such as high throughput electron backscatter electron diffraction will bring major advances in the ability to resolve fine scale structure in a statistically meaningful way. Such techniques will benefit engineering firms that make large items with properties that are determined by fine-scale structure.

The rate of change is quite remarkable in every aspect. I am amazed by the number of new techniques coming out and I fully expect to continue to be amazed in the future. What better time to be a microscopist?

Scanning electron microscopy rises to the challenges of the 21st century

Since the first commercial instrument debuted in 1964, the scanning electron microscope (SEM) has become an established tool for characterizing materials in the physical and life sciences. SEMs are commonplace in the semiconductor industry, where they are used to create and characterize extremely small features, and the instruments are a key driver in the emerging business of nanotechnology.

The SEM is often seen as being less exciting than its counterpart, the transmission electron microscope (TEM), which can resolve individual atoms. Instead, SEM has a reputation as a workhorse instrument that is reliable, easy to use and ideal for characterizing bulk materials.

However, there has been a quiet revolution in the world of the SEM, and slowly but surely, its capabilities are expanding. The instrument can now be used to study the surface of just about any bulk material at nanometre resolution and regardless of whether it is clean or dirty, wet or dry, hot or cold, conducting or insulating. Under the most favourable conditions, sub-nanometre resolution has been achieved — especially for thin specimens imaged in transmission mode.

An SEM consists of a source of electrons that are focused by lenses into a tight beam that strikes the surface of a sample. The resultant signals from the sample — including backscattered electrons, secondary electrons and X-rays — are picked up by detectors.

Specimen as active component

Radical improvements in SEM capabilities have been brought about by new technologies such as field emission electron sources, magnetic immersion lenses, more efficient and sensitive detectors and specially adapted specimen chambers and stages. In addition, there is a growing awareness in the SEM community that performance can be boosted by treating the specimen as an integral, active component of the system.

So where does all this extra power get us? Well, it opens up a world of possibilities for tuning the experiment to match the specimen; to tease out the required information or to reveal the unexpected; and to use an SEM to fabricate structures at increasingly small scales.

One strategy to improve the performance of an SEM is to apply a negative-bias voltage to the specimen, which slows down the incident (primary-beam) electrons on arrival at the specimen surface. These lower energy electrons don’t penetrate as deeply into the sample as a higher-energy primary beam would, and therefore are a more sensitive probe of the surface. The advantage of using a higher primary energy is that the beam can be made much more tightly focused, resulting in better spatial resolution.

Figure 1 shows this concept schematically, along with an image where the landing energy is a mere 50 eV, compared to the primary-beam energy of 2 keV. The reduced penetration of primary electrons into the material gives a much greater surface-sensitivity and, with the right detector, enables high-quality, low-voltage backscattered electron imaging (figure 2). This technique is unique in that it can provide high-resolution compositional and topographical information at the same time.

Another way of boosting the capability of an SEM is to position a detector beneath a thin specimen in order to collect electrons that are transmitted through the sample — similar to what is done in a scanning TEM (STEM). This “STEM-in-SEM” technique takes advantage of the comparatively low primary beam energies of the SEM (typically less than 30 keV), which means that more of the primary electrons scatter more often as they pass through the specimen.

This increased scattering makes it easier to study materials containing lighter atoms such as carbon, which are not very efficient at scattering electrons. This technique is particularly useful for studying polymers, carbon nanotubes (figure 3) and other organic matter.

Another exciting innovation is the environmental SEM (ESEM), which can also be used in STEM mode. “Environmental” refers to the instrument’s ability to image a sample in its “native state” rather than first being desiccated, coated in gold and held under high vacuum. The ESEM and microscopes using similar technologies are well suited to the study of electrically insulating bulk materials such as oxides, ceramics, glasses and polymer. Such materials would normally be electrically charged by the primary beam, making analysis very difficult.

The technique works by introducing an “imaging gas” such as water vapour to the sample chamber. When the primary beam strikes the sample, it produces ‘secondary electrons’, which ionize the gas, creating more secondary electrons as well as positive ions. The additional electrons serve to amplify the secondary electron signal, while the positive ions are attracted to the sample where they compensate for negative charge deposited by primary electrons.

Wetting and drying cycles

If water vapour is used as the imaging gas, the sample chamber becomes a suitable environment for stabilizing liquid-containing specimens and performing in situ experiments — for example, a sample can be studied as it is put through cycles of wetting and drying. Specimens can also be heated, cooled, stretched, compressed and otherwise manipulated, in association with a range of gases appropriate to the experiment. Recent advances in detector design mean that experiments can be carried out at chamber pressures up to around 4 kPa. While this is not quite atmospheric pressure (101 kPa), the pressure in a normal SEM is typically about one million times lower than this.

The SEM is also playing an increasingly valuable role in the fabrication of nanometre-scale devices for electronics and other applications. For example, interactions of the electron beam with specific gases can be used to achieve the controlled deposition of materials such as metals or different forms of carbon onto a substrate. This technique can be used to build 3D nanostructures or to create connections between nanocomponents.

Another method for making extremely small features using an SEM is electron-beam lithography. This process uses the electron beam to “draw” circuit patterns onto a substrate that is coated with a polymeric photoresist. The surface is then subjected to chemical etching, which removes the polymer that has not been exposed to the electron beam. Deposition and etching techniques can also be combined in a gas-mediated environment, as in the ESEM, leading to new and important techniques for creating nanostructures.

SEM analysis can be enhanced by the integration of a focused ion beam (FIB), which is scanned across the sample much like the electron beam. The FIB can be used to remove material from the surface of the sample (a process called milling), which allows the electron beam to probe deeper into the material. This sequence is repeated a number of times to obtain a number of 2D “slices” through a material, which can later be volume-rendered to create a 3D image. Slices can range in thickness from the nanometres to micrometres, depending on the size distribution of features present in the material. These abilities can be coupled with traditional analytical techniques such as electron backscatter diffraction (EBSD) and X-ray microanalysis (EDS) to build up 3D images of the structural and chemical properties of the sample.

This brief survey has shown how the modern SEM can take us from the advanced characterization of hard and soft materials, even liquids, to 3D visualization, dynamic experiments and the fabrication of nanostructures. Much of this is possible thanks to the extremely high level of control offered by the SEM — an ability that has evolved through decades of technology development coupled with increased computing power and dedicated software. In a cutting-edge world, the not-so-humble SEM is rising magnificently to the challenges of the 21st century.

Fastest supercomputer overtaken by Roadrunner

A supercomputer designed by scientists at Los Alamos National Laboratory in the US has broken the “petaflop barrier”, making it the most powerful supercomputer in the world. Performing over one thousand trillion calculations per second, the $100m “Roadrunner” supercomputer is more than twice as fast as the previous record holder, Blue Gene/L, based at Lawrence Livermore National Laboratory.

Computer performance has been rising exponentially over the past two decades in accordance with Moore’s law, which states that the number of transistors on computer chips doubles every 18 months. However, engineers have found it increasingly difficult to stick to this prediction because of side-effects associated with shrinking components. For example, if the internal clock speed of 65-nanometre transistors — the current standard in high-end microprocessors — is doubled, the amount of heat produced rises by eight times.

With Roadrunner, the Los Alamos scientists have got around such problems by employing a hybrid design. The main structure contains a cluster of microprocessors built by US manufacturer AMD. But the core of each of these chips is boosted up to 25 times over by a special type of microprocessor designed by Sony, Toshiba and IBM known as the “Cell”. This chip has already been used in Sony’s Playstation 3 — and, in fact, at least one scientist has made his own small supercomputer using several of the games consoles wired together.

The Roadrunner, which was named after New Mexico’s state bird, will be used to perform scientific applications for four to six months. These will likely include cosmology, astrophysics and climate models. After that time is up, it will be used by scientists in the military to research the explosions of nuclear weapons.

Should we scrap science news?

By Jon Cartwright

Here’s a statistic for you, taken from a website called Sense About Science. It claims that there’s over a million scientific papers published every year. If that’s right, there must be something in the region of 20,000 published a week. Even if physics accounts for only a small fraction of the sciences, that still means we’re looking at several hundred every day. (I could dig out a reliable figure, but it’s probably not far wrong.)

There’s no way we at Physics World can even hope of keeping you up to date with that many papers. Nor would you want us to — let’s face it, most papers only deal with very minor developments that would only interest those working in exactly the same field.

So, I would like to raise a question: should we bother to comb the journals for the biggest developments, or should we give up reporting research news altogether?

Actually, I’m not the first to raise it. I discovered the idea nestled at the bottom of an article written last week in the Guardian by Peter Wilby. He had been haranguing the Daily Mail for the way they report “breakthrough” developments in health research. (It’s the same old score: this week they tell you a glass of wine a day will increase your life expectancy; next week they will tell you the opposite.) Wilby proposes that, instead of mindlessly regurgitating seesawing opinions from the medical community, the media should offer “state of knowledge” features that give an overview of where the present scientific consensus is headed.

Would this feature-only approach benefit physics too? Conclusions seen in physics papers are usually straightforward to interpret — particularly compared with, say, the vagaries of health surveys — which would suggest the answer is no. However, there are still many difficulties.

One is that small developments in research are seen as being less newsworthy than those that go against prevailing opinion. In the latter instance, unless there is enough context to show how the research fits into the grand scheme of things, a news story can be misleading. Another, as I showed in my recent article on the use of embargoes in science publishing, is that much (if not most) science news is artificially timed to fit in with publishers’ agendas; in a sense, the news is not “new” at all. A feature-only approach would avoid both of these.

The main point I can see in favour of science news is that there are certain developments that deserve to be brought to people’s attention immediately. Think of the recent claims by the DAMA experiment team in Italy that they had detected dark matter on Earth. Or the discovery by Japanese physicists of a new class of high-temperature superconductors based on iron. Should we only report on such critical research? If so, where do we draw the line?

Let’s hear your thoughts. But bear in mind that if we do decide to scrap science news, I’ll be out of a job.

Multi-particle entanglement in solid is a first

An international team of physicists has entangled three diamond nuclei for the first time. The development promotes solid-state systems to a rank of quantum systems including ions and photons that have achieved entanglement for more than two particles.

Entanglement lies at the heart of fields such as quantum computation and quantum teleportation. At its most basic level, if two particles are entangled a measurement of the state of one reveals something about the state of the other, regardless of the distance separating them.

But entanglement is difficult to achieve. It requires quantum states to be manipulated while preventing them from interacting with their environment, which tends to degrade the quantum system into a classical state. Physicists have had some successes, having entangled up to eight calcium ions and up to five photons. So far, however, solid state systems have proved trickier.

Now, a team led by Jöerg Wrachtrup of the University of Stuttgart, Germany, has demonstrated that two or three diamond nuclei can be entangled (Science 320 1326). “If we compare the quality of entanglement in our experiments with those [of ions and photons], our results compare favourably,” says Wrachtrup. His team includes researchers from the University of Tsukuba, the National Institute of Advanced Industrial Science and Technology, and the Nanotechnology Research Institute, Japan, and Texas A&M University, US.

Method ‘not new’

The researchers’ system is a piece of synthetic diamond containing a large proportion of carbon-13 isotopes. At one point in the lattice they place a nitrogen atom, which leaves a defect containing a single electron.

Because this electron interacts with the neighbouring carbon nuclei, Wrachtrup’s team can shine laser light onto it to put some of the nuclei into a certain quantum state. Then, by applying radio-frequency pulses of a magnetic field, they can drive the spin of the nuclei so that they become entangled with one another.

“The method itself is not new,” says Wrachtrup, who adds that equivalent magnetic pulses are also used in nuclear magnetic resonance (NMR) spectroscopy. “It is the system itself which we discovered to be addressable as a single quantum system almost a decade ago. Meanwhile we can engineer this defect to such a degree that we do have excellent control.”

Diamond nuclei are an attractive option as a quantum system for computation. They can be kept in a coherent state for a long time and are easier to control than other systems, which will become vital for minimizing errors. However, because they currently have to be manipulated using the defect electron as an intermediary, it might be difficult to entangle many of them. Wrachtrup says that his team are working on scaling-up their system now, and believes in the future they should be able to control up to five or six nuclei per electron spin.

What’s your Wu index?

Ed Witten has once again been ranked as the world’s number one physicist, according to a new index that ranks scientists in terms of the citations generated from their published papers. The new measure, called the w-index, has been developed by Qiang Wu from the University of Science and Technology of China in Hefei.

Wu’s index is similar, but subtly different, to the h-index that was developed in 2005 by physicist Jorge Hirsch at the University of California at San Diego, which quantifies the published output of a scientist.

According to Hirsch’s criterion, a researcher with an h-index of, say, 9, indicates that he or she has published at least 9 papers, each of which has been cited 9 or more times. The w-index, on the other hand, indicates that a researcher has published w papers, with at least 10w citations each. A researcher who has a w-index of 24, for example, means he or she has 24 papers with at least 240 citations each.

Wu says in his paper that the index is a significant improvement on the h-index, as it “more accurately reflects the influence of a scientist’s top papers,” even though he concedes that the index could “be called the 10h-index” (arXiv:0805.4650v1).

Tops by both measures

Wu has calculated the w-index for some physicists who also have high h-indexes. Theortical physicist Ed Witten from the Institute for Advanced Study in Princeton, who has the highest h-index, also comes top in the w-index ranking with a score of 41. Witten is followed by condensed-matter theorist Phillip Anderson at Princeton University, with a w-index of 26, and cosmologist Stephen Hawking at Cambridge University coming third with a w-index of 24. Particle theorist Frank Wilczek (Massachusetts Institute of Technology) and Marvin Cohen (University of California, Berkeley) are joint fourth with a score of 23.

While Witten, Anderson and Wilczek also took three of the top five slots in the h-index ranking, the big winner under the new criterion is Hawking, who has a relatively modest h-index of just 62, compared to Witten’s score of 110.

According to Wu, a researcher with a w-index of 1 or 2 is someone who “has learned the rudiments of a subject”. A w-index of 3 or 4 characterizes a researcher who has mastered “the art of scientific activity”, while “outstanding individuals” are those with a w-index of 10. Wu reserves the accolade of “top scientists” to those with a w-index of 15 after 20 years or 20 after 30 years.

The w-index is easy to work out using a publication database such as ISI Web of Knowledge from Thomson Reuters, Scopus from Elsevier or Google Scholar. It can be determined in the same way as the h-index by simply searching for a researcher’s name in a database and then listing all their papers according to citations, with the highest first.

The hunt for ‘God’s particle’?!

By Jon Cartwright

We have Leon Lederman to blame. For the “God particle”, that is. Since he published his 1993 book, The God Particle: If the Universe Is the Answer, What Is the Question?, the layperson might be forgiven for believing the Large Hadron Collider (LHC) is not searching for a particle called the Higgs boson, but a path to spiritual enlightenment.

Many physicists hate referring to Him. For some particle physicists, the “God particle” belittles the hoards of other theoretical particles that might be detected at the LHC. They say it reveals little of the particle’s function, and is savoured by writers with little rhetoric. For some non-particle physicists, the God particle epitomizes the hype that surrounds particle physics. Then there are those who think divine connotations are always a bad idea.

Are they, though? When a furore about the use of “God particle” began bouncing around the blogsphere last August, mostly in response to an article written by Dennis Overbye of the New York Times in which he defended the term, several agreed that religious metaphors should be an acceptable part of our language. Einstein used them all the time (e.g. “Quantum mechanics…yields much, but it hardly brings us close to the secrets of the Old One”) yet historians tend to conclude he was not a theist. Even when I began writing this blog entry I thought I might be clever and refer to the Higgs as the light at the end of the LHC’s tunnel — before I reminded myself that the Higgs is not the only particle of import they expect to find.

As Sean Carroll noted on the Cosmic Variance blog, it is a fear of pandering to the religious right that is driving the expulsion of religious metaphors. If certain atheists succeed, religious metaphors will go the way of the dodo. The God particle is not one of the best, but it might be one of the last.

Which brings me to the point of this entry (not that Earth-shattering, I’ll warn you now). This morning I was looking at the news links posted on the Interactions website, only to find one from the Guardian newspaper headlined “The hunt for God’s particle“. That’s right — you read it right the first time. “God’s particle”? Where’s the metaphor in that? Have we now come full-circle, having compared the search for the Higgs boson to the path for spiritual enlightenment, only to reduce it to another of God’s creations?

Poor old Lederman must wonder what he started.

Copyright © 2025 by IOP Publishing Ltd and individual contributors