Skip to main content

Bringing exoplanets into sharper focus

Earlier this week astronomers reported observation of the smallest known planet orbiting a star other than our own. Indeed we now know about 300 such “exoplanets”, most of which are gas giants like Jupiter. Now, a new imaging technique using optical fibres could help planet-hunters towards an astronomer’s holy grail — direct observation and characterization of an Earth-like planet.

The problem with detecting such planets is one of resolution — light coming from them is incredibly faint compared with light from the parent star. A team of researchers from France and Australia suggest overcoming this problem, by feeding light through single-mode optical fibres from the telecommunications industry. The new technique significantly improves the resolution of existing adaptive optics (AO), a method for reducing image fuzziness of astronomical images, say the researchers.

“Planet-hunters have started to specialize in recent years; like predators in an ecosystem,” says Peter Tuthill, one of the researchers, from the University of Sydney. “In the long run, we are aiming to characterize and study the planets themselves, not just stamp-collect them in discovery catalogues.”

Exoplanet suite

Astronomers first began detecting exoplanets back in the mid 1990s and since then detection methods have changed surprisingly little. The majority of these discoveries have resulted from two methods: either looking for a star’s “wobble” caused by the gravity of a planet as it orbits; or looking for the dimming of a star as an orbiting planet sweeps in front and partially-blocks the starlight.

In the long run, we are aiming to characterize and study the planets themselves, not just stamp-collect them in discovery catalogues. Peter Tuthill, University of Sydney

While these techniques can reveal the orbits and masses of the exoplanets, they tell us very little about the atmosphere of planets within a solar system’s “habitable zone”. Although space-borne telescopes bypass image-blurring of the Earth’s atmosphere, it is still very difficult to resolve planetary light from that of its much brighter parent star. Contrasts are typically one in 100 million for a Jupiter-sized planet and one in 10 billion for an Earth-sized one.

Despite these difficulties, two separate teams of astronomers, last November, did finally report the first “bona fide” direct images of exoplanets. They utilized the wide 8–10 m apertures of the Gemini and Peck earthbound telescopes. They also employed an imaging technique known as adaptive optics (AO) to “clear up” the images. AO telescopes work by including a flexible primary mirror which can be calibrated — using light from bright “guiding stars” near the astronomers’ target zone — to improve the resolution of images.

“AO’s main advantage is that it removes the effect of phase aberrations, seen as ‘speckles’, via spatial filtering and careful mapping,” says Christian Marois of the Herzberg Institute of Astrophysics in Canada. One of the drawbacks of AO, however, is that calibration is notoriously difficult due to its high dependence on the angular separation between star and planet.

Adaptive Optics fine tuned

With this latest research, Tuthill and colleagues make calibration easier through developing an established technique known as “aperture masking”. In standard aperture masking, light is passed through small holes which causes the beams to diffract; observers then infer the presence — or non presence — of exoplanets from the resulting interference pattern. Until now, however, this technique has been hampered by imperfections in the interferometry equipment.

The researchers have improved aperture masking by placing an array of 36 single mode optical fibers behind the aperture disk. By feeding incoming light through the fibres the wavefronts retain their relative phases, leading to an improved resolution and reliability of data (arXiv:0901.2165 ). “You can think of aperture masking as an afterburner for an AO system because we will still use an AO system to do the first-pass at correcting the starlight for aberrations” said Tuthill.

Although the “pupil remapping” approach is yet to be tested on real astronomical images, Tuthill told physicsworld.com that the technology is “hot off the development bench” in the photonics industry.

Community reaction

Wesley Traub, chief scientist for the NASA Navigator Program, told physicsworld.com, “The new method is similar to pupil masking, but different in that all of the pupil can now be used.”

“The really interesting new feature is that they use phase closure to look for a non-symmetric distribution of light on the sky, as would be produced by a star with a planet nearby,” he added.

Tuthill and his colleagues now plan to test their device on real images of space. “In the longer term, we hope there might be potential for the next generation of giant telescopes,” he said. “As yet nobody knows how to build the AO systems to make these giants function properly, but our team thinks that instruments like the ones described here will be a key component to getting the best science from the new generation of behemoth telescopes.”

Adaptive optics expert Denis Brousseau of Laval University in Canada told physicsworld.com, “The technique holds the potential to improve aperture masking at major world observatories.”

PAMELA paper dampens dark-matter claim

Did the PAMELA experiment see dark matter or didn’t it? That was one of the biggest physics mysteries of 2008, when preliminary results from the space-borne detector suggested that an anomaly in the ratio of high-energy positrons to electrons reaching Earth could be caused by annihilating dark matter. If true, the finding would be the most direct view yet of the elusive stuff that is thought to make up about 22% of the mass of the universe.

Others argued that the anomaly could have a more mundane explanation in that it is simply caused by positrons emitted by nearby pulsars. Now, however, the PAMELA team has published an analysis of different data — the anti-proton/proton ratio — which may give added weight to the pulsar explanation.

Launched in June 2006, the PAMELA (Payload for Antimatter/Matter Exploration and Light-nuclei Astrophysics) satellite was designed by institutions in Italy, Russia, Germany and Sweden to examine the nature of antiparticles in cosmic rays. In November 2008 the PAMELA team reported preliminary results that suggested that the ratio of positrons to electrons at energies above 10 GeV is greater than predicted by theories that only take into account the production of positrons from interactions between cosmic rays and interstellar gas.

Pulsars or dark matter?

Although the PAMELA collaboration said at the time that the excess positrons could come from a nearby pulsar, they also suggested that it could be due to dark-matter particles annihilating.

The PAMELA team has published a paper on the ratio of anti-protons/protons in the cosmic ray flux, which should also be affected by dark matter annihilating to create anti-protons. PAMELA measured this quantity at energies between 1 and 100 GeV during 500 days of data collection (Phys. Rev. Lett. 102 051101).

The team found that ratio of anti-protons/protons rose from nearly zero at 1 GeV to a maximum value of about 0.0002 at around 10 GeV, before dropping back down again. According to the team, the data suggest that the anti-protons are produced when cosmic rays collide with instellar gas — with no need for dark matter annihilation.

An open question

So where does this leave the positron/electron results? PAMELA team member Wolfgang Menn of the University of Siegen in Germany told physicsworld.com that this remains an open question — and that dark-matter annihilation remains a possible explanation, along with pulsars.

Polarized electrons pumped at GHz frequencies

A device that emits exactly one spin-polarized electron every billionth of a second has been unveiled by physicists in Germany. Based on a tiny piece of semiconductor called a quantum dot, the device is one of the fastest single electron pumps ever built.

The researchers believe that the device could, with some improvements, be used as a very precise source of electrical current that would allow physicists to redefine SI units in terms of fundamental quantities such as the charge of the electron — a discipline called quantum metrology.

On a more practical level, as the spins of the emitted electrons appear to all point in the same direction, the device could be used as a source of spin-polarized electrons in “spintronic” devices, which exploit both the spin and charge of the electron. In principle, this would allow such devices to operate at gigahertz clock speeds.

Tunnelling devices

The new device has been built by physicists at the PTB standards lab in Braunschweig, Germany. They are among a number of groups at standards labs around the world developing precise single-electron current sources. Many of these sources take advantage of the quantum mechanical effect of tunnelling — whereby an electron has a probability of spontaneously crossing an insulating barrier between two tiny pieces of metal (Applied Physics Letters 94 012106).

Early devices employed a series of tiny metal pieces and barriers. However, tunnelling takes a finite amount of time, which means that there is a gap of about one ten millionth of a second between the emission of successive electrons. In other words the source operates at 10 MHz, and this relatively low frequency results in an electrical current that is too small be of any practical use as a precise current source for quantum metrology.

Much faster single electron pumps that operate at gigahertz frequencies have been made using surface acoustic waves (SAWs) on a semiconductor. These are high frequency sound waves in a semiconductor that drive single electrons across and insulating barrier. However, these are much less precise than those based on tunnelling and therefore are not likely to be of much use for quantum metrology.

In 2007 an international team of researchers including Mark Blumenthal at Cambridge University, Bernd Kaestner at the PTB national metrology lab in Germany and J T Janssen at the UK’s National Physical Laboratory worked out a new way to make a much faster single electron source based on oscillating tunnelling barriers.

Comprising a tiny piece of semiconductor called a quantum dot, the device was initially operated at frequencies up to about 3 GHz. However it was nowhere near precise enough for quantum metrology because it failed to spit out exactly one electron about once in every 10,000 cycles. To be of practical use, a GHz pump could only skip a beat about once every 10 million cycles.

In December last year, Janssen and colleagues in Cambridge and New York showed that the performance of such an electron pump could be improved greatly by placing it in a magnetic field as high as 3 T. Now, Kaestner along with Hans Schumacher and colleagues at PTB have boosted this magnetic field to about 10 T and discovered that the performance is improved even more. What’s more, at such a high field, the pumped electrons are almost certainly spin polarized — according to the researchers.

Oscillating voltage

The PTB device is based on a quantum dot 250 nm in diameter (see diagram). Opposite sides of the quantum dot are connected to two tiny metal wires (700 nm wide) via two thin insulating layers (about 100 nm thick). An electrode carrying an oscillating voltage is placed onto one insulating layer and an electrode carrying a constant voltage is placed onto the other insulating layer.

An electron can move from a wire to the dot by tunnelling through the insulating layer under the oscillating-voltage electrode. This is much more likely to occur at a certain point in the oscillation, when the tunnelling barrier is reduced by the applied voltage. After tunnelling has occurred, the quantum dot then has an extra electron, which then tunnels out into the other nanowire. Electrical repulsion between electrons means that only one electron can squeeze through the tunnel barrier at a time.

Schumacher told physicsworld.com that such devices can operate at gigahertz frequencies because the applied voltages reduce the tunnelling barriers, allowing tunelling to occur much faster than in the metal-insulator devices.

The team found that by applying a high magnetic field to the dot, the precision of the pump went from about one in 10,000 to about one in one million. However, it is not clear exactly why the magnetic field has such a significant boost on the precision.

Schumacher believes that the magnetic field changes how the electron tunnels through the insulator. The electron is deflected by the magnetic field and therefore follows longer curved trajectories through the barriers — which could make it more likely that exactly one electron tunnels in and out per cycle. Janssen adds that the magnetic field also has the effect of confining the electrons into a smaller region of the quantum dot — which sharpens the divisions between electron energy levels, again making it more likely that exactly one electron moves in an out.

‘Interesting’ technology

Jukka Pekola, who studies single electron sources at the Helsinki University of Technology, described the source as “interesting” because the magnetic field gives physicists another parameter to tune its performance — and also because the electrons appear to be spin polarized.

The PTB team is now trying to further boost the precision of the source to about one in 100 million by trying to find the optimum magnetic field, device shape, operating frequency and waveform of the oscillating voltage.

While the team have not actually measured the spin polarization of the electrons, Schumacher is confident that the 10 T field aligns all the spins in the same direction. But just to be sure, the team plans to measure the spin polarization at sometime in the future.

Schumacher adds that the source can easily be adjusted to produce two spin-polarized electrons at a time. Because the pair is created in the same quantum dot, they would be entangled and therefore could be used — at least in principle — in a quantum computer.

Weird analogy of the week

mencalc.jpg
The physicsworld.com supercomputer

by James Dacey

Ever wondered how many “men with calculators” it takes to match a day’s worth of IBM supercomputing?

According to The Times newspaper, it’s 120 billion of them, working for 50 years.

Confused?

Well, it all began yesterday at a press conference in San Francisco…

IBM revealed plans to build a supercomputer twenty times more powerful than today’s record.

The software company’s new baby is called Sequoia will be ready for action by 2012 when it takes up residence at the Lawrence Livermore National Laboratory, California.

Building and running costs will be covered by the US Department of Energy who are employing Sequoia to model the decay of the US nuclear weapons arsenal.

So what is this thing?

IBM’s geekspeak tells us that Sequoia will run at 20 petaflops: “peta” being the prefix for a quadrillion (10^15^) and FLOP standing for floating point operations per second.

The company will use their “Blue Gene” chip to make it different from a lot of other supercomputers which work by stacking up a whole load of servers.

Now, I’m no supercomputer aficionado but this computer seems substantially larger than previous computers, and, for that reason, worth reporting. The British mainstream press also thought so as the story appears on web pages of The Guardian, The Telegraph and The Times, along with a host of smaller sites.

Science journalists are always looking for “world-firsts” and “coo-wow” factor when it comes to new technologies, so it’s no real surprise that this humungous lump of American computer has received this widespread coverage.

And journalists are also committed to presenting facts in understandable, every-day terms. So it was interesting to see how the national papers would describe Sequoia’s processing power.

The Guardian, they went straight for the coo-wow, describing it as “the equivalent of more than 2m laptops.”

The Telegraph were a bit more conservative, focussing on the specific new development – “one order of magnitude quicker than its predecessor”

And then there’s The Times. Their description is – quite frankly – bizarre:

“Given an entire day, the Sequoia could match the output that 120 billion men with calculators might achieve in 50 years.”

What!!

Who are these men?
What are they calculating?
What type of calculator are they using?
Are they allowed bathroom breaks??

Ok, I’m being a little bit silly, but is a weird analogy. Quite creepy too, when you really think about it. And more than a little bit arbitrary.

So, creative physicsworld.com readers, I throw this out to you – how would you describe the computing power of IBM’s new monster machine?

Astronomers find ‘super Earth’

Astronomers using the CoRoT space telescope have found the smallest extrasolar planet to date. The planet, dubbed CoRoT-Exo-7b, is less than twice the radius of the Earth. It has a surface temperature of over 1000 degrees because it orbits extremely close to its parent star — and the exoplanet completes one orbit in just 20.5 hours.

Most of the 330 exoplanets discovered so far are gas giant planets that resemble Jupiter. Very few with masses comparable to Earth’s have been discovered because they are difficult to detect. CoRoT managed to identify such a small object because it is sensitive to a planet’s surface rather than just its mass — as in other methods to detect extrasolar planets, which detect the wobble of a star caused by an orbiting planet. CoRoT also orbits 900 km above Earth and can detect changes in star brightness as small as 0.01%, which is about 10 times better than the best ground-based telescopes.

The exoplanet circles a star about 400 light-years from Earth and was detected by measuring the slight dimming of a star each time an orbiting planet passes in front of it. Although the density of CoRoT-Exo-7b has not yet been determined, the scientists believe it might be a rocky object like Earth and covered with molten rock. It is therefore unlikely to harbour life as we know it on Earth.

Important milestone for planet hunters

The new result is an important milestone for planet hunters, according to Jean Schneider from the Laboratory Universe and Theories at the Paris Observatory, because recent measurements hinted at the existence of planets with small mass but their size had not been determined.

The discovery was also backed up with numerous follow-up measurements from the ground using telescopes and instruments like the VLT-ESO at Paranal, HARPS at La Silla and the Canada-France-Hawaii Telescope on Mauna Kea. Although scientists detected CoRoT-Exo-7b a year ago, they waited for the results from these complementary measurements before announcing their findings.

CoRoT was developed by the French Space Agency (CNES) at the Laboratory for Space Studies and Astrophysics Instrumentation (Paris Observatory), the Marseille Astrophysics Lab, the Institute of Space Astrophysics, Orsay (University of Paris 11) and the Midi Pyrenees Observatory, Toulouse. International partners included teams from Austria, Belgium, the European Space Agency, Germany, Spain and Brazil.

Star-shaking mission

CoRoT stands for “planetary convention, rotation and transits” and its goal is to search for exoplanets, and particularly those similar to Earth. It also detects and analyses star vibrations to determine star composition (also known as stellar seismology).

The work was presented at the first symposium dedicated to CoRoT, held in Paris from 2 to 5 February 2009. The work will be reported in an upcoming special issue of the journal Astronomy and Astrophysics.

CoRoT is the first step to finding Earth-like exoplanets and is a relatively small project that cost just €140m. It will be succeeded in time by KEPLER, a much more ambitious US mission with the same goals.

Doppler effect reversed by metamaterial

Physicists have generated a lot of excitement in recent years by dreaming up specially structured materials with novel applications like invisibility cloaks. What’s more, some of these “metamaterials” have been built in the laboratory and shown to work over a narrow range of electromagnetic wavelengths. Now, a group of researchers from Korea and China has created an acoustic metamaterial that causes the bizarre effect of a reverse Doppler effect. This is an important stepping stone to an acoustic cloak, according to the researchers.

As every physicist is taught in school, the Doppler effect is what causes a pedestrian to hear a high pitch siren as a police car speeds towards them, and a lowering pitch as it races away. Surprisingly, a new material has defied physics textbooks by reversing this effect.

Chul Koo Kim of Yonsei University and his colleagues have achieved this feat by creating an elastic tube that transmits sound with a negative phase velocity. “We have successfully fabricated an acoustic metamaterial whose acoustic refractive index can be controlled; the theoretical models can now be implemented to realize acoustic cloaking as well as other applications,” Kim told physicsworld.com.

Witchcraft and wizardry

In 2006 a group at Duke University, North Carolina, captured public interest when they demonstrated a trick previously confined to the pages of Harry Potter books. Led by David Smith, they created a cylinder from artificial “metamaterials” capable of hiding an object from microwave radiation — waves were literally “steered” around the object as if it wasn’t there.

Another bizarre optical effect to be demonstrated in the past few years is “negative refraction”: light passing between two media, including a metamaterial, is bent in the opposite direction to classic refraction. This effect is most pronounced when the metamaterial is “double negative”, possessing both negative electric permittivity and magnetic permeability.

This research is an important breakthrough Jose Sanchez Dehesa, University of Valencia

This latest research takes the principles of negative electromagnetic refraction and applies them to acoustic vibrations. Here the parameters to be made negative are material density and modulus, the latter relating to a material’s elasticity. Until now engineers have only created metamaterials with either of these properties, but Kim and colleagues have successfully combined them to create the world’s first “double-negative” acoustic metamaterial (arXiv:0901.2772v2 . Their acoustic tube is constructed from thin membranes under tension fed by a carefully controlled air flow, and this manages to create a negative phase velocity for sound travelling through.

Sound is passed into the tube from a moving source via holes pierced periodically along the device. Inside the tube a fixed detector receives the sound before sending an electrical signal to a loudspeaker. According to Kim, the major engineering hurdle was to develop effective absorbers at each end of the tube. “This enabled us to so as to prevent reflections and ensure the quality of data,” he said.

New sound

Kim and colleagues tested the apparatus using sound of 350 Hz with a source moving 5 m/s towards then away from the direction of wave propagation. Contrary to classic Doppler experiments they found that frequency was down-shifted as the source moved towards the receiver and up-shifted as it moved away from it. What the experimenters heard was a decreasing pitch as the source moved towards the detector and increasing one as it moved away from it.

Kim told physicsworld.com that the next stage of this research is to translate their “1D design” into various types of 2D and 3D acoustic metamaterials. “These developments may find uses in medicine and industry. Also easy control of acoustic refractive index will spur new research directions in fiber acoustics,” he said.

Jose Sanchez Dehesa, a metamaterials researcher of the University of Valencia said, “This research is an important breakthrough; if we can now shift this structure to 2D and 3D it could be used to achieve things like sub-wavelength resolution in ultrasonic imaging and many other interesting devices.”

Science in Colour

By Hamish Johnston

fred1.jpg
‘Three-particle distribution function’ (2007/2009) by Frédérique Swist
An artistic interpretation of a distribution function of a third particle around two fixed particles in a two-dimensional colloidal liquid. This image refers to a mathematical model, used theoretically to calculate the probability of each of the three particles existing in a certain spatial position.

If you happen to be in Bristol over the next few weeks why not pop into Cafe-At-Bristol at the Harbourside to see an exhibition of art inspired by the often beautiful forms that are created when scientific data are visualized.

The artist is IOP Publishing’s very own Frédérique Swist, and her show starts today and runs until 27 February.

Fred is Senior Graphic Designer here at Dirac House and she tells me that much of the inspiration for her art comes from her work designing brochures and other literature for IOP Publishing and the Institute of Physics.

She says that her work can be divided into three categories. The first includes images in which she has maintained the core scientific meaning of data, usually used in promotional materials for specific physics journals.

The second includes pieces in which she has made significant changes to the original data, usually used in more general corporate literature. ‘Three-particle distribution function’ is an example of such a work and it appeared on our 2008 Christmas card.

Finally, there are the pieces that are inspired by physics, but have been created artistically by Fred. An example is ‘Split-ring resonator’ — a work in which many physicists will recognize the iconic split rings used to make metamaterials, and others will appreciate for its artistic merit.

Indeed, Fred sums up her work: “Each piece can be appreciated on different levels; from pure abstraction to material inspired by the most advanced physics research, it provides viewers with the opportunity to form their own interpretations, and to choose ways to engage visually and/or intellectually with the imagery.”

fred 2.jpg

Science in Colour
Tuesday 3 – Friday 27 February 2009
Café-At-Bristol, Anchor Road, Harbourside, Bristol BS1 5DB
Opening time: 10am to 5pm daily

Axions hint at a return

Evidence for axions is once again mounting as researchers claim the hypothetical particles can explain how very high-energy photons travel unimpeded through the cosmos.

Such photons and other neutral cosmic rays (aside from neutrinos) should be unable to travel inter-galactic distances because they are absorbed by the universe’s opaque background of microwaves — yet they are still detected on Earth.

“If this could be confirmed, it would be an enormously important discovery” Dan Hooper, Fermilab

Now a group led by Malcolm Fairbairn of King’s College, London, has found a correlation between where detected cosmic rays originate and where photons and axions are more likely to “mix”. The result implies that cosmic-ray photons could reach Earth from distant galaxies by temporarily converting into axions, which can bypass the microwave background without absorption.

“If this could be confirmed, it would be an enormously important discovery,” says Dan Hooper, a physicist from Fermilab in the US who performed a similar study last year.

Old problem

Axions were first proposed in the late 1970s to solve an issue in particle physics known as the strong-CP problem. Theory suggested the elementary particles would be very light and would interact very weakly with matter — so weakly, in fact, that no one has yet managed to detect them.

Fairbairn’s group — which includes Timur Rashba of the Max Planck Institute for Solar System Research and Sergey Troitsky of the Russian Academy of Sciences — has been looking for an astrophysical equivalent of an effect being sought in laboratory axion experiments. In these “shining light through the wall” experiments, a laser is shone onto a wall in the presence of a magnetic field. If some of the laser’s photons covert into axions, they could travel freely through the wall, revert to photons and then be detected on the other side.

The hope of Fairbairn and colleagues is that, on a cosmic scale, the “wall” could be provided by the microwave background, and the magnetic field could be provided by galaxies.

To see if this “shining light through the universe” effect exists, Fairbairn’s group has performed a statistical analysis of neutral cosmic rays of energies above 1018 eV recorded by the High Resolution Fly’s Eye (HiRes) detector in Utah. Previous studies have already highlighted a correlation between the arrival locations of these cosmic rays and the known locations of highly luminous “active” nuclei of distant galaxies. But Fairbairn’s group has shown that there is an additional correlation with the likely profile of our own galaxy’s magnetic field, which determines the probability of photon-to-axion conversions. The researchers say the likelihood of the correlation occurring by coincidence is just 2.4% (arXiv:0901.4085).

Proof is ‘far away’

Fairbairn told physicsworld.com that the result is definitely new evidence in favour of axions but warns that it needs to corroborated by more data. “It’s still far away from proving anything,” he adds. “Very, very far.”

If the axion does indeed exist, it would be particularly light (less than 10–7 eV) and have a particularly weak coupling with photons (an inverse coupling of 1010 GeV). This latter property rules it out as a solution to the decades-old strong-CP problem by several orders of magnitude. However, is there is still a question as to whether it could fulfil the other possible role of axions — that of the cold dark matter that generates most of the universe’s gravity.

Theorists expect dark-matter particles to be heavier than 10–7 eV, because it means they can be generated sooner after the Big Bang and then spread out into the low densities given by experimental cosmology. But Aaron Chou, a spokesperson of the GammeV axion experiment at Fermilab, suggests that we might just live in a region of the universe where dark matter happens to be less dense. “In short,” he explains, “this model could also produce axion-like dark matter, but just not in a generic fashion.”

Nonetheless, the study by Fairbairn’s group does back up work by Hooper, Pasquale Serpico of CERN and Melanie Simet of the University of Chicago, who found a similar correlation for lower energy photons last year (Phys. Rev. D 77 063001; arXiv:0712.2825). Hooper and Serpico point out — and Fairbairn admits — that there is no way of knowing how many of the neutral cosmic rays detected by HiRes are actually photons, as all it can detect are the subsequent showers when the rays collide with the Earth’s atmosphere. Moreover, Serpico has doubts on the effectiveness of the photon-to-axion conversion mechanism at such high energies.

Previous mistakes

It would not be the first time that axions have tempted physicists into believing in their existence. In 2006 there was hope for experimental proof in an Italian experiment called PVLAS, which registered a slight change in the polarization of a laser beam as it passed through a magnetic field in a vacuum. The PVLAS researchers thought the change could have resulted from some of the laser beam’s photons combining with photons in the vacuum to produce axions. However, they later found the signal to be an experimental artefact.

The only course of action for Fairbairn’s group now is to await more data from other cosmic ray searches, such as the Pierre Auger Observatory in Argentina. In the meantime, they can still hope a lab-based experiment will see evidence for the same type of axion.

Konstantin Zioutas of CERN is spokesperson for CAST, the only existing axion experiment that has the potential sensitivity to search for Fairbairn and colleagues’ axion. He says that to begin looking CAST would require an upgrade, which is “within reach” in a few years. “If this idea is definitely confirmed, it will be a breakthrough for cosmology and particle physics alike,” he says.

Another department bites the dust?

By Hamish Johnston

If you’re not a regular reader of the University of Idaho Argonaut student newspaper, you may have missed this article about the possibility that the university’s undergraduate physics programme may be axed.

It seems that physics is one of 41 programmes identified for a possible chop by the university’s Program Prioritization Process (PPP) — which was initiated in 2005 to “increase the overall financial and academic efficiency of the university”, according to the Argonaut.

Bizarrely, the article suggests that the PPP plan involves getting rid of undergraduate physics in order to strengthen the graduate physics offering. I’m guessing that this is a way of trying to hold on to physics faculty members once the axe has fallen on undergraduates.

The Argonaut quotes physics undergraduate Alex Natale as saying “I don’t know how they could cut physics from the College of Science and still be the College of Science”.

I don’t know either…ironically, the University of Idaho’s sports teams are called the Vandals — and they are not the only ones causing damage in Idaho it seems.

The science of fine art

When I first saw the painting of an Elizabethan woman — thought possibly to be a portrait of Queen Elizabeth I herself — it was split completely down the middle, with paint flakes hanging off like an outcrop and its two halves curved like a shield. The heating system in the National Trust-owned house in the UK where it was on display behaved erratically in the winter, and the relative humidity had dropped dramatically. The 5 mm thick painted wood panel responded by warping so severely that its frame eventually restrained it, forcing the panel to crack under the pressure.

Conservators and conservation scientists play a key role in physically preserving important parts of our cultural heritage. With the Elizabethan painting, our team’s remit here at the Courtauld Institute of Art in London was to repair the split, find out more about the painting’s provenance, understand its environmental response and provide a suitable mount to protect it from future damage. To do this, we carefully realigned the two halves of the panel and rejoined them with a polyvinyl-acetate adhesive, taking care not to lose the flakes of paint clinging precariously to each side. A surface fill of chalk and gelatine covered the join, which was then retouched using a hydrocarbon compound and dry pigments, before finally being varnished.

We monitored the movement of the panel by simply marking out its profile on graph paper and found it responded almost immediately to small changes (5%) in relative humidity. We were able to slow down this response by applying a coating of ethylene vinyl acetate to the back of the painting, building a flexible support for it and placing it in a sealed, glazed frame. It has now been returned to Trerice in Cornwall, where it is again on display.

Bringing art and physics together

I was first attracted to conservation science by the opportunity to work hands-on with fascinating and often beautiful objects of cultural heritage — each presenting a plethora of interesting and demanding problems for scientists as well as art historians or curators. The complexity and individuality of each object requires us to understand and integrate ideas from many fields. To conserve an object, many different factors must be considered, including its aesthetic, provenance and history; the artist’s intent, choice of materials and original technique; and the object’s physical condition.

I studied both art and science at A-level, and I always assumed they were facets of the same universe. I chose to study physics at Imperial College London because I liked the philosophical as well as the mathematical aspects of the subject. After I finished my undergraduate degree, I decided to do a Master’s degree in applied optics at Reading University because it brought together many of my interests, from the theory of colour and vision to the creativity of designing experiments and specialized lenses.

After I left university and started working on optical-systems design at the Rutherford Appleton Laboratory and then at Chelsea Instruments, I saw an advert for a postgraduate conservation course, and really became aware that I could combine science and art. However, funding was not available at the time so instead I started a PhD in mechanical engineering at Imperial, researching techniques used to look for defects in ceramic tiles. I then realized these same techniques could be applied to non-destructive testing of works of art, so I contacted the scientific departments of the National Gallery in London and the Tate to find out more about the problems encountered in paintings. Seeing the work they did made me decide that this was the area I wanted to work in, so I persuaded my supervisor to let me change my PhD topic to cover the physical properties of canvas paintings.

I soon found out that there are numerous ways in which to apply my physics knowledge to this field. For the past 16 years I have been using a technique called electronic speckle pattern interferometry, which employs lasers and interferometric imaging to measure the strain induced in paintings. I have also experimented with other methods like pulsed thermography and infrared optical coherence tomography to identify subsurface features and adhesion between layers in works of art, and used multispectral imaging to understand artists’ materials and techniques.

Patterns at work

I joined The Courtauld, which consists of the Courtauld Gallery and the Institute of Art, in 2000. The Gallery houses a famous collection of Impressionist and Post-Impressionist paintings (including works by Manet, Monet, Cezanne and Renoir) and the Institute of Art is a college of the University of London that specializes in art history and conservation. My research interest is in non-invasive techniques for measuring the physical condition of paintings, and developing methods for structural conservation treatments.

Working within an academic environment means that the flow of students, lectures and exams provides an overall structure to the year, but beyond that my day-to-day activities vary quite a lot. Our postgraduate students treat paintings starting in their first year, which requires a lot of studio supervision. So on an average afternoon, I might be recording an infrared image to check for any drawings underneath the paint, using ultraviolet light to identify retouching and varnish, or using a technique called energy-dispersive X-ray spectroscopy (EDX) to identify chemical elements in the paint layers. Equally, I could be removing a painting from its wooden support, mending tears, designing a mount for a panel painting, or undertaking an environmental survey at a historic house where paintings — like the Elizabethan portrait — are displayed.

Careers in conservation

Conservation science is a relatively small field, and there is a lot of collaboration between institutions and individuals. In general, paintings conservators work directly on an object; so depending on what is required, they may remove a degraded varnish, repair a tear in the canvas, fill and retouch where paint has flaked off, or, as with the Elizabethan painting, rejoin a wooden panel that has split. Conservation scientists, in contrast, usually take a more indirect approach. For example, we may analyse paintings using X-rays (see “Underneath the surface”), monitor the movement of a painting due to environmental changes with optical or mechanical techniques, or use small original samples or replicas to investigate new cleaning methods in a laboratory setting.

Research and practice are more closely interrelated than in many scientific fields, which makes the work very satisfying. I keep abreast of emerging techniques in applied physics and engineering, and I have longterm collaborations with conservators, scientists and engineers from several institutions, including the National Physical Laboratory, Imperial College London, the Tate and National galleries and the Museum of Modern Art (MoMA) in New York.

For a physicist, there are many ways to work in the field either as a conservator or as a conservation scientist, or as both. The principal employers of conservation scientists are the scientific departments or preventive conservation sections of major museums and public collections in Europe and the US. Many come into the profession as I did, through doing a PhD at a university with external links with a museum. Scientists are sometimes employed at a junior level directly after completing undergraduate or postgraduate courses, without prior training in a conservation-related field; they then learn on the job.

There are also sometimes research-assistant posts in conservation-science research projects. At The Courtauld, we have recently had two projects: one investigating artists’ materials and techniques using microscopy, Raman spectroscopy and EDX spectroscopy; the other working with ultraviolet lasers to investigate their suitability for cleaning 19th-century paintings.

For those wanting to work hands-on with objects as conservators, then a postgraduate training course is the recognized route. There are two- and three-year postgraduate courses available in many areas — including easel paintings, wall paintings, paper, objects, stained glass, preventive conservation and archaeology — for which a first degree in physics is appropriate. Usually, after finishing their postgraduate course, students work on short-term contracts to build up experience. Conservators are employed in museums and galleries or work privately in many countries. Finding permanent posts at institutions is more difficult, but most conservation-trained scientists do eventually find full-time work.

For those wanting to see if conservation might be for them, a good start would be to read the technical bulletins published by the National Gallery, British Museum or similar institutions outside the UK. Several museums and galleries also maintain good conservation webpages, and two conservation-community websites, icon.org.uk and iiconservation.org, contain a number of useful resources. The most important step, however, is to go to museums, galleries and cultural-heritage sites to look at things and find out what really interests you.

Copyright © 2025 by IOP Publishing Ltd and individual contributors