Skip to main content

Subverting science

By Matin Durrani in Perth, Australia

Donna Franklin's Fibre reactive hybrid dress

Donna Franklin’s Fibre reactive
hybrid dress
, 2004–2008

If you think Australia is remote from the rest of the world, well the city of Perth is even more cut-off, being a five-hour flight from Sydney and 1000 km from the next main centre of population.

That remoteness has engendered a kind of “wild west” spirit, where people have the time and space to think up radical ideas that might be more quickly dismissed in less isolated places.

That, at least, was the view of the Nobel-prize-winning microbiologist Barry Marshall from the University of Western Australia (UWA) – best known for proving that most ulcers are caused by certain bacteria called Helicobacter pylori. Marshall was talking to me over lunch at the university on the latest leg of my week-long fact-finding tour of Australian science that I’m on with three fellow European science journalists.

The subversive spirit was echoed after lunch by Ionat Zurr, an Israeli-born artist working at UWA within SymbioticA – a self-styled “artistic laboratory” within the School of Anatomy and Human Biology that describes itself as being “dedicated to the research, learning, critique and hands-on engagement with the life sciences”.

Unlike most art–science projects, which involve artists reinterpreting scientific ideas in an artistic form, the people at SymbioticA are getting their hands dirty by learning various experimental scientific techniques to create works of art.

Projects include making loudspeakers from bones, growing edible steak from artificial tissue, and (well before Lady Gaga had a similar idea) creating wearable dresses from fungus leaves (pictured above).

SymbioticA, which was founded in 2000 after fighting off a rival bid to buy a boring old confocal microscope, seeks to question the very nature of science, art and even life itself. It also wants to demystify and “democratize” the scientific laboratory.

The delightfully garrulous Zurr admitted that not everyone understands, or even approves of, what she and her colleagues are trying to do. What, you may ask, is the point of designing jewellery made from pig wings grown from bone-marrow stem cells?

But the strong reaction of some scientists to SymbioticA surely shows that she and her fellow artists must be doing something right: after all, isn’t science itself about challenging orthodox thinking? Perhaps scientists are happy to be subversive but don’t like being subverted themselves.

As one of my fellow science journalists complained, shaking his head in derision as we left: “They should have bought that microscope.”

The real impact of impact

“Cool as a mountain stream” was the advertisement slogan for a certain brand of cigarettes from my childhood. It was written and paid for by companies that probably had a pretty good idea what smoking actually did to their customers. Such advertising was also supplemented by intense government lobbying – itself based on highly selective “research” – to oppose possible moves to restrict the sale and consumption of tobacco. The reason for all this? To preserve the company’s bottom line and secure the salaries and share options of the management. But that – ranging all the way from slightly disingenuous image presentation to almost outright deceit – is, after all, what advertisers do.

Scientists, of course, are different. In our laboratories we sift data to differentiate between rival hypotheses. We seek to push forward the boundaries of what is known and understood. We publish our findings so that results can be compared and, where discrepancies appear, further hypotheses can be advanced. And so the cycle continues with everyone constantly striving to increase our understanding of the universe around us.

Scientists are trained not be selective about data (at least not without good reason). Deleting or moving an “outlier” in a data set crosses an ethical boundary that the vast majority of scientists hold sacrosanct. There have been some exceptions: Jan Hendrik Schön, who was found guilty of 16 out of 24 charges of scientific misconduct in 2002, being a notable example in physics. However, once challenged, the scientific community tends to deal with the few cases of fraud rapidly and robustly.

The new problem, though, is that scientists – particularly those working in universities – are being increasingly urged to amplify the “impact” of their work. In many countries, including the UK, research funding is now becoming conditional on demonstrating impact criteria such as maximizing knowledge transfer to industry (to “secure our future prosperity”) as well as encouraging students to specialize in science. Giving an inspiring public lecture or publishing a paper reporting a genuine advance – the essence of research and teaching – are not enough anymore. That everything now also has to have “impact” means that scientific results are distorted, the significance of outcomes is overblown, and the presentation of the research becomes at least as important as the science itself.

Energy and funding

Impact has manifested itself in the “publish or perish” culture. The increasingly competitive marketplace of academic publication has placed paramount importance on journals’ “impact factor” – the number of citations in a particular year given to papers that appeared in the journal over the previous two years. To enhance this, many journals have an unwritten policy of preferring papers that are believed likely to attract more citations, and papers are also carefully written to inflate their broad significance.

But even this should be the least of our concerns. In the best journals, peer review is still at a high standard, and is normally performed objectively and honestly. Far more worrying is the funding environment in which all parties – the funding agencies, scientists and publishers – are complicit in cloaking scientific research with a far greater significance than can be objectively justified.

This is most obvious in the current focus on “energy” and the real danger of climate change. Governments have been persuaded to divert funding into energy research because it is a relatively inexpensive way of appearing to tackle the problem – at least in comparison with actually doing anything about it. Funding agencies then support it because the more funds they have to administer, the larger their operations become. Scientists also like it because we can use it to fund our research – most of which is only tenuously relevant to the reduction of energy consumption or its generation. Just as it has become possible to magically “offset” the carbon emissions associated with taking a flight or hiring a car, as a society we can offset our inactivity in actually reducing pollution by funding research that can be promoted as promising a greener future.

Of course, few research proposals actually promise perpetual-motion machines or teleportation devices. But they do promise to improve “efficiency” – the magic word for demonstrating the potential impact of such research. From new alloys for jet-engine turbine blades to LEDs for lighting, as less energy is required for a particular human activity, the new research is predicted to mean that a certain number of power stations can be shut down or so many million tonnes of carbon dioxide can be saved. But there is a catch. To governments and businesses, improved efficiency does not mean reducing use but rather the opposite – it increases the practicality and/or affordability of consumption. In the end, such developments oppose the stated impact of the research to reduce energy consumption.

Flatter the better?

Take flat-screen televisions as an example (liquid crystal, plasma, organic LED or whatever). These devices are probably intrinsically more efficient than the cathode-ray tube (CRT) that they have superseded – provided one compares the power consumption for the same screen size. However, one only has to visit an electronics retailer to realize that “efficiency” in this context is a meaningless term – modern televisions are simply enormous in comparison with their older counterparts. A quick visit to my local electronics store revealed screen sizes of up to 60 inches, and the power consumption of these mega-screens is correspondingly huge – more than 500 W in several cases, compared with a CRT television that requires about 50 W.

The key technological advance from the point of view of the manufacturers is not energy consumption but rather screen thickness. The depth of the enclosure of the cathode-ray tube had to scale with the width of the screen because electrons can only span a limited solid angle. However, a flat-screen television has no such restrictions and could cover most of a living room wall while still leaving room for the sofa. So, if we are being really honest, an accurate summary of the “impact” of the physics-led development of display technologies is that it has enabled televisions to have a much larger power consumption – an outcome that is unlikely ever to have appeared in a funding impact statement.

Another frequently trumpeted outcome of research is that a technology could become cheaper to manufacture. In the case of televisions, they are now cheap by historical standards, which has enabled people to purchase multiple sets and install them in more rooms in their homes and so increase power consumption still further. Indeed, this argument can be replicated across technology sectors: from colossally power-hungry data centres that support green-sounding “cloud computing” to cars where improvements in engine efficiency have been largely cancelled out by increasing size and mass of vehicles. So much for tackling climate change.

The moral high ground

It is about time that the scientific community applied the same intellectual rigour to the context in which it operates as it is supposed to apply to its own research. In other words, we should be able to justify objectively not just the chain of reasoning that has led to a scientific advance, but also any statements made regarding the potential impact of the result.

However, scientists simply do not have the commercial expertise or political insight to be able to do this. The recent allegations of fraud in the presentation of climate data illustrate the grave dangers inherent in trying to maximize the political “impact” of research. In reality, science can only directly contribute to tackling the major problems confronting our countries such as climate change and economic decline through educating the next generation about the science underlying current and future problems. Only then will they be able to take informed decisions themselves. As scientists, we will lose what remains of our authority and the moral high ground in that education debate if we continue to abandon scientific standards of proof in favour of spin and deceit in chasing funding.

Flipping spins, one proton at a time

In a bid towards better understanding the inner workings of the proton, researchers in Germany have, for the first time, directly measured magnetic spin-transitions of a single trapped proton. Their work is an important step forward in understanding the magnetic properties of a proton. The technique could also be used to measure the spin of an antiproton, which could help us understand why the universe has more matter than antimatter.

The proton has an intrinsic angular momentum or spin, and behaves like a tiny bar magnet that can point up or down. The spin of a single proton has not been measured until now because the magnetic moment of the proton is much smaller and hence more difficult to detect, than that of the electron or the positron. Previous measurements have been made on clouds of protons which cannot be repeated on the much more scarce antiprotons. This new method looks at measuring the spin of just one proton produced and held in a special trap, which is capable of storing protons for months.

The researchers, based at Johannes Gutenberg University and the Helmholtz Institute in Mainz, Germany, together with colleagues from the Max Planck Institute for Nuclear Physics in Heidelberg and the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, Germany, spent close to five years working on their experiment. One of the main developments was a specially designed miniature Penning trap – a vacuum trap that uses electric and magnetic fields to hold particles. “Such an experiment is challenging and has to be set up with extreme care.” explains Stefan Ulmer, a team member from the Helmholtz Institute. “In the first two years we designed the cryogenic apparatus, the Penning traps and the highly sensitive superconducting detection systems. In the third year we got the apparatus running and succeeded in preparing a single proton. In the following two years we had to improve the apparatus and redesign some parts. Finally we succeed after four and a half years in observing a single proton spin flip.” he said.

Wobbling protons

A proton in a Penning trap aligns its spin with the trap’s magnetic field. The team introduced an additional magnetic filed, which creates a non-uniformity. The team then switched on an RF signal that causes the spin to precess or “wobble” like a spinning top. The non-uniform magnetic field causes the frequency of the wobble to depend on the direction of the spin. As a result, a spin flip can be detected by measuring a small change in that frequency, which can unobtrusively be detected. This can then be used to calculate the proton’s magnetic moment.

Because the proton spin is still small, currently the researchers have measured it to a precision of 10–4. “The aim is the measurement of the magnetic moment of the single proton with a precision of 10–9, at least. We are right now working on the improvement of the apparatus to reach the 10–9 level.” explains Ulmer.

Looking towards antimatter

In the near future, the researchers would like to apply their methods to measuring the magnetic moment of antiprotons – the antimatter counterpart of the proton. This would be carried out at research facilities where low energy antiprotons (5.3 MeV) are produced, such as the CERN Antiproton Decelerator (AD). “If the proton is measured, it will be possible to measure the antiproton. Currently there is only one facility worldwide which delivers antiprotons – CERN AD. Lots of groups want to have access, so it is a question of how it can be organized to get access. Such antiprotons are needed to perform high-precision low-energy experiments with antimatter. The antiprotons from CERN AD would be decelerated and stored in our Penning trap.” says Ulmer. Another facility in Germany called a Facility for Low Energy Antiproton and Ion Research (FLAIR) has been set up to provide low-energy antiprotons, but it will be quite some time before FLAIR is operational.

Jeffrey Hangst, who works at the ALPHA experiment at CERN AD, says “This is a very difficult and elegant experiment; I am very pleased that they have achieved this important milestone. We should put matter and antimatter under the microscope when we have a chance to do a beautiful experiment like this one.” Even with ALPHA’s advance, it will be difficult to make magnetic measurements of the antiproton because that is not sufficient time. Currently, the magnetic moment of the antiproton is known only to three decimal places. The team hopes that its method will change this and help to conduct high-precision comparisons of the fundamental properties of particles and antiparticles, making it possible to accurately determine whether CPT symmetrical behaviour actually occurs, and maybe provide the basis for theories that extend beyond the Standard Model.

The research was published in Physical Review Letters.

UK research chief defends ‘blacklisting’

The head of the UK’s biggest science funding agency has defended its controversial decision to restrict how often scientists can apply for grants from it. David Delpy, chief executive of the Engineering and Physical Sciences Research Council (EPSRC), says “there is no evidence” that the restriction has led to less creative grant applications or that it is penalizing young researchers.

Delpy was speaking in an exclusive video interview with Physics World, in which he also criticizes the UK government’s recent tightening of the rules on who can work and study in the country, saying it would “damage the UK’s standing in research”. Delpy, a medical physicist who has been EPSRC boss for almost four years, was last month reappointed by the UK science minister David Willetts until March 2014.

The EPSRC introduced its new restrictions in 2009 as a way of coping with the fact that the council was receiving so many more grant applications than it could afford to fund that in some disciplines up to 85% were getting rejected. The low success rate was also demoralizing for researchers, many of whom found that applications that had been rated highly by peer-review panels were nevertheless getting turned down. Delpy adds that the deluge of applications was even making it hard for the council to find enough referees to conduct the peer review.

Tightening up

As well as preventing researchers from resubmitting grants that have previously been turned down, the new rules now dictate that anyone with three rejected and below-par applications in any two-year period can make only one further application in the following year. “Having imposed this on the community, success rates have gone up to 30% because the number of applications has dropped [and] we are seeing an increase in the number of referee responses,” claims Delpy.

The EPSRC boss also denies that forcing applicants to explain the “impact” of their work in grant applications is hampering genuine breakthroughs. “We’re not asking academics to predict the full detailed outcome of their research,” insists Delpy. “What we’re saying is start thinking about how you would maximize the impact of that research, whether it’s within our discipline or in the form of policy advice to government or a product that could be marketed – [and to] ask in your application for the resources to do that.” Indeed, Delpy thinks that UK taxpayers have “the right” to know from scientists how the £800m that the EPSRC spends on research each year benefits the UK financially and socially.

“Unfair” changes

However, Philip Moriarty, a condensed-matter physicist from Nottingham University, thinks the EPSRC’s “blacklisting” rules are unfair. “I’m an ardent supporter of peer review but the idea that there’s a direct correlation between proposal quality and where it’s ranked on a panel prioritization list is nonsense,” he says. “If there are 40 grants for review by a panel, the top five and the bottom five will probably be relatively easy to define, but the remaining 30 in many cases may as well be ranked on the basis of throwing them up in the air and seeing where they land.”

Moriarty says he wants the EPSRC to run what he calls “a simple experiment” to test the council’s assumption that grants falling in the bottom half of a ranked list are necessarily of poor quality. “They should take the same set of proposals, send them out to different referees, and then give them to five different panels [and] look at the correlation in the ranked lists,” he says. “If they are so confident that the principle underlying the blacklisting process is robust, then why not do this experiment? It would silence me and all the other critics of the scheme.”

Looking for leaders

On the UK’s tightening of visa rules, which was heavily criticized for potentially stopping talented researchers from moving to the country, Delpy says he is “very pleased” that the government responded by making it easier for scientists with PhDs to meet the tougher requirements. But he warns that any barriers to the free movement of scientists is a bad idea. “Research is an internationally competitive game [and] if the UK wants to maintain its [leading] position any restriction of movement across international boundaries will work to the detriment of that – not just in physics but in all areas of research.”

Elsewhere in the interview, Delpy says the EPSRC will be encouraging young researchers with bright ideas by focusing over the next four years on “scientific leaders” who are either already running large teams or are prominent in their fields. “[We will] provide them with funding not just in the form of a one-off grant or fellowship but by providing funding across the whole of their research career – to provide mentorship and additional training in the additional skills that you need to be a research leaders and internationally competitive,” he says.

However, Moriarty regards the EPSRC’s decision to focus on “leaders” as “fundamentally flawed”. “In just what sense is focusing on leadership supporting new or young lecturers?” he asks. “One of the great aspects of the UK funding system for many years – and certainly something that kept me here when I had the opportunity to go back to Dublin in the boom years of Science Foundation Ireland – was the ability to establish an independent research career much more quickly than other places in Europe.”

UK research chief defends ‘blacklisting’ of grant applicants

Miracle or no miracle?

By Matin Durrani, Sydney, Australia

miracle.jpg

Australia is a big country but, as far as science is concerned, it does just about as much one might expect for a country with a population of just 23 million.

But according to Thomas Barlow, a former academic and journalist who is now a kind of freelance policy wonk and science adviser, Australians are far too pessimistic about their scientific future. In other words, while Australians believe they are an inherently inventive people, they are less good, or so the thinking goes, at capitalizing on their smart ideas.

I met Barlow yesterday during my visit to Sydney at the home of Peter Pockley – a veteran Australian science journalist and broadcaster who also regularly reports on Australian science for Physics World.

Barlow has outlined his thoughts in his 2006 book The Australian Miracle, which offers an honest, well written and sober perspective on Australian science.

It’s worth reading if you’re at all interested in the country’s science and Barlow has as good a perspective as any – he’s married to Michelle Simmons, who’s a leading quantum-computing physicist at the University of New South Wales and who directs a successful national Centre for Quantum Computer Technology funded by the Australian Research Council.

Still, I just can’t get away from the nagging feeling that Australia, being physically so far removed from the US, Europe, China, Japan and other centres of power in global science, is destined to always remain one step behind the rest of the world.

In his book, Barlow denies that there is a brain drain of talent from Australia, which may be true. But unless there is a steady flow of people and ideas in and out of the country, true innovation may struggle. And being so far away, that flow is simply hard to sustain.

MINOS confirms muon-to-electron neutrino oscillation

Neutrino mixing angles from T2K and MINOS
Neutrino mixing angles from T2K and MINOS

By Hamish Johnston

Less than a fortnight ago we brought you news that the T2K experiment in Japan has caught the first glimpse of muon neutrinos changing (or oscillating) into electron neutrinos as they travel 300 km under Japan.

Now researchers at the MINOS experiment in the US have seen the same neutrino oscillation. The MINOS physicists sent a beam of neutrinos more than 700 km underground from Fermilab in Chicago to the Soudan Underground Laboratory in Minnesota. At Soudan, the team detected a total of 62 electron neutrinos, which is 13 more than they should have seen if some muon neutrinos had not changed to electron neutrinos.

Neutrinos exist in three “flavours” – muon, electron and tau – that change or “oscillate” from one to another as they travel in space. The oscillation strength between different types of neutrino is characterized by three “mixing angles” – known as theta-12, theta-23 and theta-13. Theta-12 and theta-23 have already been measured but theta-13 requires the observation of the muon-electron neutrino oscillation.

More data are required before theta-13 can be nailed down – you can see the uncertainties in the T2K and MINOS results in the diagram. Then physicists will try to measure the same quantity for anti-muon and anti-electron neutrinos. Comparing the two angles could help physicists understand why there is much more matter than antimatter in the universe.

You can read Fermilab’s announcement here.

Los Alamos National Lab narrowly avoids fire

 

Los Alamos National Laboratory (LANL) appears to have escaped unscathed after a large wildfire threatened it yesterday. At one point the blaze triggered a small “spot fire” within the lab’s boundaries, which has since been controlled by emergency services.

The Las Conchas fire began on Sunday afternoon in the Jemez Mountains approximately 19 km south-west of the boundary of the lab in New Mexico. By Sunday evening, as the fire began to spread, LANL announced that all facilities would be closed on Monday, and non-essential employees were directed to remain off-site.

Late on Sunday evening, the fire was reported to be less than 1.6 km from the lab’s south-western boundaries, as LANL emergency crews teamed up with Los Alamos County and federal fire crews to tackle the blaze. “I’m asking all our employees to stay clear of the lab so the fire crews can do their jobs,” said lab director Charles McMillan on Sunday night. “And please keep those crews in your thoughts tonight.”

The situation appeared to be escalating by mid-afternoon on Monday as a one-acre spot fire had been identified within a technical area on the lab’s south-western boundary. Reports from the field said that the fire had jumped north across New Mexico Route 4 and Los Alamos County conducted a forced evacuation of the town site. LANL emergency officials announced that the lab would remain closed today (Tuesday) as they continued to fight the fire.

‘No facilities face immediate threat’

But by 16:45 local time (23:45 GMT) officials were able to breathe a heavy sigh of relief as they announced that the spot fire was under control after air crews had dumped water at the site. “About one acre burned and the lab has detected no off-site releases of contamination,” read the update from the LANL Emergency Operations Center. “No other fires are currently burning on lab property, no facilities face immediate threat, and all nuclear and hazardous materials are accounted for and protected.”

There had been fears because the area under threat in this latest fire – Technical Area 49 – had been the site of underground hydronuclear experiments in the early 1960s. But subsequent testing has revealed that no contamination exists today at points of public access.

The lab’s latest update at 22:00 stated that important lessons had been learned from the Cerro Grande fire of 2000, which caused damage to lab buildings and employees’ homes. “Our efforts in recent years to thin ground fuels around the laboratory, coupled with the reduction in fuels caused by historic fires in the area, are helping protect the Laboratory and townsite,” said McMillan.

Having been established in 1943 to develop the first nuclear weapons, LANL now has more than 2000 individual facilities covering a wide variety of research including energy and environment. The 93 square kilometre site, owned by the US Department of Energy, has nearly 12,000 employees and its operating costs for 2010 were $2bn.

Famous black hole confirmed after 40 years

Using a vast array of radio telescopes, astronomers in North America are the first to make a direct measurement of the distance to Cygnus X-1, allowing them to conclude that the mass of its dark star is so great it can only be a black hole. They have also discovered that the black hole spins faster than most of its peers.

“There’s no doubt about its distance now, and there’s not much uncertainty anymore about its mass,” says Mark Reid of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. “It’s definitely a black hole.”

A black hole is a star that has run out of fuel and died, collapsing into a small body with such enormous gravity that nothing escapes its grip. First identified as harbouring a possible black hole in 1971, Cygnus X-1 was one of the first sources of X-rays discovered by astronomers. It is found in the constellation Cygnus the Swan, also known as the Northern Cross, and is one of the most studied objects in the sky. It even inspired a 10-minute song by the Canadian rock band Rush about how the stars of the Northern Cross were “in mourning for their sister’s loss”.

A neutron star instead?

However, some scientists were sceptical of its black hole and in 1974 Stephen Hawking bet Caltech’s Kip Thorne that Cygnus X-1 did not have a black hole. Instead, the dark object might be a neutron star, a less extreme type of dead star. The key to the controversy involved a mundane fact: its distance from Earth.

The dark star in Cygnus X-1 orbits a hot blue star every 5.6 days. But without knowing its distance from us, no-one could say how much light the blue star emits. The closer Cygnus X-1 is to us, the less powerful this star must be, therefore the less mass it must have. And the less massive this star, the less mass the dark star whose gravity tugs the bright one has. If the dark star has less than three times the Sun’s mass, it could be a neutron star rather than a black hole.

Recent distance estimates have favoured a higher mass – Hawking conceded defeat two decades ago – but these have been indirect. The best way to measure distance is through parallax – the small shift in a star’s apparent position that results as we view it from different perspectives while Earth goes around the Sun. But Cygnus X-1 is so distant that optical astronomers can’t measure its tiny parallax.

Huge array of telescopes

Fortunately, Cygnus X-1 emits radio waves, so Reid and his colleagues took aim at the object with the Very Long Baseline Array (VLBA), which consists of ten 25 m radio telescopes scattered from New England and the Virgin Islands to California and Hawaii. This huge array measures positions 100 times better than the Hubble Space Telescope.

“Cygnus X-1 produced beautiful data,” says Reid, “and we were able to get a very accurate distance.” The parallax indicates that Cygnus X-1 is 6050 light-years from Earth, with an uncertainty of just 400 light-years. From this the astronomers deduce that the dark star is 14.8 times more massive than the Sun; the uncertainty is just one solar mass, so the object is far above the dividing line between neutron stars and black holes. The blue star it orbits is even more massive, at about 19 solar masses.

“The radio estimate of the parallax is a wonderful achievement,” says Douglas Gies, an astronomer at Georgia State University in Atlanta who is not affiliated with the research team. “It is an extraordinary result.”

Spinning rapidly

The researchers also found that the black hole spins at 97% of its maximum possible speed. They deduce this by observing X-rays from a disc of hot gas that whirls around the black hole – gas that the black hole has torn from its unfortunate partner.

The general theory of relativity says that the faster a black hole spins, the closer an object can circle it on a stable orbit. The part of the gaseous disc closest to the black hole is the hottest. For Cygnus X-1, the inner edge is so hot that it must be very close to the black hole, thus the black hole spins fast. The gas at the disc’s inner edge revolves at half the speed of light, completing 670 orbits every second.

The astronomers have submitted three papers to The Astrophysical Journal, one on the distance, one on the mass, and one on the spin. Preprints are available on arXiv.

Avoiding the grump

Phil Diamond

By Matin Durrani, Marsfield, Australia

There’s someone who says he’s going to be “in a grump” if the Square Kilometre Array (SKA) is not built in Australia. That’s Phil Diamond (above), head of astronomy and space science at the country’s Commonwealth Scientific and Industrial Research Organisation (CSIRO).

Diamond took up his post last year after moving from the University of Manchester in the UK, where he was director of the Jodrell Bank Centre for Astrophysics and co-ordinator of PrepSKA – the preparatory study for SKA itself.

I caught up with Diamond earlier today at CSIRO’s radiophysics laboratory at Marsfield, about 20 km north-west of Sydney, where I’m on a fact-finding tour of Australian science with three other European science journalists. The newly appointed CSIRO chief is obviously keen for SKA to be built in Australia, having upped sticks from the UK, but unfortunately the Australian plan is faced with a rival bid from various nations in southern Africa.

Both bidders are planning to construct an array of some 3000 radio-telescopes, about a third of which (in the Australian case) will be located in an area about 5 km across in the remote outback in the west of the country, with about a half distributed over another 180 km, with the final fifth spread over several thousand kilometres (including some as far away as New Zealand).

Given that the smallest object a telescope can resolve is inversely proportional to its diameter, spreading lots of dishes far apart means that SKA will have a really high “resolving power”.

And the big advantage of locating SKA in the outback is that there will be hardly any radio interference from mobile phones, power lines or other effects of modern civilization. That’s because almost no-one lives there: the shire of Murchison, where the central SKA core will be located, has a population of just 110 spread over an area that’s 20% bigger than the whole of the Netherlands. And the lack of interference is essential given that the radio emissions that SKA’s interested in are so weak that, says Diamond, it’s like having to detect the signal from an airport radar located 50 light-years away from Earth.

So the Australians think they have quite a strong case, but no doubt the Africans do too and the final decision will be made on 29 February 2012 by the international astronomy community spearheaded by the SKA project office, which is based in Diamond’s old haunt of Jodrell Bank. Not that anyone is suggesting any bias of course.

One thing both bids are having to deal with is the huge amount of information spewing out from the array – with 2 terabits of data per second from each dish, we’re talking the equivalent of a kilometre-high stack of CDs every minute. That information has to be filtered and then sent down high-performance optical cables to a central data centre.

And the point of the project? Oh, just the small matter of finding out how the first black holes and stars formed, how galaxies evolve, the nature of dark energy, the origin of cosmic magnetic fields, the nature of gravity under extreme conditions and possibly even whether we are alone in the universe.

Plus whoever wins will have the world’s astronomers knocking on their doors. So you can see why Diamond will be in a grump if it doesn’t work out for Australia.

Copyright © 2026 by IOP Publishing Ltd and individual contributors