Skip to main content

The strange case of Lord Monckton

By Matin Durrani in Perth, Australia

lord_monckton.gif

If there’s one issue dominating Australian politics right now, it’s the proposed tax on emissions of greenhouse gases.

It seems a genuine attempt to encourage Australia – probably the world’s highest per capita emitter of carbon dioxide – to slow down or halt its growth in emissions.

Unfortunately, the climate-change debate in Australia is lagging well behind that in the rest of the world, with the media giving way too much attention to those “climate sceptics” who remain unconvinced that rising levels of greenhouse gases in the atmosphere are changing the Earth’s climate.

We’re not talking about providing airspace to the relatively small band of genuine scientists who are questioning particular aspects of the scientific evidence for climate change, based on a thorough knowledge of the relevant research.

Instead, the Australian media seems to be focusing on one character in particular: a certain Lord Monckton, deputy leader of the UK Independence Party.

This is the man who, apart from claiming that global warming stopped in 2001, likened the Australian federal government’s chief climate-change adviser Ross Garnaut of the University of Melbourne to a Nazi for his views on global warming, a below-the-belt accusation for which he was forced to apologize earlier this week.

Yesterday, however, as I was nearing the end of my week-long fact-finding tour of Australian science, who else should be appearing in the Perth district than Monckton himself.

He was invited to deliver the Lang Hancock Lecture at Notre Dame University in Fremantle, just south of Perth on Thursday night. Unfortunately, the university press office declined a request to attend to the event that was put in by one of the other journalists on the tour I’m on.

Monckton’s visit had already caused a fair bit of noise, including a formal complaint from more than 50 Australian scientists, who called for the lecture to be cancelled. Despite the protests, the lecture went ahead as planned, as did a separate talk at the Association of Mining and Exploration Companies annual convention on Math Lessons for Climate-Crazed Lawmakers

To me there’s no point calling for Monckton’s views to be stifled, which only adds to his martyr status and makes it appear that climate scientists have something to hide and are too scared to see the topic out in the open.

What’s needed instead is a careful unpicking of his main points, such as those offered here

That was certainly the view taken a few years ago in the UK by the likes of former science adviser David King. It’s the path that Australia needs to go down too.

But the controversy has not reached the end of its course. Monckton is also due to speak on 4 July in the chemistry department at the University of Western Australia. However, UWA president Alan Robson, who I met for lunch today, has insisted that the talk was not endorsed by the university but that it had been organized by a local community group that merely chose to use the department as a venue.

The row looks set to go on.

What’s going on with the Sun?

Earlier this month a lot of column inches were devoted to the news that the Sun continues to behave in a peculiar manner – and that solar activity could be about to enter a period of extended calm. The story emerged after three groups of researchers presented independent studies at the annual meeting of the Solar Physics Division of the American Astronomical Society, which appear to support this theory. But are the new findings really that clear-cut and what implications do they have for the climate here on Earth? Physicsworld.com addresses some of the issues.

Why the recent interest in the Sun’s activities?

Solar physicists agree that the Sun has been acting strangely of late. It relates to apparent abnormalities in the solar cycle, an approximately 11-year period during which the Sun’s magnetic activity oscillates from low strength to high strength, then back again. When the Sun’s magnetic activity is low, during a solar minimum, its surface remains relatively quiet, which leads to fewer sunspots. Then, as magnetic activity begins to increase, the surface becomes more dynamic and the sunspot numbers begin to increase in the lead up to a solar maximum.

But following the last solar minimum in 2006, solar physicists were surprised to observe that sunspot numbers were unusually slow to pick up. This led some to suggest that the next solar maximum, due in 2013, could be late and weaker than usual. Some see this as a sign that solar magnetic activity is slowing down and the Sun may be about to head into a prolonged period of magnetic weakness. Some have speculated that a weakened Sun could offset some of the effects of man-made global warming, or even counteract it entirely.

What was presented at the recent AAS meeting in New Mexico?

In one paper, Frank Hill of the National Solar Observatory (NSO) and his colleagues argue that because a specific solar wind beneath the surface of the Sun has failed to appear during the present solar cycle it could signify that the next cycle will be delayed. Hill and his colleagues identified the wind flow – known as “torsional oscillation” – using data from the Global Oscillation Network Group (GONG). They believe that the migration of this flow from mid-latitudes to the equator is a precursor to new sunspot formation in each cycle. Because the wind is yet to appear during the present cycle, the researchers argue that the next cycle could be postponed to 2021 or 2022, or it may not happen at all.

In a second paper, Richard Altrock of the US Air Force Research Laboratory describes how a process known as the “rush to the poles” appears to be slowing down. This phenomenon describes how older magnetic activity is pushed to higher latitudes during new cycles as fresh magnetic activity emerges at about 70 degrees latitude. Altrock has observed, using data from NSO’s 40-cm coronagraphic telescope, that this rush has been more like a crawl during the present cycle. For this reason, he believes that we’ll see a very weak solar maximum in 2013 and if the rush to the poles fails to complete then it is not clear how the sun will respond.

In a final paper, Matt Penn and William Livingston of the National Solar Observatory, in Tucson, look more specifically at the nature of sunspots during the two most recent cycles. The magnetic field associated with sunspots is typically 2500–3500 Gauss, but Penn and Livingston believe that the field strength has been reducing of late. Using over 13 years of data collected at the McMath-Pierce Telescope at Kitt Peak in Arizona, the researchers found that the average field strength dropped by roughly 50 Gauss per year during the previous cycle and the trend has continued into the present one.

Has the Sun gone through quiet spells before?

Scientists have known about the solar cycle since the mid 18th century and they have been able to reconstruct solar cycles back to the beginning of the 17th century based on historic observations of sunspot numbers. (Some researchers have even attempted to catalogue earlier solar cycles based on indirect observations of Sun spots). The first thing to say is that although solar activity has consistently oscillated over an approximately 11-year period, the timings and characteristics of each cycle are far from exact and new cycles have been late on arrival in the past.

Solar physicists do agree, however, that there was a 70-year stretch beginning in 1645 when the Sun remained in an extended period of calm referred to as the Maunder minimum. This period coincided with the “Little Ice Age” during which parts of the world including Europe and North America, experienced colder winters and increased glaciation. There was another shorter minimum from about 1790 to 1830, known as the Dalton Minimum.

So could we be heading for another Little Ice Age?

There are many uncertainties surrounding this question. Firstly, as explained in the previous answer, it is far from clear whether the Sun is headed for another period of calm. Recent research in the UK, predicts an 8% chance that we will return to Maunder minimum conditions over the next 40 years, based on past behaviour of the Sun over the last 9000 years.

Secondly, there are still debates over the details of the Little Ice Age and the role played by the Maunder minimum. In Europe, there were considerably more cold winters in this interval, but they were not unrelentingly cold as they were in an ice age. Also, the Earth’s climate is evidently a highly complicated system, involving interconnected feedback systems, so it is difficult to disentangle causes and effects. For instance, several recent studies have suggested that solar-induced changes to the jet stream in the northern hemisphere may cause colder winters in Europe but this would be offset by milder winters in Greenland.

Finally, even if the Sun were to head into a quiet period, others argue that the reduction in solar irradiance on Earth would still be small compared with the heating caused by man-made global warming. Mike Lockwood, a researcher at the University of Reading, estimates that the change in climate radiative forcing since the Maunder minimum is about one tenth of the change caused by man-made trace greenhouse gases.

Subverting science

By Matin Durrani in Perth, Australia

Donna Franklin's Fibre reactive hybrid dress

Donna Franklin’s Fibre reactive
hybrid dress
, 2004–2008

If you think Australia is remote from the rest of the world, well the city of Perth is even more cut-off, being a five-hour flight from Sydney and 1000 km from the next main centre of population.

That remoteness has engendered a kind of “wild west” spirit, where people have the time and space to think up radical ideas that might be more quickly dismissed in less isolated places.

That, at least, was the view of the Nobel-prize-winning microbiologist Barry Marshall from the University of Western Australia (UWA) – best known for proving that most ulcers are caused by certain bacteria called Helicobacter pylori. Marshall was talking to me over lunch at the university on the latest leg of my week-long fact-finding tour of Australian science that I’m on with three fellow European science journalists.

The subversive spirit was echoed after lunch by Ionat Zurr, an Israeli-born artist working at UWA within SymbioticA – a self-styled “artistic laboratory” within the School of Anatomy and Human Biology that describes itself as being “dedicated to the research, learning, critique and hands-on engagement with the life sciences”.

Unlike most art–science projects, which involve artists reinterpreting scientific ideas in an artistic form, the people at SymbioticA are getting their hands dirty by learning various experimental scientific techniques to create works of art.

Projects include making loudspeakers from bones, growing edible steak from artificial tissue, and (well before Lady Gaga had a similar idea) creating wearable dresses from fungus leaves (pictured above).

SymbioticA, which was founded in 2000 after fighting off a rival bid to buy a boring old confocal microscope, seeks to question the very nature of science, art and even life itself. It also wants to demystify and “democratize” the scientific laboratory.

The delightfully garrulous Zurr admitted that not everyone understands, or even approves of, what she and her colleagues are trying to do. What, you may ask, is the point of designing jewellery made from pig wings grown from bone-marrow stem cells?

But the strong reaction of some scientists to SymbioticA surely shows that she and her fellow artists must be doing something right: after all, isn’t science itself about challenging orthodox thinking? Perhaps scientists are happy to be subversive but don’t like being subverted themselves.

As one of my fellow science journalists complained, shaking his head in derision as we left: “They should have bought that microscope.”

The real impact of impact

“Cool as a mountain stream” was the advertisement slogan for a certain brand of cigarettes from my childhood. It was written and paid for by companies that probably had a pretty good idea what smoking actually did to their customers. Such advertising was also supplemented by intense government lobbying – itself based on highly selective “research” – to oppose possible moves to restrict the sale and consumption of tobacco. The reason for all this? To preserve the company’s bottom line and secure the salaries and share options of the management. But that – ranging all the way from slightly disingenuous image presentation to almost outright deceit – is, after all, what advertisers do.

Scientists, of course, are different. In our laboratories we sift data to differentiate between rival hypotheses. We seek to push forward the boundaries of what is known and understood. We publish our findings so that results can be compared and, where discrepancies appear, further hypotheses can be advanced. And so the cycle continues with everyone constantly striving to increase our understanding of the universe around us.

Scientists are trained not be selective about data (at least not without good reason). Deleting or moving an “outlier” in a data set crosses an ethical boundary that the vast majority of scientists hold sacrosanct. There have been some exceptions: Jan Hendrik Schön, who was found guilty of 16 out of 24 charges of scientific misconduct in 2002, being a notable example in physics. However, once challenged, the scientific community tends to deal with the few cases of fraud rapidly and robustly.

The new problem, though, is that scientists – particularly those working in universities – are being increasingly urged to amplify the “impact” of their work. In many countries, including the UK, research funding is now becoming conditional on demonstrating impact criteria such as maximizing knowledge transfer to industry (to “secure our future prosperity”) as well as encouraging students to specialize in science. Giving an inspiring public lecture or publishing a paper reporting a genuine advance – the essence of research and teaching – are not enough anymore. That everything now also has to have “impact” means that scientific results are distorted, the significance of outcomes is overblown, and the presentation of the research becomes at least as important as the science itself.

Energy and funding

Impact has manifested itself in the “publish or perish” culture. The increasingly competitive marketplace of academic publication has placed paramount importance on journals’ “impact factor” – the number of citations in a particular year given to papers that appeared in the journal over the previous two years. To enhance this, many journals have an unwritten policy of preferring papers that are believed likely to attract more citations, and papers are also carefully written to inflate their broad significance.

But even this should be the least of our concerns. In the best journals, peer review is still at a high standard, and is normally performed objectively and honestly. Far more worrying is the funding environment in which all parties – the funding agencies, scientists and publishers – are complicit in cloaking scientific research with a far greater significance than can be objectively justified.

This is most obvious in the current focus on “energy” and the real danger of climate change. Governments have been persuaded to divert funding into energy research because it is a relatively inexpensive way of appearing to tackle the problem – at least in comparison with actually doing anything about it. Funding agencies then support it because the more funds they have to administer, the larger their operations become. Scientists also like it because we can use it to fund our research – most of which is only tenuously relevant to the reduction of energy consumption or its generation. Just as it has become possible to magically “offset” the carbon emissions associated with taking a flight or hiring a car, as a society we can offset our inactivity in actually reducing pollution by funding research that can be promoted as promising a greener future.

Of course, few research proposals actually promise perpetual-motion machines or teleportation devices. But they do promise to improve “efficiency” – the magic word for demonstrating the potential impact of such research. From new alloys for jet-engine turbine blades to LEDs for lighting, as less energy is required for a particular human activity, the new research is predicted to mean that a certain number of power stations can be shut down or so many million tonnes of carbon dioxide can be saved. But there is a catch. To governments and businesses, improved efficiency does not mean reducing use but rather the opposite – it increases the practicality and/or affordability of consumption. In the end, such developments oppose the stated impact of the research to reduce energy consumption.

Flatter the better?

Take flat-screen televisions as an example (liquid crystal, plasma, organic LED or whatever). These devices are probably intrinsically more efficient than the cathode-ray tube (CRT) that they have superseded – provided one compares the power consumption for the same screen size. However, one only has to visit an electronics retailer to realize that “efficiency” in this context is a meaningless term – modern televisions are simply enormous in comparison with their older counterparts. A quick visit to my local electronics store revealed screen sizes of up to 60 inches, and the power consumption of these mega-screens is correspondingly huge – more than 500 W in several cases, compared with a CRT television that requires about 50 W.

The key technological advance from the point of view of the manufacturers is not energy consumption but rather screen thickness. The depth of the enclosure of the cathode-ray tube had to scale with the width of the screen because electrons can only span a limited solid angle. However, a flat-screen television has no such restrictions and could cover most of a living room wall while still leaving room for the sofa. So, if we are being really honest, an accurate summary of the “impact” of the physics-led development of display technologies is that it has enabled televisions to have a much larger power consumption – an outcome that is unlikely ever to have appeared in a funding impact statement.

Another frequently trumpeted outcome of research is that a technology could become cheaper to manufacture. In the case of televisions, they are now cheap by historical standards, which has enabled people to purchase multiple sets and install them in more rooms in their homes and so increase power consumption still further. Indeed, this argument can be replicated across technology sectors: from colossally power-hungry data centres that support green-sounding “cloud computing” to cars where improvements in engine efficiency have been largely cancelled out by increasing size and mass of vehicles. So much for tackling climate change.

The moral high ground

It is about time that the scientific community applied the same intellectual rigour to the context in which it operates as it is supposed to apply to its own research. In other words, we should be able to justify objectively not just the chain of reasoning that has led to a scientific advance, but also any statements made regarding the potential impact of the result.

However, scientists simply do not have the commercial expertise or political insight to be able to do this. The recent allegations of fraud in the presentation of climate data illustrate the grave dangers inherent in trying to maximize the political “impact” of research. In reality, science can only directly contribute to tackling the major problems confronting our countries such as climate change and economic decline through educating the next generation about the science underlying current and future problems. Only then will they be able to take informed decisions themselves. As scientists, we will lose what remains of our authority and the moral high ground in that education debate if we continue to abandon scientific standards of proof in favour of spin and deceit in chasing funding.

Flipping spins, one proton at a time

In a bid towards better understanding the inner workings of the proton, researchers in Germany have, for the first time, directly measured magnetic spin-transitions of a single trapped proton. Their work is an important step forward in understanding the magnetic properties of a proton. The technique could also be used to measure the spin of an antiproton, which could help us understand why the universe has more matter than antimatter.

The proton has an intrinsic angular momentum or spin, and behaves like a tiny bar magnet that can point up or down. The spin of a single proton has not been measured until now because the magnetic moment of the proton is much smaller and hence more difficult to detect, than that of the electron or the positron. Previous measurements have been made on clouds of protons which cannot be repeated on the much more scarce antiprotons. This new method looks at measuring the spin of just one proton produced and held in a special trap, which is capable of storing protons for months.

The researchers, based at Johannes Gutenberg University and the Helmholtz Institute in Mainz, Germany, together with colleagues from the Max Planck Institute for Nuclear Physics in Heidelberg and the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, Germany, spent close to five years working on their experiment. One of the main developments was a specially designed miniature Penning trap – a vacuum trap that uses electric and magnetic fields to hold particles. “Such an experiment is challenging and has to be set up with extreme care.” explains Stefan Ulmer, a team member from the Helmholtz Institute. “In the first two years we designed the cryogenic apparatus, the Penning traps and the highly sensitive superconducting detection systems. In the third year we got the apparatus running and succeeded in preparing a single proton. In the following two years we had to improve the apparatus and redesign some parts. Finally we succeed after four and a half years in observing a single proton spin flip.” he said.

Wobbling protons

A proton in a Penning trap aligns its spin with the trap’s magnetic field. The team introduced an additional magnetic filed, which creates a non-uniformity. The team then switched on an RF signal that causes the spin to precess or “wobble” like a spinning top. The non-uniform magnetic field causes the frequency of the wobble to depend on the direction of the spin. As a result, a spin flip can be detected by measuring a small change in that frequency, which can unobtrusively be detected. This can then be used to calculate the proton’s magnetic moment.

Because the proton spin is still small, currently the researchers have measured it to a precision of 10–4. “The aim is the measurement of the magnetic moment of the single proton with a precision of 10–9, at least. We are right now working on the improvement of the apparatus to reach the 10–9 level.” explains Ulmer.

Looking towards antimatter

In the near future, the researchers would like to apply their methods to measuring the magnetic moment of antiprotons – the antimatter counterpart of the proton. This would be carried out at research facilities where low energy antiprotons (5.3 MeV) are produced, such as the CERN Antiproton Decelerator (AD). “If the proton is measured, it will be possible to measure the antiproton. Currently there is only one facility worldwide which delivers antiprotons – CERN AD. Lots of groups want to have access, so it is a question of how it can be organized to get access. Such antiprotons are needed to perform high-precision low-energy experiments with antimatter. The antiprotons from CERN AD would be decelerated and stored in our Penning trap.” says Ulmer. Another facility in Germany called a Facility for Low Energy Antiproton and Ion Research (FLAIR) has been set up to provide low-energy antiprotons, but it will be quite some time before FLAIR is operational.

Jeffrey Hangst, who works at the ALPHA experiment at CERN AD, says “This is a very difficult and elegant experiment; I am very pleased that they have achieved this important milestone. We should put matter and antimatter under the microscope when we have a chance to do a beautiful experiment like this one.” Even with ALPHA’s advance, it will be difficult to make magnetic measurements of the antiproton because that is not sufficient time. Currently, the magnetic moment of the antiproton is known only to three decimal places. The team hopes that its method will change this and help to conduct high-precision comparisons of the fundamental properties of particles and antiparticles, making it possible to accurately determine whether CPT symmetrical behaviour actually occurs, and maybe provide the basis for theories that extend beyond the Standard Model.

The research was published in Physical Review Letters.

UK research chief defends ‘blacklisting’

Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
    • Chapters
    • descriptions off, selected
    • subtitles off, selected

      The head of the UK’s biggest science funding agency has defended its controversial decision to restrict how often scientists can apply for grants from it. David Delpy, chief executive of the Engineering and Physical Sciences Research Council (EPSRC), says “there is no evidence” that the restriction has led to less creative grant applications or that it is penalizing young researchers.

      Delpy was speaking in an exclusive video interview with Physics World, in which he also criticizes the UK government’s recent tightening of the rules on who can work and study in the country, saying it would “damage the UK’s standing in research”. Delpy, a medical physicist who has been EPSRC boss for almost four years, was last month reappointed by the UK science minister David Willetts until March 2014.

      The EPSRC introduced its new restrictions in 2009 as a way of coping with the fact that the council was receiving so many more grant applications than it could afford to fund that in some disciplines up to 85% were getting rejected. The low success rate was also demoralizing for researchers, many of whom found that applications that had been rated highly by peer-review panels were nevertheless getting turned down. Delpy adds that the deluge of applications was even making it hard for the council to find enough referees to conduct the peer review.

      Tightening up

      As well as preventing researchers from resubmitting grants that have previously been turned down, the new rules now dictate that anyone with three rejected and below-par applications in any two-year period can make only one further application in the following year. “Having imposed this on the community, success rates have gone up to 30% because the number of applications has dropped [and] we are seeing an increase in the number of referee responses,” claims Delpy.

      The EPSRC boss also denies that forcing applicants to explain the “impact” of their work in grant applications is hampering genuine breakthroughs. “We’re not asking academics to predict the full detailed outcome of their research,” insists Delpy. “What we’re saying is start thinking about how you would maximize the impact of that research, whether it’s within our discipline or in the form of policy advice to government or a product that could be marketed – [and to] ask in your application for the resources to do that.” Indeed, Delpy thinks that UK taxpayers have “the right” to know from scientists how the £800m that the EPSRC spends on research each year benefits the UK financially and socially.

      “Unfair” changes

      However, Philip Moriarty, a condensed-matter physicist from Nottingham University, thinks the EPSRC’s “blacklisting” rules are unfair. “I’m an ardent supporter of peer review but the idea that there’s a direct correlation between proposal quality and where it’s ranked on a panel prioritization list is nonsense,” he says. “If there are 40 grants for review by a panel, the top five and the bottom five will probably be relatively easy to define, but the remaining 30 in many cases may as well be ranked on the basis of throwing them up in the air and seeing where they land.”

      Moriarty says he wants the EPSRC to run what he calls “a simple experiment” to test the council’s assumption that grants falling in the bottom half of a ranked list are necessarily of poor quality. “They should take the same set of proposals, send them out to different referees, and then give them to five different panels [and] look at the correlation in the ranked lists,” he says. “If they are so confident that the principle underlying the blacklisting process is robust, then why not do this experiment? It would silence me and all the other critics of the scheme.”

      Looking for leaders

      On the UK’s tightening of visa rules, which was heavily criticized for potentially stopping talented researchers from moving to the country, Delpy says he is “very pleased” that the government responded by making it easier for scientists with PhDs to meet the tougher requirements. But he warns that any barriers to the free movement of scientists is a bad idea. “Research is an internationally competitive game [and] if the UK wants to maintain its [leading] position any restriction of movement across international boundaries will work to the detriment of that – not just in physics but in all areas of research.”

      Elsewhere in the interview, Delpy says the EPSRC will be encouraging young researchers with bright ideas by focusing over the next four years on “scientific leaders” who are either already running large teams or are prominent in their fields. “[We will] provide them with funding not just in the form of a one-off grant or fellowship but by providing funding across the whole of their research career – to provide mentorship and additional training in the additional skills that you need to be a research leaders and internationally competitive,” he says.

      However, Moriarty regards the EPSRC’s decision to focus on “leaders” as “fundamentally flawed”. “In just what sense is focusing on leadership supporting new or young lecturers?” he asks. “One of the great aspects of the UK funding system for many years – and certainly something that kept me here when I had the opportunity to go back to Dublin in the boom years of Science Foundation Ireland – was the ability to establish an independent research career much more quickly than other places in Europe.”

      UK research chief defends ‘blacklisting’ of grant applicants

      Miracle or no miracle?

      By Matin Durrani, Sydney, Australia

      miracle.jpg

      Australia is a big country but, as far as science is concerned, it does just about as much one might expect for a country with a population of just 23 million.

      But according to Thomas Barlow, a former academic and journalist who is now a kind of freelance policy wonk and science adviser, Australians are far too pessimistic about their scientific future. In other words, while Australians believe they are an inherently inventive people, they are less good, or so the thinking goes, at capitalizing on their smart ideas.

      I met Barlow yesterday during my visit to Sydney at the home of Peter Pockley – a veteran Australian science journalist and broadcaster who also regularly reports on Australian science for Physics World.

      Barlow has outlined his thoughts in his 2006 book The Australian Miracle, which offers an honest, well written and sober perspective on Australian science.

      It’s worth reading if you’re at all interested in the country’s science and Barlow has as good a perspective as any – he’s married to Michelle Simmons, who’s a leading quantum-computing physicist at the University of New South Wales and who directs a successful national Centre for Quantum Computer Technology funded by the Australian Research Council.

      Still, I just can’t get away from the nagging feeling that Australia, being physically so far removed from the US, Europe, China, Japan and other centres of power in global science, is destined to always remain one step behind the rest of the world.

      In his book, Barlow denies that there is a brain drain of talent from Australia, which may be true. But unless there is a steady flow of people and ideas in and out of the country, true innovation may struggle. And being so far away, that flow is simply hard to sustain.

      MINOS confirms muon-to-electron neutrino oscillation

      Neutrino mixing angles from T2K and MINOS
      Neutrino mixing angles from T2K and MINOS

      By Hamish Johnston

      Less than a fortnight ago we brought you news that the T2K experiment in Japan has caught the first glimpse of muon neutrinos changing (or oscillating) into electron neutrinos as they travel 300 km under Japan.

      Now researchers at the MINOS experiment in the US have seen the same neutrino oscillation. The MINOS physicists sent a beam of neutrinos more than 700 km underground from Fermilab in Chicago to the Soudan Underground Laboratory in Minnesota. At Soudan, the team detected a total of 62 electron neutrinos, which is 13 more than they should have seen if some muon neutrinos had not changed to electron neutrinos.

      Neutrinos exist in three “flavours” – muon, electron and tau – that change or “oscillate” from one to another as they travel in space. The oscillation strength between different types of neutrino is characterized by three “mixing angles” – known as theta-12, theta-23 and theta-13. Theta-12 and theta-23 have already been measured but theta-13 requires the observation of the muon-electron neutrino oscillation.

      More data are required before theta-13 can be nailed down – you can see the uncertainties in the T2K and MINOS results in the diagram. Then physicists will try to measure the same quantity for anti-muon and anti-electron neutrinos. Comparing the two angles could help physicists understand why there is much more matter than antimatter in the universe.

      You can read Fermilab’s announcement here.

      Los Alamos National Lab narrowly avoids fire

       

      Los Alamos National Laboratory (LANL) appears to have escaped unscathed after a large wildfire threatened it yesterday. At one point the blaze triggered a small “spot fire” within the lab’s boundaries, which has since been controlled by emergency services.

      The Las Conchas fire began on Sunday afternoon in the Jemez Mountains approximately 19 km south-west of the boundary of the lab in New Mexico. By Sunday evening, as the fire began to spread, LANL announced that all facilities would be closed on Monday, and non-essential employees were directed to remain off-site.

      Late on Sunday evening, the fire was reported to be less than 1.6 km from the lab’s south-western boundaries, as LANL emergency crews teamed up with Los Alamos County and federal fire crews to tackle the blaze. “I’m asking all our employees to stay clear of the lab so the fire crews can do their jobs,” said lab director Charles McMillan on Sunday night. “And please keep those crews in your thoughts tonight.”

      The situation appeared to be escalating by mid-afternoon on Monday as a one-acre spot fire had been identified within a technical area on the lab’s south-western boundary. Reports from the field said that the fire had jumped north across New Mexico Route 4 and Los Alamos County conducted a forced evacuation of the town site. LANL emergency officials announced that the lab would remain closed today (Tuesday) as they continued to fight the fire.

      ‘No facilities face immediate threat’

      But by 16:45 local time (23:45 GMT) officials were able to breathe a heavy sigh of relief as they announced that the spot fire was under control after air crews had dumped water at the site. “About one acre burned and the lab has detected no off-site releases of contamination,” read the update from the LANL Emergency Operations Center. “No other fires are currently burning on lab property, no facilities face immediate threat, and all nuclear and hazardous materials are accounted for and protected.”

      There had been fears because the area under threat in this latest fire – Technical Area 49 – had been the site of underground hydronuclear experiments in the early 1960s. But subsequent testing has revealed that no contamination exists today at points of public access.

      The lab’s latest update at 22:00 stated that important lessons had been learned from the Cerro Grande fire of 2000, which caused damage to lab buildings and employees’ homes. “Our efforts in recent years to thin ground fuels around the laboratory, coupled with the reduction in fuels caused by historic fires in the area, are helping protect the Laboratory and townsite,” said McMillan.

      Having been established in 1943 to develop the first nuclear weapons, LANL now has more than 2000 individual facilities covering a wide variety of research including energy and environment. The 93 square kilometre site, owned by the US Department of Energy, has nearly 12,000 employees and its operating costs for 2010 were $2bn.

      Copyright © 2025 by IOP Publishing Ltd and individual contributors