Skip to main content

US sanctions on Russia hit ITER council

The ITER fusion experiment has had to bow to the impact of US sanctions against Russia and move the venue of its council meeting, scheduled for 18–19 June, from St Petersburg to the project headquarters in Cadarache, France. In a letter to council members on 30 April, ITER director-general Osamu Motojima had already warned of the impact of “international tension” on the $15bn project. He had said he was “very much concerned about the current international tension and its possible political impact on the ITER project”, adding that the project “should remain neutral, staying outside of the world political loops”.

Although council chair Robert Iotti says that council members have been working “quite harmoniously” to resolve a number of problems, including the venue for the council meeting, on 15 May a change of venue was announced. ITER spokesperson Michel Claessens says that because of the difficulty of US delegates travelling to Russia, all seven member delegations – China, the EU, India, Japan, Russia, Korea and the US – have agreed to the move. He adds that the Russian organizers were disappointed but agreed that they could not have the council meeting without the US taking part.

Money problems

Regardless of the venue, delegates face a daunting task on many fronts, not least that the US Congress is threatening to cut funding to ITER amid concerns over the project’s escalating cost. ITER has already had to make do with getting far less money from the US during the last few years than the $350m per year originally planned by the Department of Energy (DOE). The project has received only $200m this year and the administration has proposed just $150m for 2015. However, this cap pushes spending further into the future, which increases ITER’s total cost.

I’m really beginning to believe that our involvement in ITER is not practical, that we will not gain what we hope to gain from it, and instead this money could much better be spent elsewhere

US senator Dianne Feinstein

The DOE has declined for several years to give a figure for ITER’s estimated total cost, but in April Ned Sauthoff, head of the US ITER contingent, gave his latest projections to the DOE’s Fusion Energy Sciences Advisory Committee, which took the predicted price tag from $1.1bn to $3.9bn. Reaction in Congress was harsh. “I’m really beginning to believe that our involvement in ITER is not practical, that we will not gain what we hope to gain from it, and instead this money could much better be spent elsewhere,” senator Dianne Feinstein, energy and water subcommittee chair, said at a hearing on the same day. Press reports suggest that in its mark-up of the proposed budget, the Senate might suggest more savage cuts to ITER or even withdrawal, although any changes must be agreed by the House and the administration.

One of Congress’s demands of ITER is that the council must enact the 11 recommendations made in a recent scathing management assessment, which include moving ahead with replacing Motojima as director-general. Critics, including those in Congress, will be watching closely to see how the council acts.

Laser mimics biological neurons using light

A tiny laser built up from thin layers of semiconductor can behave just like a biological neuron, according to a group of physicists in France. The team has shown that its “micropillar” laser can be made to fire only when its input shifts by some minimum amount, just like a neuron. Successive firings of the device must also be well separated in time, which is also a crucial feature of biological neurons.

The human brain consists of around 100 billion neurons, each of which receives electrical signals from other neurons via thousands of tiny junctions known as synapses. When the sum of the signals across its synapses exceeds some threshold value, a neuron “fires” by sending a series of voltage spikes to large numbers of other neurons. As such, neurons are “excitable”: below a certain input threshold the system’s output is very small and linear, but above the threshold it becomes large and nonlinear.

Recreating the brain

Scientists have long tried to build artificial neurons that can recreate the enormous processing power of the brain, which has a capacity for understanding that has no parallel in existing digital computers. Much of this effort has concentrated on silicon circuitry, while a few groups have explored more novel approaches, such as the one that exploits superconducting devices known as Josephson junctions (see “Superconductors could simulate the brain”).

The latest work does away with electronics altogether and instead relies on optics. Sylvain Barbay and colleagues at the CNRS Laboratory for Photonics and Nanostructures outside Paris use what is known as a micropillar laser. Measuring just 10 µm high and a few microns across, the cylinder-shaped device consists of alternating layers of semiconductor materials grown on a substrate. These layers create a lasing medium bounded by two parallel mirrors, and a region that absorbs low-intensity light while transmitting light of higher intensities.

Quick firing

To demonstrate its neuron-like qualities, the researchers optically pumped the device with a 794 nm diode-array laser and then excited it further with an 800 nm titanium-sapphire laser. Using a single pulse from the latter, they were able to demonstrate excitability, with the device firing only when exposed to pulses of some tens of nanojoules. This occurs on timescales of just 200 picoseconds. This makes the artificial neuron much quicker than either its biological or electronic counterparts, which have response times in the order of milliseconds.

Using pairs of pulses from the titanium-sapphire laser, Barbay and colleagues were then able to demonstrate a second fundamental attribute of their imitation neuron: a minimum gap in time between firings. Without this gap, explains Barbay, a neuron’s activity could become disordered, with noise triggering other pulses. They found that the device fired only once when subject to two input pulses spaced less than 150 ps apart – that time interval being known as the “absolute refractory period”. The researchers also found that the device has a “relative refractory period”, which occurs between 150 and 350 ps after the first pulse is delivered. During this time period, the resulting firing is weaker and required a stronger trigger than needed after 350 ps, when the device fires just as it does in response to an initial pulse.

System has ‘memory’

“This relative refractory period has never been seen before in optical systems,” says Barbay. “Its observation is interesting since it enforces the analogy with biological neurons and because it shows that the system has a ‘memory’ of its previous state.”

Barbay points out that scientists are still “very far” from building a computer that is able to mimic the brain, because, he says, it is not possible to reduce all neurons to a single model and because the number of neurons and connections in the brain is so far beyond current technological capabilities. But he adds that his group’s device has the advantage of being both small and easily coupled, potentially allowing the construction of small networks of neurons.

The research is described in Physical Review Letters.

Dispute arises over rejected climate-science paper

A row has broken out after climate scientist Lennart Bengtsson told The Times that a recent paper he submitted to the journal Environmental Research Letters (ERL) was rejected because of what the newspaper refers to as his “dissenting views on climate science”. IOP Publishing, which publishes physicsworld.com and ERL, has responded by stating that the paper “contained errors, in our view did not provide a significant advancement in the field, and therefore could not be published in the journal”.

Review process

The paper submitted by Bengtsson, who is a researcher at the University of Reading in the UK, discusses the issues surrounding “climate sensitivity”, temperature-rise effects and the uncertainties in the data therein, looking at reports AR4 and AR5 from the Intergovernmental Panel on Climate Change (IPCC), along with another study published in Nature. The paper was submitted to ERL, which is a fully peer-reviewed journal, in February this year and was rejected in mid-March on the basis of two referee reports that said it did not meet the journal’s requirement for papers to “significantly advance knowledge of the field”.

Bengtsson and colleagues then requested that their paper be published in the journal as a shorter “Perspective” piece. This request was rejected in early April by members of the journal’s editorial board, who review all “Perspective” submissions, on the basis that it also contained errors and was well in excess of the usual limit for that type of article. The authors were then told that they could address the concerns raised by the referees regarding their work and resubmit another full-length research paper with a new analysis of the data, which they have so far not done.

The story in today’s Times, which appeared on the front page, includes a partial quote from one of the two referee’s reports obtained as a part of the peer-review process. It states that the paper’s results are “inconsistent”, are “less then helpful” and “harmful as it opens the door for oversimplified claims of ‘errors’ and worse from the climate sceptics media side”. Bengtsson, who has previously published in ERL, labelled the comments as “utterly unacceptable”.

Editorial standards

IOP Publishing then released a statement, responding to the Times story, together with the entire report from the referee who was quoted by the newspaper. “The referees selected to review this paper were of the highest calibre and are respected members of the international science community,” says Nicola Gulley, Editorial Director at IOP Publishing. “The comments taken from the referee reports were taken out of context and therefore, in the interests of transparency, we are working with the reviewers to make the full reports available.” Gulley adds that the paper’s rejection “had absolutely nothing to do with any ‘activism’ on the part of the reviewers or the journal, as suggested in article in The Times“, saying that it was turned down “solely based on the content of the paper not meeting the journal’s high editorial standards”.

The referee’s report says that the paper “does not make any significant attempt at explaining or understanding the differences [in the data from the various reports], it rather puts out a very simplistic negative message giving at least the implicit impression of ‘errors’ being made within and between these assessments, e.g. by emphasising the overlap of authors on two of the three studies. What a paper with this message should have done instead is recognising and explaining a series of ‘reasons’ and ’causes’ for the differences.”

The referee concludes their report by saying “I have rated the potential impact in the field as high, but I have to emphasise that this would be a strongly negative impact, as it does not clarify anything but puts up the (false) claim of some big inconsistency, where no consistency was to be expected in the first place. And I can’t see an honest attempt of constructive explanation in the manuscript. Thus I would strongly advise rejecting the manuscript in its current form.” IOP Publishing is currently working on getting permission from the other referees of this paper to make all the reports available as soon as possible.

Microwave pockets, space station strife and dreaming of Mars

The physics of how the contents of a microwaved pastry can become “hotter than the Sun” is the subject of an entertaining and informative blog entry by Ethan Siegel. He looks at the physics of heating “microwave pockets”, those roof-of-your-mouth-scalding savoury treats that appeared on shelves in the 1980s. He explains why the outer portion of a pocket can be extremely hot, while the interior remains frozen – and why pockets often explode when heated through.

Siegel’s been a bit cheeky and republished this entry from 2009, but I suppose it’s timeless and I’m sure you can still buy microwave pockets somewhere! His blog is called Starts With a Bang and the entry is entitled “Throwback Thursday: The physics of hot pockets”.

As the crisis in the Ukraine drags on, scientists are beginning to worry about the effect it could have on scientific collaborations involving Russia and the West. Several websites are reporting that Russia is threatening to ban US astronauts from the shuttles that travel to the International Space Station (ISS). Indeed, the Independent quotes Russia’s deputy prime minister Dmitry Rogozin as saying that it would be possible for Russia to independently operate its portion of the ISS, while the US would not be able to do so. Indeed both toilets on the ISS are Russian, so it could get very messy up there!

(more…)

How to make a quantum random-number generator from a mobile phone

Do you feel nervous when you make a credit-card transaction using your mobile phone? Your worries could soon be a thing of the past, thanks to a low-cost device that could bring powerful cryptography to portable devices. That’s the aim of Bruno Sanguinetti and colleagues at the University of Geneva in Switzerland, who have created a quantum random-number generator (QRNG) that uses low-cost electronic components including a mobile-phone camera.

Modern cryptographic protocols require the rapid generation of sequences of truly random numbers. These are used to create the “keys” that allow individuals to encrypt and decrypt sensitive information such as passwords and bank details. Coming up with these numbers is a significant technological challenge because computers are completely deterministic and are therefore not capable of creating truly random numbers. Cryptography systems tend to rely on “pseudo random-number” generators that output sequences of numbers that are nearly random. While some of these generators are very good, a cryptography system based on pseudo random numbers is easier to hack than a system that uses random numbers.

Truly random numbers can be generated by making measurements on physical systems that are inherently random – such as the radioactive decay of nuclei or noise in an electronic circuit. However, existing measurement techniques tend to be either very expensive or too slow to be of practical use. Securing your mobile phone, for example, needs a generation rate of about 1 kbit/s.

Counting photons

Now Sanguinetti and colleagues Anthony Martin, Hugo Zbinden and Nicolas Gisin have used an eight-megapixel camera from a Nokia N9 smartphone to create a device that can deliver random numbers at 1.25 Gbit/s. The system exploits the fact that the camera is so sensitive that it can be used to count the number of photons that impinge on each of its individual pixels. The light is supplied by a conventional LED, in which electrons and holes combine to create photons. This is a quantum mechanical process and therefore the number of photons produced in a fixed period of time is not fixed, but is random.

The camera and LED are adjusted so that each pixel detects about 400 photons in a short exposure time. The photon numbers of all the camera pixels are combined in an “extractor” algorithm that outputs a sequence of random numbers. In the Swiss experiment, the camera was used to create a 1.25 Gbit/s stream of random numbers.

One worry about any random-number generator is that the numbers could be influenced in a predictable way by non-quantum (classical) effects in the system. This could lead to a measurement bias, for example, which could favour certain numbers over others. If a potential eavesdropper knows everything about the generator, they could in principle predict the classical component of its output. This would make it easier to crack the system.

Mindbogglingly random

However, when such biases are factored in, the team reckons that a user would have to generate a mindboggling 10118 random numbers before they would notice a deviation from a perfectly random sequence.

Sanguinetti told physicsworld.com that all of the components of his team’s QRNG could be integrated on a chip that would cost a few dollars and could be easily integrated in portable electronic devices, including mobile phones. “If there is a quantum technology that everyone will soon have, this is it,” he says. Sanguinetti also works for the technology company ID Quantique, which was co-founded in 2001 by Gisin and makes equipment for quantum and classical encryption systems. He says that the company is looking at commercializing the QRNG.

Anthony Laing of the University of Bristol described the technique as a “nice way to generate random numbers as it makes use of technology already embedded in the phone”. However, he cautioned, “It might be possible to design certain hacks based on quantum states of light that have different noise characteristics, and the authors of this work will no doubt have methods for dealing with these.”

Laing also believes that the technology could be used in quantum cryptography systems, which in principle are unbreakable: “A QRNG can also be a key component for quantum key distribution protocols, where the communicating parties must be careful to choose their measurements in a genuinely random way.”

The QRNG is described in a preprint on arXiv.

So, do you fancy winning $3m?

An e-mail arrived in my inbox this morning from Rob Meyer, who names himself “administrator” of the Fundamental Physics Prize Foundation, seeking nominations for the Breakthrough Prize, which is worth a tasty $3m, and for the $100,000 New Horizons Prize, which is aimed at “young researchers”.

In case you’ve forgotten, the foundation was funded by the Russian investor Yuri Milner, who did a degree in physics at Moscow State University before making squillions investing in start-up companies such as Facebook and Twitter.

(more…)

Fast solar wind boosts lightning rates

The discovery of a link between fast solar-wind streams and lightning could improve forecasting of hazardous weather. Scientists from the University of Reading in the UK have found that the arrival of high-speed solar-wind streams at the Earth boosts lightning rates for around 40 days.

“Very energetic particles known as galactic cosmic rays, generated by distant supernova explosions, have long been thought to influence the electrical properties of the air as they fall into our atmosphere, triggering lightning,” team member Chris Scott explains. “What we have done is to show that energetic particles generated locally in the solar wind can also influence the electrical properties of our atmosphere, despite being much lower in energy.”

Lightning forms in convective clouds that have become charged. Although the energetic particles in fast solar winds do not create the conditions necessary for lightning, Scott explains, they appear to boost the rate or magnitude of lightning that transports charge between the cloud and the ground.

Advance warning

“Since the energetic particles in our study are associated with high-speed solar-wind streams rotating with the Sun, we know that such streams will wash past the Earth roughly every 27 days,” says Scott. “This gives us a potentially useful method of forecasting the severity of storms some weeks in advance.”

The team found that just before the arrival of high-speed solar streams, the total solar irradiance drops, and increases occur in sunspot number and Mg II emissions – the latter being a spectroscopic measurement that is related to solar activity. This is consistent with the source of the stream being on the Eastern solar limb and rotating at the 27-day period of the Sun, the researchers say. The arrival of the solar-wind stream on Earth also coincided with a 1% drop in galactic cosmic-ray flux and a 6% increase in lower-energy solar-energetic protons at tropospheric altitudes.

To carry out the study, Scott and colleagues used solar-wind data for 2000–2005 from NASA’s Advanced Composition Explorer spacecraft. Lightning data, meanwhile, came from the UK Met Office’s Arrival Time Difference system, which employs radio receivers across Western Europe to detect “sferics”, which are broadband radio emissions associated with lightning. Timing the arrival of sferics at different stations can locate lightning over the UK with an accuracy of 5 km.

Focus on central England

To keep their measurements uniform, the researchers studied lightning that took place within 500 km of central England. They also employed records of thunderstorm activity from observing sites across the UK.

After 2005, the Arrival Time Difference system was expanded to form ATDnet, which can detect smaller sferics. The number of lightning strikes increased by an order of magnitude and the solar-wind effects were no longer apparent. The researchers believe this may be because, prior to 2005, the fast solar wind increased the magnitude of individual lightning strikes above the system’s detection threshold, thus creating the appearance of a rise in the rate of lightning.

“While we have shown that lightning rates can be altered by solar-wind conditions, we have a lot of work to do in investigating the exact mechanism by which this process takes place, and the magnitude and global extent of this effect,” says Scott. “Our study concentrated on lightning over Europe.”

Scott and colleagues reported their findings in Environmental Research Letters (ERL).

Tiny pretty things

Nanoscience is a fascinating and diverse field, one that uses tools from chemistry, biology and physics to investigate objects that are bigger than atoms, but smaller than most living organisms. This nanoworld can be stunningly beautiful, and the book Nanoscience: Giants of the Infinitesimal demonstrates this with more than 100 gorgeous images drawn from research across the discipline.

These images illustrate how nanoscience has evolved since 1959, when Richard Feynman proposed that devices such as circuits and motors could be made smaller, citing the nano-machinery of living cells as an example. Feynman’s ideas about miniaturization were soon adopted by the computer industry, with impressive results. However, the authors of Nanoscience, Peter Forbes and Tom Grimsey, suggest that nanoscale biomimicry could have even more far-reaching consequences. The photo above shows a type of sea slug, Elysia chlorotica, that adopts the photosynthetic apparatus of the algae it eats. Once it has done this, the slug can generate energy from sunlight directly, like a plant does. Might such a process form the basis of a new (literally) “green” energy source?

A piece of aerogel is held above the flame of a Bunsen burner. A delicate red flower sits unharmed on top of the aerogel thanks to the gel's super-insulating properties

Forbes (a science writer) and Grimsey (a scientifically minded artist) are similarly keen on the potential of existing nano-inventions. Consider aerogels, which they define as “foams from which water has been withdrawn, leaving the structure intact and replacing the water with air”. These materials are the world’s most efficient insulators, as the photo above illustrates. If aerogels could be made more cheaply, Forbes and Grimsey write, “they would revolutionize insulation technology”.

But nanoscience is about more than just new technologies and materials. According to Forbes and Grimsey, research in the field is also “bringing us to the point where life starts: the point where precisely nanostructured chemicals take on the properties of life”. Consider the circuit shown below, which “assembled itself” via a series of random collisions as individual blocks dotted with solder were jumbled in warm water. Although this is not a nanoscale device (the circuit is several millimetres across), nature employs similar self-assembly processes on much smaller scales. For example, a virus known as a T4 phage can reconstitute itself by physio-chemical means after being smashed to pieces in a blender – a feat the authors compare to “assembling a 747 in a gale”. Many scientists suspect that the earliest forms of life may have self-assembled in this way.

A tiny electrical circuit made from a stack of blocks that assembled itself after being jumbled in warm water. The circuit sits on top of a US penny

Despite all the pretty pictures, this is not a book for beginners. Readers will need a good scientific vocabulary to understand the text, and the authors are not as helpful as one might wish. For example, after a passing reference to “femtosecond pump-probe spectroscopy”, they obligingly explain what a femtosecond is, but leave the far more complicated meaning of “pump-probe spectroscopy” unexplored. However, the book could be helpful for undergraduates considering their next career move, as it provides a good overview of the work that individual nanoscientists are doing to explore this tiny, mesmerizingly beautiful world.

The problem of missile defence

Black-and-white photo of a scientist, Russell White, standing on a desk trying to read a very long computer printout at the RAND think tank in the 1950s

The idea of building a missile system to defend a nation from the horrors of nuclear attack first entered the public consciousness in the 1980s, when US president Ronald Reagan – backed by prominent (and controversial) scientific advisers such as the physicist Edward Teller – promoted the Strategic Defense Initiative as a supposedly impenetrable shield against the Soviet Union’s nuclear arsenal. At the time, debate about the technical feasibility of this so-called “Star Wars” system centred on questions about whether it was possible to detect a fast incoming missile, distinguish it from decoys or hit it with another missile or laser.

But while the physics problems associated with hitting incoming missiles are probably soluble (given sufficient resources), the computing problems are rather more challenging. Both types of problem had been hotly debated in the 1950s and 1960s as missile technology, software and computers evolved, and in 1963 the physicist Herbert York gave evidence to Congress that offensive missiles would always have the advantage over defensive ones. For a “missile shield” to be effective, it had to work quickly and precisely. It needed to work correctly the first time even though it would be impossible to test it in a realistic way. It needed to cope instantly with a whole range of (possibly unknown) countermeasures. And crucially, all of its multiple technologies and layers had to fit together seamlessly in order to pass flight data between them within minutes and without errors. Creating such a system would be like designing a computer that could beat Garry Kasparov at chess, first time and every time, in matches where the pieces and board were constantly changing.

These and similar arguments prevailed for many years, but when missile defence re-emerged as a hot topic in the 1980s, they were initially disregarded. Only in 1985, when a very senior software engineer resigned from the US programme, citing the risk of catastrophic failure, did these arguments begin to register with the Reagan administration.

The question of why it took so long for certain key issues, such as computing problems, to attract attention is explored in Rebecca Slayton’s book Arguments That Count: Physics, Computing and Missile Defence, 1949–2012. Slayton, an expert in technology policy, is interested in why some scientific arguments “stick” and others do not, and her book offers some fascinating insights into how people from various disciplines involved in missile defence presented their arguments and counter-arguments. However, Slayton’s book is far more than just a historical account of various missile-defence programmes. It also includes a whole series of thoughts and conclusions about the nature of scientific advice and decision-making in our highly technological world.

One of Slayton’s conclusions is that scientific advice cannot be separated from politics. Missile-defence systems are a good example. Because their primary goal is to reduce the effectiveness of attacking missiles, one of the cheapest responses for an adversary is simply to deploy more missiles. The better a shield becomes, the more missiles the adversary must deploy to counter it. A complex system such as “Star Wars” could also be used to destroy space-based sensors, thus rendering an opponent electronically blind and facilitating a possible surprise attack. Both are prime examples of how scientific improvements can have politically destabilizing consequences – and indeed, they contributed to a breakdown in US–Soviet negotiations related to the START anti-missile treaty in 1986.

Slayton also argues that “experts” are never just technical experts; rather, they are also experts in persuading and presenting. In that sense she regards scientific advice as being staged and dramatized to gain maximum impact. Again, the history of missile defence offers supporting examples. Slayton points out that many supposedly objective field tests of anti-missile systems have, in fact, been heavily biased in ways that make it easier for systems to find and hit incoming “warheads” and distinguish them from decoys (some of which have helpfully been made much larger). During the Gulf War, the US Patriot missile system tragically led to the shooting down of an Allied Tornado jet, while failing numerous times to shoot down incoming Iraqi Scud missiles (despite initial claims to the contrary).

Slayton uses the term “disciplinary repertoire” to describe how experts use a set of rules, knowledge and habits to rhetorically distinguish the subjective (and politically controversial) aspects of a problem from its (putatively objective) technical parts. This definition may seem controversial to some, but in my view it is a useful way of stripping away the pretence that science can operate in an objective way when it is dealing with highly political, human-related issues.

From my UK perspective, it seems a pity that Slayton’s analysis is solely based on US sources. During the 1980s a whole series of British-based analysts wrote books and articles opposing the Strategic Defense Initiative on political, strategic and technical grounds, and Slayton’s software-related arguments were also made in the UK by groups of computer professionals such as Electronics and Computing for Peace. These groups later merged into organizations such as Scientists for Global Responsibility, which I now chair, that continue to make similar points today. Another factor that Slayton does not address is the role of funding, lobbying and finance in decision-making, which in my view is also a key issue.

Despite these limitations, however, her conclusions certainly have implications far beyond missile defence. For example, we rely upon “expert” opinion as to how far we should adopt genetic-modification technologies in our food products. But is it really possible to extricate such evidence from the fact that a handful of large corporations control the world’s food supply? On the other hand, we have an unprecedented consensus among experts regarding climate change, but they still struggle to get governments to take sufficient action. I would argue that the problem lies partly with these experts’ “disciplinary repertoire”, but also with the existence of a strong, politically and financially motivated opposing lobby.

In her subtle and understated style, Slayton concludes that we must “recognize that the risks we face can only partly be addressed by the physical ingenuity of America’s top scientists and engineers”. She adds that all “complex technological systems…can never be only physical, but…are simultaneously social and political to the core”. I agree with her, while adding that I wish scientists and engineers of all nations could be more usefully engaged in less complex technological solutions to problems such as climate change, rather than dedicating their skills to conflict and destruction.

  • 2013 MIT Press &pound24.95/$35.00hb 272pp

What makes an equation beautiful?

By James Dacey

Earlier this year I wrote about a psychology experiment that revealed that mathematicians appreciate beautiful equations in the same way that people experience great works of art. In the experiment, which conjures up a slightly comical scene, mathematicians were hooked up to a functional magnetic resonance imaging (fMRI) machine and asked to view a series of equations. When the subjects looked at equations they had previously rated as beautiful, it triggered activity in a part of the emotional brain associated with the experience of visual and musical beauty. The formula most commonly rated as beautiful in the study, in both the initial survey and the brain scan, was Euler’s equation, eiπ+ 1 = 0.

Inspired by this study, we have put together this infographic to dissect the Euler identity and try to understand why so many mathematicians are enamoured with this little equation. Let us know what you think of the infographic and what you think are the most beautiful equations. Either post a comment below this article, or let us know on Twitter using the hashtag #BeautifulEquations.

(more…)

Copyright © 2026 by IOP Publishing Ltd and individual contributors