Skip to main content
Mathematics and computation

Mathematics and computation

Let’s talk about information

10 Feb 2004

Information: The New Language of Science
Hans Christian von Baeyer
2003 Weidenfeld and Nicolson 272pp £16.99hb

The year 1900 is a singularly significant year in science, celebrating as it does the birth of both quantum mechanics and molecular biology. The key papers of Max Planck will, of course, be familiar to most physicists. In biology, however, it took three researchers working independently in Amsterdam, Berlin and Vienna to rediscover Mendel’s laws of heredity, which he had originally published in 1866 in the transactions of the Scientific Society of the City of Bruenn (now Bron) in the present-day Czech Republic.

A century later on 6 June 2000, at a ceremony at the White House, the complete record of the human genome was revealed to all humanity and announced to every corner of our planet. Comprising only four chemical letters, the human genome undoubtedly carries an enormous amount of information that would have been unimaginable in former times and will sustain decades of future work. What lies behind this remarkable achievement for humanity, which took place just a century after the birth of molecular biology?

This new book by Hans Christian von Baeyer – a physicist at the College of William and Mary in Virginia, US – provides a superb account of various aspects of the influence of information on humanity and science in particular. Written by an author who displays a thorough understanding of his subject, the book offers the bold thesis that information is the essence of science. It is a remarkable notion that will be widely accepted by scientists and by the general public. Although the book is aimed at a broad readership – there are no equations – it will still be valuable for physicists, as it examines the role that information plays in many areas of science.

The book begins with a thorough survey of the nature of information – ranging from an initial description of the concept to the late Claude Shannon’s “operational” way of measuring information. It involves translating any message into binary units and counting the number of resulting ones and zeroes. Shannon’s tremendous insight that digital information can be extracted from analogue signals helped to pave the way for the information-technology revolution of the past decade.

The rest of the book is divided into two parts: classical information and quantum information. The former offers a detailed insight into the nature of “information” as we know it today, including entropy, randomness and noise. It also looks at how information is stored and transmitted. In the section on quantum information, the book offers some basic insights into the key points of the subject, which is now a hot topic. A huge amount of knowledge has been obtained in a few short years on various aspects of the field, particularly about the nature of the related physical processes.

These research efforts, supported by the governments of many industrialized nations, have yielded many findings about quantum physics and information theory, all in classical forms. However, the book warns readers to be wary of some of the grandiose claims that have been made concerning potential applications of quantum information. These words of caution provide a welcome balance to some of these claims. Quantum information, it should also be noted, is still in an analogue form; whether it too can be made digital remains to be seen.

Despite having been invented almost 2000 years ago, paper remains our primary information-storage medium – apart from the human brain. However, the transformation to electronic communication brings both risks and rewards. One of the dangers of being able to store, retrieve and transmit information electronically is that the quantity of available information increases enormously. I believe that this book could have become a best-seller if it had tackled the psychology of information and the influence of instant access to information on society. These issues will soon become pressing for our civilization, although I admit that they are beyond what one might expect a fellow physicist, such as von Baeyer, to address.

As for his view that information will reunite science, it is a nice idea but whether it will happen is uncertain. If anything, the growth in information has caused science to splinter into ever smaller branches. A good example is the German mathematician Georg Simon Ohm, who received his doctorate from the University of Erlangen in 1811 and went on to formulate the law of electrical transmission. What most people do not know is that he also developed a theory of physiological acoustics, although it was later proven faulty. In his day, scientists like Ohm were not labelled by their “discipline”.

Nowadays, however, it takes much longer for young scientists to grasp the basic information in their field. Indeed, it is rare to find anyone who can make significant contributions to more than one sub-field of science. I therefore doubt if more detailed scientific information will unite all of science. Nonetheless, we can unite the fundamentals, such as causality, basic scientific methods, and the most basic pictures of life and the universe. Such a unified view – known to even just a part of humanity – would be a tremendous achievement.

Copyright © 2025 by IOP Publishing Ltd and individual contributors