Many search engines take account of the number of links to a web page when they return the results of a search. "There is a widespread belief that search engines create a vicious cycle by making well known pages more and more popular at the expense of new ones," says Fortunato. "This presumed phenomenon, which is sometimes called 'googlearchy', has been widely discussed in the computer, social and political science communities. Our findings contradict this picture."

The Indiana-Bielefeld scientists measured the traffic and incoming links for 28,164 web sites using the Alexa, Google and Yahoo search engines, and then developed a new theoretical model that combines the various factors at play in web searching. These factors include: the queries users submit; the way that search engines retrieve and rank results; and the way that people use the results obtained.

To derive a scaling relation between the two quantities, Fortunato and co-workers plotted the traffic to a web page - measured as the fraction of all user clicks in a three-month period – against the number of incoming hyperlinks for that page. Each web page in their sample was represented by a point on this graph. They found that existing models failed to match their data. Whereas previous models had predicted that the traffic and the number of incoming links are related by a power law, Fortunato and co-workers discovered that there was a simple linear relationship between the two quantities instead (figure 2).

"Search engines are the interface between society and the main commodity of the 21st century - information," says Fortunato. "Our findings are broadly relevant to scientists who model the structure of the web, search engine designers, computer scientists who forecast traffic patterns, marketing experts who try to predict the effect of Web advertising campaigns and, finally, social scientists who are interested in the impact of the Web on knowledge discovery and propagation in the information society."