Skip to main content

‘Anaconda’ could be the best way to harness wave power

Amid spiralling oil costs and public fears of nuclear power, one promising alternative source of energy — water waves — is still almost completely unexploited. But the tide may soon turn for wave power, as physicists are developing a device that promises to be the cheapest and lowest-maintenance option yet.

Invented by Rod Rainey of Atkins engineering and design consultancy, UK, and Francis Farley, a retired scientist now based in France, the “Anaconda” is a huge rubber tube that can be moored offshore so that it floats just beneath the sea surface. When a wave approaches one end of the Anaconda, it creates a bulge at that end of the tube.

Like a ripple travelling down a length of rope, the bulge moves through the tube. But unlike a ripple in rope, which decays until it disappears, the bulge in the Anaconda is amplified by the surrounding waves, so that it grows. When it reaches the far end, the bulge drives a turbine, producing electricity.

“I’ve been working for years on the fringes of [wave power],” says John Chaplin at the University of Southampton who, in conjunction with colleague Grant Hearn, is researching the physics of the Anaconda. “I’ve always been rather sceptical, but I do actually think this is rather promising.”

‘Very encouraging’

The most electricity is produced when the Anaconda is in resonance with the surrounding waves. Although sea waves come in many different wave periods, there is usually a dominant wave period, which roughly relates to the size of the body of water the wave is travelling in. This means that, for example, the dominant wave period of waves in the Pacific is bigger than that in the Atlantic. The Anaconda can be tailored in dimensions to suit these different bodies of water. Moreover, certain parameters — such as the pressure of water inside the tube — can be altered to tune its resonance in situ.

Funded by the UK’s Engineering and Physical Sciences Research Council (EPSRC), Chaplin and Hearn are currently investigating prototypes that are about 8 m long. However, they expect full-scale versions to be as long as 150 m, producing as much as 1 MW of power, or enough for 2000 houses. They estimate this would cost £2m, which places the Anaconda three times cheaper than their main rival wave-power converter. The rubber construction also makes the device almost free from maintenance.

Tom Roach, the managing director of Avon Fabrications, is refining the design for a proof-of-concept Anaconda, and expects one to be at sea within three years. But it is a monster undertaking — at 200 tonnes, the Anaconda will be the largest rubber device ever. “The results are very encouraging,” he says.

WIMPs and DAMA: The debate continues

By Hamish Johnston

Did they see WIMPS, or didn’t they?

The question of whether the DAMA and DAMA/LIBRA experiments in Italy have seen evidence of dark matter in the form of “weakly interacting massive particles” (WIMPs) has hung over the dark matter community for the past eight years.

The DAMA team has measured a very strong annual modulation in the signal from their detectors — which they ascribe to the motion of the Earth through our galaxy’s halo of dark matter.

However, other dark-matter searches have failed to confirm their result. The latest, being posted on the arXiv preprint server by researchers using a new detector buried somewhere under Chicago.

Physics World‘s Italy correspondent Edwin Cartlidge will be delving deep into the controversy surrounding the DAMA result in the next issue of the magazine. Stay tuned for much more.

Thin-film dyes boost solar cells

Scientists in the US have shown how to multiply the power output of photovoltaic (solar) cells by up to ten times using organic dyes to concentrate sunlight. They say that their work could be scaled up to make solar cells competitive with fossil-fuel power generation.

Conventional photovoltaic concentrators are designed to increase the power output from a solar cell by collecting sunlight over a large surface area and then focusing it down, using large moving mirrors to track the Sun. Unfortunately the mirrors are expensive to set up and maintain, while the solar cells themselves must be cooled.

An alternative approach is to use “luminescent solar concentrators”. These generally consist of a block of transparent plastic containing dye molecules with solar cells placed at the ends of the block. The dye molecules absorb incident light and then reemit the light at longer wavelengths, leaving most of the light trapped inside the block through total internal reflection (in this way the block acts as a waveguide). The light is then collected by the solar cells.

However, in devices built to date much of the extra light collected has been lost via absorption in the dye. The “geometric gain” of such devices is the ratio of the areas of the concentrator and solar cell — but according to Marc Baldo of the Massachusetts Institute of Technology in the US, researchers have struggled to achieve an overall gain (or “flux gain”) of 1, rendering the technology pretty much useless.

Thin films

Baldo and colleagues have therefore taken a slightly different approach, by depositing thin films of two different organic dyes — one called Pt(TPBP) and the other DCJTB — onto glass. Along with a number of other technical innovations, this has allowed them to achieve higher gains (Science 321 226).

“In previous devices the dye absorbed too much of its own radiation,” explained Baldo. “This means that there was a limit in the size of the concentrator — light simply could not propagate far enough to reach the edges of larger concentrators,” he said.

The team overcame this problem by borrowing an idea from lasers known as a ‘four level system’, in which the absorption of the concentrator is separated in energy from the emission.

These higher gains rely on higher concentrator efficiencies — the ratio of electrical power out to solar power in. The MIT researchers say that previous demonstrations of luminescent concentrators have yielded efficiencies of the order of 1%, whereas they calculate that their Pt(TPBP) device operates at 4.1% while the DCJTB runs at 5.9%. Furthermore, they believe these figures could be increased by creating “tandem” concentrators, which are able to use a wider spectral range of the incoming light. They calculate that Pt(TPBP) combined with a DCJTB would yield 6.8%, while concentrators coupled with additional solar cells could push the efficiency up to over 20%.

In the demonstrations thus far, Baldo’s group has built concentrators with a geometric gain of 45. Taking into account both the efficiency of the concentrator and that of the solar cell, they calculate that their DCJTB concentrator generates a flux gain of 11. However, this would still leave solar power far more expensive than fossil fuels. Baldo points out that to be competitive solar must come down to about a dollar per peak watt, but that the solar cells used in concentrators produce power at about $50 per peak watt. Concentrators must therefore yield flux gains of around 50.

Very cheap to make

Fortunately, the MIT researchers predict that Pt(TPBP) could yield flux gains of between 30 and 60 for geometric gains of 630. Baldo adds that the concentrators themselves will be very cheap to make, and therefore should not add significantly to the price. “The paint industry alone makes thousands of tons per year of these kinds of dyes at very low cost,” said Baldo.

However, Martin Green, who carries out research on solar cells at the University of New South Wales in Australia, believes that the MIT work is still some way from producing useful devices. “The projected 6.8% efficiency is not yet high enough for major commercial impact,” he said.

Japanese particle-physics leader dies

Yoji Totsuka, former director general of the KEK particle-physics lab in Japan, died yesterday at the age of 66. Totsuka, whose research interests were in the field of neutrino physics, served as KEK boss for three years from April 2003 After retiring in 2006, Totsuka became a professor emeritus at KEK and the University of Tokyo.

As KEK boss, Totsuka oversaw the Belle B-factory, in which collisions between electrons and positrons generate copious quantities of B-mesons, which are used to probe the differences between matter and antimatter. He also oversaw the K2K neutrino-oscillation experiment, in which a beam of neutrinos is sent 250km to the vast underground SuperKamiokande detector at the Kamioka Observatory.

Before moving to KEK in October 2002, Totsuka was head of the SuperKamiokande experiment. Under Totsuka’s leadership, the experiment recorded the first ever evidence for neutrinos changing, or “oscillating”, from one type to another, which indicated that these particles had mass. Until then, neutrinos were believed to be massless. Totsuka also oversaw the rebuilding of SuperKamiokande after most of its 11,000 photomultiplier tubes blew up in November 2001.

Born on 6 March 1942, Totsuka received his doctorate in 1972 from the University of Tokyo for research into cosmic-ray muons and then studied electron-positron collisions at the DESY lab in Hamburg, Germany. He joined the Kamiokande experiment — the forerunner of SuperKamiokande — in 1981 and eventually became director of the Institute for Cosmic Ray Research at the University of Tokyo. The institute runs a range of astrophysics experiments, including the neutrino detectors at Kamioka.

During his lifetime, Totsuka received numerous awards including the Order of Culture of Japan and the Bruno Pontecorvo Prize in 2004. Current KEK boss Atsuto Suzuki paid tribute to Totsuka, calling him an “outstanding physicist” and “a prominent leader both in non-accelerator and accelerator neutrino experiments”. Although Suzuki admitted that the pair had had “many discussions and sometimes quarrels”, he praised Totsuka for being “very proactive in pushing foward international cooperation as members of many large particle laboratories”.

Dave Wark from Imperial College London, who is involved in plans to build an upgraded version of the K2K experiment known as T2K, also underlined Totsuka’s importance to neutrino physics, adding that Totsuka could well have won a Nobel prize had he lived longer.

“His tenacity in getting SuperKamiokande built and then rebuilt transformed our field, providing its premier detector and determining its direction,” said Wark. “It is hard to overestimate the importance of SuperKamiokande to neutrino physics. Those of us who are involved with T2K will continue to owe our opportunity to make new world-leading measurements to his legacy. Without Totsuka the world of neutrino physics would be a very different and much less productive place. We have lost one of our best”.

Happy birthday, ISS!

iss.jpg

By Belle Dumé

The International Space Station celebrates its 10th anniversary later this year as its first module was launched in November 1998.

To commemorate the historic event, a symposium was held at UNESCO Headquarters in Paris this week — and in my capacity as physicsworld.com‘s Paris correspondent, I was there rubbing shoulders with space industry and agency leaders from around the world.

The symposium, organized by the International Astronautical Federation (IAF) and the European Space Agency (ESA), began by looking at history of the cooperation between 16 nations and how it all began in the 1980s — at the height of the cold war. Agreements were signed and collaborations forged thanks to the efforts of early ISS negotiators, including Mac Evans of the Canadian Space Agency, Margaret Finarelli of NASA and Fredrik Engstrom of the ESA (pictured above left to right), all now retired. The symposium also described the present, difficult, construction period and future ISS plans.

ISS crew members, Jean-Francois Clervoy, Leopold Eyharts, Satoshi Furukawa, Sergei Krikalev and Michael Lopez-Alegria also explained what it is like to actually live in space for long periods of time. Such experience will be invaluable for future human space travel.

ISS is first and foremost a giant space laboratory, with experiments being carried out in areas as diverse as medicine and life sciences, microgravity, materials science and physics, and Earth observation. An amazing technical achievement, the ISS is the world’s biggest scientific collaboration to date.

Construction of the $60 billion space station (it weighs over 500 tons) began at the end of 1998 (with Zarya, the Russian control module). It has been successively built over the last decade by over 80 spaceflights, carrying different modules up to the orbiting outpost. This year, the Columbus research module from ESA, the Kibo Laboratory from Japan and the Dextre Robot of Canada were installed. This also means that all of the ISS partners now have their major elements assembled in orbit.

ISS assembly should be complete by 2010, by which time it should be the size of a football field from “port” to “starboard”.

The ISS is crucial in our quest to expand the boundaries of space exploration and research, and will be a real “stepping stone” to other planets in our solar system, like Mars, and beyond. It is also a magnificent testament to what can be achieved when nations collaborate for the advancement of humankind.

Quote me fairly, I’m a scientist!

Have you ever been contacted by a journalist? If you have, a survey suggests, it was more than likely a pleasant experience.

The survey, which was based on the responses of over 1350 scientists in the US, UK, Japan, Germany and France, indicates that interaction between researchers and the media is more frequent and “smooth” than previously thought (Science 321 204).

The media work according to different rules, it’s not like talking to a colleague Hans Peter Peters, Forschungszentrum Jülich

Although the scientists surveyed were involved in either epidemiology or stem-cell research, there was little difference between the two disciplines which implies that the same good relations might be present in the physical sciences. “We would expect some differences,” Hans Peter Peters, the lead author of the study who is based at the Forschungszentrum Jülich research centre in Germany, told physicsworld.com. “[But] I would expect the result to be rather general.”

‘Overcome that fear’

Peters — who worked with researchers at the University of Wisconsin–Madison in the US, the social-science institution EHESS in France, University College London in the UK and Kansai University in Japan — thinks the survey is important because it goes against the common notion that science and the media are “like oil and water”.

This notion comes about because of the different ways in which science and the media operate. The media, which work on short timescales with a constant drive for “new” stories, are at odds with the gradual, formal progression of science. Peters even points out that the criterion for accuracy differs between journalists and researchers — in journalism, for example, it is sometimes most important to get the right social or political message across.

“Naturally these relationships are difficult,” explains Peters. “There is a fear [among scientists] to lose control of their knowledge.” He notes that many scientists ask to be able to check media articles prior to publication, even though this goes against journalists’ instinct to remain independent. “Scientists can overcome that fear,” he adds.

Positive impact

The survey showed that the overwhelming majority of scientists chose to speak with the media to achieve either “a more positive public attitude toward research” or “a better-educated general public”. Overall, 46% of scientists rated the impact of media stories as “mostly positive”, while just 3% rated them as “mostly negative”.

Japanese scientists were the least pleased with the media impact. Among Western nations, German and US scientists were more pleased than their UK or French counterparts.

However, there are still concerns among scientists about how they are represented by the media. Nine in 10 thought there was a risk of being misquoted, while eight in 10 were wary of journalists’ “unpredictability”.

Peters now plans to perform a comprehensive survey for a range of other research fields, including those in physics. In the meantime, though, does he have any advice for scientists when dealing with the media? “You need to be aware that the media work according to different rules, and try not to assume too much — it’s not like talking to a colleague,” he says.

Peters explains that scientists often do not appreciate that journalists need to create a “picture” of the world. “[Scientists] should have the chance to take part in this journalistic story,” he adds.

Social networking for physicists

By Hamish Johnston
Yesterday I was at a conference in London put on by the Specialised Information Publishers Association, or SIPA.

Although SIPA members publish both print and electronic products, old-fashioned paper was barely ever mentioned and the focus was on the Internet.

A great deal of discussion was devoted to building social networking sites for professionals — and we were treated to several success stories.

Some speakers argued that social networking is the next big thing for professionals, and instead of happening on massive public sites like Facebook or Bebo, professionals will interact on much smaller and much more exclusive “niche” sites.

An example cited by several speakers is SERMO, which claims to have over 65,000 doctors as members.

But what about physicists?

In September, the Institute of Physics (which owns the company that publishes physicsworld.com) plans to launch a social networking site for its members. The network has the working title MyIOP and according to the IOP’s Karen Bayless, “The network will enhance members’ ability to interact with each other more widely”.

Do you use social networking sites as a physicist?

Fitting food for physics

Culinary xenophobes take note: your favourite national dish is unlikely to be pushed off the menu by exotic recipes from afar. That’s the conclusion of physicists in Brazil who have applied statistical methods to cookery books from different countries and different eras in the name of interdisciplinary science. The study, published today in New Journal of Physics, also suggests that the average number of ingredients per recipe is similar between countries (about 7–10) and has remained constant over time.

The outcome may surprise foodies in the UK, where the Indian-themed dish “chicken tikka masala” reportedly has displaced the humble “fish ‘n’ chips” as the nation’s favourite dish. Then again, their analysis could be a bit overseasoned.

“Having observed that culinary ingredients and recipes constitute a bipartite network, just like actors and films do, we realized they can be modelled using tools from mathematics and physics,” team-leader Antonio Roque of the University of Sao Paulo told physicsworld.com. “Besides, food is an unusual topic among the physics community but one of high social and cultural importance.”

Rank ingredients

Roque and co-workers collected data from four classic cookbooks: Pleyn Delit, a collection of medieval recipes; the UK’s New Penguin Cookery Book; Larousse Gastronomique from France; and Dona Benta from Brazil. For each they ranked ingredients according to how often they appeared, and then plotted the number of recipes in which each ingredient appears as a function of decreasing rank (New J Phys 10 073020).

The distributions for each book had long “tails”, reflecting the fact that there are fewer recipes containing low-rank ingredients, and the team was able to fit each one using the same power-law. This suggests that cuisines evolve independently of culture, although with just one book per country and no error bars present it is difficult to judge just how similar this evolution really is.

The researchers also found the same power-law behaviour for three different editions of Dona Benta from 1946, 1969 and 2004, hinting that the structure of Brazilian recipes is not only similar to those in France and the UK but has not changed in the last 50 years — despite the shift from a regional to a more globalized consumer profile during that period.

So what?

The traditional culinary ingredients and recipes of a given culture are strongly resistant to replacement pressures from external culturesAntonio Roque, University of Sao Paulo

Clearly it is satisfying for physicists to be able to describe complex behaviour in terms of simple formulae, but at the end of the day these are just statistical fits. Such “scale-invariant” (i.e. power-law) behaviour pops up in countless social systems, such as linguistics, not to mention numerous physical situations such as coastlines and drainage networks. The conclusion of such studies is often a statement of the obvious.

What makes the cookbook analysis interesting is that the Brazilian team was able to reproduce the universal power-law behaviour using a “copy-mutate” algorithm, which describes culinary evolution by a branching process similar to that in biology. The model suggests that idiosyncratic ingredients, for example the edible plant chayote in Central and South American diets, are preserved in cultures just like certain genetic patterns persist across generations. The classic French pot-au-feu and other stews are good examples of “mother recipes” that generate several daughter-recipes by a copy-mutate mechanism.

“Because the system does not forget the initial conditions, the traditional culinary ingredients and recipes of a given culture are strongly resistant to replacement pressures from external cultures,” says Roque. “The model may help biologists model speciation, for example by making an analogy between recipes/ingredients and species/phenotypes.”

It is very difficult to define ingredients — is sugar an ingredient, for example?Hervé This, College de France

Hervé This, a physical chemist at the College de France in Paris who cofounded the science of molecular gastronomy, thinks the idea and methods of the work are good but finds its conclusions hard to swallow. “It is very difficult to define ingredients – is sugar an ingredient, for example, and what about the carrots used to make the stock for a recipe?” He also says we should recognize cookery books as literature, not necessarily as reliable gastronomic databases. That’s as clear as consommé to anyone who has attempted to prepare an evening meal from the Larousse.

The team now plans to consider other cookery books and to apply its evolutionary algorithm to different problems.

Method could cut number of vaccinations by half

Physicists in the US and Israel claim to have come up with a method that reduces the number of vaccinations required to immunize a population by as much as a half.

Vaccines are often limited or expensive and it is therefore important for authorities to make immunization as efficient as possible. One approach is to target individuals who come into contact with the greatest number of people, so that they cannot spread disease quickly.

Now, however, Yiping Chen of Boston University in the US and colleagues have used statistical physics to improve on this strategy. They split a population up into a number of equal-sized clusters that are connected with one another through “separator groups”. Individuals within a cluster are only in contact with each other, while people within separator groups are connected to individuals in more than one cluster. By immunizing only those within the separator groups, a new outbreak can be self-contained inside the cluster in which it originated. The idea is to arrange the members of the separator groups so that only a few of them — and vaccines — are required (Physical Review Letters in publication).

Real data

Chen and colleagues examined the effectiveness of their “equal-graph partitioning” strategy by plotting the fraction of a population at risk of infection against the number of doses required. They found, using different types of theoretical network, that their strategy required fewer doses than other strategies for any given level of immunization.

For example, compared with targeting well-connected individuals, the new strategy would need between 30% and 50% fewer doses. When compared against more sophisticated “adaptive” targeting, which involves recalculating how connected individuals are after successive vaccinations, they would need between 5% and 10% fewer doses.

Working with Fredrik Liljeros, a sociologist at Stockholm University, Chen’s group applied its strategy to data from workplaces in Sweden. They found that in this kind of network, which has been shown to be important in the transmission of flu, would need 15–30% fewer doses for complete immunization.

The new strategy could even work for immunizing computer networks. Chen’s group showed that completely immunizing the Internet would need anti-virus software for only half the number of servers used in alternative strategies.

No waste

Chen believes that the success of his group’s approach stems from its more global perspective. He points out that in the case of a highly connected small cluster of individuals and a loosely connected larger cluster, the conventional targeted strategy will first of all immunize people in the small cluster, which means in effect wasting resources on a small fraction of the population. “This will not happen in our algorithm,” he says.

Even the most effective immunization strategy depends on good quality data, however. Chen admits that data on social relationships can be hard to come by but he points out that social networking websites, such as Facebook, can help.

The final outcome

By Michael Banks

Many physicists in the UK have spent the past six months fuming after the Science and Technology Facilities Council (STFC) announced an £80m shortfall in its budget late last year. The STFC said it would deal with the shortfall by pulling out of plans for the International Linear Collider and withdrawing from the Gemini telescopes in Hawaii and Chile. Numerous experiments in particle physics and astronomy also faced the axe or severe cuts.

Stunned by the reaction from the community, the STFC quickly set up a consultation with physicists, the final outcome of which was due to be announced at a meeting with physicists and the media at the Royal Society in London yesterday. In fact, the STFC told the media of its plans last week, although I still decided to attend the meeting as I knew that STFC chief executive Keith Mason, science-board chair Peter Knight and John Womersley director of science programmes at the STFC would all be there.

But what could have turned out to be a lively debate into how the £80m black hole in the budget of the STFC occurred ended up as a rather drab affair. Generally, members of the physics community were pleased with how the consultation process went and had even accepted that some projects would sadly have to face the axe. Indeed, almost all physicists who asked a question began by praising the STFC into how it conducted the consultation period.

Any chance of a surprise announcement at the meeting had of course vanished due to last week’s unveiling of the final programmatic outcome. The STFC had decided that some projects that had faced the funding axe would now be saved, including the e-MERLIN project – containing the Jodrell Bank observatory. In other words, the community and media already knew what projects were going to be funded.

What was bizarre for me and other members of the media is that we were prevented from asking any questions in the open session because there was going to be time to do so afterwards in a separate session for journalists. However, this session never materialised, and when I went and asked an STFC spokesperson what happened to the media session I was told that as the outcome had been made public last week, there was no need for one!

It was a shame that the media were not given the chance to ask questions with the community present. Instead we had to approach Mason, Knight and Womersley to quiz them at the end of the meeting as they were preparing to leave.

I myself had wanted to ask Mason about the Gemini project. I am still intrigued to know how and by whom that decision was made early in March to pull UK involvement in the project completely. Not only would this have been a scientific loss, but also would have meant a financial loss as the UK has put around £35m into the 8m class telescopes based in Hawaii and Chile. Additionally, the UK would also have induced a financial penalty for pulling out.

Both these points must have been known to the management of STFC. In the final outcome, the UK has now been reinstated as a full member in the Gemini consortium, but will sell 50% of its observing time to other members.

There was a change of tune within the community from the hostilities earlier in the year. Both parties are probably thinking that damage is now being done to the image of physics not only in the UK but internationally as well.

Possibly one of the most outspoken critics of STFC, Michael Rowan Robinson from Imperial College London, former president of the Royal Astronomical Society, was pleased at how the STFC have listened to the community.

But, he added, “I am still saddened that we had to go through all that in the last eight months and damaged our international reputation. I still don’t understand why we went through it.”

The first words that came from Mason in response were, rather discouragingly, “I don’t either.”

Copyright © 2025 by IOP Publishing Ltd and individual contributors