Skip to main content

Why now is the time to change bad exam habits

Physics is a versatile subject and the problem-solving skills you learn during a degree can be used in countless other career paths. During the first week of my physics undergraduate course at the University of Nottingham, UK, we were told that to become a good physicist we must use mathematical tools to approach a problem creatively and find a solution through critical thinking. Learning content for the sake of memorization for an exam was not encouraged.

While this problem-solving approach was central to my degree, I began to wonder why this was not assessed at an earlier age. If we are to become adept critical thinkers, then why should we only be evaluated on our ability to saturate our brains with equations and explanations, only to regurgitate them over a two-hour exam? Should we not instead train physics students at school to rely on problem-solving skills that have been developed over months and years? If nothing else, it would better prepare future science and technology students for the style of assessment they will face at university.

Cheat-sheet exams would allow for a greater focus on the application of content and problem solving rather than the recall of information

At a time when libraries of information can be carried around in a pocket, it seems counterproductive to test students on their ability to commit so much content to memory, not to mention the added anxiety of having to do so. Yet in the UK, school physics exams stubbornly remain “closed book”, with students given no resources beyond what is provided on the paper. With this year’s exams cancelled due to the global COVID-19 pandemic and replaced by teacher assessments, there may never be a better time to re-evaluate our assessment systems nationwide to better foster critical thinking and problem solving in students.

Semi-open book?

I have never particularly struggled with the “standard” style of examination, probably helped by years of script learning for my performing-arts hobby. Yet as a result of the pandemic, my third-year summer exams were changed to “open book” to accommodate students working from home. An open-book exam is where students are given any number of resources, including textbooks, handwritten notes or even Internet access. This has been shown to have several advantages, including reducing test anxiety and allowing students to utilize more critical and creative thinking when equipped with as many facts as possible.

However, despite these advantages, I found that my engagement with the content and level of preparation for the exam was reduced. This could have been because the open-book exam simply consisted of the same questions as the closed-book exam would have had. But if the questions had been specifically geared towards open-book examination, it might have made me apply, rather than recall, key physical concepts. Indeed, numerous studies have found that students tend to prepare less for open-book exams and as a result spend considerable time in the exam looking up answers in the resources provided. While open-book assessments benefit students’ mental health and lend themselves to more critical thinking-based questions, they are not without their flaws.

If we are to revamp how we test students in the physical sciences, the solution may come from taking the best attributes from both exam styles in the form of the open-notebook – or so-called “cheat sheet” –exam. Here, students can prepare a limited number of notes – usually a side of A4 paper or similar – to bring into the exam. There are several advantages in a move to cheat-sheet exams. For example, student engagement and preparation would remain at the current levels or improve as students are forced to fully review the content, organize the information and summarize the key concepts on a single sheet. There would be reduced test anxiety, again owing to the student’s ability to prepare for the aspects that most challenged them, and, crucially, cheat-sheet exams would allow for a greater focus on the application of content and problem solving rather than the recall of information. This would reward those who have put time into developing the key attributes of a good physicist.

Creative thinking

Closed-book exams can alleviate the stress of memorization for some students, but a self-prepared cheat-sheet would allow students to spend less time agonizing over basic descriptions of phenomena. Instead, they would be rewarded for probing creative uses of mathematical tools and considering their application to new challenges.

Given the upheaval of education during the COVID-19 pandemic, there can be no better time than now to change how we test physics students in schools and a move to cheat-sheet-style exams could be part of that solution. If we train our budding physicists to develop those critical thinking skills from a younger age, we may all see the benefits as they move onto university in the years to come.

Synthetic ivory can be 3D printed

Claimed to be highly realistic and elephant-friendly, a new alternative to ivory has been developed by researchers in Austria. Led by Jürgen Stampfl at the Vienna University of Technology, the team used stereolithography to 3D print a replica material called “Digory”, which they claim closely mimics both the mechanical and optical properties of real ivory. Their approach could make it far easier for conservators to restore historical ivory artefacts.

Renowned for its aesthetic appearance, durability, and ease of sculpting, ivory has been used for centuries to create practical and artistic objects. However, its use has extracted a terrible toll on elephant populations and its global trade was banned completely in 1989. Today, conservators use synthetic replica materials to restore damaged ivory artefacts – yet none of these are fully able to recreate characteristics like the colour, translucency, and surface gloss of the real thing.

Stamfl’s team aimed to recreate these properties using stereolithography: a 3D printing technique that uses heated, light-sensitive resins to construct polymer materials layer by layer. Previously, the researchers used this approach to create ceramic materials, and even artificial teeth.

UV polymerization

To make artificial ivory they used a dimethacrylic resin in a liquid state, in which they embedded fine particles of calcium phosphate. On exposure to an ultraviolet laser, molecules in the resin polymerize to form long, rigid chains, resulting in a solid material with calcium phosphate particles trapped inside. By fine-tuning the fractional volume of these particles, the team could adjust the translucency, density, and hardness of the material to make it resemble ivory. The result was an advanced new synthetic material, which they named Digory.

In collaboration with the Archdiocese of Vienna, Stamfl and colleagues used their technique to restore a 17th century casket from a local church. The artefact had been decorated with ivory ornaments, but some of these had been lost over the centuries. The team’s additive manufacturing approach enabled them to recreate the complex and delicate features of the ornaments in digital constructions. After printing, they then treated their replicas with pigments to match the colour of the original ornaments. With further staining and polishing, the appearance of the Digory material was barely distinguishable from the genuine ivory on the casket.

Where previous restoration processes relied on time-consuming and error-prone carving techniques, the team’s approach will now allow conservators to print Digory replicas in just a few hours, with no material waste. Overall, the project clearly demonstrates the ability of stereolithography to reproduce the many desirable properties of ivory, without posing any further threat to wild elephants.

The research is described in Applied Materials Today.

Melting glaciers have been shifting the Earth’s poles since 1995, new study suggests

The rotation of the Earth has been affected over the past 25 years by the rapid melting of glaciers caused by climate change, according to a study done by scientists in China and Denmark. Using satellite data and modelling, the team has shown that the melting of glaciers has caused an eastward shift in the position of the true North Pole and South Pole that began in 1995.

Scientists already knew that since 2005 glacier melting has affected the location of the poles and this latest study suggests that the trend began a decade earlier. The technique developed by the team could be used to look further back in time to study the relationship between polar shifts and changes in terrestrial water distribution.

Mass distribution

The Earth spins on an axis that runs through the geographic, or true, North and South Poles. However, the precise locations of where the axis intersects the Earth’s surface changes with time. This is because the Earth is not rigid and changes in the distribution of mass, both inside the Earth and on the surface can shift the position of the poles in a phenomenon called polar drift or true polar wander.

Part of this movement is an oscillation on a timescale of about a year – caused by short-term fluctuations such as changes in ocean currents and atmospheric pressure. Scientists have also measured a slow steady shift in the poles that is believed to be associated with the rebound of parts of the Earth that were once covered by glaciers. Convection in the mantle is also believed to contribute to this long-term drift.

Amazing GRACE

In 2013, Jianli Chen and colleagues at the University of Texas, Austin showed that an eastward drift of the poles that began in 2005 is linked to melting glaciers and the associated sea-level rise. They used observations of terrestrial water (which includes groundwater) made by the Gravity Recovery and Climate Experiment (GRACE) mission, which was launched by in 2002 by NASA and the German Aerospace Center.

Grace comprised two satellites that followed the same orbit, separated by about 200 km. As the spacecraft passed over a feature on the Earth’s surface such as a mountain or glacier, the satellites experienced local changes in the Earth’s gravitational field caused by the mass of the feature. This caused tiny changes in the separation of the satellites, which were measured. This allowed GRACE to determine the shape of the Earth and monitor changes in sea level, glaciers and groundwater.

In the mid-1990s scientists spotted a significant eastward shift of the position of the poles. However, without GRACE data researchers had not been able to make the link to glacier melting. Now, Suxia Liu and colleagues at the Chinese Academy of Sciences and the Technical University of Denmark have developed a technique that uses modelling and observations from GRACE and its 2018 successor GRACE-FO to predict shifts in terrestrial water back to 1981. Other observations of glacier melting, and estimations of groundwater extraction were also used in the analysis.

Glaciers and groundwater

After accounting for known influences on polar drift, the team concluded that the main cause of the polar drift that started in 1995 is the melting of glaciers in the polar regions. However, the size of the drift cannot be explained by glacier melting alone and the team believe that there is also a contribution from the extraction of groundwater at middle latitudes – in places like California, Texas, the region around Beijing and northern India.

The research suggests that the average speed of the eastward drift of the poles in 1995–2020 is about 3 mm/year – which is about 17 times faster than the average speed observed in 1981–1995. While significant, the shift is too small to be noticed in our daily lives. Indeed, Vincent Humphrey of the University of Zurich says it would change the length of a day by only milliseconds.

Scientists have been monitoring polar drift for 176 years and Liu believes that her team’s analysis techniques could be adapted to peer further back in time and determine groundwater extraction patterns in the 20th century.

The study is described in Geophysical Research Letters.

Surprising physics that we depend on for existence

We humans are not blessed with a reliable intuition for the laws of nature. Many of today’s scientific axioms have had to pass through what the British scientist John Haldane once described as the four stages of acceptance for scientific ideas: from nonsense, to interesting but perverse, to true but unimportant, to “I always said so.” In Seven Pillars of Science: the Incredible Lightness of Ice and Other Scientific Surprises, bestselling science author John Gribbin describes seven important founts of scientific wisdom that have passed through these stages of acceptance. Ironically, given the initial scepticism they faced, these “pillars”, as he dubs them, have turned out not only to be true, but also crucial for our own existence, and maybe even life elsewhere in the universe.

Gribbin, a veteran science writer who won a lifetime achievement award from the Association of British Science Writers in 2009, digs out lots of good stories about scientists and their discoveries, some of which I haven’t come across in decades of reading. I particularly enjoyed the description of how Cecilia Payne moved from the UK to America in 1925 because, not being a man, she was not allowed to take a degree at Newnham College in Cambridge. Payne triumphed in the US and became the first woman to be awarded a PhD by Radcliffe College, near Boston, for her work in astrophysics at Harvard College Observatory.

For her thesis research Payne found that in a study of 18 elements in several stars there was overwhelmingly more hydrogen and helium than everything else. Asked by her supervisor Harlow Shapley to review it, senior astronomer at Princeton University Henry Norris Russell said the result was “clearly impossible” since it contradicted Henry Rowland’s prediction that the Sun’s composition was very similar to the Earth’s. On Shapley’s advice, Payne included a sentence in her thesis that the apparent overwhelming abundance of hydrogen and helium in the stellar atmospheres was “almost certainly not real”.

Many scientific axioms have had to pass through four stages of acceptance, from nonsense to “I always said so”

Later, Russell did his own study of the solar spectrum and concluded that “the great abundance of hydrogen can hardly be doubted”. He gave full credit to Payne, but his prominence left him with most of the recognition – and meant that yet another woman was overlooked for a Nobel prize.

While Richard Feynman famously said that if there was only one piece of knowledge we could pass on to future generations, it would be “that all things are made of atoms”, some of Gribbin’s pillars remind us it’s a good thing that we can pass on more. Solid things, for example, are mostly empty space. Stars are suns and we know what they are made of. There is, also, no life force. This pillar is important right now, as scientific facts continue to struggle against powerful religions, conspiracy theories and people with outlandish ideas who can’t think critically even if their lives are at stake (and with COVID-19, they might well be). Even Louis Pasteur argued for vitalism, the idea that living things hold some non-physical something that inanimate objects do not.

In “The Milky Way is a warehouse stocked with the raw ingredients of life”, Gribbin covers the famous experiments of Stanley Miller as a lead-in to the history of organic chemistry. When he was a graduate student, Miller – under his adviser Harold Urey – mixed methane, ammonia, water vapour and hydrogen to mimic what Haldane dubbed a small “primordial soup” on early Earth, adding sparks as lightning. Astonishingly, the experiment produced 13 different amino acids, the building blocks of proteins, in just the first week.

Since then, astronomers have discovered a few hundred interstellar molecules, including amino acids. As Gribbin writes, the young Earth was almost surely seeded with the raw materials of life via such molecules sticking to ice-covered dust grains from comets. After the tumultuous Late Heavy Bombardment of comets and rocks that ended about four billion years ago, protein- and nucleic-based life was established on Earth in only 200 million years, just 700 million years after Earth’s formation. It may be the best argument for the ubiquitous presence of at least simple life throughout the  cosmos.

The chapter titled “The carbon coincidence”, meanwhile, includes the not-so-well known story of Fred Hoyle’s prediction of the famous 7.65 MeV nuclear resonance state in carbon-12 that allows stellar nucleosynthesis to proceed past beryllium-8. Gribbin was a graduate student under Hoyle, and he tells this story well, including how Hoyle was egregiously overlooked for a Nobel prize in favour of a teammate, the experimentalist William Fowler. The long-serving Nature editor John Maddox once called the neglect “shameful”.

Incidentally, I found the photograph of Hoyle in this pillar’s chapter to be striking and beautiful. It is a simple headshot of him in a heavy suit and tie, but I felt I could see into Hoyle, almost into his very being and intelligence. It makes up for the next pillar on the “Book of life, written in three-letter words”, which presents a picture of Raymond Gosling but not Rosalind Franklin, the crystallographers whose work was instrumental in James Watson and Francis Crick’s discovery of the structure of DNA.

Gribbin’s final pillar, “The incredible lightness of ice”, is insightful and well written and could stand alone as a primer on the importance of the seemingly trivial but unusual fact, among substances, that solid water is less dense than liquid water, and the importance of the hydrogen bond. I would quibble though with some of the statistics in this section of the book. In particular, the modern dimensions for the Milky Way galaxy are generally given as 185,000 light-years across and 2000 light-years thick (nearly twice the values he quotes), while the Earth is close to the inner edge of the Sun’s habitable zone, not in the middle.

This pillar goes big, discussing the importance of past Snowball Earth climates and supervolcanoes and the planet Jupiter. At the very end Gribbin gives his own conclusion about the existence of intelligent life like us elsewhere in the heavens. I won’t spoil his answer to this grand question here, but I will say that, perhaps in keeping with the four stages of acceptance, I was left quite surprised.

  • 2020 Icon Books £9.99hb 160pp

BrainGate: untangling the brain–computer interface

Mind reading has long been relegated to the realm of science fiction. But, with the power of electrodes, researchers are able to detect and monitor neurological signals to gain insight into the activity of the brain.

One of the most exciting applications of this technology is the brain–computer interface (BCI). BCIs that aim to digitize thought patterns – such as the intention to move – could help individuals with loss of motor function, including patients with severe neurological disease or spinal cord injury. Applying BCIs to control assistive technology could repair cognitive-sensory function, even for those who are left “locked-in”, providing them with an avenue to communicate and interact with the world around them.

Wired versus wireless BCIs

One of the major obstacles inhibiting the use of BCIs outside of a laboratory setting is the cumbersome system of wires needed to transfer the large amount of data collected from the brain to a computer.

Enter BrainGate, an interdisciplinary research team involving Brown University, Massachusetts General Hospital, Stanford University, Case Western Reserve University and Providence VA Medical Center. With a focus on practical applications and reliability, the team aims to develop assistive BCI technology to restore independence and communication in individuals with impaired movement abilities.

The team’s most recent work, published in IEEE Transactions on Biomedical Engineering, details a clinical trial of a new wireless BCI called the Brown Wireless Device (BWD). The two study participants, aged 63 and 35, suffer from tetraplegia (paralysis of all four limbs) caused by spinal cord injuries.

The BWD connects to two 96-channel silicon microelectrode arrays via the same ports used for studies with wired BCI systems. The intracortically-implanted electrodes detect neural activity from a part of the frontal lobe – specifically, a brain region responsible for motor control. The signal is then amplified and filtered to identify when the patient thinks about moving a limb, which can be used to trigger an action. In this trial, the participants moved a cursor on a tablet computer to type and navigate through applications.

Previous wireless BCIs have not been able to match the fidelity of their wired counterparts, a limitation that is overcome by BrainGate’s high-bandwidth transmission protocol. The patients achieved comparable point-and-click accuracy and typing speeds with the BWD as when using a wired BCI.

By minimizing power consumption, the BWD battery lasts up to 36 hr, enabling a participant’s brain activity to be recorded in their own home for 24 hr continuously.

“With this system, we’re able to look at brain activity, at home, over long periods in a way that was nearly impossible before. This will help us to design decoding algorithms that provide for the seamless, intuitive, reliable restoration of communication and mobility for people with paralysis,” says Leigh Hochberg, leader of the BrainGate clinical trial, in a recent press release.

The researchers have developed a fully implantable version of the BWD and validated it in primates. They are currently preparing the device for regulatory approval prior to human clinical trials.

Pulsed lasers probe beyond titanium dioxide’s surface

A collaboration involving experimental and computational chemical physicists has revealed new clues as to how electrons in titanium dioxide interact with light. The researchers used two-photon photoemission (2PPE), a technique utilizing ultrafast laser pulses, to distinguish electrons localized at surface defects from electrons in the bulk. The findings could have a lasting impact on the design of semiconductor devices with photocatalytic applications, improving the efficiency at which sunlight is converted into more usable forms of energy.

Titanium dioxide is an important material for solar technologies: it can be used to generate hydrogen via solar water splitting (where sunlight dissociates hydrogen and oxygen from water molecules) or as an electron transport layer in next-generation solar cells. It is also well-suited to experiments under highly controlled ultra-high vacuum conditions, leading to titanium dioxide being considered a “model surface” for the study of metal oxides.

However, the role that defects in the crystal structure – and electrons bound to them – play in the photocatalytic performance of titanium dioxide in “real life” systems remains to be fully determined.

Protected from chemical reactions

“Discerning the individual, light-driven processes of photocatalysis is challenging,” Alex Tanner, a PhD student in Geoff Thornton’s group at University College London and the lead author of a paper describing the latest research, explains to Physics World. “But those defects in the bulk are more abundant and are protected from chemical reactions compared to those at the surface, making them important to understand.”

With this motivation, Tanner and colleagues did 2PPE measurements using ultrashort laser pulses, lasting only femtoseconds (10-15 s), to improve our understanding of the nature of electrons in titanium dioxide. The technique is underpinned by the quantum mechanical view of light as comprising discrete packets of energy called photons. When the laser beam is focussed onto the sample, an initial photon, known as the pump, can be absorbed by an electron, increasing the electron’s energy. A second photon (the probe) then further excites the electron, ejecting it from the material and into a vacuum system, where the electron’s kinetic energy is measured.

Revealing excitation pathways

Because of the photon energies involved in this work, the depth from which the electrons were ejected was up to around 5 nm, which was greater than in previous studies.

The results suggest that electrons localized at the titanium dioxide surface are bound more strongly to defects than those a few atomic layers deeper into the material. This indicates an alternative excitation pathway for electrons in the bulk. To assist with this interpretation, the group of Annabella Selloni at Princeton University used computational methods. Tanner credits his collaborators for further confirming the findings: “[Princeton chemist] Bo Wen created precise models of our system that replicated our experiments. This allowed us to confirm our experimental intuition and provided a level of detail not possible from experiment alone.”

By showing how electrons localized at the bulk defects may play a role in photocatalysis, the work represents an important step towards control over the properties of titanium dioxide and other solar materials.

Full details of the research are reported in Physical Review B.

Graphene beam splitter gives electron quantum optics a boost

A graphene-based “beam splitter” for electronic currents has been built by researchers in France, South Korea, and Japan. Created by Preden Roulleau at the University of Paris and colleagues, the tuneable device’s operation is directly comparable that of an optical interferometer. The technology could soon enable allow electron interferometry to be used in nanotechnology and quantum computing.

An optical interferometer splits a beam of light in two, sending each beam a long a different path before recombining the beams at a detector. The measured interference of the beams at the detector can be used to detect tiny differences in the lengths of the two paths. Recently, physicists have become interested in doing a similar thing with currents of electrons in solid-state devices, taking advantage of fact that electrons behave as waves in the quantum world.

Graphene is a sheet of carbon just one atom thick and is widely considered to be the best material for realizing such “electron quantum optics”. Indeed, researchers have already used the material to make simple electron interferometers. Now, Roulleau’s team has created a fully-adjustable electron beam splitter that could be used to build more sophisticated devices. It exploits the quantum Hall effect, whereby the application of a strong magnetic field perpendicular to a sheet of graphene will cause an electron current to flow around the edge of the sheet.

Graphene p-n junction

The team’s interferometer design features a graphene p-n junction, which comprises a nanoscale flake of graphene where one side is p-doped and the other is n-doped by applying two different electric fields.

The device operates by first injecting an electron current at one corner of the n side, which causes two current loops to form on either side of the p-n boundary, flowing in opposite directions. While the edge of the p side carries a clockwise loop of spin-up electrons, the n side has two separate channels flowing anti-clockwise, each containing opposite electron spins.

At the point where the two loops first meet at the boundary, the effect of quantum tunnelling means that a certain proportion of spin-up electrons on the n side will transfer to the p side. By applying a varying voltage to this point through a tuneable gate, Roulleau and colleagues discovered that they could tightly control this tunnelling fraction: an ability directly comparable to a tuneable beam splitter used in optical interferometers.

Recombined electrons

At the other end of the boundary, the researchers then applied a second voltage to act as a “reverse” beam splitter, forcing electrons that had tunnelled to recombine with the current on the n side. To determine the influence of these tunnelling electrons, Roulleau’s team measured an output current on the p side, at the opposite corner to the current injection point.

From their observations, the team spotted characteristic oscillations in this output current, which varied depending on the voltage and magnetic field strength applied to the first beam splitter. In analogy with the interference patterns seen in recombined interferometer beams, these oscillations indicated the phase differences between the recombined n side currents. The team will now aim to further compact their graphene flake design; potentially leading to new, highly advanced capabilities in both nanotechnology and quantum computing.

The device is described in Physical Review Letters.

China’s Beijing Graphene Institute looks to accelerate the ‘graphene era’

Zhongfan Liu

What is the role of the Beijing Graphene Institute (BGI) and why was it created?

Officially opened in 2018, the BGI is committed to building a world-leading graphene R&D cluster and developing the worldwide graphene industry. Graphene is an emerging strategic material and while the roadmap to industrialization is long and full of challenges, the BGI is aiming to accelerate it through a collaboration between government, industry and academia. We believe we are on the eve of a “graphene era”.

Who funds the institute?

The BGI is jointly funded by the Beijing municipal government and social capital funds with 320m RMB ($50m) as registered capital and about 200m RMB invested each year. The institute comprises two legal entities – a research centre that receives funding from local governments and a company focusing on business development.

We believe the best is yet to come. Graphene has a big potential in many applications

How many people work at the institute and is that set to grow in the future?

We currently have around 250 staff, which is growing at a rate of about 100 new staff every year. Despite its short history, BGI has been successful in attracting people and we aim to reach a thousand staff in the coming decade.

Why graphene?

Carbon-based materials have proved promising and have derived various industrial applications such as carbon fibre, graphite and activated carbon. Graphene, being another revolutionary carbon material, has shown promise due to its unique characteristics in terms of strength, electronic and thermal whilst being lightweight and transparent. The future for the industrialization of graphene is bright because of its extraordinary structural, physical and chemical properties.

What attracted you to graphene?

I started research in graphene in 2008. Before then I worked at Peking University on carbon nanotubes research. However, after publishing almost 600 research papers, I started to realize the limitation of fundamental research and think about doing something beyond that. Then in 2010 Andre Geim and Konstantin Novoselov shared the Nobel Prize for Physics for discovering graphene.

And how did that transform work on graphene?

Researchers around the world then began clamouring for ways to use this remarkable “supermaterial”. China also fell into “graphene fever” and began to create various graphene industrial parks and companies. Initially, the overfocus was irrational and concerning, so in 2016 I decided to set up the BGI to bring graphene research on the right path and pave the way for further industrialization. I hope BGI can follow the example of the Japanese company Toray, which leads the exploration of carbon fibre material and nurtures a worldwide carbon fibre industry.

What applications of graphene are you working on at the institute?

We are currently focusing on materials such as A3 sized super-clean graphene film, 4-inch single-crystal graphene wafers, graphene-coated glass fibres as well as 30 × 30 cm2 super graphene glass. These types of materials can be used in light-emitting diode devices, fibre-optic sensors, ultrafast lasers, high-performance heaters and more other areas. We aim to develop green and mass production techniques to make low-cost graphene materials, and to explore applications of graphene in traditional and high-tech industries.

How will these be scaled-up for applications?

We hope to build mass-production lines for the best quality materials, including manufacturing and characterization equipment as well as strong commercialization team, which can move these materials from the lab to the marketplace.

What do you think are the most exciting potential applications of graphene?

We believe the best is yet to come. Graphene has a big potential in many applications. But like carbon fibre, which was initially used in fishing pole and golf clubs before it found use in civil aviation, it takes time to explore the ultimate application of graphene. It will need joint efforts from researchers and entrepreneurs who are devoting themselves to the commercialization of materials.

What challenges – both technological and commercial – does graphene have to overcome to make it in these applications?

According to a report in 2018, most of the companies worldwide claiming to produce “graphene”, are producing less than 10% of the graphene (Adv. Mater. 30 1803784). This lack of properly characterized, high-quality material has been stalling the development of applications that depend fundamentally on graphene such as advanced coatings and composites, high-performance batteries, sensors as well as electronic and optoelectronic devices. Another challenge we are facing is to find an ultimate application for graphene and then nurturing that market.

Do you work with international collaborators or companies?

The BGI has been working with many academic institutes in Europe by providing graphene and single crystal graphene wafers for LED lighting devices. No commercial collaborations have begun yet.

What would mark success for the institute over the next 5-10 years?

The BGI is aiming to be leading player in a graphene industry that is set to be worth over $100bn. We also want to be an integrated enterprise that incubates multiple subsidiaries in different areas.

Transistor-like device controls graphene’s electronic properties

Researchers in Germany and Spain have created a transistor-like device that uses a small voltage to control the strength and frequency of electronic signals transmitted through graphene. The feat, which is detailed in Science Advances, marks an important step towards using graphene in electronic devices such as terahertz frequency converters, mixers and modulators.

Graphene – a honeycomb-like lattice of carbon just one atom thick – has several unique electronic properties. Many of them stem from the fact that it is a semimetal with no energy gap between its valence and conduction bands. In the region where these two bands meet, the relationship between the energy and momentum of charge carriers (electrons and holes) in graphene is described by the Dirac equation, rather than the standard Schrödinger equation as is the case for most crystalline materials.

High electronic conductivity and massless behaviour

The presence of these unusual band structures (known as Dirac cones) enables the charge carriers in graphene to behave like massless particles. This effective masslessness gives the electrons in graphene a very high mobility – up to 200 000 cm2/Vs at room temperature, compared to only about 1400 cm2/Vs in silicon. Such extremely high mobility means that graphene-based transistors and other electronic devices could be faster and more energy efficient than any that exist today.

Researchers recently discovered that when an electric current (or a light wave) passes through graphene, the material’s high electron conductivity and the effectively massless behaviour of its electrons changes the frequency of the current. This type of nonlinear behaviour is one of the most basic functionalities in modern electronic devices, crucial for switching and processing electrical signals.

Graphene’s nonlinearity is by far the strongest of all electronic materials, notes Dmitry Turchinovich of Bielefeld University, who co-led the latest study with Michael Gensch of the German Aerospace Center (DLR) Institute of Optical Sensor Systems and the Technical University of Berlin. The material also remains highly nonlinear even at high frequencies, extending into the technologically important terahertz (THz) range where most conventional electronic materials fail.

Tight control

While this behaviour is important for integrating graphene into electronic devices, researchers need to be able to control it first. Gensch, Turchinovich and colleagues have now demonstrated that such control is possible. In the new work, they fabricated a transistor-like device to which they could apply a gate (control) voltage via electrical contacts. They then used the device to transmit ultrahigh frequency THz signals and analysed how the frequency of these signals transformed as a function of the applied voltage.

At a certain applied voltage, the researchers observed that graphene’s normally strong nonlinear response nearly vanished. By slightly increasing or decreasing the control voltage from this critical value by just a few volts, they found they could make the material strongly nonlinear again. Once they determined the optimal gating voltage, they showed that they could alter the strength and the frequency components of the transmitted and reemitted THz electronic signals by as much as two orders of magnitude.

A missing link

Being able to control graphene’s nonlinearity in such a simple way is the “missing link” for using the material in electrical signal processing and signal modulation applications, Turchinovich says. “With this work, we have reached an important milestone on the path towards using graphene as an extremely efficient nonlinear functional quantum material in devices like THz frequency converters, mixers, and modulators,” Gensch adds.

Gensch goes on to explain that graphene is also perfectly compatible with existing electronic ultrahigh-frequency semiconductor technology such as CMOS or Bi-CMOS. It is therefore possible to envision hybrid devices in which the initial electric signal is generated at lower frequency using existing semiconductor technology, and is then very efficiently up-converted to much higher THz frequencies using graphene – all in a fully controllable and predictable manner.

The team, which also includes researchers from the Helmholtz Center Dresden-Rossendorf, the Max Planck Institute for Polymer Research and the University of Duisburg-Essen in Germany and the Catalan Institute of Nanoscience and Nanotechnology (ICN2) and the Institute of Photonic Sciences (ICFO) in Spain, says it is now working on integrating graphene into SiGe HBT/Bi-CMOS chip technology.

Laser paints a mini masterpiece, counting bubbles in a glass of beer, Jane Austen written in oligomers

Vividly coloured paintings have been created by researchers in Russia by using a laser to heat the surface of a metal until it begins to evaporate. Developed by Vadim Veiko, Yaroslava Andreeva and colleagues at ITMO University in Saint Petersburg, the technique makes colours by creating oxide layers on the metal surface. The palette of nine basic colours can be created, erased and changed using the laser and the team used the technique to make a 7×5 cm reproduction of Vincent van Gogh’s The Starry Night in just a few minutes.

The team now hope to incorporate the laser-painting technology into a handheld tool and describe their research in Optica.

Have you ever wondered how many bubbles there are in a glass of beer? Gérard Liger-Belair and  Clara Cilindre did, and now they have an answer. The duo, which had previously counted bubbles in a champagne flute, first measured the carbon dioxide content of a freshly-poured glass of lager at 5 °C. Then they calculated how many bubbles would form at defects in the glass that are more than 1.4 micron wide. High-speed photographs revealed how bubbles grew as they rose in the glass, removing more carbon dioxide from the beer.

Glass imperfections

Putting all of this together, they reckon that between about 200,000 and nearly two million bubbles are created in a gently poured glass of lager before it goes flat. Interestingly, they discovered that beer and champagne bubbles form differently in a glass, with larger imperfections leading to more bubbles in beer but not in champagne. The researchers, based at the University of Reims Champagne-Ardenne, describe their study in ACS Omega.

Austenites have a new way of reading their favourite author now that a passage from Jane Austen’s novel Mansfield Park has been encoded in a series of oligomer molecules. Eric Anslyn and colleagues at the University of Texas at Austin used a new molecular-data-storage technique to encode the quote, “If one scheme of happiness fails, human nature turns to another; if the first calculation is wrong, we make a second better: we find comfort somewhere”.

According to the team, the words of wisdom can be read back without prior knowledge of the structures that encoded the passage. You can read more about the encoding technique in Cell Reports Physical Science.

 

Copyright © 2026 by IOP Publishing Ltd and individual contributors