Skip to main content

Quantum technology 2.0

By Michael Banks

Are we on the verge of a “quantum 2.0” revolution?

That was a question raised yesterday at an event that I attended at HP Labs in north Bristol, which was organized by the University of Bristol.

The day-long meeting featured a series of talks from industry about how quantum technologies are affecting business.

(more…)

Flash Physics: Room-temperature superconductor, well-known fundamental constants, photosynthesis in action

Superconducting transition spotted well above room temperature

An abrupt transition in the electrical resistance of graphite at 350 K could be a signature of superconductivity occurring well above room temperature (293 K). That is the claim of Pablo Esquinazi and colleagues at the University of Leipzig in Germany and also in Brazil and Australia. The effect was spotted in samples of natural graphite that came from a mine in Brazil. While claims of room-temperature superconductivity in graphite have been made several times over the past 40 years, this is the first time that the transition temperature has been measured, according to Esquinazi. The team found that the transition went away when the graphite was exposed to a magnetic field – something that is indicative of superconductivity. The team believes that individual grains within their samples are tiny superconductors and the spaces between the gaps act as Josephson junctions that allow supercurrents to flow from one grain to another. X-ray diffraction studies suggest that the grains have atomic structures that could support superconductivity, says Esquinazi. The research is described in New Journal of Physics.

Measurements of fundamental constants are good enough to revamp SI units

The fundamental physical constants have been measured with sufficient precision to allow the values to be used to redefine the International System of Units (SI), according to scientists at NIST in Gaithersburg, Maryland. These constants include the speed of light, the Planck and Boltzmann constants and the electrical charge of the electron. Metrologists are in the process of creating a completely new way of defining SI units – such as the metre, kilogram and second – in terms of the fundamental constants. This is unlike the current definition, which relies in part on artefacts such as the standard kilogram that is stored in Paris – and losing mass over time. In the new system, the Planck constant – which is now known to 12 parts in one billion – would be used to define the kilogram. Other planned changes involve using the Boltzmann constant (6 parts in 10 million) to define the kelvin, which is currently defined using the triple point of water. “These now ultra-small uncertainties in the constants will allow the General Conference on Weights and Measures to revise the International System of Units so that the seven base units will be exactly defined in terms of fundamental constants,” says NIST’s Donald Burgess.

X-ray laser reveals key steps in photosynthesis

Photograph of researchers preparing samples of photosystem II

An X-ray free-electron laser at SLAC in the US has been used to observe two important steps in photosynthesis in which water molecules are split to liberate oxygen atoms. The work was done by an international team of scientists that used X-ray pulses just 40 fs in duration to determine the structure of a protein complex called “photosystem II”, which is involved in water splitting. Unlike previous studies of the process, which were done using frozen samples, the measurement was done at room temperature. The team was able to observe the steps in the four-step cycle by first firing pulses of green laser light at the liquid sample to initiate the splitting. Then, the X-ray pulses are used to measure the structure of photosystem II as the splitting proceeds. The team hopes its measurements will shed light on how water is split by a complex in photosystem II that contains manganese and calcium. “Learning how exactly this water-splitting process works will be a breakthrough in our understanding, and it can help in the development of solar fuels and renewable energy,” explains team member Vittal Yachandra of Berkeley Lab. The research is reported in Nature.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on a new type of neutrino detector.

Doing physics by ear

When not pondering the underlying principles of the physics universe, Sajjad is also a strong advocate of improving accessibility for people with visual impairments. He makes the case that everybody stands to benefit from diversifying the ways in which physics is taught. In his spare time Sajjad is also a keen baseball player and is often travels to other US cities with his team the Boston Renegades. At heart though, Sajjad admits that he prefers cricket!

Web life: Ice Flows

So what is the site about?

Computer and video gamers of a certain vintage will have fond memories of Lemmings, a game in which players must shepherd pixelated, suicidal rodents around a series of obstacles to reach safety. At first glance, the Ice Flows game is strikingly similar. Your task is to guide hungry penguins to their feeding grounds off the Antarctic coast, ensuring that they leap into the water in the precise place where the fish are, and the leopard seals aren’t. But that’s where the likeness ends. In Lemmings, any resemblance between the playing environment and actual lemming habitat was purely accidental (and in any case, real-world lemmings don’t walk off cliffs). Ice Flows, by contrast, is grounded solidly in science. The design of each game level is based on actual locations, and to steer your penguins to the right place, you must manipulate the rate of snowfall onto their icy breeding grounds and the temperature of the waters beyond them. What’s more, the effects of changing snowfall and water temperature are based on detailed data and computer models of real ice-shelf behaviour.

Who is behind it?

Ice Flows is the brainchild of Anne Le Brocq, a physical geographer at the University of Exeter, UK, who studies ice sheet modelling and the sub-glacial hydrology of ice sheets. She created the game with two game developers, Inhouse Visuals and Questionable Quality, and with support from the British Antarctic Survey and researchers at several other institutions around the UK and Europe. The whole effort was funded by the Natural Environment Research Council.

What’s it like to play?

The developers have crafted a cute and visually appealing game around the flowing-ice concept, and striking the right balance between snowfall and water temperature is pleasingly tricky without ever tipping into frustration. An initial tutorial level is followed by more difficult scenarios based on the Ronne and Filchner ice shelves. In some of these later levels, the complex interplay between the ice and the shape of the bedrock below it means that even a small increase in water temperature will cause your entire ice shelf to break up into icebergs, sending your penguins straight into the jaws of the waiting leopard seals. Such behaviour is physically realistic. According to in-game explanations, when the bedrock slopes downwards inland, it is known as a “reverse slope” and “there is a theory that once an ice sheet starts to retreat down a reverse slope, it is harder to stop the retreat”.

Who is it aimed at?

The game’s visuals and its reward element (when you perform well you accumulate credits that you can use to “buy” fancier types of penguins) seem tailor-made for children in late primary/early secondary school. With a couple of exceptions, most of the levels look and behave very similarly (presumably because there isn’t that much variation in the real locations), so in pure gameplay terms, the scientific realism of Ice Flows is actually something of a hindrance. That said, the game is certainly good enough to use as part of a lesson on climate change, and the “about” page has links to a number of useful resources for teachers. Finally, as Le Brocq points out in the game’s blog, video snippets from the game make excellent visual aids in public lectures about ice science.

Is the universe a sponge?

An image of the universe's large-scale topology. The image is split into two sides and each side is a complex pattern of red, green and blue areas with many holes

Does the large-scale universe look more like meatballs, like Swiss cheese or like a sponge? A meatball universe would be composed of isolated, disconnected regions of high density embedded in a connected low-density background. The Swiss cheese universe would be precisely the opposite: low-density isolated voids embedded in a high-density connected background. A sponge is neither of the above or, if you prefer, a compromise between the two. In a sponge both the low-density and high-density regions are each connected, and ideally both the sponge and its “complement” (the network of holes) are identical in character, at least from a topological point of view.

The differences between these various types of universe – and how we can use ideas from topology to quantify and statistically analyse them – are described in detail in J Richard Gott’s The Cosmic Web: Mysterious Architecture of the Universe. In essence, Gott’s book is the story of how our current understanding of the universe’s large-scale structure emerged over the past 100 years. Note that by “large-scale” Gott means scales somewhere between the entire observable universe (which comprises a sphere with an approximate radius of 15 billion parsecs, or about 45 billion light-years) and a distance of approximately 20 million parsecs. Below that scale the universe is at present very clumpy, and at least for the purposes of this book uninteresting.

Gott begins his story around 1918, when Harlow Shapley dethroned the Sun from its supposed position at the centre of the Milky Way by mapping the distribution of globular clusters. These compact, spherically shaped, gravitationally bound objects comprise thousands of stars and are predominantly situated in the halo of our galaxy, above its dust-obscured disc (part of which we see in the night sky as the Milky Way). Shapley observed that the density of globular clusters depends not only on the angle with respect to the disc; it is also higher in the direction of the galactic centre, which lies roughly in the constellation Sagittarius. Shapley’s analysis showed that if one regards the Milky Way as a sort of saucer, we live somewhere near the edge.

From there, Gott’s story follows two parallel but intertwined threads. The first concerns progress on the observational front. Over the past century or so, we have seen ever-bigger telescopes coming online, and in parallel instrumentalists have developed ever-better ways of capturing and processing images. Once CCD images replaced photographic plates analysed by eye, the linearity and reproducibility of CCD measurement made it possible to carry out accurate photometry – a prerequisite for constructing precision quantitative surveys.

Thanks to these advances, catalogues of galaxies increased both in size and in depth, in a manner much resembling Moore’s law, and the frontiers of the universe were progressively pushed back.

Indeed, one could describe these developments as a march outward in distance and backward in time, and ultimately to the limits of our “light cone”. Our past light cone consists of those events (characterized by a position and time) from which a signal can travel to us today along some path without the velocity ever exceeding the velocity of light. Today the terra incognita of the universe has all been mapped out, at least at some rudimentary resolution. We cannot look back farther, because doing so would require signals to travel acausally, or faster than the speed of light – which, as Einstein taught us, is not possible. There remain, however, some gaps to be filled in at higher resolution, such as the epoch of the formation of the first stars and quasars. Although some basic clues exist concerning this epoch, a precise mapping must await more powerful infrared telescopes, and in particular the Square Kilometre Array and its precursors.

The other thread of this story concerns the theoretical front. Improved maps duly inspired a wide range of theories that could explain the origin of the structures being mapped out. Since the quality of the data improved only slowly, a wide range of theories remained tenable for significant periods of time. Many of these theories had little in common, and decisions on which was correct (or at least still viable) had to await better data. In the meantime, the various possibilities gave rise to lively debates – debates that could not however be resolved until better data became available.

Gott’s book combines necessary background material presented in a clear and pedagogical manner with a personal narrative emphasizing those aspects of the subject to which he and his collaborators have made original contributions. Many of the anecdotes make fun reading, emphasizing the human aspect of the endeavour. Gott explains how as a high-school student, he was fascinated by the question of how to fill space using regular polyhedra. A project on this topic won him second place in the prestigious Westinghouse Science Talent Search, and this same idea was the subject of his first published paper, in the American Mathematical Monthly. More recently, this old interest inspired work on a quantitative measurement known as the “genus statistic”, which can be used to analyse the topology of large-scale structures (such as the meatball, Swiss cheese and sponge universes previously described) and thus decide between competing notions of structure formation.

To appreciate Gott’s work on the genus statistic, some background is required. We start with the fact that the large-scale structure of the universe is often explained as arising from an initial state in which matter was distributed uniformly with a certain average density r, with low-amplitude density irregularities dr(x) superimposed. In this model, the total density field is given by r(x) = r + dr(x). Mathematically, the simplest way to set up a stochastic model for the primordial cosmological perturbations is by means of a Gaussian stochastic process. The density perturbation field dr(x) is expanded into Fourier modes and the Fourier coefficient for each mode is chosen randomly and independently according to a Gaussian probability distribution whose variance depends only on the wavelength of the mode. This variance as a function of wavelength is known as the power spectrum of the Gaussian stochastic process and suffices to characterize this process completely.

Because of the importance of Gaussian stochastic processes, most studies of large-scale structure focus on measuring the primordial power spectrum. This would be the end of the story if the primordial perturbations were indeed Gaussian. But Gaussianity should be tested by observation rather than decreed by fiat, and doing so requires new statistics beyond the power spectrum, because meatball-like or Swiss-cheese-like stochastic processes can be constructed with identical power spectra but appearances that are radically different to the eye. This is where the genus statistic comes in. It starts with a 3D catalogue of galaxies. The positions of these galaxies are used to create a smoothed galaxy density field rgalaxy(x,y,z), which is a sort of blurry approximation to the underlying density field r(x,y,z). Then the topology of the surfaces where rgalaxy(x,y,z) is constant is analysed; for example, the median density rmedian (defined as the density at which half the volume is more dense and the other half less dense) defines a 2D surface rgalaxyy(x,y,z)= rmedian. In a Gaussian theory this surface has the topology of a sponge, as shown in the figure above, which was produced using data from the Sloan Digital Sky Survey.

Today, the received wisdom is that the initial perturbations in the early universe were very nearly Gaussian random fields. The simplest inflationary models predict this, and recent extremely sensitive probes for primordial non-Gaussianity (particularly those using cosmic microwave background data) have failed to turn up any statistically significant evidence favouring a non-Gaussian stochastic process for the origin of the initial seeds of large-scale structure. But not so long ago the theoretical field was wide open, and models predicting different types of topology were able to explain the available data at the time. A Swiss-cheese universe, for example, might be explained as resulting from explosions, which would clear matter out of the voids.

This story, and the confusion it created within the cosmology community, is beautifully explained in Gott’s book. By going beyond a sort of “Cosmology 101” pseudo-history, in which observations are cherry-picked to suggest a linear path from past ignorance to current wisdom, Gott provides a complement to this more conventional story, artfully recounting the excitement, debates and false directions that led to our current “best bet” theoretical description of the universe.

  • 2016 Princeton University Press £22.95/$29.95hb 272pp

Flash Physics: Huge Martian ice deposit, Trump sets back climate science, ultra-transparent metamaterial

Martian ice deposit holds as much water as Lake Superior

An underground deposit of water ice on Mars covering an area larger than Poland and containing more water than Lake Superior – the largest of the North American Great Lakes – has been discovered by Cassie Stuurman of the University of Texas, Austin, and colleagues. The deposit was found in a mid-latitude region called Utopia Planitia using SHARAD – a ground-penetrating radar instrument aboard NASA’s Mars Reconnaissance Orbiter. Its composition varies between 50–85% frozen water that is mixed with dust and larger rocky particles. The ice is buried about 1–10 m under a layer of soil, which the researchers believe stops the water from subliming into the atmosphere. “This deposit probably formed as snowfall accumulating into an ice sheet mixed with dust during a period in Mars history when the planet’s axis was more tilted than it is today,” explains Stuurman. The study is described in Geophysical Research Letters and was inspired by work done by Gordon Osinski of the University of Western Ontario. He noticed that patterns on the surface of Utopia Planitia are similar to those seen in regions of ground ice in the Canadian Arctic (see figure). Water close to the surface of Mars could be used as a resource for future human colonization of the planet. “Sampling and using this ice with a future mission could help keep astronauts alive, while also helping them unlock the secrets of Martian ice ages,” says Joe Levy of the University of Texas.

Trump victory sets the clock back, say climate scientists

“Devastating”, “embarrassed”, “worried” and “set the clock back” were just some of the words used by environmental scientists when asked for their views on the election of climate-change sceptic Donald Trump as US president – according to an article in environmentalresearchweb called “The climate after Trump”. In January, Trump will become the only leader of a major industrialized country to deny the existence of human-caused climate change and he looks set to renege on the Paris climate agreement that came into force on 4 November. “It took the US two decades to go from climate obstructionist to climate leader, and one ugly season to throw it away,” lamented Dan Kammen of the University of California, Berkeley. Other researchers believe that the scientific community was naive to think that better information and more knowledge are enough to convince wider society of the importance of tackling environmental degradation. “We must leave our castles in the sky; we must get out and listen deeply and with empathy, we must stop preaching to the converted,” says Hallie Eakin of Arizona State University.

Metamaterial is transparent at all angles of incidence

Image of microwaves travelling through an ultra-transparent metamaterial

An “ultra-transparent” medium that transmits electromagnetic radiation incident from all angles has been unveiled by physicists at Soochow University and the Hong Kong University of Science and Technology in China. The new metamaterial comprises a regular array of alumina bars. These are arranged so that the refractive index within the array is identical to that in free space, regardless of the incident angle of the radiation. This is unlike materials like glass, which has an index of refraction that is greater than air. This means that some incident light is reflected at the surface of glass, and that rays are bent when they travel between the two media. Both of these effects can have distorting effects on light moving through an optical system and cause, for example, the blurring of images. The metamaterial created by Zhi Hong Hang, Yun Lai and colleagues works at microwave frequencies, but writing in Physical Review Letters, the team says that it should be possible to create ultra-transparent metamaterials that work for visible light.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics.

Gravity measured using a Bose–Einstein condensate on a chip

A new sensor that measures the local acceleration due to gravity using a Bose–Einstein condensate (BEC) of ultracold atoms has been made by physicists in Germany, the US and Canada. While the prototype device is not as accurate as commercial gravimeters, its makers say it could be made much smaller and much more accurate than existing devices.

Atoms can be used to measure the acceleration due to gravity by cooling a gas of them to near absolute zero and then dropping them along two different paths in an interferometer. The quantum interference that occurs when the paths converge at a detector provides a very good measure of gravity, with commercial atom interferometers able to measure the acceleration to within one part in 108. Such measurements are invaluable for geological exploration because the presence of certain minerals can be spotted by seeking tiny variations in gravity at the Earth’s surface.

While these ultracold atom gravimeters are on a par with conventional absolute gravimeters based on macroscopic falling masses, their accuracy could be improved a lot by using a BEC. In a conventional atomic gravimeter, the ultracold atoms form a diffuse gas roughly a millimetre in size and a major cause of uncertainty is that the laser pulses used to control the atoms are not spatially uniform on that length scale. A BEC – formed by cooling a gas of atoms with integer spin until they condense into a single quantum state – reduces this uncertainty because it squeezes the atoms into a region that is about 100 times smaller.

Tiny vacuum chamber

Created by Ernst Rasel of Leibniz University of Hannover and colleagues, the BEC gravimeter has at its heart a chip a few centimetres in size that contains a tiny vacuum chamber. Lasers and magnetic fields trap about 15,000 rubidium atoms at the top of the chip, where they are cooled to temperatures of a few nanokelvin to create a BEC.

The BEC is then allowed to fall about 1 cm through the chamber and, as it does so, a series of laser pulses is fired at the BEC, deflecting the atoms into different paths to create an interferometer. It takes about 10 ms for the atoms to complete their free fall, but the sensitivity can be improved if this time is extended using a laser pulse to bounce the atoms back up when they reach the bottom of the chip. The atoms are then allowed to fall back down again, increasing the free-fall time by a factor of five.

Rasel and colleagues were able to measure the acceleration of the atoms with an accuracy of about one part in 107 – at least an order of magnitude worse than commercial gravimeters. Writing in Physical Review Letters, the researchers point out that their gravimeter was operated in a “rough environment without access to any vibrational shielding”, which caused much of the measurement uncertainty. By reducing the vibrations and doing other improvements, Rasel and colleagues reckon the gravimeter could operate with an uncertainty of less than one part in 109 and fit inside of a backpack.

What is spintronics?

The word “spintronics” is a portmanteau of spin and electronics. That’s because this emerging field of applied physics offers an alternative to conventional electronics – using an electron’s spin instead of its charge to carry information. In this short video, Christoph Boehme – a condensed matter researcher of the University of Utah in the US – introduces spintronics to the uninitiated. Boehme takes it back to basics by explaining what physicists mean by an electron’s spin and how this can be exploited to encode information within circuitry. He also introduces some of the possible applications and beneficiaries of spintronics such as websites dealing with big data sets.

This video is part of our 100 Second Science series, in which researchers give concise presentations covering the spectrum of physics.

Flash Physics: Graphene-based loudspeaker, new schedule for ITER, India joins CERN

“Consumer-ready” loudspeaker is made from graphene oxide

The first “consumer-ready” loudspeaker made from graphene oxide has been unveiled by the Canada-based company ORA. Researchers at the start-up company made their loudspeaker from a grapheme-oxide-based composite material dubbed GrapheneQ. The material has properties similar to pristine graphene, which is a sheet of carbon just one atom thick. GrapheneQ is made by reducing graphene and then adding a proprietary blend of cross linkers to make the composite. The material is designed to have a low “Q resonance”, which means that it requires less damping (especially at low frequencies) than commercial devices to prevent unwanted frequency responses. This means that the device can operate at just a few nanoamps and so uses much less power than conventional speakers – a real advantage if it were to be employed in portable devices. Less damping at lower frequencies also means that the bass and treble response are both extended, which drastically improves the fidelity of the music being reproduced. “GrapheneQ is also very inexpensive to produce (membranes cost as little as $0.02 in raw materials), is easily shaped into 3D forms, and is scalable,” explains ORA co-founder Xavier Cauchy. A longer version of this article first appeared on nanotechweb.org.

ITER council endorses new “baseline” schedule

The ITER Council has approved an updated schedule for the huge fusion experimental facility that is currently being built in Cadarache, France. At a meeting held from 16 to 17 November, the council approved the plan that was proposed by the ITER organization earlier this year with first plasma set for 2025 – a delay of five years – and ITER only moving onto deuterium-tritium fuel in 2035. ITER is a collaboration between China, the EU, India, Japan, Russia, South Korea and the US that aims to demonstrate that nuclear fusion can generate useful energy. It will involve a giant doughnut-shaped chamber, known as a tokamak, which will use strong magnetic fields to contain a heated plasma of deuterium and tritium at a temperature of tens of millions of degrees so that atomic nuclei collide and fuse. In theory, the reactor will produce 10 times the power it takes to heat it. The slip in the schedule was initially announced in June after French nuclear physicist Bernard Bigot, former head of France’s Alternative Energies and Atomic Energy Commission, was brought in as ITER director general in 2015 to shake up the organization and draft a credible schedule. In its November meeting, ITER Council reported that all 19 project milestones for this year had been completed on time and on budget.

India will become associate member state of CERN

Photograph of dignitaries signing the CERN-India agreement

An agreement that will see India join CERN as an associate member state has been signed by CERN director general Fabiola Gianotti and Sekhar Basu, who is secretary of the Department of Atomic Energy (DAE) of the Indian government. Associate membership will allow India to take part in meetings of the CERN Council and Indian physicists will be eligible for staff appointments at the lab. Indian companies will also be able to bid on CERN contracts. Indian physicists have been involved with the Geneva-based particle-physics lab for more than 50 years, and in 1996 the Indian Atomic Energy Commission contributed to the construction of the Large Hadron Collider (LHC) as well as the CMS and ALICE experiments. The country also hosts Tier 2 centres of the LHC Computing Grid in Mumbai and Kolkata. India was granted observer status at CERN in 2002 and the DAE has since been involved with a number of facilities and experiments at CERN including the ISOLDE radioactive ion-beam facility. “Becoming associate member of CERN will enhance participation of young scientists and engineers in various CERN projects and bring back knowledge for deployment in the domestic programmes,” says Basu. India’s membership will begin once the agreement gains final approval from the Indian government.

 

  • You can find all our daily Flash Physics posts in the website’s news section, as well as on Twitter and Facebook using #FlashPhysics. Tune in to physicsworld.com later today to read today’s extensive news story on an atomic gravimeter on a chip.

China forges ahead in global research

China is performing “outstanding” research in a number of emerging scientific topics, putting the country’s output on a par with the UK but still behind the US. That is the conclusion of a new study by the Chinese Academy of Sciences (CAS) and the scientific data company Clarivate Analytics. The Research Fronts 2016 annual report identifies 100 “hot” and 80 “emerging” research areas based on citation analysis of papers published in 2015.

The research areas – divided in various fields of science – reflect global interest in specific topics that have resulted in “core” journal articles. These articles are defined by an algorithm that takes into account, among other things, the time of publication and how frequently an article is cited by other papers in the same area. In physics, for instance, the hottest research pursuits last year included the detection of dark matter and experiments that measure neutrino oscillations. Research into properties and applications of black phosphorus – a 2D material also called phosphorene because of its similarity to graphene – was also identified. The study of topological materials called Weyl semimetals was also named as a hot topic in physics.

China has a significant gap with the US, and fierce competition with the UK
Research Fronts 2016

Six countries – China, France, Germany, Japan, UK and US – made the greatest contributions in the 180 research areas, according to the report. The US retained its leadership, with its researchers publishing core papers in 152 of the 180 areas, ranging from the hunt for dark matter to the health impact of electronic cigarettes. The UK, meanwhile, contributed core papers in 90 research topics, covering more areas than China’s 68. However, China had top-cited papers among the core papers in 30 research areas, which is more than twice that of the UK. “China has a significant gap with the US, and fierce competition with the UK,” the report says, adding it was likely that China would soon overtake the UK.

Spending growth

One factor in China’s success is that it invested more than $400bn on R&D last year, second only to the US. In 2015, more than 2% of China’s gross domestic product (GDP), adjusted by purchasing power, was spent on research. The rapid increase of government investment has spurred the construction of many large facilities, including the world’s largest single-aperture telescope, the longest quantum-communication network and the first quantum satellite. CAS president Chunli Bai says that the Chinese government is expected to support many of the identified areas over the next five to 10 years. “These breakthroughs may change the future pattern of the world,” says Bai.

Copyright © 2026 by IOP Publishing Ltd and individual contributors