Skip to main content

The funny and sinister sides of machine learning, how to make nanoparticles by the tonne

Would you use a pick-up line composed by a machine-learning algorithm? Perhaps something like You Look Like a Thing and I Love You, which is a machine’s attempt at an amorous icebreaker and the title of a book by Janelle Shane.

Shane is an optical engineer with a blog that chronicles the absurdities that are sometimes generated by artificial intelligence (AI) systems – and is interviewed in this episode by Physics World’s Margaret Harris. While pick-up lines and recipes generated by AI can be hilarious, Shane cautions that there is also a sinister side to AI because it can become very good at mimicking human prejudices.

Have you ever wondered how large quantities of nanoparticles can be manufactured to very high precision? Ed Lester, founder of UK-based Promethean Particles, explains how the firm’s continuous-flow hydrothermal synthesis process does the job using a much more sophisticated version of a method familiar to anyone who grew a “crystal garden” in science class.

Lester spun the company out of the University of Nottingham, where he developed the technology. He looks back at the challenges of developing and commercializing the process and talks about some exciting new nanoparticle applications such as anti-icing coatings for aeroplanes and artificial bone for medical applications.

Artificial spider web gets an ionic boost

An artificial material with the same elastic, adhesive, self-cleaning, sensing and tensile properties as natural spider silk has been created by researchers in South Korea. The synthetic web, which is made from a semi-solid stretchy gel, works using electrostatics and might have applications in artificial muscles, grippers and self-cleaning wall-climbing devices.

Spider silk has a tensile strength five times higher than that of steel, and its stretchable threads boast an adhesive coating that enables spiders to capture and trap prey in their webs. This adhesive coating does have a down side, however, which is that it attracts contaminants from the environment, causing the webs’ capturing efficiency to deteriorate.

To overcome this problem, spiders have evolved several strategies for reducing and eliminating dirt and debris from their webs. One such strategy is to build a minimalist web structure and then wait for prey to fly or crawl into it. As its prey struggles with the sticky threads, the spider senses the web vibrating, springs into action and wraps its as-yet weakly-bound prey with additional threads. This traps the prey once and for all, without the need for a high-surface-area web to which contaminants could more easily adhere. Another strategy is for the spiders to pull on their webs and then rapidly release them, causing contaminants to bounce off in a catapult-like fashion as the webs vibrate.

Mimicking spider actuation and sensing

Scientists have long been fascinated by spider biology, and many have attempted to mimic either the spiders’ actuation and sensing or the structural, self-cleaning properties of their silk. Reproducing the behaviour of spider webs in the laboratory is no simple task, however, and previous attempts to do so have mainly focused on recreating the way in which a spider spins its natural web.

A team led by Jeong-Yun Sun and Ho-Young Kim of Seoul National University has now taken a different approach. Their method involves applying static electricity to fibre-like strands of an ionically conducting and stretchable organogel, which is a semi-solid material made of gelling molecules in an organic solvent (in this case, covalently cross-liked polyacrylamide chains in ethylene glycol with dissolved lithium chloride). The strands of this organogel are then encapsulated with silicone rubber and coated with a hydrophobic perfluorinated compound to reduce their surface energy (and thus surface tension).

The researchers, who report their work in Science Robotics, wove these composite fibres into structures that resemble natural spider webs. They found they could make the web strands electro-adhere to target objects made from metals, ceramics and polymers, thanks to the static electric field that arises between adjacent web strands when a high voltage is applied.

Vibrations help eliminate contaminants

Sun, Kim and colleagues found that these target objects generate an electrical response in the ionic conducting fibres as soon as the fibre touches them, initiating the adhesion process by electrostatics. They also discovered that they could make the fibres vibrate by applying an electric field between strand pairs. These vibrations, combined with the fibres’ hydrophobic coating, help to eliminate contaminants that would otherwise reduce adhesion forces. Indeed, after self-cleaning, the artificial webs recovered nearly 99% of their original adhesive force.

In a related article, Jonathan Rossiter, a roboticist at the University of Bristol, UK, and the head of the Soft Robotics group at the Bristol Robotics Laboratory, notes that flexible and stretchable ionic conducting materials could be used as substitutes for conventional conductors in future electroactive artificial muscles. Rossiter, who was not involved in the Seoul team’s work, suggests that such structures could come in handy when developing soft robotic wearable assistive devices for older people or people with disabilities.

The study’s lead author, Younghoon Lee, agrees, adding that the team’s approach could also apply to “existing robotics components based on electrostatics, such as electrostatic grippers, dielectric elastomer actuators and capacitive tactile sensors”. In addition, the materials’ ability to self-clean could be exploited in robots that use electro-adhesion to climb walls, and are currently limited to clean, dust-free surfaces.

As a suggestion for future work, Rossiter notes that it would be “extremely interesting to develop the spider analogy further, potentially seeing how spiders could work with the ionic spider webs in their natural environments, providing new insights into biology.” For their part, the Seoul team aim to improve the robustness of their web fibres and may also seek to adapt them so that they can detect non-charged objects, too.

End-to-end motion QA in radiation therapy treatment planning

Want to learn more on this subject?

With the advancement of treatment techniques designed to escalate delivered tumour doses, there is an increasing need for quality-assurance tools in radiation therapy treatment planning. Medical physicists are looking to improve treatment delivery by optimizing current 4D IGRT protocols and exploring rapidly advancing adaptive techniques.

Modus Medical Devices specializes in end-to-end QA solutions for 4D IGRT. This presentation will provide an overview of our CT and MR-safe motion phantoms with an emphasis on how they will meet your current and future QA needs.

The webinar, presented by Rocco Flores, will discuss:

  • 4D IGRT: current and upcoming methods of mitigating patient motion.
  • The review of Modus QA solutions for CT and MR end- to-end motion management.

Want to learn more on this subject?

Rocco Flores is a MRT(T) clinical application specialist. He has worked at several clinics over 20+ years as a medical radiation therapist with a focus on treatment delivery, planning and research. Since joining Modus QA in 2019, Rocco is primarily involved in customer application support and new product development.

 

 

Sidestepping the side effects of neurostimulation

Stimulation of the vagus nerve by implanted electrodes is used to treat a range of conditions including epilepsy, depression and heart failure. Such stimulation can also inadvertently activate muscles in the throat, however, leading to treatment-limiting side effects such as pain, difficulty swallowing and shortness of breath. By measuring nerve and muscle activity in pigs undergoing vagus nerve stimulation (VNS), researchers in the US have identified the mechanism by which the technique causes these side effects. This new understanding should help clinicians administer VNS at a high enough intensity to achieve therapeutic benefits while keeping below the threshold at which intolerable side effects are triggered.

Despite the effectiveness of VNS at treating some conditions, many details of the process remain unclear. One of the open questions has to do with the technique’s side effects, which recent studies have shown are more severe than researchers would expect. This can be a problem when those side effects become intolerable at a stimulation level below that at which the technique starts to yield benefits.

The persistence of the mystery around the side effects of VNS can be attributed partly to the relative paucity of human data obtained in the two decades since the technique was first demonstrated.

Evan Nicolai

“Many of the studies that were used to support the application of VNS in humans were performed in animal models that do not represent human physiology very well,” says Evan Nicolai, a graduate student at the University of Wisconsin-Madison and the Mayo Clinic.

Writing in the Journal of Neural Engineering, Nicolai and collaborators report how they sought to fill this gap with a series of experiments on pigs, which represent a much more human-like animal model.

The team wrapped either the right or left vagus nerve of 12 anaesthetized animals with a helical electrode cuff of the type used to administer VNS clinically. They also implanted electrodes to obtain electroneurograms (ENGs) from the vagus nerve and electromyograms (EMGs) from the cricoarytenoid and cricothyroid muscles. These muscles, which govern the motion of the larynx, are controlled by impulses from two bundles of nerve fibres that branch off from the vagus trunk: the recurrent laryngeal branch (RLB) activates the cricoarytenoid muscle; the superior laryngeal branch (SLB) activates both the cricothyroid and cricoarytenoid muscles.

Nicolai and his colleagues applied a stimulation pulse via the electrode cuff, and measured the timing of the resulting ENG and EMG signals recorded in the vagus nerve and the cricoarytenoid and cricothyroid muscles. Any delay between the pulse and a muscle’s response would indicate that the muscle was activated by a signal following a circuitous route, passing first along the vagus trunk, and then into the motor nerve fibres that branch off. A prompt response, on the other hand, would indicate that the muscle was activated directly by leakage current taking a shortcut from the nearby electrode cuff.

The researchers found that low levels of stimulation (around 0.3 mA) activated the cricoarytenoid muscle with a relatively long latency of 6–10 ms. Pulses of this intensity are well below the tolerable threshold for humans, indicating that signals induced in the RLB are not the cause of the treatment-limiting side effects. Higher stimulation levels (around 1.4 mA) provoked a response in the cricothyroid muscle with a latency of between 4 and 6 ms. This pulse strength corresponds approximately to the tolerable limit for clinical VNS. The researchers conclude, therefore, that the most severe side effects can be attributed to stimulation pulses that are strong enough to activate the nearby SLB directly via leakage currents from the electrode cuff.

The effect of this leakage-induced activation of the SLB could be minimized by using cuff designs with thicker insulation, or by rerouting the nearby nerve branches during surgery. But the team also has plans for avoiding muscle activation via the fibres that take the long way from the stimulation electrode to the RLB.

“One member of our team, Megan Settell, is leading a project to understand the distribution and orientation of the nerve fibres that make up the vagus, to enable targeting of different fibres before a patient has surgery for VNS,” says Nicolai. “In parallel, our collaborators Warren Grill and Nicole Pelot seek to predict activation of certain nerve fibres in the vagus using computational modelling.”

First ‘open flavour’ tetraquark is spotted by LHCb at CERN

The first tetraquark composed of four quarks of different flavours has been discovered by physicists working on the LHCb experiment at CERN. Dubbed X(2900), the “open flavour” tetraquark has a mass of about 2.9 GeV/c2 and has been spotted in two spin states. The tetraquark was made by smashing protons together at the Large Hardron Collider (LHC) to produce B mesons – and then searching the B meson decay products for signs of new particles.

While LHCb physicists are not completely certain about the nature of the particle – hence the “X” in the name – they believe it contains four quarks: anticharm, antistrange, up and down. Because the tetraquark does not contain a quark–antiquark pair of the same flavour, no quark flavours are hidden and therefore the tetraquark is described as open flavour.

Hadrons are made of two or more bound quarks or antiquarks. Mesons comprise a quark and antiquark, whereas baryons such as protons and neutrons comprise three quarks. However, nature does not stop at three quarks and several tetraquarks (two quarks and two antiquarks) and pentaquarks (four quarks and an antiquark) have been discovered.

No predictions

Before the discovery of X(2900), all known tetraquarks contained at least one charm–anticharm or beauty–antibeauty quark pair. According to LHCb physicist Tim Gershon of the University of Warwick, an open-charm/open-strange tetraquark such as X(2900) had been considered possible. However, no-one had made predictions of this particular hadron, so the discovery took the LHCb team by surprise.

Because it only contains one heavy quark (the anticharm), X(2900) has a relatively low mass compared to other tetraquarks. This, says, Gershon makes it easier to produce at the LHC than its heavier cousins. However, he points out that tetraquarks containing charm/anticharm pairs are relatively easy to observe because they decay to a final state that contains the J/psi meson – which itself has a very clean experimental signature.

“Open flavour tetraquarks will tend to have less clean experimental signatures, and it is only due to the unique design, and incredible performance, of the LHCb detector that we are able to make this latest discovery,” explains Gershon.

Mesonic  molecules?

Gershon is hopeful that X(2900) could shed light on an important mystery of tetraquarks – how the four quarks arrange themselves internally. This is defined by strong-force interactions between quarks, which are extremely difficult to calculate. One possibility is that all the quarks and antiquarks are tightly bound together. Another is that they are arranged as two quark–antiquark pairs loosely bound in a structure that resembles a molecule made from two mesons.

Gershon points out that the mass of X(2900) is similar to the sum of the masses of an excited D meson and an excited kaon – which together have the same quark content as X(2900). This could point towards the molecular model, but Gershon says it is too early to draw any conclusions.

Just last month, physicists working on the LHCb experiment discovered the first tetraquark that comprises all charm and anticharm quarks. As for more tetraquark discoveries, Gershon says, “There are many possible avenues of exploration, and I am certain that there is more to be found in LHCb’s treasure trove of data!  The problem is that it is hard to know which directions are more likely to yield discoveries, so I cannot say how long we will need to wait.”

Light waves steer electron beams thanks to whispering gallery effect

233904_web

Visitors to Saint Paul’s in London are often astonished by the cathedral’s whispering gallery, where words spoken in a whisper along the curving wall of the dome can be heard anywhere along that same wall – including the opposite side of the circular walkway, some 33 metres away. This “whispering gallery effect” is possible because sound runs so easily and smoothly around the dome’s circumference, and variations of it occur wherever waves can travel nearly perfectly around a structure.

Researchers at the University of Göttingen in Germany have now exploited the whispering gallery effect to control the beam of an electron microscope using light. As well as being of fundamental importance, they say that the work might lead to novel technologies for nanoscale sensing and microscopy.

Light wave confined to surface of a sphere

In their work, researchers led by Ofer Kfir and Claus Ropers illuminated small glass spheres with laser light that was trapped in an optical whispering-gallery mode, meaning that it was confined to the surface of the sphere by total internal reflection. Analogous to the acoustic example, the light wave travels around the sphere’s perimeter almost without damping.

The researchers then passed a beam of electrons in a transmission electron microscope near the edge of a sphere. When they measured the distribution of electron velocities using the microscope, they discovered that the electrons had exchanged large amounts of energy with the electric field of the laser light – roughly 10 times more energy, in fact, than any such exchange previously measured in an electron microscope.

The strength of the electron-light interaction comes from two contributions, Kfir explains. First, the whispering-gallery optical modes make it possible to store light and so build up a stronger wave. Second, the velocity of the light wave on the glass sphere matches that of the electron, running at 70% of the speed of light. This matching changes the quantum state of the electron (see image).

The coupling that occurs between an electron and the light’s electric field can be thought of as the acceleration (or deceleration) of the electron by the light field, he tells Physics World. “In the same way a surfer matches the speed of an ocean wave to best use its energy, an electron ‘riding’ the light wave is therefore either purely accelerated or purely decelerated over a substantial distance,” he says. “Indeed, in our study, we observed that individual electrons had picked up or given away the energy of hundreds of photons.”

Further enhancing electron-phonon coupling

As well as being of fundamental interest, the researchers say their findings could bring a greater functionality to electron microscopy – an indispensable characterization tool for research in many scientific disciplines as well as in industry. In Kfir’s view, using light to control the electrons’ quantum state in transmission electron microscopes would open up a range of novel imaging and spectroscopy techniques, making it possible to inspect the optical properties of materials with resolutions down to the atomic scale.

The researchers, who report their work in Nature, are now exploring ways of further enhancing the coupling of electrons and photons. “We can use light to steer a beam of electrons in space and time,” says Ropers. “An increase in the coupling strength would allow a single photon to affect an electron, and so allow us to measure optical excitations on the level of individual nanoscopic quantum systems. Doing so with ultrafast temporal resolution would be simply amazing, and may eventually lead to entirely new quantum technologies for nanoscale sensing and microscopy.”

White papers: Mad City Labs and SAES Group

This time we are featuring white papers from Mad City Labs and SAES Group.

When AFM becomes DIY

Atomic force microscopes (AFMs) are versatile tools for characterizing surfaces down to the subnanometre scale. Researchers wanting to, say, map out the optical antennas they’ve inscribed on a chip, or measure the quantum dots they’ve created, can image objects at resolutions down to the picometre level by scanning an AFM over the surface.

Useful as they are, AFMs aren’t cheap – typically running at $250,000 – which means that they can seem out of reach to many academic scientists, graduate students and researchers at small companies because of their high cost. Fortunately, researchers can build their own AFMs for as little as $30,000 using off-the-shelf components such as nanopositioning stages.

In the Building a Do-it-Yourself Atomic Force Microscope white paper from Mad City Labs, the company’s director of product development James MacKay describes how home-made AFMs can are not just cheap but high-performance too. DIY AFMs allow the researchers to customize the design so that they can do whatever kind of surface testing fits their needs and to tune the performance to suit their requirements.

Cutting carbon contamination

Vacuum engineers and scientists have long known that even if a sample is clean and handled with ultra-high vacuum (UHV) standards, a layer of unwanted carbon will form on the material’s surface after it’s placed in a high-vacuum (HV) or UHV chamber. This also happens for materials in the optics vacuum chambers in particle accelerators and synchrotron X-ray beamlines.

Indeed, X-rays can increase the pressure and yield of carbon contaminants by one to two orders of magnitude, significantly reducing how many X-rays end up downstream at experimental stations. And with next-generation synchrotrons ushering in X-ray brightness increases of two to three orders of magnitude, it is critical to minimize these losses from carbon contamination.

Carbon contamination used to be a show stopper, forcing mirrors to be removed, cleaned and sometimes even replaced. But as staff from SAES Group explain in their new white paper ZAO® Based Non Evaporable Getter Pumps in Optics Vacuum Chambers, the company’s NEXTorr and CapaciTorr pumps can significantly lower the gases that make the carbon residues form in the first place. SAES says that its pumps are much smaller and less invasive than traditional varieties, reducing particle-generation to levels acceptable for synchrotron-radiation studies.

Agilent vacuum technology: trusted answers

Bartly Carlson is a full-time trainer and applications engineer at Agilent Technologies. In this video, Carlson introduces Agilent’s transition from wet to dry vacuum technology – describing the advantages for customers in academia and industry. He introduces Agilent’s range of products and provides practical advice for reaching ultrahigh vacuum (UHV) pressures for a variety of applications.

Visit Agilent’s website to find out more about how their vacuum experts can help you. Or contact them directly at vpl-customercare@agilent.com. For more stories about creative technology solutions, you can also take a look at the Physics World Instruments and Vacuum Briefing.

This video was filmed in early 2020 before the coronavirus lockdowns in the US.

 

Putting new physics on the syllabus

Peter Higgs at CERN

On 4 July 2012 the CERN particle-physics lab near Geneva announced the discovery of the Higgs boson. “I think we have it…but we are only at the beginning,” noted Rolf-Dieter Heuer, the then director-general of CERN, in front of a packed auditorium. When I heard those words, I felt a sense of awe and even got a lump in my throat as I watched Peter Higgs wipe away a tear as his theoretical construct from 50 years ago became reality. Then my smile slowly faded and I realized that my Thursday morning A-level physics students would be full of questions about the new discovery. As a geophysicist, my particle physics is entirely self-taught, so I knew that I had a challenge ahead of me. I spent the rest of that July evening researching and preparing my explanations to the expected questions.

The next day I answered the students with confidence. I used analogies that were accessible and not too simplistic. I was careful not to obscure the fact that there are still many unknowns yet to discover and my students were genuinely engaged. “Will the Higgs boson be on our exam?” they asked me. I reassured them that it would not be this year, but next year’s students would probably need to know about it. However, I was wrong.

Almost eight years later and I’m still wrong. Each year, my students ask me why the Higgs boson is missing from the syllabus and when will it be added. Today, the three major exam boards in the UK – AQA, Edexcel and OCR – are still yet to include the Higgs boson in their A-level physics curriculum. Particle physics in education appears to be stuck, which is frustrating because some of the most important recent work in physics has been in this field. Unfortunately, very little of it is making its way into A-level education.

Stuck in a rut

I have worked for a small exam board and gained some insight into the process of writing a new syllabus. This board provided qualifications to a handful of colleges around the UK and I consulted on a rewrite for the curriculum and managed to insist that the existence of the Higgs boson was at least acknowledged. I also amended the definition of the neutrino to include the fact that it had mass – a feat that was acknowledged by the 2015 Nobel Prize for Physics. These were small victories, but ones that could not grow beyond the reach of that small sphere of influence.

While I have no experience of larger exam boards or the processes of scrutiny and consultation that go on behind closed doors, I find myself questioning the efficacy of those panels. When reading a typical syllabus for an exam board, the first few pages tend to include a “why choose us?” section. It states that the exam board has conducted extensive work with a range of educational providers and employers to ensure that the content is inspiring and up-to-date, but I have never seen any mention of the involvement of professional physicists, researchers or industry professionals. They don’t appear to consult with anyone who has been actively working to expand knowledge and understanding in our field.

Science communicator Henry Reich, who created the minutephysics YouTube channel that is dedicated to communicating challenging science, penned an open letter in 2012 to the then US president Barack Obama highlighting this very issue. As Reich pointed out, the omission of discoveries after the mid-1800s means that US high-school physics completely overlooks much of the science that contributed to many of the technologies that the US should be proud of. We’re not quite in such a situation in the UK, but it is becoming increasingly so. Students in the UK are now finding that the Nobel Prize for Physics is seemingly less and less relevant to their everyday lives, when in fact, with the acceleration of technology, the opposite should be true.

Whose responsibility is it then to make sure that key discoveries are integrated into our education system? I would argue that in an ideal world, the onus should be on the exam boards to reach out. But failing this, the responsibility falls to physicists to lobby exam boards to help them understand that the gap is widening between what we now know and what we are teaching our young people.

My current students were only eight years old when the Higgs boson was discovered. When they come across it during a reading assignment, it is often a new concept to them. But because it is not on the exam they are not motivated to study it in any depth. Year on year, I notice that the number of students asking about the Higgs boson is decreasing so much that now, most new students haven’t heard of it.

That class from 2012 who asked me all those questions about the Higgs boson are now aged between 24 and 25. The news may well have inspired them to pursue a career in physics – they could even be working at CERN. But the classes I taught in subsequent years have been increasingly isolated from new discoveries from particle physics and beyond. My current students find it harder to see modern physics as the exciting and fast-moving topic that it is. Surely that must change.

Quantum-inspired detection method generates high-quality OCT images

OCT researchers

A novel detection method for optical coherence tomography (OCT) achieves high-quality imaging at low light intensity. The new OCT method was developed by a group of researchers from the University of Auckland in New Zealand and the Nicolaus Copernicus University in Poland. Borrowing ideas for their detection mechanism from quantum optics, they describe in Optics Letters how the method can produce comparable results to standard OCT systems at very low light intensity levels (around 10 pW).

OCT is a widely used medical imaging technique that can non-invasively obtain cross-sectional images of semi-translucent materials. It has applications in several medical disciplines, including ophthalmology, cardiology and dermatology, where its resolution allows visualization of microscopic details. It works by illuminating the tissue with broad-bandwidth visible or near-infrared light. This light is scattered and reflected by the biological media and then captured and assessed using a spectroscopic detector.

OCT can provide images of tissue morphology at 1 µm resolution and it is now a common part of routine eye exams, where an ophthalmologist may recommend the scan to inspect the retina’s distinctive layers and rule out glaucoma or retinal diseases.

In a clinical setting, OCT is limited by the allowed intensity levels. For example, the imposed safety levels for imaging the eye are 1.7 and 5 mW, at wavelengths of 800 and 1060 nm, respectively, constraints that can sometimes affect the fidelity of the obtained images. A team led by Sylwia Kolenderska bypassed this limitation by using a superconducting single-photon detector (SSPD) instead of the standard OCT detector – an idea inspired by quantum optics, where SSPDs are used to study various properties of single photons.

One photon at a time…

Leading the way towards high-fidelity low-power OCT systems, the team came up with the idea of using an SSPD while researching new OCT detection schemes. As these sensors can detect single photons, the proposed setup uses a much smaller amount of light than a standard OCT system. In fact, the required power is a few orders of magnitude smaller than clinically available counterparts.

To make the new detection scheme work, the team had to make a few changes to the original optical setup, which requires a way of discerning the different wavelengths of light being reflected from the imaged object. As they were now working in a single-photon detection regime, the researchers decided to couple the SSPD with a long (5 km) fibre spool that induces a wavelength-dependent time delay. In other words, different wavelengths of light will travel at different speeds down this fibre, enabling the SSPD to capture the light spectrum.

…yields high-quality images

The team tested the new OCT setup using a 1550 nm pulsed light source to image two objects: a stack of different types of glass and a slice of onion. The glass stack was made up of three layers: quartz (50 µm thick), sapphire (460 µm thick) and BK7 glass (500 µm thick), and it was used as a controlled test to understand how well the different layers can be discerned. The slice of onion was used as a biological sample. Both experiments resulted in good-quality images, comparable with images obtained by standard OCT systems, but using intensity levels five orders of magnitude lower.

The researchers point out that they observed artefacts in the reconstructed images. These are unwanted image features that, in this case, appear because the detection system is recording all types of interactions between photons inside the sample. Artefact removal algorithms can be used to clean up the images and are straightforward to use when the imaged object has a well-defined structure, such as the stack of glass. These artefacts become more problematic when dealing with biological samples, where the structure is highly heterogeneous and complicated.

Copyright © 2025 by IOP Publishing Ltd and individual contributors