Skip to main content
Modelling and simulation

Modelling and simulation

How computational modelling is transforming medicine

15 Sep 2020 Samuel Vennin 
Taken from the September 2020 issue of Physics World, where it appeared under the headline "How modelling is transforming medicine".

Computational modelling has been brought under the spotlight during the COVID-19 pandemic, with scientists trying to predict how the SARS-CoV-2 virus will spread. But epidemiology is not the only medical field in which modelling is sparking breakthroughs, as Sam Vennin explains

Model for microcirculatory blood flow
(Courtesy: Paul Sweeney)

On 23 March 2020 UK prime minister Boris Johnson announced a lockdown to tackle the spread of coronavirus, following the example of other countries around the world who chose this strategy to halt the virus’ progression. This decision came days after Johnson’s government toyed with the idea of letting the virus spread and infect up to 70% of the population, in order to develop so-called “herd immunity”. The stark policy shift left people wondering what had changed.

To many, the models produced by the physicist-turned-epidemiologist Neil Ferguson and his group at Imperial College London were critical. They predicted that should no action be taken, the death toll in the UK could reach 500,000, and may exceed 2 million in the US. As well as providing a shocking reality check about the pandemic, the work highlighted an increasingly popular new tool that is profoundly changing medical research.

While in vitro and in vivo experiments have long been a staple of medical-based research, the rapid increase of computational power in recent decades has enabled the emergence of a new experimental field: computational (in silico) modelling. From surgery to drug design, these numerical models are not only used to describe physiological phenomena, but also to derive useful information and even drive clinical decisions.

“Modelling is about encapsulating our knowledge into a set of rules or equations. It is thus at the core of any science,” says Pablo Lamata from King’s College London, UK. “We are currently experiencing a ‘computational boost’ in our modelling capabilities.”

Digital twin of a patient's heart

The ever-increasing amount of available data – from wearable sensors to digital medical images – has also sped up the applications of modelling. Lamata’s group, for example, combines in silico heart models with medical images of the heart to create patient-specific numerical heart models – so-called digital twins. Such models could in future provide doctors with vital information regarding cardiac properties that are currently unavailable, such as heart stiffness. This is important because when the heart fills up with blood (during diastole), stiffness can prevent the ventricle from filling up properly, a phenomenon associated with heart failure in about 50% of patients. These models could also provide new understandings on the mechanisms leading to this outcome.

“We obviously cannot touch a beating heart to know the stiffness, but we can use these models governed by the rules and laws of the material properties to infer that important piece of diagnostic and prognostic information,” Lamata explains. “The stiffness of the heart becomes another key biomarker that will tell us how the health of the heart is coping with disease.”

Reducing uncertainty

Similar approaches are used in other medical fields. Paul Sweeney, a mathematician from the University of Cambridge in the UK, for example, uses in silico models to predict perfusion – the passage of fluids through the circulatory or lymphatic systems to an organ or tissue – and drug delivery across whole tumours at the scale of the smallest blood vessels (approximately 3 μm). “Our models allow us to understand how a tumour’s micro-architecture influences the distribution of fluid and mass through cancerous tissue, which is important for engineering new anti-cancer drugs, or optimizing strategies of current therapeutics,” Sweeney says. As with heart stiffness, such data are otherwise unavailable through conventional experiments in isolation, which means that introducing modelling can inform the development of cancer therapeutics.

But getting to the point where information can be derived from these models is only the last stage in a long and thorough development process. From defining the problem to selecting the modelling strategy to address it, each step requires crucial choices, which are reassessed later to ensure they reflect reality. “It is only through many iterations of searching for the agreement between model and data that the weak links are revealed and addressed,” notes Lamata. This is why, just as with weather forecasting or death-toll projections in the COVID-19 crisis, models are constantly updated as more data become available.

Sometimes this means revisiting the assumptions that models are based on: they are necessary at the beginning to facilitate the modeller’s task, but their impact on predictions should be carefully handled. “The best solution to deal with this is the use of sensitivity analysis [assessing a parameter’s influence on the model prediction by varying it while keeping all other parameters constant] – but the limit is always going to be in those aspects that your model can’t account for,” Lamata says. It is important to keep this in mind when considering what can be inferred from the model output and what cannot, while also leaving room for improvement.

Tumour model

To Sweeney, there is always scope for increasing the accuracy of models, whether by supplying additional experimental data or incorporating more models of other complex biological mechanisms. “A potential trade-off is always between the accuracy of the model and the amount of experimental data available,” he says. “In other words, will additional data make the model more accurate or cause overfitting to the empirical data?”

Indeed, this is the caveat that all models face: the need to balance accuracy against simplicity. If a model relies too heavily on the data that it was developed from, its prediction for other datasets might not be accurate – the problem of overfitting Sweeney refers to. Conversely, developing as basic as possible a model, with little reliance on data or patient characteristics, can also yield unreliable predictions, as the “one-size-fits-all” approach fails to account for a patient’s individualities.

The purpose of the model dictates where to put the emphasis, although simplicity is usually favoured. The focus hence shifts back to the importance of accurately defining the problem that the model is addressing in the first place. “You want the minimum model to correctly address the problem,” concludes Lamata.

Designing new technologies

Confronting a model’s predictions with reality remains the quickest method of validation, although this can be achieved faster in some fields than others. Take biomaterials, where in silico models help us understand how molecules behave and interact with their environment, which can quickly be replicated in vitro.

Over at the University of Nottingham in the UK, for example, bioengineer Alvaro Mata and his group use modelling as a stepping stone for developing innovative materials and therapies for tissue engineering and regenerative medicine. “Molecular dynamic simulations are key to elucidate mechanisms by which molecules assemble together,” explains Mata. “This allows us to translate those assemblies at the molecular level into fabrication platforms capable of engineering functional structures.”

His group recently used this approach to understand how graphene oxide can exploit the flexible regions of a protein to create a new bioink material for 3D-printing tissue-like vasculature structures. Through the models, the researchers learnt how to guide their assembly at various size scales, from the cellular level to the final complex structure. “Simulations can dramatically facilitate testing and optimization of materials, structures and processes, saving both time and money,” Mata says.

Molecular dynamics simulation

Compared with in vitro and in vivo experiments, in silico simulations have the advantage of being fast, cheap, safe, easy to implement and free of experimental errors. Consequently, they are becoming increasingly helpful in designing new technologies and strategies.

A prime example of this is cardiac resynchronization therapy (CRT) in patients with heart failure. This treatment involves placing two pacing leads controlled by a pacemaker into the patient’s heart to augment the electrical activation and synchronize the beating of the two ventricles. Traditionally, the leads’ location and timings for stimulation are derived from electrocardiograms (ECGs) and medical images, but 30% of patients do not see a clear benefit from this strategy. By producing computational heart models from the patient’s scans and simulating various pacing strategies on them, Steven Niederer’s group at King’s College London can identify the best area to electrically stimulate the heart and investigate the effects of changing the pacing.

Such models are particularly complex, requiring multi-scale modelling to link cellular dynamics, blood flow, electrophysiology and tissue deformations within a common anatomical heart geometry. There is still a long way to go before the technology can be adopted as a support tool for clinical decision-making, but studies in a small number of patients have shown that such models can perform patient-specific predictions about the acute haemodynamic CRT response. This further demonstrates that clinical intervention guided by in silico models is no longer just a pipe dream.

In fact, in 2015 Alberto Figueroa and his team at the University of Michigan in the US helped perform the first surgical intervention that used computerized blood flow simulation. The team’s open-source software, CRIMSON, uses MRI scans and haemodynamic variables (blood pressure and flow) to produce 3D computer models of a patient’s circulatory system. This can be used to simulate different surgical alternatives and determine which yields the best prognosis before heading to the operating theatre.

The CRIMSON software has already been used to plan several complex cardiac operations, such as “Fontan procedures”, which involves rewiring pulmonary circulation in patients born with just one functioning ventricle. The venous return – the flow of blood back into the heart – is rerouted to bypass the heart and connected directly to the pulmonary arteries for transport to the lungs. Simulations help surgeons decide where to make the surgical connections so that blood flow is ideally balanced between the lungs.

In silico trials in reality

With the surge in applications of computational modelling and the demonstration of its clinical relevance, regulatory bodies are taking note and have already acknowledged the benefit of these models.

For example, in 2011 the US Food and Drug Administration (FDA) approved the first in silico diabetes type 1 model as a possible substitute for pre-clinical animal testing for new control strategies for type 1 diabetes. A few years later, the FDA went further by approving FFRCT software that had been developed by the US medical firm Heartflow to measure coronary blockages non-invasively from CT scans. This was therefore the first clinical technology based on subject-specific modelling to get the green light. The software has also received CE marking and regulatory approval in Japan.

In specific cases, such as the assessment of drug toxicity on the heart, the FDA is now even sparking new collaborations between academia and industry to rise to the challenge.

One success story comes from the University of Oxford in the UK, where Blanca Rodriguez’s group developed Virtual Assay – software that can run in silico drug trials in populations of human cardiac cell models. Designed to predict drug safety and efficacy, the software simulates the effects of drugs on the electrophysiology and calcium dynamics of human cardiomyocytes – cells that control the heart’s ability to contract. Specific heart rhythm patterns can therefore be inferred for each drug compound and dose simulated, with the objective of spotting any drug-induced arrythmias. An in silico trial of 62 drugs, led by Rodriguez’s team in collaboration with Janssen Pharmaceutica, showed that the software was more accurate at predicting abnormal heart rhythms than animal trials (89% accuracy versus 75% in rabbits).

These results captured the attention of other pharmaceutical companies – a sector in which developing new compounds costs several billions of dollars, and 20–50% of drug candidates are abandoned due to cardiovascular safety issues. Eight major pharmaceutical companies are now evaluating Virtual Assay in the early drug development process to assess arrhythmic risk. This strategy is also backed by animal protection groups, as in silico models for pharmaceutical R&D could reduce the need for animal use by a third.

This conjunction of interest from regulatory bodies, industry, clinics, academia and even animal-welfare groups has led to the establishment of networks and initiatives around the world to promote the development, validation and use of in silico medicine technologies.

The Virtual Physiological Human Institute led the way when it was opened to members in 2011. As part of a two year EU-funded project starting in 2013, it produced a roadmap for the introduction of in silico clinical trials. Lamata, Rodriguez and Niederer are part of a network of universities, industries and regulatory bodies working to bring personalized in silico cardiology to the clinics. Just as research is becoming more and more interdisciplinary, this combination of expertise is vital to steer the use of models in the right direction and facilitate their clinical translation.

“Oddly, one challenge with our models is knowing where to begin analysing their predictions as they produce a vast wealth of data. The interdisciplinary expertise on our team makes this process far easier by providing unique perspectives on how to tackle this challenge,” says Sweeney.

As Lamata puts it, it’s easy to be tempted to “think big” and include a lot of variables and complexity to have a huge model prediction power. “But it’s important to keep things as simple as possible for validation, which is an incremental process. Working with clinicians helps to keep the end-goal in sight.”

If incorporating in silico models to complement traditional in vitro and in vivo experiments is a new paradigm, all stakeholders seem to adapt well. Sweeney did not perceive much resistance from biologists and clinicians, who are more used to working with statistics and correlations drawn from large cohorts than equation-based, patient-specific simulations. For Lamata, the challenge with demonstrating the accuracy of in silico-based predictions remains the same as for any scientific advance: the need for evidence. And this takes time to generate. “The main cultural shift we require is one towards open science, where we make our data and tools available for the fast generation of the required evidence,” he notes.

The momentum is there: industrialists, policy-makers and clinicians are on board; personalized medicine is gaining traction as computational power keeps increasing and initiatives flourish; evidence of computational models’ capacity to enhance diagnosis, prognosis and treatment is mounting. The COVID-19 pandemic has shown that models can even influence government decisions. It might just be a matter of time before in silico models in medicine become as ubiquitous as the computers they are run on.

Ultimately, if all models are limited by their hypotheses, the possibilities that they offer are limitless. You just need to know exactly what you are looking for.

Copyright © 2024 by IOP Publishing Ltd and individual contributors