Calculating large amounts of data is crucial in combating a pandemic and in developing individual treatment of patients.
The more valid data is available, the better the opportunities. The faster the data can be calculated, the sooner researchers can start developing preventive methods, treatments, and new vaccines. This requires fast computing power and reliable mathematical models. Morten Nielsen, professor at DTU, develops such models and algorithms.
“We still lack sufficient data to enable computer models to give us the comprehensive biological understanding we need to develop vaccines against, for example, cancer. But they’re coming. With the help of data and algorithms, our models and predictions are now more accurate than lab tests. If there’s a discrepancy, we can say almost with certainty that our result is right and that the errors are in the test,” says Morten Nielsen.
Training models of the immune system
"We want to model this as best we can with our mathematical models. And these models have to be trained."
Morten Nielsen
Morten Nielsen’s group is designing mathematical models of the immune system to predict how it will react to a cancer or coronavirus vaccine.
“We want to be able to understand individual immune responses. In other words, why each of us reacts differently to viruses. Some get very sick from coronavirus while others don’t. We want to model this as best we can with our mathematical models. And these models have to be trained,” says Morten Nielsen.
Some molecules in the immune system are very important when it comes to determining how the immune system responds, and these molecules have already been described with large amounts of data.
“What happens is that a molecule takes a fragment from, for example, the SARS genome and shows that fragment to the immune system on the surface of an infected cell. If we can predict which fragment from the SARS genome will be displayed, then we can also begin to understand how the immune system will be able to respond to it,” explains Morten Nielsen.
Algorithms look for patterns
Morten Nielsen’s research groups use machine learning, whereby programs or algorithms look for patterns.
“These algorithms are relatively complex and require both memory and time, because they have to look at millions of data points to learn the patterns. We need many different models and large ensembles to get a complete description of the diversity among all people.”
The researchers can train the models in a few days because they have access to Computerome, which is a national supercomputer with massive computing power located on DTU Risø Campus.