“I spend a lot of time in the clinic, and don’t have the time or the technical expertise to learn, configure, and maintain software. MATLAB makes it easy for physicians like me to get work done and produce meaningful results.”
Dr. Johan Nilsson, Skåne University Hospital, Lund University
A heart transplant recipient’s survival depends on dozens of variables, including the weight, gender, age, and blood type of both donor and recipient, and the ischemic time—or the time during a transplant when there is no blood flow to the organ.
To better understand transplant risk factors and improve patient outcomes, researchers at Lund University and Skåne University Hospital in Sweden use artificial neural networks (ANNs) to explore the complex nonlinear relationships among multiple variables. The ANN models are trained using donor and recipient data obtained from two global databases: the International Society for Heart and Lung Transplantation (ISHLT) registry and the Nordic Thoracic Transplantation Database (NTTD). The Lund researchers accelerated the training and simulation of their ANNs by using MATLAB®, Neural Network Toolbox™, and MathWorks parallel computing products.
“Many of the techniques we use are computer-intensive and time-consuming,” says Dr. Johan Nilsson, Associate Professor in the Division of Cardiothoracic Surgery at Lund University. “We used Parallel Computing Toolbox with MATLAB Distributed Computing Server to distribute the work on a 56-processor cluster. This enabled us to rapidly identify an optimal neural network configuration using MATLAB and Neural Network Toolbox, train the network using data from the transplantation databases, and then run simulations to analyze risk factors and survival rates.”
Understanding how various risk factors affect survival rates involved hundreds of thousands of computationally and data-intensive operations—for example, the team had to test hundreds of ANN configurations to identify the best one. An analysis of six variables requires the simulation of 30,000 different combinations. Simulating all these combinations for 50,000 patients took weeks using an open-source software package.
Nilsson and his colleagues encountered reliability problems with the software they were using, as well. “The software was unstable, which led to crashes during long, multiday simulations,” Nilsson explains. “In addition, some of the results it produced were not quite right. When we publish our findings, we need to be very sure we can trust the results.”
To address the speed and reliability challenges, Lund University researchers developed their initial ANN model using MATLAB and Neural Network Toolbox. To find the optimal network configuration, they wrote MATLAB scripts that varied the number of hidden nodes used in the network for a range of weight decay (or regularization) values.
The team used Parallel Computing Toolbox™ and MATLAB Distributed Computing Server™ to accelerate the simulation of more than 200 ANN configurations. They then evaluated the results to find the best-performing configuration.
After training the ANN using donor and recipient information from the databases, they verified the model’s accuracy by simulating outcomes for 10,000 patients that had been omitted from the training set. They then compared the results against actual survival rates.
In the next phase, the team conducted thousands of simulations in parallel to rank the 57 risk factors considered in the study for predicting long-term survival.
Using results from Monte Carlo simulations on the computer cluster and simulated annealing techniques, the researchers identified the best and worst possible donors for any particular recipient.
As a final step, the team developed an automated process that ranks the recipient waiting list to identify the best candidates for a prospective donor.
In the next major phase of the project, Lund University researchers are using the ANN to investigate the use of Human Leukocyte Antigen (HLA) genetic profiles to match donors with recipients.
Improve long-term survival rates for heart-transplant recipients by identifying optimal recipient and donor matches
Use MathWorks tools to develop a predictive artificial neural network model and simulate thousands of risk-profile combinations on a 56-processor computing cluster