The team demonstrated the feasibility of the algorithm with the ChestXRay14 dataset.
ORNL researchers developed a scalable CNN to identify pulmonary diseases in medical images. LungXNet was trained using the lab’s Summit supercomputer and more than 100,000 chest X-ray images.
A team of researchers from the Department of Energy’s Oak Ridge National Laboratory recently developed LungXNet, a deep neural network trained for classifying chest X-rays to identify the signs of 14 chest-related diseases more efficiently, and an efficient data parallelism algorithm to maximize GPU utilization while reducing the overhead inter-node traffic, resulting in optimal training of AI at scale on Summit, ORNL’s flagship computing system and currently the world’s fastest. The increased scalability on a system of Summit’s size, along with the application of a convolutional neural network with data augmentation, resulted in higher accuracy than current practices and competing networks.
The team used Horovod, an Uber deep-learning toolkit aimed at distributed and scalable AI training, and its own expertise to increase the scalability of LungXNet, then demonstrated the feasibility of the algorithm with the ChestXRay14 dataset, a large volume of publicly available high-resolution chest X-ray images with expert annotations which consists of 110,000 images from 30,000 patients. The research shows enormous potential in training AI networks at scale and in improving medical image classification models.
Research team: Hong-Jun Yoon, Cameron Kuchta, Folami Alamudun, Jacob Hinkle, Kshitij Srivastava, Gina Tourassi
Funding: This research was supported by the Laboratory Directed Research and Development Program of ORNL, managed by UT-Battelle, LLC, for the U.S. DOE. This research used resources of the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.
Publication: “Effects of Transfer Learning for Convolutional Neural Network Models in Medical Imaging” INFORMS computing society 2019. Accepted.
Contact: Hong-Jun Yoon, yoonh@ornl.gov