Pneumonia. Fibrosis. Emphysema: All are thoracic disorders that affect millions of individuals globally. From mild to deadly cases, Bryant’s Nafees Qamar, Ph.D., associate professor and program director of Healthcare Informatics, alongside outside researchers, developed an automated framework to assist radiologists in detecting and diagnosing thoracic conditions. The revolutionary framework relies on artificial intelligence to scan chest X-ray radiographs through a computer-based system; Qamar and colleagues found this to be a quicker, more reliable interpretation of medical imaging than the human eye.
“Machine learning and deep learning algorithms can do this job fast. They can analyze the information more efficiently and in an objective way,” says Qamar, noting that the researchers’ model — named Z-Net — has outperformed the latest frameworks from other studies; their paper was recently published in Diagnostics.
According to Qamar, chest X-rays are one of the most common ways to diagnose thoracic disorders, which are conditions that can affect the heart, great vessels, lungs, esophagus, chest wall, diaphragm, and mediastinum. Interpreting medical images requires extensive knowledge and experience, so radiologists need a significant amount of time to review X-ray data. While radiologists play a vital role in the healthcare system, the World Health Organization reports that 60 percent of the population worldwide does not have access to these medical professionals. Due to this deficit, models like Z-Net can offer a potential solution.
“This technology is going to be helpful in addressing health disparities and accessing cost-effective healthcare,” says Qamar, noting that automated models have the capability of analyzing a single X-ray pixel unlike the human eye. “Early detection of thoracic diseases has a huge impact on saving lives.”
Due to bias and human error, Qamar says two radiologists could have different interpretations of the same X-ray — an issue that may result in a problem going undiagnosed, underdiagnosed, or overdiagnosed. He adds that a 2017 study revealed that an automated model outperformed four experienced radiologists by correctly identifying pneumonia among 420 radiographs.
With automated tools and technologies empowering providers so they can take their care to the next level, Qamar believes automated frameworks will supplement radiologists’ roles. Radiologists’ future responsibilities could involve monitoring the model’s output and prescribing therapies and medications.
Qamar notes that the framework he and researchers developed can be applied to other types of medical imaging through slight modifications and a change in datasets. While the one he worked on considers thoracic disorders and was trained on a hospital dataset of 112,120 chest radiographs that showed he says Chest X-rays can also reveal signs of heart conditions — such as heart enlargement, fluid accumulation in the lungs, or abnormalities in the blood vessels. The framework may be applicable to identifying other pulmonary conditions such as chronic obstructive pulmonary disease, pulmonary fibrosis, or lung cancer; additionally, a variation of the model could be used for identifying retinal diseases.
Looking to the future, the researchers — which include individuals from COMSATS University Islamabad and the Institute of Space Technology in Pakistan — aim to create a more accurate and sophisticated model that can locate the overlapping elements of disease and that can develop correlation between different diseases.
“Anything in healthcare that has to do with imaging will be transformed in the next few years,” Qamar says.