Research Article
Deep Learning-Based Networks for Detecting Anomalies in
Chest X-Rays
Malek Badr ,
1,2,3
Shaha Al-Otaibi ,
4
Nazik Alturki,
4
and Tanvir Abir
5
1
The University of Mashreq, Research Center, Baghdad, Iraq
2
Department of Medical Instruments Engineering Techniques, Al-Farahidi University, Baghdad 10021, Iraq
3
Research Center, The University of Mashreq, Baghdad, Iraq
4
Department of Information Systems, College of Computer and Information Sciences, Princess Nourah bint
Abdulrahman University, P. O. Box 84428, Riyadh 11671, Saudi Arabia
5
Department of Business Administration, Faculty of Business and Entrepreneurship, Daffodil International University,
Dhaka, Bangladesh
Correspondence should be addressed to Tanvir Abir; tanvir.ba02876.c@diu.edu.bd
Received 5 June 2022; Revised 20 June 2022; Accepted 24 June 2022; Published 23 July 2022
Academic Editor: Dinesh Rokaya
Copyright © 2022 Malek Badr et al. This is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
X-ray images aid medical professionals in the diagnosis and detection of pathologies. They are critical, for example, in the
diagnosis of pneumonia, the detection of masses, and, more recently, the detection of COVID-19-related conditions. The chest
X-ray is one of the first imaging tests performed when pathology is suspected because it is one of the most accessible
radiological examinations. Deep learning-based neural networks, particularly convolutional neural networks, have exploded in
popularity in recent years and have become indispensable tools for image classification. Transfer learning approaches, in
particular, have enabled the use of previously trained networks’ knowledge, eliminating the need for large data sets and
lowering the high computational costs associated with this type of network. This research focuses on using deep learning-based
neural networks to detect anomalies in chest X-rays. Different convolutional network-based approaches are investigated using
the ChestX-ray14 database, which contains over 100,000 X-ray images with labels relating to 14 different pathologies, and
different classification objectives are evaluated. Starting with the pretrained networks VGG19, ResNet50, and Inceptionv3,
networks based on transfer learning are implemented, with different schemes for the classification stage and data
augmentation. Similarly, an ad hoc architecture is proposed and evaluated without transfer learning for the classification
objective with more examples. The results show that transfer learning produces acceptable results in most of the tested cases,
indicating that it is a viable first step for using deep networks when there are not enough labeled images, which is a common
problem when working with medical images. The ad hoc network, on the other hand, demonstrated good generalization with
data augmentation and an acceptable accuracy value. The findings suggest that using convolutional neural networks with and
without transfer learning to design classifiers for detecting pathologies in chest X-rays is a good idea.
1. Introduction
Advances in data acquisition, storage, and processing allow
for cheap, large-scale data collection [1]. They have
improved the ability to process data into useful information
and advance knowledge. This caused a substantial increase
in available information in medical imaging, leaving behind
the days when health data was scarce. This poses a great
challenge when it comes to developing tools for its analysis
and interpretation that aid in decision-making. Many mod-
ern hospitals’ computer systems store a large volume of
chest X-rays and radiological reports [2].
Digital image processing (DIP) allows segmentation and
classification of medical images [3]. In this context, segmen-
tation defines a partition so that the obtained regions corre-
spond to anatomical structures, processes, or regions of
Hindawi
BioMed Research International
Volume 2022, Article ID 7833516, 10 pages
https://doi.org/10.1155/2022/7833516