COX Regressive Winsorized Correlated Convolutional
Deep Belief Boltzmann Network for Covid-19
Prediction with Big Data
Dr. K.Sankar
1
, Dr. Divya Rohatgi
2
and S.Balakrishna Reddy
3
1, 3
CVR College of Engineering, Dept. of CSE, Hyderabad. India
Email: sankarkrish@cvr.ac.in, sama.balakrishnareddy@gmail.com
2
Amity University, Dept. of CSE, Mumbai, India
Email: divi.rohatgi@gmail.com
Abstract—Big data analytics in the health care industry is a very promising method of
integrating, discovering, and analyzing huge volume of complex heterogeneous data. Some of
the techniques were not well suited for medical data classification with minimum time
consumption. To improve disease prediction accuracy, a novel technique Cox Regressive
Winsorized Correlated Convolutional Deep Belief Boltzmann Network (CRWCCDBBN) is
introduced. The proposed CRWCCDBBN technique uses multiple layers for performing the
different processes like feature selection and classification. Initially, feature selection is
performed using Cox Regression to minimize the dimensionality of the data by finding the Cox
partial log-likelihood between the two features. After the feature selection, the classification is
performed using Winsorized Correlation coefficient by analyzing the training and testing
disease data. Based on the classification results, Covid 19 disease is correctly diagnosed with a
minimum error by updating the weight value. Finally, the gradient descent first-order iterative
function is used to find the local minimum error. Experimental evaluation of CRWCCDBBN
technique is carried out using a Covid 19 Dataset with different performance metrics such as
accuracy, precision, recall, f-measure, and prediction time with respect to a number of patient
data. The observed result proves that the proposed CRWCCDBBN technique achieves higher
prediction accuracy with a minimum time consumption than the state-of-the-art methods.
Index Terms— Big Data, Convolutional Deep Belief Boltzmann Network, Cox Regression,
feature selection, Winsorized Correlation coefficient, gradient descent first-order iterative
function.
I. INTRODUCTION
A novel virus disease called COVID-19 caused in the world with various problems. But there are still limitations
in the real-time detection of COVID-19 such as a large volume of datasets, imbalance classes, and a
misclassification rate of models. Therefore, a novel deep learning-based model aims to investigate for effective
detection of COVID-19. A hybrid deep learning prediction model CNN-LSTM was introduced in [1] to correctly
forecast the COVID-19. However, it failed to test the algorithm on the large time-series datasets to determine
suitability and correctness. Deep-LSTM ensemble model was introduced in [2] to predict the Covid-19. But it
was not efficient to provide better results with minimum time. Supervised machine learning models were
Grenze ID: 01.GIJET.9.1.547
© Grenze Scientific Society, 2023
Grenze International Journal of Engineering and Technology, Jan Issue