Algorithms 2022, 15, 129. https://doi.org/10.3390/a15040129 www.mdpi.com/journal/algorithms
Article
Convolutional-Neural-Network-Based Handwritten Character
Recognition: An Approach with Massive Multisource Data
Nazmus Saqib
1,
*, Khandaker Foysal Haque
2
, Venkata Prasanth Yanambaka
1
and Ahmed Abdelgawad
1
1
College of Science and Engineering, Central Michigan University, Mount Pleasant, MI 48859, USA;
yanam1v@cmich.edu (V.P.Y.); abdel1a@cmich.edu (A.A.)
2
Institute for the Wireless Internet of Things, Northeastern University, Boston, MA 02115, USA;
haque.k@northeastern.edu
* Correspondence: saqib1n@cmich.edu
Abstract: Neural networks have made big strides in image classification. Convolutional neural
networks (CNN) work successfully to run neural networks on direct images. Handwritten character
recognition (HCR) is now a very powerful tool to detect traffic signals, translate language, and
extract information from documents, etc. Although handwritten character recognition technology
is in use in the industry, present accuracy is not outstanding, which compromises both performance
and usability. Thus, the character recognition technologies in use are still not very reliable and need
further improvement to be extensively deployed for serious and reliable tasks. On this account,
characters of the English alphabet and digit recognition are performed by proposing a custom-
tailored CNN model with two different datasets of handwritten images, i.e., Kaggle and MNIST,
respectively, which are lightweight but achieve higher accuracies than state-of-the-art models. The
best two models from the total of twelve designed are proposed by altering hyper-parameters to
observe which models provide the best accuracy for which dataset. In addition, the classification
reports (CRs) of these two proposed models are extensively investigated considering the
performance matrices, such as precision, recall, specificity, and F1 score, which are obtained from
the developed confusion matrix (CM). To simulate a practical scenario, the dataset is kept
unbalanced and three more averages for the F measurement (micro, macro, and weighted) are
calculated, which facilitates better understanding of the performances of the models. The highest
accuracy of 99.642% is achieved for digit recognition, with the model using ‘RMSprop’, at a learning
rate of 0.001, whereas the highest detection accuracy for alphabet recognition is 99.563%, which is
obtained with the proposed model using ‘ADAM’ optimizer at a learning rate of 0.00001. The macro
F1 and weighted F1 scores for the best two models are 0.998, 0.997:0.992, and 0.996, respectively, for
digit and alphabet recognition.
Keywords: handwritten character recognition; English character recognition; convolutional neural
networks (CNNs); deep learning in character recognition; digit recognition; English alphabet
recognition
1. Introduction
Handwriting is the most typical and systematic way of recording facts and
information. The handwriting of an individual is idiosyncratic and unique to individual
people. The capability of software or a device to recognize and analyze human
handwriting in any language is called a handwritten character recognition (HCR) system.
Recognition can be performed from both online and offline handwriting. In recent years,
applications of handwriting recognition are thriving, widely used in reading postal
addresses, language translation, bank forms and check amounts, digital libraries,
keyword spotting, and traffic sign detection.
Citation: Saqib, N.; Haque, K.F.;
Yanambaka, V.P.; Abdelgawad, A.
Convolutional-Neural-Network-
Based Handwritten Character
Recognition: An Approach with
Massive Multisource Data.
Algorithms 2022, 15, 129. https://
doi.org/10.3390/a15040129
Academic Editor: Marcos Zampieri
Received: 5 March 2022
Accepted: 12 April 2022
Published: 14 April 2022
Publisher’s Note: MDPI stays
neutral with regard to jurisdictional
claims in published maps and
institutional affiliations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license
(https://creativecommons.org/license
s/by/4.0/).