Journal of Computer Science 9 (11): 1435-1442, 2013
ISSN: 1549-3636
© 2013 Science Publications
doi:10.3844/jcssp.2013.1435.1442 Published Online 9 (11) 2013 (http://www.thescipub.com/jcs.toc)
Corresponding Author: Hendy Yeremia, Department of IT, School of Computer Science, Bina Nusantara University,
Jakarta-Indonesia
1435
Science Publications JCS
GENETIC ALGORITHM AND NEURAL
NETWORK FOR OPTICAL CHARACTER RECOGNITION
Hendy Yeremia, Niko Adrianus Yuwono, Pius Raymond and Widodo Budiharto
Department of IT, School of Computer Science, Bina Nusantara University, Jakarta-Indonesia
Received 2013-01-15, Revised 2013-04-16; Accepted 2013-09-19
ABSTRACT
Computer system has been able to recognize writing as human brain does. The method mostly used for
character recognition is the backpropagation network. Backpropagation network has been known for
its accuracy because it allows itself to learn and improving itself thus it can achieve higher accuracy.
On the other hand, backpropagation was less to be used because of its time length needed to train the
network to achieve the best result possible. In this study, backpropagation network algorithm is
combined with genetic algorithm to achieve both accuracy and training swiftness for recognizing
alphabets. Genetic algorithm is used to define the best initial values for the network’s architecture and
synapses’ weight thus within a shorter period of time, the network could achieve the best accuracy.
The optimized backpropagation network has better accuracy and less training time than the standard
backpropagation network. The accuracy in recognizing character differ by 10, 77%, with a success rate
of 90, 77% for the optimized backpropagation and 80% accuracy for the standard backpropagation
network. The training time needed for backpropagation learning phase improved significantly from 03
h, 14 min and 40 sec, a standard backpropagation training time, to 02 h 18 min and 1 sec for the
optimized backpropagation network.
Keywords: Backpropagation Network, Genetic Algorithm, Optical Character Recognition, Optimized
Artificial Neural Network
1. INTRODUCTION
Human brain consists of 10
11
sets of interconnected
neurons to facilitate our reading, breathing, motion and
thinking. In term of learning, human brain is superior to
a microprocessor. Because of that fact, backpropagation
network tries to adapt the ability of human brain to learn
by experience (Pinjare and Kumar, 2012).
Backropagation is probably the most common
method for training forward-feed neural networks. A
forward pass using an input pattern propagates through
the network and produces an actual output. The
backward pass uses the desired outputs corresponding to
the input pattern and updates the weights according to
the error signal. There are hundreds of papers covering
the subject of backward propagation. Unfortunately,
many of them tend to exhibit a vast stockpile of
equations and complicated partial derivatives with
undefined variables to explain a concept that is really
quite simple. Quite often, a pseudocode algorithm or an
example with pictures is the most effiecient method to
convey information.The most popular method used in
optical character recognizing is nevertheless
backpropagation network. This method weakness is the
required time to achieve the best result for recognizing
alphabets tends to be long. Backpropagation itself could
do the preprocessing phase for alphabet recognition less
complex than genetic algorithm (Negnevitsky, 2005).
Genetic algorithm would be used to optimize what a
standard backpropagation network lacks, architecture and
initial weights. This algorithm is often used to find an
optimal solution in complex problems (Matic, 2010) by