978-1-5386-7266-2/18/$31.00 ©2018 IEEE
A Facial Expression Recognition Approach Using
DCNN for Autistic Children to Identify Emotions
Md Inzamam Ul Haque
Ingram School of Engineering
Texas State University
San Marcos, USA
m_h536@txstate.edu
Damian Valles
Ingram School of Engineering
Texas State University
San Marcos, USA
dvalles@txstate.edu
Abstract—In this paper, an initial work of a research is
discussed which is to teach young autistic children recognizing
human facial expression with the help of computer vision and
image processing. This paper mostly discusses the initial work of
facial expression recognition using a deep convolutional neural
network. The Kaggle’s FER2013 dataset has been used to train
and experiment with a deep convolutional neural network model.
Once a satisfactory result is achieved, the dataset is modified with
pictures of four different lighting conditions and each of these
datasets is again trained with the same model. This is necessary for
the end goal of the research which is to recognize facial expression
in any possible environment. Finally, the comparison between
results with different datasets is discussed and future work of the
project is outlined.
Keywords—Facial Expression Recognition, Autistic Children,
DCNN, Loss, Accuracy
I. INTRODUCTION
Strong and meaningful human interaction is necessary to
convey feelings and communicate with another person. Along
with verbal communication, conveying communicative feelings
can also be carried out by a person via nonverbal communication
such as body language, facial expression, attitude, movement
etc. [1]. One of the non-verbal communication methods by
which one can understand the mood/mental state of a person is
the expression of the face [2]. Facial expressions play a
significant role in interpersonal communication. In 1971,
Ekman et al. [3] identified six facial expressions that are
universal across all cultures - anger, disgust, fear, happiness,
sadness and surprise.
As infants, nonverbal communication is learned from social-
emotional communication, making the face rather than voice the
dominant communication channel [4]. For autistic children,
face processing is a challenging task. It has been argued that the
ability of autistic children to understand facial expression is
impaired and this inability may account for other problems that
they demonstrate during social interaction [5]. Several studies
including [6] and [7] showed the impairment of autistic children
in classifying and understanding facial expressions compared to
normal children of the same age. Interestingly, most of these
studies used static front view images or drawings.
In this paper, a novel idea is presented of teaching young
autistic children to recognize human facial expressions in a
friendly and practical environment. As most autistic kids like to
play with gadgets such as smartphones, tablets etc., the goal will
be to teach them to recognize facial expressions using these
gadget’s camera. When they will point the camera towards a
person, the app will automatically detect the face and classify
facial expression of the person. The facial expression will be
shown on the gadget’s screen in the form of an emoticon. The
goal of this research is to use these facial expressions as an
emoticon to show autistic children how the person, to whom
they are pointing the camera, is feeling and displaying emotional
characteristics. To make this model robust to any environment
and angle, the model will be trained with not only front-view
facial images but also with images of faces from different
orientation, i.e. side view, top view and bottom view. Also, our
model should be able to predict facial expression in different
lighting environments: darker or lighter shades of contrast. Fig.
1 shows a simple workflow diagram of the application.
Fig. 1. Work flow diagram for the application of research.
The second step of the overall application, the facial
expression recognition will be done with the help of computer
vision and neural networks by mainly using DCNNs (Deep
Convolutional Neural Networks) design approach. In recent
546