An Approach for Quantitative Evaluation of the Degree of Facial Paralysis Based
on Salient Point Detection
Junyu Dong
1
, Lijing Ma
1
, Qingqiang Li
1
, Shengke Wang
1
Li-an Liu
2
, Yang Lin
1
, Muwei Jian
1
1
College of Information Science and Engineering,
Ocean University of China, Qingdao, Shandong, 266100, China
2
Qingdao Hiser Medical Center, Qingdao, Shandong, China
dongjunyu@ouc.edu.cn, malijing66@163.com
Abstract
Facial paralysis can be on one or both sides of face
and one side is much more common. This disease can
impose significant psychological and functional
impairment to patients. Traditionally, patients with facial
paralysis are evaluated and examined by physicians
based on manually measurement of certain difference
between the two facial sides. In this paper, we describe a
new approach for quantitatively estimating the degree of
facial paralysis. We first determine key points based on
salient point detection. Then the differences between the
two facial sides are calculated. The proposed method is
proved to be useful and effective in practice.
1. Introduction
Facial paralysis is seriously deteriorating patients’
normal life. Meanwhile, traditional methods that solely
depend on the physician’s diagnoses are time consuming
and subjective. An accurate method for assessing facial
nerve system is useful and necessary.
There are more than twenty methods regarding facial
paralysise assessment in the literature. Latest and
frequently used methods include Nottingham system
[1]
,
Toronto facial grading system (TFGS)
[2-3]
, facial nerve
function index (FNFI), 1inear measurement index
(LMI)
[4]
and House-Brackmann(H-B)
[5]
system. These
methods have defaults in integration, feasibility, accuracy
and reliability, and in general are not commonly
employed in practice.
In this paper, we describe a method which is used to
evaluate the degree of one side facial paralysis caused by
function disorder problems of facial nerves, and make a
quantitative assessment of patients’ paralysis and health
status in order to help the physicians to choose the most
appropriate treatment scheme. The steps of our facial
paralysis detection method is composed of face detection,
salient points detection
[6]
, edge detection, K-MEANS
clustering
[7]
, key points detection and the final evaluation.
In our practice, it is proved to be feasible and time-saving.
The main idea is as follows. We first find the salient
points in the face region. Since the salient points include
some points that cannot describe facial features, edge
detection is used to discard these points. Then K-MEANS
clustering is applied to classify the salient points into six
categories which represent two eyebrows, two eyes, nose
and mouth. Following this fourteen key points are found
in the six facial regions respectively. They mostly
represent the state of the disease. Finally, certain vertical
distances are calculated to assess the degree of patient’s
paralysis.
2. Pre-processing to facial images
The facial images are captured by ourselves in the
hospital and they are original, and some noises will be
taken in to deteriorate the quality of the images. The
noises are unavoidable and destructive. Thus, Pre-
processing is necessary to remove the noises in order to
simplify the subsequent processes. First, the images are
change into gray ones, and then we apply certain filters to
the images to remove the noises.
3. Estimating degrees of facial paralysis
In order to estimate the degrees of the facial paralysis,
the following steps are taken:
(1) Face region detection: to restrain facial features in a
small rectangle in images;
(2) Salient points detection: to find all salient points of
the facial features;
(3) Edge detection: Susan algorithm is used to find the
edges of facial features including two eyebrows, two eyes,
nose and mouth;
(4) K-MEANS clustering: to classify the salient points
into six regions;
(5) Find the fourteen key points;
International Symposium on Intelligent Information Technology Application Workshops
978-0-7695-3505-0/08 $25.00 © 2008 IEEE
DOI 10.1109/IITA.Workshops.2008.93
483