REVIEW ARTICLE
Recognition of Fetal Facial Expressions Using Artifcial
Intelligence Deep Learning
Yasunari Miyagi
1
, Toshiyuki Hata
2
, Saori Bouno
3
, Aya Koyanagi
4
, Takahito Miyake
5
A BSTRACT
Fetal facial expressions are useful parameters for assessing brain function and development in the latter half of pregnancy. Previous investigations
have studied subjective assessment of fetal facial expressions using four-dimensional ultrasound. Artifcial intelligence (AI) can enable the
objective assessment of fetal facial expressions. Artifcial intelligence recognition of fetal facial expressions may open the door to the new
scientifc feld, such as “AI science of fetal brain”, and fetal neurobehavioral science using AI is at the dawn of a new era. Our knowledge of fetal
neurobehavior and neurodevelopment will be advanced through AI recognition of fetal facial expressions. Artifcial intelligence may be an
important modality in current and future research on fetal facial expressions and may assist in the evaluation of fetal brain function.
Keywords: Artifcial intelligence, Deep learning, Facial recognition, Fetus, Machine learning, Ultrasonography.
Donald School Journal of Ultrasound in Obstetrics and Gynecology (2021): 10.5005/jp-journals-10009-1710
I NTRODUCTION
Fetal behaviors such as fetal movements and facial expressions
that have been observed by four- (4D) or three-dimensional (3D)
ultrasound have been deemed to be related to the development of
fetal central nervous system development.
1–11
A scoring system,
12
which was originally reported by Kurjak et al. and later modifed by
Stanojevic et al.,
13
can evaluate fetal neurobehavioral development
by evaluating fetal movements and facial expressions. Fetal facial
movements and expressions such as blinking, a face without any
expression, mouthing, scowling, smiling, sucking, tongue expulsion,
and yawning can be evaluated by 4D ultrasound from the beginning
of the 2nd trimester of pregnancy.
2,14
Eye blinking (blinking) is a
refex response possibly related to brain function maturation and
development that occurs with advancing gestation.
14–18
Mouthing
is the most frequent expression and is recognized as fetal brain
maturation if it occurs together with non-rapid eye movement
after 35 weeks of gestation.
19
The frequency of scowling that might
indicate sufering of the fetus in utero pain or stress
20
increases
with advancing gestation.
21
Smiling might indicate a state of brain
development performing complex facial movements.
22,23
The
correlation of an expressionless face and tongue expulsion with
brain function is unclear.
14
Yawning may be utilized as an index of
fetal development.
24,25
Therefore, it is important to investigate fetal
facial expressions. There have been, however, no standard objective
methods to evaluate fetal facial expressions.
Recently, artifcial intelligence (AI) has advanced into the feld of
medicine. In diferent felds of obstetrics and gynecology, research
works relevant to AI have been published.
26–35
A well-trained AI
classifer that can evaluate and classify fetal facial expressions
would help investigate the development of the fetal central
nervous system. The AI recognition of adult facial expressions
has been investigated. Kim et al. reported the accuracy of the AI
facial expression recognition was 0.965.
36
Adult facial expressions
can state human mental state and behavior and their analysis
is available for marketing, healthcare, safety, environment, and
social media.
37
In this review article, we introduce the updated status of AI
recognition of fetal facial expressions as a signifcant parameter
for fetal brain function and suggest recommendations for future
research on fetal brain development and function.
R ECOGNITION OF F ETAL F ACIAL E XPRESSIONS
U SING AI
All data per fetus are divided into test/training/validation
datasets at random in a ratio that is not fxed but commonly set to
0.20/0.64/0.16. In this way, training datasets, validation datasets,
and non-overlapping test datasets are created.
The AI classifer is then designed. The AI classifer composed of
convolutional neural network (CNN)
38–43
for classifying categories
is often used for image recognition. The CNN usually comprises
1
Department of Gynecology, Miyake Ofuku Clinic, Okayama, Japan;
Medical Data Labo, Okayama, Japan; Department of Gynecologic
Oncology, Saitama Medical University International Medical Center,
Hidaka, Japan
2
Department of Obstetrics and Gynecology, Miyake Clinic, Okayama,
Japan; Department of Perinatology and Gynecology, Kagawa
University Graduate School of Medicine, Kagawa, Japan
3,4
Department of Obstetrics and Gynecology, Miyake Clinic, Okayama,
Japan
5
Department of Gynecology, Miyake Ofuku Clinic, Okayama, Japan;
Department of Obstetrics and Gynecology, Miyake Clinic, Okayama,
Japan; Department of Perinatology and Gynecology, Kagawa
University Graduate School of Medicine, Kagawa, Japan
Corresponding Author: Yasunari Miyagi, Department of Gynecology,
Miyake Ofuku Clinic, Okayama, Japan; Medical Data Labo, Okayama,
Japan; Department of Gynecologic Oncology, Saitama Medical
University International Medical Center, Hidaka, Japan, Phone: +81-
86-281-2020, e-mail: ymiyagi@mac.com
How to cite this article: Miyagi Y, Hata T, Bouno S, et al. Recognition
of Fetal Facial Expressions Using Artifcial Intelligence Deep Learning.
Donald School J Ultrasound Obstet Gynecol 2021;15(3):223–228.
Source of support: Nil
Confict of interest: None
© Jaypee Brothers Medical Publishers. 2021 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License
(https://creativecommons.org/licenses/by-nc/4.0/), which permits unrestricted use, distribution, and non-commercial reproduction in any medium, provided you give
appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons
Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.