Research Article
Prediction of the Age and Gender Based on Human Face Images
Based on Deep Learning Algorithm
S. Haseena,
1
S. Saroja ,
1
R. Madavan,
2
Alagar Karthick,
3
Bhaskar Pant,
4
and Melkamu Kifetew
5
1
Department of Information Technology, Mepco Schlenk Engineering College, Sivakasi, 626005 Tamil Nadu, India
2
Department of Electrical and Electronics Engineering, PSR Engineering College, Sivakasi, 626140 Tamil Nadu, India
3
Renewable Energy Lab, Department of Electrical and Electronics Engineering, KPR Institute of Engineering and Technology,
Coimbatore, 641407 Tamil Nadu, India
4
Department of Computer Science and Engineering, Graphic Era Deemed to Be University, Bell Road, Clement Town,
248002 Dehradun, Uttarakhand, India
5
Department of Environmental Engineering, College of Biological and Chemical Engineering Addis Ababa Science and
Technology University, Addis Ababa, Ethiopia
Correspondence should be addressed to Melkamu Kifetew; melkamu.kifetew@aastustudent.edu.et
Received 10 April 2022; Revised 14 June 2022; Accepted 19 June 2022; Published 24 August 2022
Academic Editor: Muhammad Fazal Ijaz
Copyright © 2022 S. Haseena et al. This is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
In recent times, nutrition recommendation system has gained increasing attention due to their need for healthy living. Current
studies on the food domain deal with a recommendation system that focuses on independent users and their health problems
but lack nutritional advice to individual users. The proposed system is developed to suggest nutritional food to people based on
age and gender predicted from their face image. The designed methodology preprocesses the input image before performing
feature extraction using the deep convolution neural network (DCNN) strategy. This network extracts D-dimensional
characteristics from the source face image, followed by the feature selection strategy. The face’s distinctive and identifiable traits
are chosen utilizing a hybrid particle swarm optimization (HPSO) technique. Support vector machine (SVM) is used to classify
a person’s age and gender. The nutrition recommendation system relies on the age and gender classes. The proposed system is
evaluated using classification rate, precision, and recall using Adience dataset and UTKface dataset, and real-world images
exhibit excellent performance by achieving good prediction results and computation time.
1. Introduction
In recent years, many real-life applications such as social
media, security control, advertising, and entertainment have
made use of information contained in a human face. Auto-
matic age [1] as well as gender [2] prediction from facial
image plays a vital role in interpersonal communication
and is always a significant area for researchers of computer
vision [3]. Face age and gender recognition are a very impor-
tant aspect of face analysis that has piqued the interest of
researchers in areas such as demographic information collec-
tion, surveillance, human-computer interaction, marketing
intelligence, and security. Recently nutrition recommenda-
tion has gained attention among both healthy and unhealthy
people. This paper focuses on recommending nutritional
advice for people based on their age and gender.
Different methodologies have been available to identify
gender based on human biometric traits, mannerisms, and
behaviours. A face provides distinguished information about
a person that includes age, gender, expression, mood, ethnic-
ity, etc. Gender identification from a person’s face image is a
difficult application in the computer vision community,
image analysis, and artificial intelligence that recognises gen-
der based on masculinity and femininity. It is binary classifi-
cation problem which assigns a gender class to an individual.
Gender identification is one part of facial analysis [4, 5]
Hindawi
Computational and Mathematical Methods in Medicine
Volume 2022, Article ID 1413597, 16 pages
https://doi.org/10.1155/2022/1413597