Citation: Bouzid, A.; Sierra-Sosa, D.;
Elmaghraby, A. A Robust Pedestrian
Re-Identification and
Out-Of-Distribution Detection
Framework. Drones 2023, 7, 352.
https://doi.org/10.3390/
drones7060352
Academic Editor: Anastasios Dimou
Received: 18 April 2023
Revised: 15 May 2023
Accepted: 25 May 2023
Published: 27 May 2023
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
drones
Article
A Robust Pedestrian Re-Identification and Out-Of-Distribution
Detection Framework
Abdelhamid Bouzid
1,
* , Daniel Sierra-Sosa
2
and Adel Elmaghraby
1
1
Department of Computer Science and Engineering, University of Louisville, Louisville, KY 40208, USA;
adel@louisville.edu
2
Department of Computer Science and Information Technology, Hood College, Frederick, MD 21701, USA;
sierra-sosa@hood.edu
* Correspondence: abdelhamid.bouzid@louisville.edu
Abstract: Pedestrian re-identification is an important field due to its applications in security and
safety. Most current solutions for this problem use CNN-based feature extraction and assume that
only the identities that are in the training data can be recognized. On the one hand, the pedestrians
in the training data are called In-Distribution (ID). On the other hand, in real-world scenarios, new
pedestrians and objects can appear in the scene, and the model should detect them as Out-Of-
Distribution (OOD). In our previous study, we proposed a pedestrian re-identification based on
von Mises–Fisher (vMF) distribution. Each identity is embedded in the unit sphere as a compact
vMF distribution far from other identity distributions. Recently, a framework called Virtual Outlier
Synthetic (VOS) was proposed, which detects OOD based on synthesizing virtual outliers in the
embedding space in an online manner. Their approach assumes that the samples from the same object
map to a compact space, which aligns with the vMF-based approach. Therefore, in this paper, we
revisited the vMF approach and merged it with VOS to detect OOD data points. Experiment results
showed that our framework was able to detect new pedestrians that do not exist in the training data
in the inference phase. Furthermore, this framework improved the re-identification performance and
holds a significant potential in real-world scenarios.
Keywords: pedestrian detection; tracking; re-identification; Virtual Outlier Synthetic (VOS):
In-Distribution; Out-Of-Distribution; Unmanned Aerial Vehicles; drones; surveillance; von Mises–
Fisher Distributions (vMF)
1. Introduction
Pedestrian tracking and re-identification systems based on machine learning have
emerged as a significant solution for various safety and security applications [1,2]. These
systems utilize a mapping function that is trained to embed images into a compact Eu-
clidean space, such as a unit sphere [3,4]. The primary objective of this embedding is
to ensure that images depicting the same person are mapped to nearby feature points,
while images depicting different people are mapped to distant feature points. However, in
real-world scenarios, there may be situational changes such as differences in pedestrian
position, orientation, and occlusion within a single scene, which can adversely affect the
effectiveness of the embedding approach. To overcome these challenges, it is essential
to develop robust pedestrian re-identification systems that can handle such variations.
Additionally, the embedding approach should not rely on clothing appearance, given that
individuals may wear different clothing over time, spanning days or weeks.
pedestrian re-identification is a challenging task in computer vision, where the goal
is to recognize a pedestrian across multiple camera views. It has been a topic of intense
research over the last decade due to its importance in various applications, such as surveil-
lance and forensics [5–7]. Traditional approaches rely on hand-crafted features and metrics
to match individuals across cameras [8,9]. However, with the recent advancements in
Drones 2023, 7, 352. https://doi.org/10.3390/drones7060352 https://www.mdpi.com/journal/drones