Towards an Effec�ve Arousal Detec�on System for Virtual Reality
I�igeneia Mavridou
Centre of Digital Entertainment
Bournemouth University
Bournemouth, UK
imavridou@bournemouth.ac.uk
Ellen Seiss
Department of Psychology
Bournemouth University
Bournemouth, UK
eseiss@bournemouth.ac.uk
Theodoros Kostoulas
Dept. of Computing and Informatics
Bournemouth University
Bournemouth, UK
tkostoulas@bournemouth.ac.uk
Charles Nduka
Emteq Ltd.
Science Park Square,
Brighton, UK
charles@emteq.net
Emili Balaguer-Ballester
Dept. of Computing and Informatics
Bournemouth University
Bournemouth, UK
eb-ballester@bournemouth.ac.uk
ABSTRACT
Immersive technologies offer the potential to drive engage-
ment and create exciting experiences. A better understanding
of the emotional state of the user within immersive experi-
ences can assist in healthcare interventions and the evaluation
of entertainment technologies. This work describes a feasibil-
ity study to explore the effect of affective video content on
heart-rate recordings for Virtual Reality applications. A low-
cost re�lected-mode photoplethysmographic sensor and an
electrocardiographic chest-belt sensor were attached on a
novel non-invasive wearable interface specially designed for
this study. 11 participants responses were analysed, and
heart-rate metrics were used for arousal classi�ication. The
reported results demonstrate that the fusion of physiological
signals yields to signi�icant performance improvement; and
hence the feasibility of our new approach.
CCS CONCEPTS
•Human-centered computing→ Interaction paradigms:
Virtual reality; •Information systems→ Sentiment analysis; •
Human-centered computing→ Interactive systems and tools
KEYWORDS
Virtual Reality; Arousal, Classification; PPG; ECG; C-SVM;
ACM Reference format:
Ifigeneia Mavridou, Ellen Seiss, Theodoros Kostoulas, Charles Nduka,
Emili Balaguer-Ballester. 2018. Towards an Effective Arousal Detec-
tion System for Virtual Reality. In Proc. of ACM Human-Habitat for
Health (H3'18). ACM, Boulder, CO, USA, October 2018, 6 pages.
DOI: 10.1145/3279963.3279969
Permission to make digital or hard copies of part or all of this work for personal
or classroom use is granted without fee provided that copies are not made or
distributed for profit or commercial advantage and that copies bear this notice
and the full citation on the first page. Copyrights for components of this work
owned by others than ACM must be honored. Abstracting with credit is permit-
ted. To copy otherwise, or republish, to post on servers or to redistribute to
lists, requires prior specific permission and/or a fee. Request permissions from
Permissions@acm.org.
Human-Habitat for Health (H3'18), October 16, 2018, Boulder, CO, USA
© 2018 Association for Computing Machinery.
ACM ISBN 978-1-4503-6075-3/18/10…$15.00
https://doi.org/10.1145/3279963.3279969
1 INTRODUCTION
The increasingly evolving Virtual Reality (VR) technologies
permit the adaptation of experimental protocols for their use
with VR. Crucially, experiment design utilising VR can offer
controlled laboratory conditions while granting a wealth of
content resources and ecological validity [1]. User input and
interface sensory modalities are currently integrated with VR,
as they monitor the user’s actions. These systems use various
haptic and wearable user-interfaces to track head and body
movements, eye gaze and speech patterns [2]. Such metrics
can describe useful information related to the user’s behav-
iour, preferences and actions within VR. As such, they can im-
prove automatic emotion recognition, which is important to
enhance VR user interactions. Previous research on affective
computing offers a wealth of emotion detection solutions
ranging from physiological and speech signals, to monitoring
facial expressions, and movement analysis [3]. Understanding
the user's emotions and behaviour within VR experiences
could not only assist experience-designers to evaluate their
content [4, 5] but also in healthcare interventions such as VR
exposure therapy [6].
There are two basic challenges for emotion recognition in
VR. Firstly, the Head Mounted Displays (HMDs) commonly
used during VR experiences cover a significant part of the face
which renders the detection of facial expressions difficult.
Secondly, commercial immersive experiences require often
intense head and limb movements, which could result in noise
artefacts on potential wearable sensors. To overcome the first
challenge, our team developed a novel prototype for facial ex-
pression recognition, Faceteq™ [7] with surface physiological
sensors. This interface can be incorporated on a commercial
HMD, acting as non-invasive, soft medium between the user’s
skin and the HMD.
In this work, we propose a system for the detection of high
and low arousal in VR settings via capturing multimodal
heart-rate responses (from low cost, custom-made photople-
thysmographic (PPG) and electrocardiographic (ECG) sen-
sors) and continuous self-ratings of HMD users.
brought to you by CORE View metadata, citation and similar papers at core.ac.uk
provided by Bournemouth University Research Online