Qantifying Coordination in Human Dyads via a Measure of
Verticality
Roshni Kaushik
Mechanical Science and Engineering
Champaign-Urbana, Illinois
rkaushi2@illinois.edu
Ilya Vidrin
Movement Lab
Cambridge, Massachusetts
ilya_vidrin@mail.harvard.edu
Amy LaViers
Mechanical Science and Engineering
Champaign-Urbana, Illinois
alaviers@illinois.edu
ABSTRACT
Working towards the goal of understanding complex, interactive
movement in human dyads, this paper presents a model for analyz-
ing motion capture data of human pairs and proposes measures that
correlate with features of the coordination in the movement. Based
on deep inquiry of what it means to partner in a motion task, a
measure that characterizes the changing verticality of each agent is
developed. In parallel a naïve human motion expert provides a qual-
itative description of the features and quality of coordination within
a dyad. Analysis on the verticality measure, the cross-correlation
of verticality signals, and deviation of those verticality signals from
the trend over time, provides quantitative insight that corroborates
the naïve expert’s analysis. Specifcally, the paper shows that, for
four samples of dyadic behavior, these measures provide informa-
tion about 1) whether two agents were involved in the same dyadic
interaction and 2) the level of "resistance" found in these interac-
tions. Future work will test this model over a larger dataset and
develop human-robot coordination schemes based on this model.
CCS CONCEPTS
· Human-centered computing → Empirical studies in inter-
action design; Empirical studies in interaction design; · Ap-
plied computing → Performing arts;
KEYWORDS
motion-capture, robotics, partner, interaction, coordination, dyad
ACM Reference Format:
Roshni Kaushik, Ilya Vidrin, and Amy LaViers. 2018. Quantifying Coordina-
tion in Human Dyads via a Measure of Verticality. In MOCO: 5th International
Conference on Movement and Computing, June 28–30, 2018, Genoa, Italy, Jen-
nifer B. Sartor, Theo D’Hondt, and Wolfgang De Meuter (Eds.). ACM, New
York, NY, USA, Article 4, 8 pages. https://doi.org/10.1145/3212721.3212805
1 INTRODUCTION
Human movement is a complex physical phenomenon, full of the
richness of contexts, interactions, and variations. In particular, the
intricacies of interactive movement raise many research questions,
including the manner of nonverbal communication between a pair
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for proft or commercial advantage and that copies bear this notice and the full citation
on the frst page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specifc permission and/or a
fee. Request permissions from permissions@acm.org.
MOCO, June 28–30, 2018, Genoa, Italy
© 2018 Association for Computing Machinery.
ACM ISBN 978-1-4503-6504-8/18/06. . . $15.00
https://doi.org/10.1145/3212721.3212805
or dyad performing a task together. In something as simple as
moving a table across a room, two individuals communicate through
the movement of their bodies in addition to the forces applied on the
table and the foor. In partner dance, this communication channel is
even more nuanced. When dissecting interactive human movement,
we seek to identify properties describing and characterizing these
interactions.
A number of studies have been conducted on dyadic interactions.
For example, analyzing the making and breaking of symmetry of
the head (mirror symmetry) during conversations showed to be
a meaningful element of communication when modeled with a
dynamical system [5, 7, 8]. Clinicians found that understanding
micro-movements using kinematic recordings could allow them
to classify dyadic interactions of people with social difculties
more quantitatively [29]. Additionally, movement as an important
design aspect in human-computer interaction prompted a course
on embodied interaction, formalizing the applications for many
movement aspects [11].
Categorizing the large variety of movement can draw analogies
from studies that look for parameterizations of other large datasets.
The search for a parameterization of images using thermodynamic
principles such as energy and entropy drew many parallels between
the physical intuition of thermodynamics and properties of the im-
age, revealing measures that refected natural versus urban images
[26]. An interactive online dance work allowed researchers to better
understand the interactions between the audience and the work
and develops kinesthetic empathy as a parameter in movement
representations [10].
Machine learning and neural networks can be used to abstract
away the complexities of interaction by training models with ex-
amples. Gaussian Mixture Models (GMM) of Interaction Primitives
model nonlinear correlations between diferent movers [12, 18].
Task-parameterized dynamical systems combined with learning
allowed a robot to learn a collaborative task after observing a
pair of humans performing the same task [24]. A GMM trained
with examples of two humans interacting recognized new actions
and generated responses of a virtual character [13]. Learning from
demonstrations, a virtual dancer developed an internal model of
a human dancer’s movements using Artifcial Neural Networks
(ANN) and Hidden Markov Models (HMM) and reacted to some
movements from a human dancer [19].
Haptic feedback, a way to measure the forces a user exerts on
an interface, is another tool used to understand interactive mo-
tion. A dancing robot adjusted the length of its stride based on
haptic feedback from the physical connection between robot and
human [27], and male and female partner dancing behavior was
synthesized based on haptic interactions and stride length [14]. A