Modelling Human Perception to Leverage the Reuse of Concepts
across the Multi-sensory Design Space
Keith V. Nesbitt
School of Information Technology
Charles Sturt University
Panorama Av, Bathurst, NSW
knesbitt@csu.edu.au
Abstract
Information Visualisation is an emerging discipline that
concerns the design of interactive computer systems that
provide the user with a visual model of abstract data.
Information Visualisation implies a mapping from the
data attributes to the units of visual perception.
Information Sonification is an embryonic field that uses
sound rather than imagery to present abstract data.
Information Sonification, implies a mapping from the data
attributes to the units of auditory perception. In both these
fields the need to describe appropriate mappings between
the data and the units of perception has led to models or
taxonomies that describe the available design space.
While these models of the visual design space and the
auditory design space may be appropriate for people
working in a single sensory domain, these models based
purely on sensory attributes are very disjoint. However,
for designers who wish to consider a multi-sensory
solution to information display, these disjoint models of
the different sensory domains make it difficult to compare
and contrast the possible mapping choices.
This paper describes existing conceptual models of the
visual and auditory design space and then proposes a
different conceptual modelling of the multi-sensory
design space. This new model describes the units of
perception but is not based on sensory attributes, but
typical information metaphors. Throughout the paper all
discussions are illustrated using the UML modelling
notation which is a standard notation used to document
the design of software systems
.
.
Keywords: Perception, Modelling, Multi-sensory
1 Introduction
The idea of mining large abstract data sets for useful
patterns is an attractive proposition, especially at a time
when most companies are growing larger and larger
stocks of data. The most traditional notion of data mining
.
Copyright © 2006, Australian Computer Society, Inc. This
paper appeared at the Third Asia-Pacific Conference on
Conceptual Modelling (APCCM2006), Hobart, Australia.
Conferences in Research and Practice in Information
Technology (CRPIT), Vol. 53. Markus Stumptner, Sven
Hartmann, and Yasushi Kiyoki, Eds. Reproduction for
academic, not-for profit purposes permitted provided this text is
included.
is an automated process, which involves running rule-
finding algorithms across the data to automatically detect
patterns. However, there is also a growing interest in the
idea of developing tools that support human pattern
recognition within large data sets. Such human perceptual
tools present the data to the user’s senses (vision, hearing,
touch) in a way that the user can search for useful
patterns.
It might be expected that Human Perceptual Tools are
particularly useful where; unpredictable exceptions may
occur in the data; heuristics are required to filter subtle
variations; the target is unknown or cannot be precisely
formalised by rules and; the problem requires intuitive
knowledge that is hard to formalise, such as, past
experience.
During the 1990s, the accent for Human Perceptual Tools
was on designing visual displays of data. This approach is
sometimes called visual data mining (Soukup 2002),
although the more general term is information
visualisation (Card, Mackinlay et. al. 1999). A number of
example applications have been described and the field is
beginning to develop a more theoretical basis (Card,
Mackinlay et. al. 1999).
By contrast the use of other senses, such as, hearing and
haptics (touch) to display abstract data are fairly
embryonic. The term information sonification is used to
describe auditory models of abstract data and despite a
number of validated uses of sound for finding patterns in
abstract data (Kramer, G. 1994) the field can probably be
best described as immature.
Haptic displays are still relatively uncommon although
some novel applications have been developed, for
example, the use of haptic displays for investigating
patterns within force fields (Brooks, Ouh-Young et al.
1990) and fluid flow models (Nesbitt, Gallimore et al
2001).
Regardless of the particular sensory modality, the
designer of a human perceptual tool can describe the
design as a mapping between the data and the some
characteristics of the model. A common approach is to
map the data to the display artefacts that can be perceived
by the user. Assuming there are no shortcomings in the
display itself, the possible range of artefacts can be
described as the fundamental elements or units of human
perception. For example colour and shape are two of the
visual perceptual units available to a designer. If the