Using a User Experience Evaluation Framework for
eModeration
Cornè J van Staden
School of Computing
UNISA
Florida, South Africa
vstadcj1@unisa.ac.za
Prof Judy A van Biljon
School of Computing
UNISA
Florida, South Africa
vbiljja@unisa.ac.za
Prof Jan H Kroeze
School of Computing
UNISA
Florida, South Africa
kroezjh@unisa.ac.za
Abstract— The use of eModeration (of examination scripts) can
improve the efficiency of the examination moderation, while
simultaneously lowering the risk of misplacing or delaying the
moderation process. Despite the potential benefits of using an
eModerate system in terms of optimising examination procedures, the
implementation and application of such online moderation systems in
the South African context is limited. Various factors could be
contributing towards the resistance to the implementation and
adoption of eModerate systems in higher education institutions.
These factors include human factors as well as technical and
organisational resistance to change. This study focuses on the human
factors involved in eModeration (user experience) and attempts to
answer the following research question: How can the User
Experience Evaluation Framework for eModeration be utilised within
the context of higher education institutions in South Africa? The
research used a Design Science Research methodology, which
included the design, development as well as testing of the User
Experience Evaluation Framework for eModeration. This paper will
report on identified issues pertaining to the User Experience
Evaluation Framework for eModeration during the evaluation phase.
The research was conducted at Midrand Graduate Institute (MGI),
now trading as Pearson Institute of Higher Education, a private
higher education institute in South Africa. The data generation
methods included interviews with eModerators from different
faculties within a private higher education institution. This paper
makes a theoretical contribution to this area of study by identifying
the problems that users might have with the implementation of the
User Experience Evaluation Framework for eModeration as well as
providing some insights into the user experience of eModerators.
Keywords — eModeration, eModerators, user experience,
functionality, effectiveness, efficiency
I. INTRODUCTION
Manual paper-based moderation is still widely used at
academic institutions in South Africa but the manual process
poses challenges in terms of cost and time [4, 5]. An additional
challenge can be found in the intensive management process
associated with manual moderation. The challenges that
educators at academic institutions experience with regards to
paper-based moderation call for an investigation into
moderation processes. “Midrand Graduate Institute (MGI), a
private higher education institution (PHEI) in South Africa,
reviewed their examination and moderation practices and
realised” [37:1] that there was a need to revise “assessment
practices and structures” [37:1]. MGI decided after the
investigation to replace the paper-based moderation process
with an electronic process [5].
The role of information communication technology has
changed over the years. The researcher recognised an emerging
research trend and direction in the user experience of electronic
moderation, which lead to an investigation of some of the
challenges that educators might experience with finding,
evaluating and using electronic moderation systems. Crucially,
the researcher found that no user experience evaluation
framework for eModeration existed at the time of this research.
This work importantly addresses this lack of framework and
contributes to the body of knowledge by proposing and
creating a User Experience Evaluation Framework for
eModeration, that can be used by educators to assess how
appropriate an eModerate system is for their needs [4], with
conceivable limitations to electronic moderation including
connectivity, access to the internet and bandwidth.
Within “the context of this research, electronic moderation,
also referred to as eModeration, involved a process of
moderating examination scripts online” [37:1]. The following
definition of eModeration was used in this study: “eModeration
can be defined as the electronic moderation [quality
assurance/critical reading] of summative examination scripts
by external moderators in a virtual learning environment called
eModerate” [22:3]. Several different types of users were
involved in the electronic moderation process, such as
examiners, moderators and the deans of the various faculties.
The deans of the faculties managed the moderation process
[37]. The user experience that was investigated was that of the
eModerators. This paper reflects on the issues identified by
2017 Conference on Information Communications Technology and Society
978-1-4673-8996-9/17/$31.00 ©2017 IEEE