ISMIR 2008 – Session 2c – Knowledge Representation, Tags, Metadata MOODSWINGS: A COLLABORATIVE GAME FOR MUSIC MOOD LABEL COLLECTION Youngmoo E. Kim Erik Schmidt Lloyd Emelle Electrical & Computer Engineering Drexel University {ykim,eschmidt,lte22}@drexel.edu ABSTRACT There are many problems in the field of music infor- mation retrieval that are not only difficult for machines to solve, but that do not have well-defined answers. In labeling and detecting emotions within music, this lack of specificity makes it difficult to train systems that rely on quantified la- bels for supervised machine learning. The collection of such “ground truth” data for these subjectively perceived features necessarily requires human subjects. Traditional methods of data collection, such as the hiring of subjects, can be flawed, since labeling tasks are time-consuming, tedious, and expensive. Recently, there have been many initiatives to use customized online games to harness so-called “Human Computation” for the collection of label data, and several such games have been proposed to collect labels spanning an excerpt of music. We present a new game, MoodSwings (http://schubert.ece.drexel.edu/moodswings), which differs in that it records dynamic (per-second) labels of players’ mood ratings of music, in keeping with the unique time- varying quality of musical mood. As in prior collaborative game approaches, players are partnered to verify each oth- ers’ results, and the game is designed to maximize consensus- building between users. We present preliminary results from an initial set of game play data. 1 INTRODUCTION The detection and labeling of the emotional content (mood) of music is one of many music information retrieval prob- lems without a clear “ground truth” answer. The lack of eas- ily obtained ground truth for these kinds of problems further complicates the development of automated solutions, since classification methods often employ a supervised learning approach relying on such ground truth labels. The collec- tion of this data on subjectively perceived features, such as musical mood, necessarily requires human subjects. But tra- ditional methods of data collection, such as the hiring of subjects, have their share of difficulties since labeling tasks can be time-consuming, tedious, error-prone and expensive. Recently, a significant amount of attention has been placed on the use of collaborative online games to collect such ground truth labels for difficult problems, harnessing so- called “Human Computation”. For example, von Ahn et al. have created several such games for image labeling: the ESP Game, Peekaboom [1], and Phetch. More recently, several such games have been been proposed for the collection of music data, such as MajorMiner [2], Listen Game [3], and TagATune [4]. These implementations have primarily fo- cused on the collection of descriptive labels for a relatively short audio clip. We present a new game, MoodSwings, designed to ex- plore the unique time-varying nature of musical mood. Of course, one of the joys of music is that the mood of a piece may change over time, gradually or suddenly. According to Huron [5], this combination of anticipation and surprise may be at the core of our enjoyment of music. Thus, our game is targeted at collecting dynamic (per-second) labels of users’ mood ratings, which are collected in real-time as a player hears the music using the two-dimensional grid of emotional components: valence and arousal. As in other collaborative games, players are partnered in order to verify each others’ results, providing a strong incentive for pro- ducing high-quality labels that others can agree upon. Ac- cordingly, game scoring is designed to maximize consensus- building between partners. In this paper, we present data from an initial pilot phase of the game and demonstrate the utility of this approach for the collection of high-quality, dy- namic labels of musical affect. 2 BACKGROUND Models of affect and the categorization and labeling of spe- cific emotions has received significant attention from a vari- ety of research areas including psychology, physiology, neu- roscience, as well as musicology. With the advent of digital music and very large music collections, recent work has fo- cused on the problem of automatic music mood detection. Next, we briefly summarize some of the related work. 2.1 Mood models Early work on the quantification of musical affect focused on the formation of ontologies using clusters of common 231