Supporting the Evolution of a Software Visualization Tool Through Usability Studies Andrian Marcus, Denise Comorski, Andrey Sergeyev Department of Computer Science Wayne State University Detroit MI 44242 {amarcus, dcomors, andrey} @cs.wayne.edu Abstract The paper presents a usability study conducted with graduate and undergraduate computer science students, designed to evaluate the effectiveness of a software visualization tool named sv3D, and to provide necessary user data for the evolution of the system. Sv3D is a software visualization tool for comprehension of large software, capable of displaying source code and associated metrics in three dimensions. The participants in the study answered two types of questions: one set provided objective measurements to support the formulated hypotheses with respect to the accuracy and speed of the users answering questions using sv3D; the second set of questions provided subjective measurements that were used to support the evolution of sv3D. We formulated two null hypotheses with respect to accuracy and time respectively. The collected data supported one hypothesis and rejected the other. 1. Introduction Software visualization is a maturing area of research. In addition to taxonomies [18, 20, 23, 24], a variety of techniques and tools have been developed, which address a range of tasks from algorithm animation to support learning to visualization of software structure, data, and metrics to facilitate comprehension. All of these tools and technologies share a common promise that they help the user better understand aspects of the software and, in turn, help better perform specific software engineering tasks. Software visualization is situated at the intersection of information visualization, software engineering, human computer interaction, graphics, and cognitive psychology. Thus, researchers in this area borrow a number of methodologies from these fields, in particular, those involved with the evaluation of software visualization and comprehension techniques and tools. There are several ways in which researchers and practitioners choose to evaluate their tools. The ultimate proof for the quality of a tool is, of course, its wide adoption and use in the research community or in industry. To achieve this goal a mature tool is needed. While the software visualization field itself is maturing, many of the existing tools are still prototypes, more or less advanced. Some of them are generally used for proof-of-concept purposes, while others have a restricted user base outside their originator research group. There is no single evaluation procedure that fits all software visualization or comprehension systems. The issue of what type of evaluation studies is best suited for a particular type of applications has been studied by information visualization and cognitive psychology researchers [16]. The problem is still far from being completely solved. The consensus in the field is that the type of studies (e.g., usability, user, or case studies) one needs to perform depends on the technology and comprehension task at hand. With all this in mind, researchers in software visualization seem to prefer the following types of evaluations: a) Interviews with users of software visualization systems; b) Case studies performed by a very small group of users (usually the builders of the tool) on one or more subject software systems; and c) Usability studies involving several human subjects that are required to solve particular tasks using some software visualization tool (or tools). The subjects’ level of expertise ranges from students to seasoned software professionals. The last two categories (b and c) can be further divided into studies that are aimed at the evaluation of a single tool or comparative studies of multiple software visualization systems. An inherent problem faced by researchers in the field is access to appropriate subjects to conduct evaluation experiments. Students make up most of the participants in such experiments. This is well suited for tools that are Proceedings of the 13th International Workshop on Program Comprehension (IWPC’05) 1092-8138/05 $ 20.00 IEEE