1 PRiMMA: Privacy Rights Management for Mobile Applications Funded by the E PSRC (E P/ F024037/ 1) B. Nuseibeh, A. Bandara, B. Price, A. Joinson, and Y. Rogers The Open University [b.nuseibeh, a.k .bandara, b.a.price, a.n.joinson, y.rogers] @ open.ac.uk M. Sloman, E. Lupu, N. Dulay, A. Russo Imperial College London [m.sloman, e.c.lupu, n.dulay, a.russo] @ imperial.ac.uk Introduction The age of Ubiquitous Computing is approaching fast: most people in the UK over the age of 8 carry mobile phones [5], which are becoming increasingly sophisticated interactive computing devices. Location-based services are also increasing in popularity and sophistication [6]. There are many tracking and monitoring devices being developed that have a range of potential applications, from supporting mobile learning to remote health monitoring of the elderly and chronically ill. However, do users actually understand how much of their personal information is being shared with others? In a recently released report from the UK Information Commissioner [7], we are warned that the UK in particular is ‘sleepwalking into a surveillance society’, as ordinary members of the public give up vast amounts of personal information with no significant personal or societal advantage gained. In general, there will be a trade off between usefulness of disclosing private information and the risk of it being misused. This project will investigate techniques for protecting the private information typically generated from ubiquitous computing applications from malicious or accidental misuse. Consider the following two mobile learning scenarios: Alice, Bob and Charles are students who have Digital Study Assistant (DSA) applications that run on their mobile computing devices in order to support their study programme. Functionality supported by the application includes instant messaging, study group formation and co-ordination, access to tutor support, tutor monitoring, and information sharing. In order to provide this functionality, the DSA requires access to the student’s personal information (e.g., name, address, email, contact number, etc.) and their university academic record. Student privacy is ensured by the Privacy Rights Manager (PRM) component of the DSA which can be configured with policies that control the disclosure of personal information based on the student’s context. Scenario 1 : Bob and Charles have already agreed to form a study group. They are happy for others to join their group and therefore configure their DSAs to advertise their study group whenever they are together with their status set to ‘studying’. In order to protect their privacy, they define a policy for their PRMs so that the advertisement only discloses their group size, home city and email addresses. Alice is looking to join a small study group and happens to attend a course seminar at the same time as Bob and Charles. Her DSA receives the study group advertisement from Bob and Charles’ DSAs for the same course as the one she is following. Alice is keen to join a group with members of a similar ability to her and would therefore like to know the average course scores of the group before deciding to join. When her DSA requests this information from Bob and Charles, their PRMs prevent this information from being disclosed and raise an alert. Bob and Charles discuss this request and decide they are happy to disclose this information on this occasion, and respond to Alice’s DSA with their average course scores. Upon receiving this information, Alice decides to join Bob and Charles’ study group. Scenario 2 : Tom is a tutor who would like to find out if his tutee Charles is participating in a study group and if so who else is in the group. Charles has configured his DSA to indicate that his status is ‘studying’ whenever he is in the library and with Bob and/or Alice. Whilst Charles’ privacy policy allows disclosure of his status and location to his tutor, Alice and Bob’s privacy policies prevent disclosure of their locations to anyone. Therefore, Charles’ DSA is not able to respond to Tom’s request to know who Charles is studying with and raises an alert to Alice and Bob’s DSAs requesting permission to disclose their presence with Charles. On this occasion Alice and Bob allow this information to be disclosed and Tom is informed that Charles is studying with Alice and Bob. When this event occurs on a number of occasions, Alice and Bob’s DSA suggest a new privacy policy that allows Tom to know that they are with Charles. These scenarios illustrate the need for explicit privacy rights in mobile computing interactions, and the importance of being able to detect and resolve inconsistencies between user privacy policies and the information required to provide particular functionalities. The second scenario also raises the need to be able to analyse a collection of user privacy policies before making a decision to disclose private information. Additionally, the second scenario highlights the need for automated learning of privacy policies in order to minimise the overhead of requiring user intervention whenever there is an inconsistency between policies or between an information request and privacy policies. This project will investigate privacy requirements across the general population for a specific set of ubiquitous computing technologies and produce a reusable framework with demonstrator applications, based on the above scenarios, evaluated with participants across a wide population demographic. It will focus on investigating and addressing the privacy issues associated with this type of ubiquitous computing interaction. We propose to develop a Privacy Rights Management (PRM) framework that will enable users to specify and manage the privacy of personal context information generated by a pervasive system [8]. This framework will integrate users’ privacy policies with their personal information to control how information is used. This is analogous to Digital Rights Management (DRM), which uses software solutions to protect digital information against copyright infringement and often incorporates information such as ‘digital watermarks’ in the data being protected or encapsulates the data such that it is self protecting [9]. Our work will identify how people perceive privacy in ubiquitous systems, how they would like to control it, and provide tools that will enable them to manage the privacy of the information they generate. We will make use of a large cohort of over 1000 OU students with a broad range of ages and backgrounds, both for identifying requirements and a smaller group of over 100 to evaluate the tools for privacy management prototyped in the project. The size of the first sample enables us to generalize the