Security Requirements for Outsourced Big Data in a Cloud Environment Imene Bouleghlimat Salima Hacini LIRE Laboratory, Department of Software Technology and Information Systems University of Constantine 2 - Abdelhamid Mehri Constantine,Algeria (imene.bouleghlimat,salima.hacini)@univ-constantine2.dz Abstract In the era of big data-driven services, enterprises and organizations require seeking high performance and storage systems that can store and process a high volume of heterogeneous data to extract useful insights. Cloud computing provides the required software, platforms, and infrastructures. However, the security of outsourced data presents the main concern. This paper focuses on big data security requirements and modeling. A conceptual extension of i* is proposed and the big data security requirements are generally modeled using our i* model extension. 1 Introduction Nowadays, data-driven services-based enterprises and organizations need to collect, manage, store, analyze, and visualize large volume of heterogeneous data to extract useful information and insights for decision making in their businesses. Therefore, traditional storage and processing tools cannot deal with these data called big data. The “3V”, Volume, Variety, and Velocity characterize the Big data [Mark&all12]. The Volume refers to large data with size in the range of Zettabytes. These data can be structured, semi-structured, and unstructured and with a different format, that what denoted by Variety. Velocity presents the speed of data generation, processing, etc. Other V’s are added such as Value, Veracity, Viability, and Complexity. Value presents the insight that can be extracted from these large data. Veracity refers to data quality. Since data are generating and collecting from various sources and with a different format, cleaning, linking, and transforming data into the required format before processing phase is needed. This process describes big data Complexity. These enterprises require infrastructures, software, and platforms that can support big data Vs. They need the process and analysis efficiency of these data at any time [Kimball&Ross13]. The Cloud computing environment provides these requirements. Several big data technologies are available as a service in the Cloud but how to ensure the security of the outsourced data and processing becomes one of the main challenges. For the proposal of a security approach for big data outsourced in a Cloud environment, it is important to identify its security requirements and possible threats in each phase of the big data life cycle in the Cloud environment. Security requirements engineering continue to develop and provide techniques, methods, and modeling languages framework for tackling this part in the information systems and software development process. For requirements modeling, various languages are proposing by researchers. There are Domain Specific Modeling Languages DSML that model only given domain and General Purpose Modeling Language GPML that is proposed to model any domain Page51 The 8th International Seminary on Computer Science Research at Feminine