1932-4537 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TNSM.2019.2963643, IEEE Transactions on Network and Service Management 1 Dynamic On-Demand Fog Formation Offering On-The-Fly IoT Service Deployment Hani Sami and Azzam Mourad Department of Computer Science & Mathematics, Lebanese American University, Beirut, Lebanon Abstract—With the increasing number of IoT devices, fog computing has emerged, providing processing resources at the edge for the tremendous amount of sensed data and IoT computation. The advantage of the fog gets eliminated if it is not present near IoT devices. Fogs nowadays are pre-configured in specific locations with pre-defined services, which limit their diverse availabilities and dynamic service update. In this paper, we address the aforementioned problem by benefiting from the containerization and micro-service technologies to build our on- demand fog framework with the help of the volunteering devices. Our approach overcomes the current limitations by providing available fog devices with the ability to have services deployed on the fly. Volunteering devices form a resource capacity for building the fog computing infrastructure. Moreover, our framework leverages intelligent container placement scheme that produces efficient volunteers’ selection and distribution of services. An Evolutionary Memetic Algorithm (MA) is elaborated to solve our multi-objective container placement optimization problem. Real life and simulated experiments demonstrate various im- provements over existing approaches interpreted by the relevance and efficiency of (1) forming volunteering fog devices near users with maximum time availability and shortest distance, and (2) deploying services on the fly on selected fogs with improved QoS. Index Terms—IoT, Fog Computing, On-Demand Fog Forma- tion, Edge Computing, Docker, Kubernetes, Kubeadm, Container Placement, Evolutionary Memetic Algorithm, Micro-Services. I. I NTRODUCTION I N today’s fast growth and high intelligence, we are en- countering a vast increase in the number of IoT devices changing the way we live. This leads to a humongous volume of data that has to be dealt with before going to the cloud with the help of fog nodes located at the edge next to IoT devices [1]. The purpose of having fog nodes located next to users can be summarized as a computation resource for filtering the data, processing power for the data before it goes to the cloud, minimizing the workload on the cloud, achieving faster response time, and diminish their energy consumption by reducing data transmission over the network. Fog is a perfect solution for resource constrained devices and users in need of a service running nearby to get a better Quality of Service (QoS). However, current work in the literature are considering fog devices pre-configured in specific locations next to a group of known users and running the same known services all the time. This limits the fog advantages in terms of having them available all the time next to the user in need and does not allow the fog to change and update services in its hosting environment on-demand. Accordingly, there is a need for having an architecture that can help in creating fogs on-demand, and can adapt or configure the installed services that should be updated, removed, or changed dynamically. In parallel, the rise of the containerization technology is opening the door for interesting solutions serving the fog computing objectives. Containers are services running on a device by using their actual operating system to provide them with services. This makes containers more lightweight and gives them an advantage over virtual machines which use a full copy of the operating system and are much heavier on devices [2]. It is easier now to just have an abstracted operating system with all the environments needed to run multiple services in multiple containers that are a copy of images pulled from im- age repositories like Docker Hub. Docker and Kubernetes are the main containerizations and orchestration technologies used nowadays [3]. Moreover, the fast emergence of micro-service architecture, where services are designed to be decoupled and lightweight, makes them perfect candidates to be deployed and executed on containers. In this paper, we benefit from the containerization and micro-service technologies to address the aforementioned problems of the existing statically formed fog computing architectures and solutions in the literature. In this context, we propose a dynamic on-demand fog computing framework based on Kubeadm and Docker with the presence of volunteering devices. Images embedding lightweight micro-services can be deployed and run efficiently on the fly even on devices with limited resources [4]. The advantage of supporting on the fly deployment technique is to push only current needed services which were not predicted to be requested. Moreover, the motivation behind using volunteering devices to join the fog network is to increase the available resources capacity wherever possible, which leads to maintained services availability everywhere. Our proposed on-demand creation serves as a solution to overcome the limitation of fog availability, usage of fogs on statically defined locations, hosting pre-configured devices, and embedding pre-selected services. A master-worker nodes architecture is implemented with the help of a Kubernetes Utility called Kubeadm [5], which uses docker to monitor the status of containers running on every worker node through the master node. Moreover, another problem arises to select the best volunteers that can host the different required services. In this regard, we formulate this problem as a multi-objective container placement optimization problem and provide an Evo- lutionary Memetic Algorithm to solve it [6]. Using heuristics, our decision model efficiently dictates for each service on which volunteer to be pushed. To overview our approach and illustrate its contributions