Edge-Assisted Rendering of 360 Videos Streamed to Head-Mounted Virtual Reality Wen-Chih Lo Department of Computer Science National Tsing Hua University Hsin-Chu, Taiwan wchih.lo@gmail.com Chih-Yuan Huang Department of Computer Science National Tsing Hua University Hsin-Chu, Taiwan yuan64198@gmail.com Cheng-Hsin Hsu Department of Computer Science National Tsing Hua University Hsin-Chu, Taiwan chsu@cs.nthu.edu.tw Abstract—Over the past years, 360 video streaming is getting popular. Watching these videos with Head-Mounted Displays (HMDs), also known as Virtual Reality (VR) headsets, gives more immersive experience than using traditional planar monitors. To fulfill a real immersive experience, there are several challenges, such as high bandwidth consumption, latency-sensitive, and heterogeneous HMD devices. In this paper, we propose an edge- assisted 360 video streaming system, which leverages edge servers to render viewports for viewers of 360 videos. We formulate an optimization problem to determine which HMD clients should be served by the edge server. We design an algorithm to solve this problem, and implement a real testbed as a proof-of-concept. The resulting edge-assisted 360 video stream- ing system is extensively evaluated with a public 360 viewing dataset. Leveraging edge servers, we reduce the bandwidth usage and computational workload on HMD clients. Moreover, lower network latency is achieved. The evaluation results show that compared to current 360 video streaming platforms, our edge- assisted rendering platform: (i) saves up to 62% in bandwidth consumption, (ii) achieves higher viewing quality, (iii) reduces the computation workload for those lightweight HMDs, and (iv) saves the battery life of HMD clients. Index Terms—Edge computing; tiled streaming; Head- Mounted Display; resource allocation; omnidirectional videos. I. I NTRODUCTION In recent years, Augmented and Virtual Reality (AR/VR) technologies are rapidly developed and have attracted atten- tions from both the academia and industry. A report [1] says that more than a billion people worldwide will regularly use and access AR/VR applications, content, and data by 2021. Another market research [2] predicts that the AR/VR market will drive 108 billion USD annual revenues by 2021. Head- Mounted Displays (HMDs), which are also known as VR headsets (such as HTC Vive Focus or Samsung Gear VR) are heterogeneous in terms of computing power and net- work conditions. To accelerate computation and perform high- resolution viewport rendering for HMDs, a high-end GPU is needed. While increasingly more powerful mobile GPUs are integrated with HMDs, running computationally intensive applications, such as online games [3], [4], only on HMDs may still result in excessive heat, short battery life, and higher unit prices. Hence, to reduce the computational workload on the HMDs, we employ edge servers to perform viewport rendering for viewers. Edge computing, a.k.a. fog computing, pushes applications, data, and services from the centralized servers to the edge servers close to users [5], [6]. By using the distributed edge servers, we aim to reduce latency, save network resources, and achieve location awareness. However, delivering immersive AR/VR experience faces three major challenges: High bandwidth consumption. Streaming 360 videos dramatically increases the network bandwidth consump- tion. A viewer with an HMD rotates his/her head any time to change the viewing orientation when watching a 360 video. HMD then displays the current viewport based on the orientation. Since each viewport only accounts for a small part of the entire video frame, streaming each frame of the 360 video in its full resolution wastes resources. Latency-sensitive. Watching 360 videos using HMDs for immersive experience is getting more and more com- mon for people. However, human demand for precise and smooth viewport movements, which in turn dictate low network latency. Indeed, research shows that the round- trip latency should be less than 60 ms to avoid motion sickness [7]. Heterogeneous HMD devices. There are mainly three types of HMDs: (i) an HMD with a PC, such as Oculus Rift DK2, (ii) an HMD with a mobile device, like Samsung Gear VR, and (iii) a standalone HMD, like HTC Vive Focus. These HMD devices have different com- puting power and network conditions [8]. For example, HMDs with PCs may have access to powerful GPUs, while standalone HMDs come with weak mobile GPUs. In this paper, we design, implement, and evaluate an edge- assisted 360 video streaming system to address the above challenges. First, tiled streaming is employed to reduce the bandwidth consumption. That is, we split a 360 video into several tiles of sub-videos, which are encoded by modern video codecs, such as HEVC (High Efficiency Video Cod- ing) [9] into tiled video bitstreams. In order to save bandwidth, each client only requests and decodes the tiles that will be watched in high quality, while other tiles are in low quality. Second, we propose to leverage edge servers (for example, in base stations or next to access points) to perform viewport rendering. Doing so reduces the network latency, compared