A Key Based Cache Replacement Policy for Cooperative Caching in Mobile Ad hoc Networks Preetha Theresa Joy Dept. of Computer Science Cochin University of Science and Technology Kochi, Kerala India. K. Poulose Jacob Dept. of Computer Science Cochin University of Science and Technology Kochi, Kerala, India. AbstractCooperative caching in mobile ad hoc networks aims at improving the efficiency of information access by reducing access latency and bandwidth usage. Cache replacement policy plays a vital role in improving the performance of a cache in a mobile node since it has limited memory. In this paper we propose a new key based cache replacement policy called E-LRU for cooperative caching in ad hoc networks. The proposed scheme for replacement considers the time interval between the recent references, size and consistency as key factors for replacement. Simulation study shows that the proposed replacement policy can significantly improve the cache performance in terms of cache hit ratio and query delay. KeywordsData Caching; Cooperative Cache; Ad hoc Networks; Cache Replacement. I. INTRODUCTION The advancement in the new wireless technology has lead to the wide popularity of Mobile Ad hoc Networks (MANETS). The primary attraction of a wireless ad hoc network is the fact that these networks can be formed spontaneously without any fixed infrastructure. This makes the ad hoc networks ideal for applications where initial connection setup is not possible or unreliable like military and disastrous areas. It has also wide range of commercial applications like personal area networks, sensor networks, emergency networks and vehicular communication. In these types of networks, devices generally have limited energy reserves and processing capabilities. Bandwidth is also a scarce resource, limited by the nature of the wireless medium. In a data-management point of view, these restrictions introduce several issues that need to be addressed. Data transfers must be reduced and mechanisms must be deployed in order to confront the frequent disconnections and low bandwidth constraints. Therefore it is a challenging task to present the data efficiently by reducing the delay or waiting time to the end user. Data caching is widely used in various domains to improve data access efficiency, by reducing the waiting time or latency experienced by the end users. In wireless mobile network, holding frequently accessed data items in a mobile node’s local storage can reduce network traffic, response time and server load. Caching in ad hoc networks is effective because a few resources are requested often by many users, or repeatedly by a specific user, which is known as the locality of reference. To have the full benefits of caching, the neighbor nodes can cooperate and serve each other’s misses, thus further reducing the wireless traffic. This process is called cooperative caching. Since the mobile nodes can make use of the data stored in another node’s cache the effective cache size is increased. However, since the mobile nodes have limited memory, the cache area defined for storage is also limited. Whenever the cache memory is full, we have to find an efficient method to replace some data from the cache to make room for the newly arrived data. The cache replacement strategy decides which data item has to be removed to place the new data items. Cache replacement algorithm plays a central role in response time reduction by selecting suitable subset of data for caching. Numerous cache replacement algorithms were proposed in web caching but only a few were proposed for ad hoc networks. In this paper we present a coordinated cache replacement policy, E-LRU (Extended LRU) which evicts data based on size, time interval between recent accesses and consistency. Almost all of the replacement algorithms proposed for cooperative caching are function based policies which involves the calculation of a value function based on different parameters , which uses complex data structures that makes the implementation difficult. Least Recently Used (LRU) on the other hand is the simplest replacement policy suggested from early days, for data caching. The disadvantage of LRU is, it considers only too little information for replacement. The replacement policy we propose is an extension to the LRU policy, aims at replacing the data objects based on the time difference between the recent references. The novelty of our approach lies in the key based replacement where we first consider size of the data item, then takes in to account the time interval between the recent references and the pages with longest reference interval time is dropped first. The advantage of our scheme is that the pages with shortest access time interval, which have more probability of reference in the future, will be kept in the cache. Simulations shows that our replacement policy out performs LRU in cache hit ratio and average query delay. 383 978-1-4673-4529-3/12/$31.00 c 2012 IEEE