Impact of Primary User Emulation Attacks on Dynamic Spectrum Access Networks Z. Jin, Student Member, IEEE, S. Anand, Member, IEEE and K. P. Subbalakshmi, Member, IEEE Abstract—Primary user emulation attack (PUEA) is a denial of service (DoS) attack unique to dynamic spectrum access (DSA) networks. While there have been studies in literature to detect and mitigate PUEA, the impact of PUEA on the performance of secondary networks has seldom been studied. In this paper, we analyze how PUEA affects call dropping and delay in secondary networks carrying both real-time traffic and non-real-time traffic. Numerical results indicate that PUEA can increase the number of dropped calls by up to two orders of magnitude and can increase the mean delay by up to a factor larger than two. We then evaluate the performance of secondary networks that deploy the protocols which we proposed previously to mitigate PUEA. Our protocols reduce the number of dropped calls by up to one order of magnitude. Our protocols are also shown to exhibit almost the same delay performance as that of a system with no PUEA, for low malicious traffic load. When malicious traffic load is high, our protocols provide an improvement on the delay performance by up to 54%. Index Terms—Dynamic spectrum access (DSA), primary user emulation attack (PUEA), Markov model, call dropping, delay I. I NTRODUCTION Cognitive radio enabled dynamic spectrum access (DSA) networks [1]-[3] allow unlicensed “secondary users” to access the spectrum bands unused by licensed “primary users” to improve spectrum utilization. The secondary users evacuate the spectrum bands upon the return of the primary users. This spectrum etiquette could be exploited by malicious users to mount a DSA specific attack called primary user emulation attack (PUEA) [4]. In such an attack, a set of malicious sec- ondary users transmit signals whose characteristics resemble that of the primary transmitter, misleading the good secondary users to believe that the primary user is active and evacuate the spectrum unnecessarily. There are several studies in literature that deal with detection and mitigation of PUEA [4]-[11]. Some of these include isolation of malicious users using directional antennas on the secondary users [4] or underlying sensors [5], while some mitigate PUEA by using hypothesis testing [7],[8]. Additional description and references on PUEA can be found in [12]- [17]. We proposed the first centralized protocol [9] to mitigate PUEA, in which secondary users convey their individual decisions to a centralized controller, which in turn, uses the individual decisions obtained from all the secondary users to come up with a decision for the entire network. We then developed the first distributed protocol to mitigate PUEA [10], Manuscript received November 22, 2010. Revised September 14, 2011, January 21, 2012, March 29, 2012. This work, in part, was presented in IEEE GLOBECOM’2010 [21]. The authors are with the department of electrical and computer en- gineering, Stevens Institute of Technology. Email: {zjin, asanthan, ksub- bala}@stevens.edu in which secondary users exchange their individual spectrum decisions with their one-hop neighbors to thwart PUEA. The centralized protocol in [9] and the distributed protocol in [10] were found to reduce the probability of successful PUEA by up to four and three orders of magnitude, respectively. While research has been performed on the mitigation of PUEA, most studies focus on reducing the error probabilities in the primary user sensing mechanism. The impact of PUEA on the quality-of-service (QoS) performance of the secondary network has not been studied in detail. In order to illustrate this, consider a secondary network in which secondaries use channels from the set C = {1, 2, ··· ,C}, for communication. Let a particular secondary user use channel 1. If PUEA is successfully launched on channel 1, then this user has to switch to another channel that is neither used by the primary users nor by other secondary users. If such a channel is not available, then the call incident on the secondary user is dropped if it is a delay sensitive real-time application. If the call corresponds to non-real-time traffic, it can be buffered till a channel becomes available by virtue of a primary or another secondary call leaving the system. A dropped call results in unreliable communication. If the call is buffered then it causes additional delay, thereby resulting in degraded quality-of-service (QoS). To the best of our knowledge, the effect of PUEA on the performance of secondary networks (e.g., call dropping, delay), has not been studied. In this paper, we study the impact of PUEA on secondary networks. Specifically we study the call dropping in secondary networks carrying real time traffic and the delay suffered by secondary networks carrying non-real-time traffic, due to PUEA. Note that in some scenarios, certain real-time traffic may be tolerant to some delay, while some calls corresponding to non-real-time traffic may be dropped if their waiting time in the buffer exceeds a specific threshold. In this paper, by “real- time traffic”, we mean delay-intolerant traffic which is dropped immediately when no available channels are found. By “non- real-time traffic”, we mean delay-tolerant traffic which can be buffered in the system until a channel becomes available. To perform our study, we consider two types of malicious behavior: (i) “obstructive” malicious users, who launch PUEA with the sole objective of evacuating secondary users but not to use the white spaces for themselves and (ii) “greedy” malicious users, who use the white spaces like other secondary users in addition to launching PUEA. We model the channel occupancy in DSA networks under PUEA as a three dimensional continuous time Markov chain (3D-CTMC), and use the 3D-CTMC to analyze the call dropping in secondary networks with real-time traffic and the delay in secondary networks carrying non-real-time traffic, in