Performance study of Byzantine Agreement Protocol with artificial neural network Kok-Wah Lee a, * , Hong-Tat Ewe b a Faculty of Engineering & Technology, Multimedia University, Jalan Ayer Keroh Lama, 75450 Bukit Beruang, Melaka, Malaysia b Faculty of Information Technology, Multimedia University, Jalan Multimedia, 63100 Cyberjaya, Selangor, Malaysia Received 23 December 2005; received in revised form 10 April 2007; accepted 16 April 2007 Abstract Since 1982, numerous Byzantine Agreement Protocols (BAPs) have been developed to solve arbitrary faults in the Byz- antine Generals Problem (BGP). A novel BAP, using an artificial neural network (ANN), was proposed by Wang and Kao. It requires message exchange rounds similar to the traditional BAP and its suitability, in the context of network size, has not been investigated. In the present study, we propose to adopt Nguyen–Widrow initialization in ANN training, which modifies message communication and limits the message exchange rounds to three rounds. This modified approach is referred to as BAP-ANN. The BAP-ANN performs better than the traditional BAP, when the network size n is greater than nine. We also evaluate the message exchange matrix (MEM) constructed during the message exchange stage. For a fixed number of faulty nodes and remainder cases of (n mod 3), the study shows that the mean epoch for ANN training decreases as the network size increases, which indicates better fault tolerance. Ó 2007 Elsevier Inc. All rights reserved. Keywords: Byzantine Agreement; Communication protocol; Artificial neural networks; Cryptology; Fault tolerance 1. Introduction With the emergence of the Internet, the consensus issue or Byzantine Generals Problem (BGP) of the dis- tributed system has become more significant in achieving fault tolerance. The essential fault tolerance [3,4,22] necessary to reach common agreement enables the importance of earlier investigations [14,16,20,32]. Since the study of Lamport et al. [21], various Byzantine Agreement Protocols (BAPs) for fault tolerance have been developed [10,17,27,31,39]. In their studies, numerous assumptions were made to construct variations of the traditional BAP by Lamport et al. [21]. Wang and Kao [43] used a feedforward neural network (FFNN) with a backpropagation learning algorithm (BPLA) to construct a BAP. The message communication mode of Wang and Kao’s BAP is similar to the traditional BAP and its suitability, in the context of the network size, has not been investigated. 0020-0255/$ - see front matter Ó 2007 Elsevier Inc. All rights reserved. doi:10.1016/j.ins.2007.04.011 * Corresponding author. Tel.: +60 6 252 3320; fax: +60 6 231 6552. E-mail addresses: kwlee@mmu.edu.my (K.-W. Lee), htewe@mmu.edu.my (H.-T. Ewe). Information Sciences 177 (2007) 4785–4798 www.elsevier.com/locate/ins