International Journal of Electrical and Computer Engineering (IJECE) Vol. 15, No. 2, April 2025, pp. 2181~2191 ISSN: 2088-8708, DOI: 10.11591/ijece.v15i2.pp2181-2191 2181 Journal homepage: http://ijece.iaescore.com Markov processes in Bayesian network computation Assem Shayakhmetova 1 , Nurbolat Tasbolatuly 2,3 , Ardak Akhmetova 1 , Assel Abdildayeva 1 , Marat Shurenov 4 , Anar Sultangaziyeva 1 1 Department of Artificial Intelligence and Big Data Al-Farabi Kazakh National University, Almaty, Republic of Kazakhstan 2 Higher School of Information Technology and Engineering, Astana International University, Astana, Republic of Kazakhstan 3 Department of Computer Engineering, Astana IT University, Astana, Republic of Kazakhstan 4 Department IT and Service, QUniversity, Almaty, Republic of Kazakhstan Article Info ABSTRACT Article history: Received Aug 22, 2024 Revised Oct 26, 2024 Accepted Nov 20, 2024 The article examines the influence of Markov processes on computations in Bayesian networks (BN), an important area of research within probabilistic graphical models. The concept of Bayesian Markov networks (BMN) is introduced, an extension of traditional Bayesian networks with the addition of a Markov constraint, according to which the probability in a node can only depend on the state of neighboring nodes. This constraint makes the model more realistic for many practical tasks, as most graphical models that reflect real-world processes possess the Markov property. The article also discusses that Bayesian networks, in the absence of evidence, actually exhibit the Markov property. However, when evidence (additional information) is introduced into the model, challenges arise that require more complex computational methods. In response, the article proposes algorithms adapted for working with Bayesian Markov networks in the presence of evidence. These algorithms are aimed at optimizing computations and reducing computational complexity. Additionally, a comparative analysis of calculations in Bayesian networks without Markov constraints and with them is conducted, highlighting the advantages and disadvantages of each approach. Special attention is paid to the practical applications of the proposed methods and their effectiveness in various scenarios. Keywords: Bayesian network Markov network Markov random field Markov properties Graphical model Evidence Belief propagation This is an open access article under the CC BY-SA license. Corresponding Author: Nurbolat Tasbolatuly Higher School of Information Technology and Engineering, Astana International University 010000 Astana, Republic of Kazakhstan Email: tasbolatuly@gmail.com 1. INTRODUCTION In the theory of Bayesian networks [1][3], the ideas of Markov [4][6] play a key role, especially in developing efficient algorithms for data processing under uncertainty. Markov processes are of particular interest to researchers for several reasons. First, they are intuitively understandable and natural for many applied tasks, making them attractive for various fields. Second, incorporating Markov ideas into Bayesian networks (BNs) [7][9] can significantly simplify computational algorithms, thereby speeding up the probability calculation process and expanding the possibilities of working with larger networks. Finally, Markov processes are well-studied within the framework of Markov chain theory [10][12], which allows for adapting existing developments to work with Bayesian networks. Intelligent decision support systems [13], [14], especially in various types of uncertainties, are often described using Bayesian networks. One of the main challenges in working with BNs is the need to perform calculations considering the presence of evidence-information that can alter the probability distribution in the network. Standard methods, such as