Mathematical Statistician and Engineering Applications ISSN: 2094-0343 DOI: https://doi.org/10.17762/msea.v70i2.2454 1641 Vol. 70 No. 2 (2021) http://philstat.org.ph BERT Algorithm used in Google Search Sumeshwar Singh Asst. Professor, Department of Comp. Sc. & Info. Tech., Graphic Era Hill University, Dehradun, Uttarakhand India 248002 Article Info Page Number:1641 - 1650 Publication Issue: Vol 70 No. 2 (2021) Article History Article Received: 05 September 2021 Revised: 09 October 2021 Accepted: 22 November 2021 Publication: 26 December 2021 Abstract Search engines are now a need for obtaining information due to the internet's explosive expansion in digital material. One of the most widely used search engines, Google, works hard to improve its search functionality. Google has recently used cutting-edge natural language processing (NLP) methods to enhance search results. The Bidirectional Encoder Representations from Transformers (BERT) method is one such ground-breaking invention. This study seeks to offer a thorough evaluation of the BERT algorithm and its use in Google Search. We examine BERT's design, training procedure, and salient characteristics, emphasising its capacity to comprehend the subtleties and context of real language. We also talk about BERT's effects on user experience and search engine optimisation (SEO), as well as potential future advances and difficulties. Keywords. BERT model, Bidirectional Encoder Representations from Transformers, Natural language understanding, Contextual understanding, Language modeling I. Introduction The introduction of the internet has completely changed how we access and use information. Users can now rapidly access relevant and trustworthy material thanks to search engines, which have evolved as essential tools for navigating the huge digital universe. Google has retained its supremacy among the many search engines accessible because to its ongoing efforts to enhance search quality and user experience [1]. Google has considerably improved its search skills in recent years by utilising the strength of cutting-edge natural language processing (NLP) methods, notably the Bidirectional Encoder Representations from Transformers (BERT) algorithm. 1.1 Background: To close the gap between user intent and search results, search engines like Google face a significant difficulty. Users frequently use natural language when entering their search terms in the hopes that search engines would comprehend their purpose and provide appropriate results that meet their needs. Traditional algorithms, on the other hand, had trouble understanding the nuances of human language, which led to erroneous and inadequate search results [2]. 1.2 Problem Statement: In the parts that follow, we will present a thorough review of Google Search, its development, and the difficulties that search engine technology is now facing. The BERT algorithm [3] will next be introduced, with its neural network design, pre-training, and fine-tuning procedures explained. We