394 Incorporating Distinct Translation System Outputs into Statistical and Transformer Model Mani Bansal, D.K.Lobiyal Jawaharlal Nehru University, Hauz khas South, New Delhi, 110067, India Abstract To find correct translation of an input sentence in Machine Translation is not an easy task of Natural language processing (NLP). The hybridization of different translation models has been found to handle this problem in an easy way. This paper presents an approach that takes advantage of various translation models by combining their outputs with statistical machine translation (SMT) and transformer method. Firstly, we achieve Google Translator and Bing Microsoft Translator outputs as external system outputs. Then, outputs of those models are fed into SMT and Transformer. Finally, the combined output is generated by analyzing the Google Translator, Bing, SMT and Transformer output. Prior work used system combination but no such approach exist which tried to combine the statistical and transformer system with other translation system. The experimental results on English-Hindi and Hindi-English language have shown significant improvement. Keywords Machine Translation, Transformer, Statistical Machine Translation, Google Translator, Bing Microsoft Translator, BLEU. 1. INTRODUCTION Machine Translation is the main area of Natural Language Processing. There are various translation approaches each with its pros and cons. One of the recent and existing approaches of Machine Translation (MT) is Statistical Machine Translation (SMT). The Statistical system is [1] structured for adequacy and handling out-of-vocabulary ISIC’21:International Semantic Intelligence Conference, February 25–27, 2021, New Delhi, India ✉manibansal1991@gmail(M.Bansal);lobiyal@gmail.co m (D.K. Lobiyal) 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings(CEUR-WS.org) words. Neural Machine Translation is a breakthrough which reduces post-editing efforts [2] and helps in dealing with syntactic structure of sentence. The NMT [3] outputs more fluent translations. Therefore, we make a hybrid system by combining Statistical and Transformer (NMT with multi-head self- attention architecture) outputs to refine the machine translation outputs. The combining these approaches into one is not an easy task. By using either SMT or Transformer does not give the solution to all issues. NMT has a problem of over-translates and under-translates to some extent. Also long distance dependency, phrase repetitions, translation adequacy for rare words and word alignment problems are observed in neural based system. As SMT [4] handles long-term