Journal of Babylon University/Pure and Applied Sciences/ No.(1)/ Vol.(26): 2018 77 The Essential Order of      Approximation Using Regular Neural Networks Eman S. Bhaya Omar A. Al-sammak University of Babylon, College of Education for Pure Sciences, Department of Mathematics emanbhaya@itnet.uobabylon.edu.iq omar.alhelly@itnet.uobabylon.edu.iq Abstract This paper is concerning with essential degree of approximation using regular neural networks and how a multivariate function in  spaces for  can be approximated using a forward regular neural network. So, we can have the essential approximation ability of a multivariate function in  spaces for  using regular FFN. Keywords. Neural network approximation, Modulus of smoothness,  Spaces, best approximation. خلصة ال المتعددةدوالمكن تقريب الف ية المنتظمة ، وكيكة العصبيسي بأستخدام الشبلساحث درجة التقريب ا في هذا الب درسنا ت في فضاء ا المتغير  عندما ى مبرهناتحصول عم ال بامكانناذلكمية المنتظمة ، وكلماكة العصبية ا بأستخدام الشب ت في فضاء اب المتعددة المتغيرة ونظرية تكافؤ لمتقري مباشرة وعكسي  عندما لماميةكة العصبية ا بأستخدام الشب المنتظمة . لمفتاحية :ت اما الكمس العصبية ،مقيا الشبكة ا تقريبءات لنعومة ، فضا  فضل .تقريب ال ، ال 1. Introduction "Various papers on feasibility of approximation by forward neural networks have been made in past years (see [Cardaliaguet & Euvrard,1992; Chen, 1995; Chen, 1994; Chui, 1992; Cybenko, 1989; Gallant, ,1992; Hornik, 1989; Hornik, & Stinchombe, 1990; Leshno et al., 1993; Mhaskr & Michelli,1992]. The most important result among these papers is that : If we have a continuous function with multivariable and compact domain subset of there exist a feed forward neural networks (FNNs) as an approximation for it . By a sigmoidal we can be approximated arbitrarily well . A three-layer of the FNNs with input units and one hidden and one output units can be mathematically expressed as   ∑ (〈  〉 )       where       is the threshold,         are connection weights of neuron in the hidden layer with input neurons,  are the connection strength of neuron with the output neuron, and is the sigmoidal activation function used in the network. In this paper we prove direct and inverse estimation and saturation problem for the approximation of multivariate function in  spaces for  using a forward regular neural network .