42 Nonconventional Technologies Review 2012 Romanian Association of Nonconventional Technologies Romania, September, 2012 A NEW APPROACH ON DEVELOPING A PARAMETRIC FINITE ELEMENT ANALISYS MODEL FOR SIMULATION OF ELECTRODISCHARGE MACHINING PROCESS ASSISTED BY ULTRASONICS Alexandru Sergiu NANU 1 , Niculae Ion MARINESCU 2 , Liviu Daniel Ghiculescu 3 1 - Politehnica University of Bucharest, sergiu.nanu@nsn.pub.ro 2 - Politehnica University of Bucharest, niculae.marinescu@nsn.pub.ro 3 - Politehnica University of Bucharest, daniel.ghiculescu@nsn.pub.ro ABSTRACT: In recent years, many of the researches in the field of the electrodischarge-aided environment improvement technology have been published, and it is expected that such a tendency will be continued for a considerable length of time in the future. It has conventionally been believed that the understanding of the uses of discharge phenomena has mostly been due to experimental discovery, but a new modern approach can be promoted, using the accumulation of knowledge by the study of the process. This study deals with a new concept on modeling and virtual simulation of the EDM+US process. KEY WORDS: EDM/EDM+US, modeling, virtual experiments, parametric model, accumulated knowledge, data visualisation 1. INTRODUCTION The need of understanding large and complex (information- rich) data sets is common virtually to all human activities. Nowadays database is recognized as a strategic asset. The ability to extract useful knowledge hidden in those data and act based on that knowledge has become increasingly important important in today’s competitive world. There are two problems in modern science [1]: too many people use different terminology to solve the same problems; even more people use the same terminology to address completely different issues. The growing uses of computers and database technology have resulted in the explosive growth of methods for estimating useful models from data. We can consider that a model estimates an unknown dependency between a system inputs and outputs. The most used method for modeling is from samples, which means we have to collect those data first. According to some authors [1], there are three actual distinct methodologies for estimating empirical models from data: 1. Statistical model estimation, based on extending a classical statistical and function approximation for developing adaptive model formulations; 2. Predictive learning, developed by practitioners in the field of artificial neural networks (in the late 1980’s with no particular theoretical justification). Under this approach, the main focus is on estimating models with good generalization capability, as opposed to estimating “true” models under a statistical model estimation methodology. The theoretical framework for predictive learning is known as Statistical Learning, to denote a methodology for estimating models from data. 3. Data mining, which is a new practical methodology developed at the intersection of computer science (database technology), information retrieval and statistics. The main goal of data mining is to estimate useful models from data, by attempting to extract a subset of data samples (from a given larger dataset) with useful (particular) properties. The method is similar to exploratory data analysis in statistics. The term “learning method” is used here to define an algorithm (frequently implemented in software) that estimates an unknown mapping from known system input-output samples. 2. THEORETICAL ISSUES ON MODELING The general experimental procedure adopted in classical statistics, mentioned by Dowdy and Wearden [2], involves the following steps: 1. state the problem; 2. formulate the hypothesis; 3. design the experiment/generate the data; 4. collect the data and perform preprocessing; 5. estimate the model; 6. interpret the model and draw the conclusions. It is important to mention that steps 1–4 preceding model estimation are application domain dependent. The main goals of modeling are: gaining insights about the unknown system; understanding the limits of applicability of a given modeling method; identifying the most important (relevant) input variables that are responsible for the most variation of the output; making decisions based on the interpretation of the model. The growing use of computers has fundamentally changed the traditional boundaries between a statistician (as a data modeler) and a user (as an application expert). Nowadays, engineers as well as computer scientists successfully develop sophisticated empirical data modeling techniques to estimate complex nonlinear dependencies from the data. Paul and Balmer [3] defined simulation as the process of designing a model of a real system and conducting experiments with this model for the purpose of either understanding the behavior of the system or of evaluating various strategies for the operation of the system. Refering to simulalion Shannon [4] defines it as the process of conducting experiments on a model of a system in lieu of either direct experimentation with the system itself, or direct analytical solution of some problem associated with the system. We can conclude that simulation is used to describe and analyze the behavior of an existing or conceptual system.