ELSEVIER Statistics & Probability Letters 20 (1994) 69-73 Minimum disparity estimation in the errors-in-variables model Ayanendranath Basuay*, Sahadeb Sarkarb ‘Deparfment of Mathematics. Universit), of Te.ras at Austin, Austin. TX 78712, USA bDepartmrnt qf Statistics, Oklahoma State University, Stillwater, OK 74078, USA Received April 1993 Abstract Robust estimators are determined using the minimum disparity estimation method (Lindsay, 1994; Basu and Lindsay, 1994) in the errors-in-variables model. These estimators are asymptotically fully efficient for the model considered and have strong robustness features. In a numerical example these estimators compare favorably with the orthogonal regression M-estimators of Zamar (1989). Key words: Hellinger distance; Kernel density estimation; Robustness; Transparent kernel 1. Introduction We consider the following classical measurement error model: Y, = SO + Blxt + e,, x, = x, + ut, where Y, and X, are observable random variables and x, denotes the true value of the explanatory not observed directly; {x,1, {e,} and { t} u are independent sequences of i.i.d random variables and (x,, e,, 4)’ - NC& 0, O)‘, diag(o,,, gee, GJI. (1.1) variable Under the model assumptions, the common distribution of the observations (Y,, X,) is bivariate normal: (1.2) Hence the distribution of (Y,, X,) is characterized completely by the five elements of its mean vector and its covariance matrix. But, because the model (1.1) contains 6 parameters it is not identified. Model (1.1) becomes identifiable, for example, if we assume that either the measurement error variance CT,, or the ratio of oee and ff,, is known. In this paper we will assume CTJCT~~ is known. For a detailed description of measurement error models see Fuller (1987). *Corresponding author. 1234-5678/94/$7.00 0 1994 - Elsevier Science B.V. All rights reserved SSDI 0167-7152(93)E0148-M