Manuscript submitted to doi:10.3934/xx.xx.xx.xx AIMS’ Journals Volume X, Number 0X, XX 200X pp. X–XX SUPERVISED DISTANCE PRESERVING PROJECTION USING ALTERNATING DIRECTION METHOD OF MULTIPLIERS Sohana Jahan School of Mathematics University of Dhaka,Bangladesh. School of Mathematics University of Southampton, UK. Abstract. Supervised Distance Preserving Projection (SDPP) is a dimension reduction method in supervised setting proposed recently by Zhu et. al in [55]. The method learns a linear mapping from the input space to the reduced feature space. While the method showed very promising result in regression task, for classification problems the performance is not satisfactory. The preservation of distance relation with neighborhood points forces data to project very close to one another in the projected space irrespective of their classes which ends up with low classification rate. To avoid the crowdedness of SDPP approach we have proposed a modification of SDPP which deals both regression and classification problems and significantly improves the performance of SDPP. We have incorporated the total variance of the projected co-variates to the SDPP problem which is maximized to preserve the global structure. This approach not only facilitates efficient regression like SDPP but also successfully classifies data into different classes. We have formulated the proposed optimization problem as a Semidefinite Least Square (SLS) SDPP problem. A two block Alternating Direction Method of Multipliers have been developed to learn the transformation matrix solving the SLS-SDPP which can easily handle out of sample data. 1. Introduction. In this paper we have considered a dimension reduction method in supervised setting. Supervised Distance Preserving Projection (SDPP) is a di- mension reduction method proposed recently by Zhu et al [55] which showed very promising result in regression problems. The basic formulation of SDPP aims to project data points in reduced feature space in such a way that the distance be- tween data points in the projected space mimics the distance between them in the response space. The method learns a linear mapping from the input space to the reduced feature space that leads to an efficient regression design. Suppose we have n data points {x 1 ,x 2 , ....,x n },x i ∈ℜ m and their responses {y 1 ,y 2 , ....,y n }, y i ∈ℜ k . Assuming that the mapping X Y is continuous and X is well sampled, the idea is to project high dimensional data {x 1 ,x 2 , ....,x n } in a lower dimensional space Z with dimensionality r << m by Z = f (X)= W T X in such a way that the projec- tion preserve distances locally between data points in the projected space (reduced 2010 Mathematics Subject Classification. Primary: 62H30, 68T10; Secondary: 90C25. Key words and phrases. Supervised Distance Preserving Projection, Semidefinite Program- ming, Alternating Direction Method of Multipliers. The research of the author was supported by the Commonwealth Scholarship Commission, UK BDCS-2012-44. Email:sjahan.mat@du.ac.bd. 1