https://doi.org/10.1007/s10489-022-03492-6
Robust kernel ensemble regression in diversified kernel space with
shared parameters
Zhi-feng Liu
1
· Liu Chen
1
· Sumet Mehta
1,2
· Xiang-Jun Shen
1
· Yu-bao Cui
3
Accepted: 10 March 2022
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022
Abstract
Kernel regression is an effective non-parametric regression method. However, such regression methods have problems in
choosing an appropriate kernel and its parameters. In this paper, we propose a robust kernel ensemble regression model
(RKER) in diversified multiple Reproducing Kernel Hilbert Spaces (RKHSs). Motivated by multi-view data processing,
we consider a kernel representation as one view of data and apply this multi-view modeling idea into the kernel regression
scenario. The proposed RKER uses an ensemble idea to combine multiple individual regressors into one, where each kernel
regressor is associated with a weight that is learned directly from one view of data without manual intervention. Thus, the
problem of selecting kernel and its parameter in traditional kernel regression methods is overcome by finding best kernel
combinations in diversified multiple solution spaces. With this multi-view modeling, RKER results in a superior overall
regression performance and more robust in parameter selection. Further, we can learn the parameters in multiple RKHSs
with individual specific and shared structures. Experimental results on Abalone and FaceBook datasets demonstrate that
our proposed RKER model shows best performance among other state-of-the-art regression and ensemble methods, such as
Random Forest, Gradient Boosting Regressor and eXtreme Gradient Boosting.
Keywords Kernel regression · Ensemble regression · Multiple kernels · Shared parameters
1 Introduction
Recently, kernel regression method is a popular non-
parametric estimation method, which is widely used
in various regression learning tasks due to its good
performance in solving non-linear relationship. Such as
Moreno-Salinas et.al [1] used kernel ridge regression
confidence machine to identify and predict the real ship
behavior with high accuracy. Also, Yang et al. [2] proposed
Xiang-Jun Shen
xjshen@ujs.edu.cn
Yu-bao Cui
ybbcui1975@hotmail.com; ybcui1975@njmu.edu.cn
1
School of Computer Science and Communication Engineer-
ing, JiangSu University, Zhenjiang, 212013 China
2
Department of Electronics and Communication Engineering,
JCDM College of Engineering, Haryana 125055, India
3
Clinical Research Center, The Affiliated Wuxi People’s
Hospital of Nanjing Medical University, No. 299 at Qingyang
Road, Wuxi 214023, People’s Republic of China
a kernel regression method based on ridgelet theory and
kernel machine, which has low computational complexity
and robustness to noise.
Challenges However, the performance of kernel based
regression methods mainly depends on the choice of an
appropriate kernel function and it is very difficult to
determine an optimal kernel in practice. In order to solve
the selection problem of kernel function and its parameters,
Peter et al. [3] used a set of flexible nonlinear prediction
functions to study the selection of kernel function and its
parameters in kernel ridge regression. Samah et al. [4]
proposed a technique to eliminate false edges from binary
edge images by using local adaptive regression kernel as
the descriptor of edge detection. Salhov et al. [5] designed
the kernel by approximating the similarity between the
ensemble parameters shared by multiple feature subsets.
Meanwhile, some researches apply ensemble learning (EL)
to kernel regression task to improve performance. For
example, Berikov et al. [6] apply the concept of kernel
ensemble scheme to solve a semi-supervised regression
problem. They proved that the probability that an error is
significantly converges to its minimum possible value as the
/ Published online: 25 April 2022
Applied Intelligence (2023) 53:1051–1067