Least square regression with indefinite kernels and coefficient regularization

https://doi.org/10.1016/j.acha.2010.04.001Get rights and content
Under an Elsevier user license
open archive

Abstract

In this paper, we provide a mathematical foundation for the least square regression learning with indefinite kernel and coefficient regularization. Except for continuity and boundedness, the kernel function is not necessary to satisfy any further regularity conditions. An explicit expression of the solution via sampling operator and empirical integral operator is derived and plays an important role in our analysis. It provides a natural error decomposition where the approximation error is characterized by a reproducing kernel Hilbert space associated to certain Mercer kernel. A careful analysis shows the sample error has O(1m) decay. We deduce the error bound and prove the asymptotic convergence. Satisfactory learning rates are then derived under a very mild regularity condition on the regression function. When the kernel is itself a Mercer kernel better rates are given by a rigorous analysis which shows coefficient regularization is powerful in learning smooth functions. The saturation effect and the relation to the spectral algorithms are discussed.

Keywords

Indefinite kernel
Mercer kernel
Coefficient regularization
Least square regression
Integral operator
Capacity independent error bounds
Learning rates

Cited by (0)

1

The author is supported by the Nature Science Fund of Shandong Province, China [Project No. Y2007A11], and the Doctor Fund of University of Jinan [Project No. XBS0832].