Skip to main content
Log in

Learning Rates of Least-Square Regularized Regression

  • Published:
Foundations of Computational Mathematics Aims and scope Submit manuscript

Abstract

This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is mζ with ζ arbitrarily close to 1, regardless of the variance of the bounded probability distribution.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Qiang Wu, Yiming Ying or Ding-Xuan Zhou.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wu, Q., Ying, Y. & Zhou, DX. Learning Rates of Least-Square Regularized Regression. Found Comput Math 6, 171–192 (2006). https://doi.org/10.1007/s10208-004-0155-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10208-004-0155-9

Navigation