Skip to Main Content


AMS eBook CollectionsOne of the world's most respected mathematical collections, available in digital format for your library or institution


Generalized Mercer Kernels and Reproducing Kernel Banach Spaces

About this Title

Yuesheng Xu, School of Data and Computer Science, Guangdong Province Key Laboratory of Computational Science, Sun Yat-Sen University, Guangzhou, Guangdong 510275 P. R. China and Qi Ye, School of Mathematical Sciences, South China Normal University, Guangzhou, Guangdong 510631 P. R. China

Publication: Memoirs of the American Mathematical Society
Publication Year: 2019; Volume 258, Number 1243
ISBNs: 978-1-4704-3550-9 (print); 978-1-4704-5077-9 (online)
DOI: https://doi.org/10.1090/memo/1243
Published electronically: February 21, 2019
Keywords: Reproducing Kernel Banach Spaces, Generalized Mercer Kernels, Positive Definite Kernels, Machine Learning, Support Vector Machines, Sparse Learning Methods.
MSC: Primary 68Q32, 68T05; Secondary 46E22, 68P01

View full volume PDF

View other years and numbers:

Table of Contents

Chapters

  • 1. Introduction
  • 2. Reproducing Kernel Banach Spaces
  • 3. Generalized Mercer Kernels
  • 4. Positive Definite Kernels
  • 5. Support Vector Machines
  • 6. Concluding Remarks
  • Acknowledgments

Abstract

This article studies constructions of reproducing kernel Banach spaces (RKBSs) which may be viewed as a generalization of reproducing kernel Hilbert spaces (RKHSs). A key point is to endow Banach spaces with reproducing kernels such that machine learning in RKBSs can be well-posed and of easy implementation. First we verify many advanced properties of the general RKBSs such as density, continuity, separability, implicit representation, imbedding, compactness, representer theorem for learning methods, oracle inequality, and universal approximation. Then, we develop a new concept of generalized Mercer kernels to construct p-norm RKBSs for 1 less than or equal to p less than or equal to infinity. The p-norm RKBSs preserve the same simple format as the Mercer representation of RKHSs. Moreover, the p-norm RKBSs are isometrically equivalent to the standard p-norm spaces of countable sequences. Hence, the p-norm RKBSs possess more geometrical structures than RKHSs including sparsity. To be more precise, the suitable countable expansion terms of the generalized Mercer kernels can be used to represent the pairs of Schauder bases and biorthogonal systems of the p-norm RKBSs such that the generalized Mercer kernels become the reproducing kernels of the p-norm RKBSs. The generalized Mercer kernels also cover many well-known kernels, for example, min kernels, Gaussian kernels, and power series kernels. Finally, we propose to solve the support vector machines in the p-norm RKBSs, which are to minimize the regularized empirical risks over the p-norm RKBSs. We show that the infinite dimensional support vector machines in the p-norm RKBSs can be equivalently transferred to finite dimensional convex optimization problems such that we obtain the finite dimensional representations of the support vector machine solutions for practical applications. In particular, we verify that some special support vector machines in the 1-norm RKBSs are equivalent to the classical 1-norm sparse regressions. This gives fundamental supports of a novel learning tool called sparse learning methods to be investigated in our next research project.

References [Enhancements On Off] (What's this?)

References