Skip to Main Content

Mathematics of Computation

Published by the American Mathematical Society since 1960 (published as Mathematical Tables and other Aids to Computation 1943-1959), Mathematics of Computation is devoted to research articles of the highest quality in computational mathematics.

ISSN 1088-6842 (online) ISSN 0025-5718 (print)

The 2020 MCQ for Mathematics of Computation is 1.78.

What is MCQ? The Mathematical Citation Quotient (MCQ) measures journal impact by looking at citations over a five-year period. Subscribers to MathSciNet may click through for more detailed information.

 

The power of adaption for approximating functions with singularities
HTML articles powered by AMS MathViewer

by Leszek Plaskota, Grzegorz W. Wasilkowski and Yaxi Zhao PDF
Math. Comp. 77 (2008), 2309-2338 Request permission

Abstract:

Consider approximating functions based on a finite number of their samples. We show that adaptive algorithms are much more powerful than nonadaptive ones when dealing with piecewise smooth functions. More specifically, let $F_r^1$ be the class of scalar functions $f:[0,T]\to \mathbb {R}$ whose derivatives of order up to $r$ are continuous at any point except for one unknown singular point. We provide an adaptive algorithm $\mathcal {A}_n^\textrm {ad}$ that uses at most $n$ samples of $f$ and whose worst case $L^p$ error ($1\le p<\infty$) with respect to ‘reasonable’ function classes $\mathcal {F}_r^1\subset F_r^1$ is proportional to $n^{-r}$. On the other hand, the worst case error of any nonadaptive algorithm that uses $n$ samples is at best proportional to $n^{-1/p}$.

The restriction to only one singularity is necessary for superiority of adaption in the worst case setting. Fortunately, adaption regains its power in the asymptotic setting even for a very general class $F_r^\infty$ consisting of piecewise $C^{r}$-smooth functions, each having a finite number of singular points. For any $f\in F_r^\infty$ our adaptive algorithm approximates $f$ with error converging to zero at least as fast as $n^{-r}$. We also prove that the rate of convergence for nonadaptive methods cannot be better than $n^{-1/p}$, i.e., is much slower.

The results mentioned above do not hold if the errors are measured in the $L^\infty$ norm, since no algorithm produces small $L^\infty$ errors for functions with unknown discontinuities. However, we strongly believe that the $L^\infty$ norm is inappropriate when dealing with singular functions and that the Skorohod metric should be used instead. We show that our adaptive algorithm retains its positive properties when the approximation error is measured in the Skorohod metric. That is, the worst case error with respect to $\mathcal {F}_r^1$ equals $\Theta (n^{-r})$, and the convergence in the asymptotic setting for $F_r^\infty$ is $n^{-r}$.

Numerical results confirm the theoretical properties of our algorithms.

References
Similar Articles
Additional Information
  • Leszek Plaskota
  • Affiliation: Faculty of Mathematics, Informatics, and Mechanics, University of Warsaw, Banacha 2, 02-097 Warsaw, Poland
  • Email: leszekp@mimuw.edu.pl
  • Grzegorz W. Wasilkowski
  • Affiliation: Department of Computer Science, University of Kentucky, 773 Anderson Hall, Lexington, Kentucky 40506-0046
  • MR Author ID: 189251
  • ORCID: 0000-0003-4727-7368
  • Email: greg@cs.uky.edu
  • Yaxi Zhao
  • Affiliation: Department of Computer Science, University of Kentucky, 773 Anderson Hall, Lexington, Kentucky 40506-0046
  • Email: yaxi@uky.edu
  • Received by editor(s): July 24, 2007
  • Published electronically: April 24, 2008
  • © Copyright 2008 American Mathematical Society
    The copyright for this article reverts to public domain 28 years after publication.
  • Journal: Math. Comp. 77 (2008), 2309-2338
  • MSC (2000): Primary 65Y20, 65D05, 41A10, 41A25
  • DOI: https://doi.org/10.1090/S0025-5718-08-02103-0
  • MathSciNet review: 2429887