Remote Access Mathematics of Computation
Green Open Access

Mathematics of Computation

ISSN 1088-6842(online) ISSN 0025-5718(print)

 

 

The power of adaption for approximating functions with singularities


Authors: Leszek Plaskota, Grzegorz W. Wasilkowski and Yaxi Zhao
Journal: Math. Comp. 77 (2008), 2309-2338
MSC (2000): Primary 65Y20, 65D05, 41A10, 41A25
DOI: https://doi.org/10.1090/S0025-5718-08-02103-0
Published electronically: April 24, 2008
MathSciNet review: 2429887
Full-text PDF Free Access

Abstract | References | Similar Articles | Additional Information

Abstract: Consider approximating functions based on a finite number of their samples. We show that adaptive algorithms are much more powerful than nonadaptive ones when dealing with piecewise smooth functions. More specifically, let $ F_r^1$ be the class of scalar functions $ f:[0,T]\to\mathbb{R}$ whose derivatives of order up to $ r$ are continuous at any point except for one unknown singular point. We provide an adaptive algorithm $ \mathcal{A}_n^{\rm ad}$ that uses at most $ n$ samples of $ f$ and whose worst case $ L^p$ error ( $ 1\le p<\infty$) with respect to `reasonable' function classes $ \mathcal{F}_r^1\subset F_r^1$ is proportional to $ n^{-r}$. On the other hand, the worst case error of any nonadaptive algorithm that uses $ n$ samples is at best proportional to $ n^{-1/p}$.

The restriction to only one singularity is necessary for superiority of adaption in the worst case setting. Fortunately, adaption regains its power in the asymptotic setting even for a very general class $ F_r^\infty$ consisting of piecewise $ C^{r}$-smooth functions, each having a finite number of singular points. For any $ f\in F_r^\infty$ our adaptive algorithm approximates $ f$ with error converging to zero at least as fast as $ n^{-r}$. We also prove that the rate of convergence for nonadaptive methods cannot be better than $ n^{-1/p}$, i.e., is much slower.

The results mentioned above do not hold if the errors are measured in the $ L^\infty$ norm, since no algorithm produces small $ L^\infty$ errors for functions with unknown discontinuities. However, we strongly believe that the $ L^\infty$ norm is inappropriate when dealing with singular functions and that the Skorohod metric should be used instead. We show that our adaptive algorithm retains its positive properties when the approximation error is measured in the Skorohod metric. That is, the worst case error with respect to $ \mathcal{F}_r^1$ equals $ \Theta(n^{-r})$, and the convergence in the asymptotic setting for $ F_r^\infty$ is $ n^{-r}$.

Numerical results confirm the theoretical properties of our algorithms.


References [Enhancements On Off] (What's this?)


Similar Articles

Retrieve articles in Mathematics of Computation with MSC (2000): 65Y20, 65D05, 41A10, 41A25

Retrieve articles in all journals with MSC (2000): 65Y20, 65D05, 41A10, 41A25


Additional Information

Leszek Plaskota
Affiliation: Faculty of Mathematics, Informatics, and Mechanics, University of Warsaw, Banacha 2, 02-097 Warsaw, Poland
Email: leszekp@mimuw.edu.pl

Grzegorz W. Wasilkowski
Affiliation: Department of Computer Science, University of Kentucky, 773 Anderson Hall, Lexington, Kentucky 40506-0046
Email: greg@cs.uky.edu

Yaxi Zhao
Affiliation: Department of Computer Science, University of Kentucky, 773 Anderson Hall, Lexington, Kentucky 40506-0046
Email: yaxi@uky.edu

DOI: https://doi.org/10.1090/S0025-5718-08-02103-0
Keywords: Function approximation, sampling, singularities, adaptive algorithms
Received by editor(s): July 24, 2007
Published electronically: April 24, 2008
Article copyright: © Copyright 2008 American Mathematical Society
The copyright for this article reverts to public domain 28 years after publication.