Math @ Duke
|
Publications [#360475] of Shira Faigenbaum-Golovin
Papers Published
- Daubechies, I; DeVore, R; Dym, N; Faigenbaum-Golovin, S; Kovalsky, SZ; Lin, K-C; Park, J; Petrova, G; Sober, B, Neural Network Approximation of Refinable Functions, vol. abs/2107.13191
(July, 2021)
(last updated on 2024/07/30)
Abstract: In the desire to quantify the success of neural networks in deep learning and
other applications, there is a great interest in understanding which functions
are efficiently approximated by the outputs of neural networks. By now, there
exists a variety of results which show that a wide range of functions can be
approximated with sometimes surprising accuracy by these outputs. For example,
it is known that the set of functions that can be approximated with exponential
accuracy (in terms of the number of parameters used) includes, on one hand,
very smooth functions such as polynomials and analytic functions (see e.g.
\cite{E,S,Y}) and, on the other hand, very rough functions such as the
Weierstrass function (see e.g. \cite{EPGB,DDFHP}), which is nowhere
differentiable. In this paper, we add to the latter class of rough functions by
showing that it also includes refinable functions. Namely, we show that
refinable functions are approximated by the outputs of deep ReLU networks with
a fixed width and increasing depth with accuracy exponential in terms of their
number of parameters. Our results apply to functions used in the standard
construction of wavelets as well as to functions constructed via subdivision
algorithms in Computer Aided Geometric Design.
|
|
dept@math.duke.edu
ph: 919.660.2800
fax: 919.660.2821
| |
Mathematics Department
Duke University, Box 90320
Durham, NC 27708-0320
|
|