Abstract

This work features an original result linking approximation and optimization theory for deep learning. Several examples from recent literature show that, given the same number of learnable parameters, deep neural networks can approximate richer classes of functions, with better accuracy than classical methods. The bulk of approximation theory results though, are only concerned with the infimum error for all possible parameterizations of a given network size. Their proofs often rely on hand-crafted networks, where the weights and biases are carefully selected. Optimization theory indicates that such models would be difficult or impossible to realize with standard gradient-based training methods. The main result of this thesis proves that, for a single-layer neural network having m parameters, a conservative approximation rate, O(m¼), is achieved with gradient flow training on univariate functions. This is especially noteworthy since we make no assumption of overparameterization, as is typically done with neural tangent kernel (NTK) techniques. The proof relies on an assumption that the H1-norm of the residual error throughout the training process is uniformly bounded. This assumption is justified by numerical experiments which also show that rates beyond 1/4 are achieved in practice, indicating that a sharper theoretical result is most likely possible. Future work will focus on proving that the bounded H1 assumption is not needed and that variations of our main result can also be applied to multi-dimensional cases and deep networks.

Notes

If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu

Graduation Date

2022

Semester

Summer

Advisor

Welper, Gerrit

Degree

Master of Science (M.S.)

College

College of Sciences

Department

Mathematics

Degree Program

Mathematical Science

Identifier

CFE0009174; DP0026770

URL

https://purls.library.ucf.edu/go/DP0026770

Language

English

Release Date

August 2022

Length of Campus-only Access

None

Access Status

Masters Thesis (Open Access)

Included in

Mathematics Commons

Share

COinS