Search

MiS Preprint Repository

We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.

MiS Preprint
55/2019

Best k-layer neural network approximations

Lek-Heng Lim, Mateusz Michałek and Yang Qi

Abstract

We investigate the geometry of the empirical risk minimization problem for k-layer neural networks. We will provide examples showing that for the classical activation functions σ(x)=1/(1+exp(−x)) and σ(x)=tanh(x), there exists a positive-measured subset of target functions that do not have best approximations by a fixed number of layers of neural networks. In addition, we study in detail the properties of shallow networks, classifying cases when a best k-layer neural network approximation always exists or does not exist for the ReLU activation σ=max(0,x). We also determine the dimensions of shallow ReLU-activated networks.

Received:
Jul 16, 2019
Published:
Aug 8, 2019
MSC Codes:
92B20, 41A50, 41A30

Related publications

inJournal
2022 Repository Open Access
Lek-Heng Lim, Mateusz Michałek and Yang Qi

Best \(k\)-layer neural network approximations

In: Constructive approximation, 55 (2022) 1, pp. 583-604