Pure and Applied Mathematics Quarterly

Volume 18 (2022)

Number 1

Special Issue in Honor of Bernie Shiffman

Guest Editors: Yuan Yuan, Christopher Sogge, and Steven Morris Zelditch

Holomorphic feedforward networks

Pages: 251 – 268

DOI: https://dx.doi.org/10.4310/PAMQ.2022.v18.n1.a7


Michael R. Douglas (Center of Mathematical Sciences and Applications (CMSA), Harvard University, Cambridge, Massachusetts, U.S.A.)


A very popular model in machine learning is the feedforward neural network (FFN). The FFN can approximate general functions and mitigate the curse of dimensionality. Here we introduce FFNs which represent sections of holomorphic line bundles on complex manifolds, and ask some questions about their approximating power. We also explain formal similarities between the standard approach to supervised learning and the problem of finding numerical Ricci flat Kähler metrics, which allow carrying some ideas between the two problems.

2010 Mathematics Subject Classification

Primary 32Q25. Secondary 65M99, 68Txx.

The full text of this article is unavailable through your IP address:

Received 31 July 2020

Accepted 3 May 2021

Published 10 February 2022