Geometry of polynomial neural networks

Kaie Kubjas, Jiayi Li, Maximilian Wiesmann

Research output: Contribution to journalArticleScientificpeer-review

Abstract

We study the expressivity and learning process for polynomial neural networks (PNNs) with monomial activation functions. The weights of the network parametrize the neuromanifold. We study certain neuromanifolds using tools from algebraic geometry: we give explicit descriptions as semialgebraic sets and characterize their Zariski closures, called neurovarieties. We study their dimension and associate an algebraic degree, the learning degree, to the neurovariety. The dimension serves as a geometric measure for the expressivity of the network, the learning degree is a measure for the complexity of training the network and provides upper bounds on the number of learnable functions. These theoretical results are accompanied with experiments.
Original languageEnglish
Pages (from-to)295-328
JournalAlgebraic Statistics
Volume15
Issue number2
DOIs
Publication statusPublished - 3 Dec 2024
MoE publication typeA1 Journal article-refereed

Keywords

  • neuromanifold
  • neural network expressivity
  • nonlinear network
  • semialgebraic sets
  • tensor decomposition
  • optimization
  • Euclidean distance degree

Fingerprint

Dive into the research topics of 'Geometry of polynomial neural networks'. Together they form a unique fingerprint.

Cite this