Approximation with random bases: Pro et Contra

Alexander N. Gorban, Ivan Yu Tyukin*, Danil V. Prokhorov, Konstantin I. Sofeikov

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

111 Citations (Scopus)

Abstract

In this work we discuss the problem of selecting suitable approximators from families of parameterized elementary functions that are known to be dense in a Hilbert space of functions. We consider and analyze published procedures, both randomized and deterministic, for selecting elements from these families that have been shown to ensure the rate of convergence in L2 norm of order O(1/N), where N is the number of elements. We show that both randomized and deterministic procedures are successful if additional information about the families of functions to be approximated is provided. In the absence of such additional information one may observe exponential growth of the number of terms needed to approximate the function and/or extreme sensitivity of the outcome of the approximation to parameters. Implications of our analysis for applications of neural networks in modeling and control are illustrated with examples.

Original languageEnglish
Pages (from-to)129-145
Number of pages17
JournalINFORMATION SCIENCES
Volume364-365
DOIs
Publication statusPublished - 1 Oct 2016

Keywords

  • Approximation
  • Measure concentration
  • Neural networks
  • Random bases

Fingerprint

Dive into the research topics of 'Approximation with random bases: Pro et Contra'. Together they form a unique fingerprint.

Cite this