Estimates on compressed neural networks regression

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

When the neural element number n of neural networks is larger than the sample size m, the overfitting problem arises since there are more parameters than actual data (more variable than constraints). In order to overcome the overfitting problem, we propose to reduce the number of neural elements by using compressed projection A which does not need to satisfy the condition of Restricted Isometric Property (RIP). By applying probability inequalities and approximation properties of the feedforward neural networks (FNNs), we prove that solving the FNNs regression learning algorithm in the compressed domain instead of the original domain reduces the sample error at the price of an increased (but controlled) approximation error, where the covering number theory is used to estimate the excess error, and an upper bound of the excess error is given.

Original languageEnglish
Pages (from-to)10-17
Number of pages8
JournalNeural Networks
Volume63
DOIs
StatePublished - 1 Mar 2015
Externally publishedYes

Keywords

  • Compressed projection
  • Neural networks
  • Regression learning

Fingerprint

Dive into the research topics of 'Estimates on compressed neural networks regression'. Together they form a unique fingerprint.

Cite this