Some efficient strategies for improving the eigenstructure method in synthesis of feedback neural networks

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Two efficient strategies are proposed for improving the eigenstructure method from the best approximation projector point of view. Interpreted as two complementary best approximation projectors, the method is reformulated in a much more simplified form. We develop a new synthesis procedure through constructing the related best approximation projectors by using a simple recursive formula, which improves on the existing eigenstructure method not only in the significant reduction of the computational complexity but also in the incorporation of the learning capability comparable to the outer product method. The networks designed by the present procedure outperform those designed by some other known methods. We also propose a new forgetting algorithm for deleting any specific existing memories in a synthesized network. The algorithm performs efficiently and reliably, which particularly eliminates the overforgetting drawback of the Yen-Michel algorithm. The feasibility and effectiveness of the algorithm are supported by theoretical analysis and computer simulations.

Original languageEnglish
Pages (from-to)233-245
Number of pages13
JournalIEEE Transactions on Neural Networks
Volume7
Issue number1
DOIs
StatePublished - 1996

Fingerprint

Dive into the research topics of 'Some efficient strategies for improving the eigenstructure method in synthesis of feedback neural networks'. Together they form a unique fingerprint.

Cite this