Abstract
The higher order Hebbian learning rule is most often used in the higher order associative memory networks. By the analysis of the two dynamic properties, the stability and attractivity, of the higher order Hebbian-type networks, the deficiency of the higher order Hebbian rule is shown that it is suitable only for the orthogonal patterns. A better new learning mechanism, the higher order projection learning rule, is deduced on the basis of tensor of rank one analysis. The stability of any linear independent patterns is ensured by the new learning rule. The attractivity analysis of the network with the higher order projection rule is derived, too. Numerical simulations to clarify the merits of the higher order associative memory and the potential applications of the new learning rule are presented.
| Original language | English |
|---|---|
| Pages (from-to) | 17-30 |
| Number of pages | 14 |
| Journal | Neurocomputing |
| Volume | 50 |
| DOIs | |
| State | Published - Jan 2003 |
Keywords
- Associative memory
- Dynamics
- Higher order neural networks
- Learning rule
- Tensor of rank one