TY - JOUR
T1 - Generalized extreme learning machine autoencoder and a new deep neural network
AU - Sun, Kai
AU - Zhang, Jiangshe
AU - Zhang, Chunxia
AU - Hu, Junying
N1 - Publisher Copyright:
© 2016 Elsevier B.V.
PY - 2017/3/22
Y1 - 2017/3/22
N2 - Extreme learning machine (ELM) is an efficient learning algorithm of training single layer feed-forward neural networks (SLFNs). With the development of unsupervised learning in recent years, integrating ELM with autoencoder has become a new perspective for extracting feature using unlabeled data. In this paper, we propose a new variant of extreme learning machine autoencoder (ELM-AE) called generalized extreme learning machine autoencoder (GELM-AE) which adds the manifold regularization to the objective of ELM-AE. Some experiments carried out on real-world data sets show that GELM-AE outperforms some state-of-the-art unsupervised learning algorithms, including k-means, laplacian embedding (LE), spectral clustering (SC) and ELM-AE. Furthermore, we also propose a new deep neural network called multilayer generalized extreme learning machine autoencoder (ML-GELM) by stacking several GELM-AE to detect more abstract representations. The experiments results show that ML-GELM outperforms ELM and many other deep models, such as multilayer ELM autoencoder (ML-ELM), deep belief network (DBN) and stacked autoencoder (SAE). Due to the utilization of ELM, ML-GELM is also faster than DBN and SAE.
AB - Extreme learning machine (ELM) is an efficient learning algorithm of training single layer feed-forward neural networks (SLFNs). With the development of unsupervised learning in recent years, integrating ELM with autoencoder has become a new perspective for extracting feature using unlabeled data. In this paper, we propose a new variant of extreme learning machine autoencoder (ELM-AE) called generalized extreme learning machine autoencoder (GELM-AE) which adds the manifold regularization to the objective of ELM-AE. Some experiments carried out on real-world data sets show that GELM-AE outperforms some state-of-the-art unsupervised learning algorithms, including k-means, laplacian embedding (LE), spectral clustering (SC) and ELM-AE. Furthermore, we also propose a new deep neural network called multilayer generalized extreme learning machine autoencoder (ML-GELM) by stacking several GELM-AE to detect more abstract representations. The experiments results show that ML-GELM outperforms ELM and many other deep models, such as multilayer ELM autoencoder (ML-ELM), deep belief network (DBN) and stacked autoencoder (SAE). Due to the utilization of ELM, ML-GELM is also faster than DBN and SAE.
KW - Deep neural network
KW - Extreme learning machine
KW - Generalized extreme learning machine autoencoder
KW - Manifold regularization
KW - Multilayer generalized extreme learning machine autoencoder
UR - https://www.scopus.com/pages/publications/85010723396
U2 - 10.1016/j.neucom.2016.12.027
DO - 10.1016/j.neucom.2016.12.027
M3 - 文章
AN - SCOPUS:85010723396
SN - 0925-2312
VL - 230
SP - 374
EP - 381
JO - Neurocomputing
JF - Neurocomputing
ER -