Abstract
Dimensionality reduction is an efficient method for alleviating the issue of dimensionality in high-dimensional data. As a popular self-supervised learning method, contrastive learning has recently garnered considerable attention. In this paper, we propose NCLDR: Nearest-Neighbor Contrastive Learning with Dual Correlation Loss for Dimensionality Reduction, a novel dimensionality reduction method that is porting a contrastive learning framework to the specific task of dimensionality reduction. Firstly, NCLDR uses the nearest-neighbor to construct feature pairs from the training set itself. Afterwards, to decorrelate feature variables that produce representations invariant across such pairs, a basic multi-layer perceptron (MLP) network architecture with a dual correlation loss function is designed. Compared to most dimensionality reduction methods, NCLDR bypasses the complexity of optimizing kNN graphs and facilitates the embedding of out-of-sample data. Additionally, it also alleviates the issue of “dimensional collapse” in the low-dimensional representation space. Finally, experimental results demonstrate that the proposed method achieves significant improvements over state-of-the-art dimensionality reduction methods.
| Original language | English |
|---|---|
| Article number | 127848 |
| Journal | Neurocomputing |
| Volume | 594 |
| DOIs | |
| State | Published - 14 Aug 2024 |
Keywords
- Contrastive learning
- Dimensionality reduction
- Neighbor embedding