ALR-HT: A fast and efficient Lasso regression without hyperparameter tuning

  • Yuhang Wang
  • , Bin Zou
  • , Jie Xu
  • , Chen Xu
  • , Yuan Yan Tang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Lasso regression, known for its efficacy in high-dimensional data analysis and feature selection, stands as a cornerstone in the realm of supervised learning for regression estimation. However, hyperparameter tuning for Lasso regression is often time-consuming and susceptible to noisy data in big data scenarios. In this paper we introduce a new additive Lasso regression without Hyperparameter Tuning (ALR-HT) by integrating Markov resampling with additive models. We estimate the generalization bounds of the proposed ALR-HT and establish the fast learning rate. The experimental results for benchmark datasets confirm that the proposed ALR-HT algorithm has better performance in terms of sampling and training total time, mean squared error (MSE) compared to other algorithms. We present some discussions on the ALR-HT algorithm and apply it to Ridge regression, to show its versatility and effectiveness in regularized regression scenarios.

Original languageEnglish
Article number106885
JournalNeural Networks
Volume181
DOIs
StatePublished - Jan 2025
Externally publishedYes

Keywords

  • Additive models
  • Generalization bound
  • Hyperparameter tuning
  • Lasso regression
  • Markov resampling
  • Ridge regression

Fingerprint

Dive into the research topics of 'ALR-HT: A fast and efficient Lasso regression without hyperparameter tuning'. Together they form a unique fingerprint.

Cite this