A Simple and Effective Architecture Selection Method for Differentiable Architecture Search

Research output: Contribution to journalArticlepeer-review

Abstract

Although differentiable architecture search (DARTS) improves the searching efficiency of neural architecture search (NAS), the widely applied magnitude-based selection method of DARTS can frequently lead to deteriorating architectures with degenerated performance. Most existing works propose to address this issue by improving the supernet's optimization to guarantee the applicability of the magnitude-based method, while little attention has been paid to the selection criterion to obtain the final architecture. In this brief, we introduce a novel, simple, and effective architecture selection method, Manda (Magnitudes and activations), which estimates the contribution of an operation in an optimized supernet by both its architecture parameter's magnitude and corresponding generated activation. Notably, Manda can effectively address the notorious degeneration issue in DARTS without any modification of the supernet's optimization procedure, indicating the instability in DARTS can be attributed to the widely applied magnitude-based selection method. The experimental results on both NAS-Bench-201 and DARTS search spaces show the effectiveness of our method.

Original languageEnglish
JournalIEEE Transactions on Artificial Intelligence
DOIs
StateAccepted/In press - 2025

Keywords

  • AutoML
  • differentiable architecture search
  • Neural architecture search
  • neural networks

Fingerprint

Dive into the research topics of 'A Simple and Effective Architecture Selection Method for Differentiable Architecture Search'. Together they form a unique fingerprint.

Cite this