Deep attention based music genre classification

  • Yang Yu
  • , Sen Luo
  • , Shenglan Liu
  • , Hong Qiao
  • , Yang Liu
  • , Lin Feng

Research output: Contribution to journalArticlepeer-review

94 Scopus citations

Abstract

As an important component of music information retrieval, music genre classification attracts great attentions these years. Benefitting from the outstanding performance of deep neural networks in computer vision, some researchers apply CNN on music genre classification tasks with audio spectrograms as input instead, which has similarities with RGB images. These methods are based on a latent assumption that spectrums with different temporal steps have equal importance. However, it goes against the theory of processing bottleneck in psychology as well as our observation from audio spectrograms. By considering the differences of spectrums, we propose a new model incorporating with attention mechanism based on Bidirectional Recurrent Neural Network. Furthermore, two attention-based models (serial attention and parallelized attention) are implemented in this paper. Comparing with serial attention, parallelized attention is more flexible and gets better results in our experiments. Especially, the CNN-based parallelized attention models with taking STFT spectrograms as input outperform the previous work.

Original languageEnglish
Pages (from-to)84-91
Number of pages8
JournalNeurocomputing
Volume372
DOIs
StatePublished - 8 Jan 2020
Externally publishedYes

Keywords

  • Deep neural networks
  • Music genre classification
  • Parallelized attention
  • Serial attention

Fingerprint

Dive into the research topics of 'Deep attention based music genre classification'. Together they form a unique fingerprint.

Cite this