Music style analysis among Haydn, Mozart and beethoven: An unsupervised machine learning approach

  • Ru Wen
  • , Zheng Xie
  • , Kai Chen
  • , Ruoxuan Guo
  • , Kuan Xu
  • , Wenmin Huang
  • , Jiyuan Tian
  • , Jiang Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Different musicians have quite different styles, which has influenced by their different historical backgrounds, personalities, and experiences. In this paper, we propose an approach to extract melody based features from sheet music, as well as an unsupervised clustering method for discovering music styles. Since that existing corpus is not sufficient for this research in terms of completeness or data format, a new corpus of Haydn, Mozart and Beethoven in MusicXML format is created for research. By applying this approach, similar and different styles are discovered. The analysis results conform to the Implication-Realization model, one of the most significant modern theories of melodic expectation, which confirms the validity of our approach.

Original languageEnglish
Title of host publication2017 ICMC/EMW - 43rd International Computer Music Conference and the 6th International Electronic Music Week
PublisherShanghai Conservatory of Music
Pages323-328
Number of pages6
ISBN (Electronic)9780984527465
StatePublished - 2017
Event43rd International Computer Music Conference, ICMC 2017 and the 6th International Electronic Music Week, EMW 2017 - Shanghai, China
Duration: 15 Oct 201720 Oct 2017

Publication series

Name2017 ICMC/EMW - 43rd International Computer Music Conference and the 6th International Electronic Music Week

Conference

Conference43rd International Computer Music Conference, ICMC 2017 and the 6th International Electronic Music Week, EMW 2017
Country/TerritoryChina
CityShanghai
Period15/10/1720/10/17

Fingerprint

Dive into the research topics of 'Music style analysis among Haydn, Mozart and beethoven: An unsupervised machine learning approach'. Together they form a unique fingerprint.

Cite this