Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Exploiting online music tags for music emotion classification

Exploiting online music tags for music emotion classification Exploiting Online Music Tags for Music Emotion Classi cation YU-CHING LIN, National Taiwan University YI-HSUAN YANG, Academia Sinica HOMER H. CHEN, National Taiwan University The online repository of music tags provides a rich source of semantic descriptions useful for training emotion-based music classi er. However, the imbalance of the online tags affects the performance of emotion classi cation. In this paper, we present a novel data-sampling method that eliminates the imbalance but still takes the prior probability of each emotion class into account. In addition, a two-layer emotion classi cation structure is proposed to harness the genre information available in the online repository of music tags. We show that genre-based grouping as a precursor greatly improves the performance of emotion classi cation. On the average, the incorporation of online genre tags improves the performance of emotion classi cation by a factor of 55% over the conventional single-layer system. The performance of our algorithm for classifying 183 emotion classes reaches 0.36 in example-based f-score. Categories and Subject Descriptors: H.3.1 [Information Storage and Retrieval]: Information Search and Retrieval ”Retrieval models; H.5. [Information Interfaces and Presentation]: Sound and Music Computing ”Modeling, Systems General Terms: Algorithms, Performance, Human Factors Additional Key Words http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP) Association for Computing Machinery

Loading next page...
 
/lp/association-for-computing-machinery/exploiting-online-music-tags-for-music-emotion-classification-NCdfVbPbTu

References (58)

Publisher
Association for Computing Machinery
Copyright
Copyright © 2011 by ACM Inc.
ISSN
1551-6857
DOI
10.1145/2037676.2037683
Publisher site
See Article on Publisher Site

Abstract

Exploiting Online Music Tags for Music Emotion Classi cation YU-CHING LIN, National Taiwan University YI-HSUAN YANG, Academia Sinica HOMER H. CHEN, National Taiwan University The online repository of music tags provides a rich source of semantic descriptions useful for training emotion-based music classi er. However, the imbalance of the online tags affects the performance of emotion classi cation. In this paper, we present a novel data-sampling method that eliminates the imbalance but still takes the prior probability of each emotion class into account. In addition, a two-layer emotion classi cation structure is proposed to harness the genre information available in the online repository of music tags. We show that genre-based grouping as a precursor greatly improves the performance of emotion classi cation. On the average, the incorporation of online genre tags improves the performance of emotion classi cation by a factor of 55% over the conventional single-layer system. The performance of our algorithm for classifying 183 emotion classes reaches 0.36 in example-based f-score. Categories and Subject Descriptors: H.3.1 [Information Storage and Retrieval]: Information Search and Retrieval ”Retrieval models; H.5. [Information Interfaces and Presentation]: Sound and Music Computing ”Modeling, Systems General Terms: Algorithms, Performance, Human Factors Additional Key Words

Journal

ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)Association for Computing Machinery

Published: Oct 1, 2011

There are no references for this article.