File Download

There are no files associated with this item.

Supplementary

Conference Paper: Music Affect Recognition: The State-of-the-art and Lessons Learned

TitleMusic Affect Recognition: The State-of-the-art and Lessons Learned
Authors
Issue Date2012
PublisherInternational Society for Music Information Retrieval Conference (ISMIR).
Citation
The 13th International Society for Music Information Retrieval Conference (ISMIR 2012) Tutorial, Porto, Portugal, 8 October 2012 How to Cite?
AbstractThe affective aspect (popularly known as emotion or mood) of music information has gained fast growing attention in Music Information Retrieval (MIR) community. The recent years witnessed an explosive growth of studies on music affect recognition. This tutorial provides ISMIR participants an opportunity to learn a range of topics closely involved in affective indexing of music and to discuss how findings and methods can (or cannot) be borrowed from and applied to other multimedia information types such as speech (audio), images (visual) and movies (audio-visual). Topics in this tutorial include: the most influential psychological models of human emotion; musical, personal, and situational factors of music listening that influence the perception and description of music affect; building emotion taxonomies from online music metadata and social media; best practices of constructing ground truth datasets; approaches to and tools for automatic affect classification and regression; benchmarking and evaluation; a sample of deployed prototyping systems; issues and challenges on affect analysis; and the common ground of affect in music, image and movies. All the tools and systems covered in this tutorial are open source or freeware, and the datasets are available in transformed formats (due to the copyright of the audio and lyrics). The format of the tutorial will include lectures, group discussions, demonstration of sample systems and technical results with illustrative musical examples, and spontaneous interactions between the presenters and the audience.
Persistent Identifierhttp://hdl.handle.net/10722/257762

 

DC FieldValueLanguage
dc.contributor.authorHu, X-
dc.contributor.authorYang, YH-
dc.date.accessioned2018-08-14T07:04:17Z-
dc.date.available2018-08-14T07:04:17Z-
dc.date.issued2012-
dc.identifier.citationThe 13th International Society for Music Information Retrieval Conference (ISMIR 2012) Tutorial, Porto, Portugal, 8 October 2012-
dc.identifier.urihttp://hdl.handle.net/10722/257762-
dc.description.abstractThe affective aspect (popularly known as emotion or mood) of music information has gained fast growing attention in Music Information Retrieval (MIR) community. The recent years witnessed an explosive growth of studies on music affect recognition. This tutorial provides ISMIR participants an opportunity to learn a range of topics closely involved in affective indexing of music and to discuss how findings and methods can (or cannot) be borrowed from and applied to other multimedia information types such as speech (audio), images (visual) and movies (audio-visual). Topics in this tutorial include: the most influential psychological models of human emotion; musical, personal, and situational factors of music listening that influence the perception and description of music affect; building emotion taxonomies from online music metadata and social media; best practices of constructing ground truth datasets; approaches to and tools for automatic affect classification and regression; benchmarking and evaluation; a sample of deployed prototyping systems; issues and challenges on affect analysis; and the common ground of affect in music, image and movies. All the tools and systems covered in this tutorial are open source or freeware, and the datasets are available in transformed formats (due to the copyright of the audio and lyrics). The format of the tutorial will include lectures, group discussions, demonstration of sample systems and technical results with illustrative musical examples, and spontaneous interactions between the presenters and the audience.-
dc.languageeng-
dc.publisherInternational Society for Music Information Retrieval Conference (ISMIR). -
dc.relation.ispartofInternational Society for Music Information Retrieval Conference, ISMIR 2012-
dc.titleMusic Affect Recognition: The State-of-the-art and Lessons Learned-
dc.typeConference_Paper-
dc.identifier.emailHu, X: xiaoxhu@hku.hk-
dc.identifier.authorityHu, X=rp01711-
dc.identifier.hkuros275124-
dc.publisher.placePorto, Portugal-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats