The Resource Music Emotion Recognition, Yang, Yi-Hsuan
Music Emotion Recognition, Yang, Yi-Hsuan
Resource Information
The item Music Emotion Recognition, Yang, Yi-Hsuan represents a specific, individual, material embodiment of a distinct intellectual or artistic creation found in Merrimack Valley Library Consortium.This item is available to borrow from 1 library branch.
Resource Information
The item Music Emotion Recognition, Yang, Yi-Hsuan represents a specific, individual, material embodiment of a distinct intellectual or artistic creation found in Merrimack Valley Library Consortium.
This item is available to borrow from 1 library branch.
- Summary
- Providing a complete review of existing work in music emotion developed in psychology and engineering, Music Emotion Recognition explains how to account for the subjective nature of emotion perception in the development of automatic music emotion recognition (MER) systems. Among the first publications dedicated to automatic MER, it begins with a comprehensive introduction to the essential aspects of MER-including background, key techniques, and applications. This ground-breaking reference examines emotion from a dimensional perspective. It defines emotions in music as points in a 2D plane in terms of two of the most fundamental emotion dimensions according to psychologists-valence and arousal. The authors present a computational framework that generalizes emotion recognition from the categorical domain to real-valued 2D space. They also: Introduce novel emotion-based music retrieval and organization methods Describe a ranking-base emotion annotation and model training method Present methods that integrate information extracted from lyrics, chord sequence, and genre metadata for improved accuracy Consider an emotion-based music retrieval system that is particularly useful for mobile devices The book details techniques for addressing the issues related to: the ambiguity and granularity of emotion description, heavy cognitive load of emotion annotation, subjectivity of emotion perception, and the semantic gap between low-level audio signal and high-level emotion perception. Complete with more than 360 useful references, 12 example MATLAB® codes, and a listing of key abbreviations and acronyms, this cutting-edge guide supplies the technical understanding and tools needed to develop your own automatic MER system based on the automatic recognition model
- Language
-
- eng
- eng
- Edition
- 1st edition
- Extent
- 1 online resource (261 pages)
- Label
- Music Emotion Recognition
- Title
- Music Emotion Recognition
- Statement of responsibility
- Yang, Yi-Hsuan
- Language
-
- eng
- eng
- Summary
- Providing a complete review of existing work in music emotion developed in psychology and engineering, Music Emotion Recognition explains how to account for the subjective nature of emotion perception in the development of automatic music emotion recognition (MER) systems. Among the first publications dedicated to automatic MER, it begins with a comprehensive introduction to the essential aspects of MER-including background, key techniques, and applications. This ground-breaking reference examines emotion from a dimensional perspective. It defines emotions in music as points in a 2D plane in terms of two of the most fundamental emotion dimensions according to psychologists-valence and arousal. The authors present a computational framework that generalizes emotion recognition from the categorical domain to real-valued 2D space. They also: Introduce novel emotion-based music retrieval and organization methods Describe a ranking-base emotion annotation and model training method Present methods that integrate information extracted from lyrics, chord sequence, and genre metadata for improved accuracy Consider an emotion-based music retrieval system that is particularly useful for mobile devices The book details techniques for addressing the issues related to: the ambiguity and granularity of emotion description, heavy cognitive load of emotion annotation, subjectivity of emotion perception, and the semantic gap between low-level audio signal and high-level emotion perception. Complete with more than 360 useful references, 12 example MATLAB® codes, and a listing of key abbreviations and acronyms, this cutting-edge guide supplies the technical understanding and tools needed to develop your own automatic MER system based on the automatic recognition model
- http://library.link/vocab/creatorName
- Yang, Yi-Hsuan
- Nature of contents
- dictionaries
- http://library.link/vocab/relatedWorkOrContributorName
-
- Chen, Homer
- O'Reilly Media Company
- Label
- Music Emotion Recognition, Yang, Yi-Hsuan
- Carrier category
- online resource
- Carrier category code
-
- cr
- Carrier MARC source
- rdacarrier
- Color
- multicolored
- Content category
- text
- Content type code
-
- txt
- Content type MARC source
- rdacontent
- Dimensions
- unknown
- Edition
- 1st edition
- Extent
- 1 online resource (261 pages)
- Form of item
- online
- Issuing body
- Made available through: O'Reilly Media Company.
- Media category
- computer
- Media MARC source
- rdamedia
- Media type code
-
- c
- Reproduction note
- Electronic reproduction.
- Specific material designation
- remote
- System control number
- (CaSebORM)9781439850466
- System details
- Mode of access: World Wide Web
- Label
- Music Emotion Recognition, Yang, Yi-Hsuan
- Carrier category
- online resource
- Carrier category code
-
- cr
- Carrier MARC source
- rdacarrier
- Color
- multicolored
- Content category
- text
- Content type code
-
- txt
- Content type MARC source
- rdacontent
- Dimensions
- unknown
- Edition
- 1st edition
- Extent
- 1 online resource (261 pages)
- Form of item
- online
- Issuing body
- Made available through: O'Reilly Media Company.
- Media category
- computer
- Media MARC source
- rdamedia
- Media type code
-
- c
- Reproduction note
- Electronic reproduction.
- Specific material designation
- remote
- System control number
- (CaSebORM)9781439850466
- System details
- Mode of access: World Wide Web
Library Links
Embed
Settings
Select options that apply then copy and paste the RDF/HTML data fragment to include in your application
Embed this data in a secure (HTTPS) page:
Layout options:
Include data citation:
<div class="citation" vocab="http://schema.org/"><i class="fa fa-external-link-square fa-fw"></i> Data from <span resource="http://link.mvlc.org/portal/Music-Emotion-Recognition-Yang/JUVeW5sKzSI/" typeof="Book http://bibfra.me/vocab/lite/Item"><span property="name http://bibfra.me/vocab/lite/label"><a href="http://link.mvlc.org/portal/Music-Emotion-Recognition-Yang/JUVeW5sKzSI/">Music Emotion Recognition, Yang, Yi-Hsuan</a></span> - <span property="potentialAction" typeOf="OrganizeAction"><span property="agent" typeof="LibrarySystem http://library.link/vocab/LibrarySystem" resource="http://link.mvlc.org/"><span property="name http://bibfra.me/vocab/lite/label"><a property="url" href="http://link.mvlc.org/">Merrimack Valley Library Consortium</a></span></span></span></span></div>
Note: Adjust the width and height settings defined in the RDF/HTML code fragment to best match your requirements
Preview
Cite Data - Experimental
Data Citation of the Item Music Emotion Recognition, Yang, Yi-Hsuan
Copy and paste the following RDF/HTML data fragment to cite this resource
<div class="citation" vocab="http://schema.org/"><i class="fa fa-external-link-square fa-fw"></i> Data from <span resource="http://link.mvlc.org/portal/Music-Emotion-Recognition-Yang/JUVeW5sKzSI/" typeof="Book http://bibfra.me/vocab/lite/Item"><span property="name http://bibfra.me/vocab/lite/label"><a href="http://link.mvlc.org/portal/Music-Emotion-Recognition-Yang/JUVeW5sKzSI/">Music Emotion Recognition, Yang, Yi-Hsuan</a></span> - <span property="potentialAction" typeOf="OrganizeAction"><span property="agent" typeof="LibrarySystem http://library.link/vocab/LibrarySystem" resource="http://link.mvlc.org/"><span property="name http://bibfra.me/vocab/lite/label"><a property="url" href="http://link.mvlc.org/">Merrimack Valley Library Consortium</a></span></span></span></span></div>