Moody music generator

Characterising control parameters using crowdsourcing

Marco Scirea, Mark J. Nelson, Julian Togelius

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    We characterise the expressive effects of a music generator capable of varying its moods through two control parameters. The two control parameters were constructed on the basis of existing work on valence and arousal in music, and intended to provide control over those two mood factors. In this paper we conduct a listener study to determine how people actually perceive the various moods the generator can produce. Rather than directly attempting to validate that our two control parameters represent arousal and valence, instead we conduct an open-ended study to crowd-source labels characterising different parts of this two-dimensional control space. Our aim is to characterise perception of the generator’s expressive space, without constraining listeners’ responses to labels specifically aimed at validating the original arousal/valence motivation. Subjects were asked to listen to clips of generated music over the Internet, and to describe the moods with free-text labels. We find that the arousal parameter does roughly map to perceived arousal, but that the nominal “valence” parameter has strong interaction with the arousal parameter, and produces different effects in different parts of the control space. We believe that the characterisation methodology described here is general and could be used to map the expressive range of other parameterisable generators.

    Original languageEnglish (US)
    Title of host publicationEvolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings
    PublisherSpringer Verlag
    Pages200-211
    Number of pages12
    Volume9027
    ISBN (Print)9783319164977
    DOIs
    StatePublished - 2015
    Event4th International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design, EvoMUSART 2015 - Copenhagen, Denmark
    Duration: Apr 8 2015Apr 10 2015

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume9027
    ISSN (Print)03029743
    ISSN (Electronic)16113349

    Other

    Other4th International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design, EvoMUSART 2015
    CountryDenmark
    CityCopenhagen
    Period4/8/154/10/15

    Fingerprint

    Mood
    Music
    Control Parameter
    Generator
    Two Parameters
    Labels
    Computer music
    Categorical or nominal
    Methodology
    Interaction
    Internet
    Range of data

    ASJC Scopus subject areas

    • Computer Science(all)
    • Theoretical Computer Science

    Cite this

    Scirea, M., Nelson, M. J., & Togelius, J. (2015). Moody music generator: Characterising control parameters using crowdsourcing. In Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings (Vol. 9027, pp. 200-211). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9027). Springer Verlag. https://doi.org/10.1007/978-3-319-16498-4_18

    Moody music generator : Characterising control parameters using crowdsourcing. / Scirea, Marco; Nelson, Mark J.; Togelius, Julian.

    Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings. Vol. 9027 Springer Verlag, 2015. p. 200-211 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9027).

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Scirea, M, Nelson, MJ & Togelius, J 2015, Moody music generator: Characterising control parameters using crowdsourcing. in Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings. vol. 9027, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9027, Springer Verlag, pp. 200-211, 4th International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design, EvoMUSART 2015, Copenhagen, Denmark, 4/8/15. https://doi.org/10.1007/978-3-319-16498-4_18
    Scirea M, Nelson MJ, Togelius J. Moody music generator: Characterising control parameters using crowdsourcing. In Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings. Vol. 9027. Springer Verlag. 2015. p. 200-211. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-16498-4_18
    Scirea, Marco ; Nelson, Mark J. ; Togelius, Julian. / Moody music generator : Characterising control parameters using crowdsourcing. Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings. Vol. 9027 Springer Verlag, 2015. pp. 200-211 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
    @inproceedings{955cafdde703440e8de9699dfe4e4373,
    title = "Moody music generator: Characterising control parameters using crowdsourcing",
    abstract = "We characterise the expressive effects of a music generator capable of varying its moods through two control parameters. The two control parameters were constructed on the basis of existing work on valence and arousal in music, and intended to provide control over those two mood factors. In this paper we conduct a listener study to determine how people actually perceive the various moods the generator can produce. Rather than directly attempting to validate that our two control parameters represent arousal and valence, instead we conduct an open-ended study to crowd-source labels characterising different parts of this two-dimensional control space. Our aim is to characterise perception of the generator’s expressive space, without constraining listeners’ responses to labels specifically aimed at validating the original arousal/valence motivation. Subjects were asked to listen to clips of generated music over the Internet, and to describe the moods with free-text labels. We find that the arousal parameter does roughly map to perceived arousal, but that the nominal “valence” parameter has strong interaction with the arousal parameter, and produces different effects in different parts of the control space. We believe that the characterisation methodology described here is general and could be used to map the expressive range of other parameterisable generators.",
    author = "Marco Scirea and Nelson, {Mark J.} and Julian Togelius",
    year = "2015",
    doi = "10.1007/978-3-319-16498-4_18",
    language = "English (US)",
    isbn = "9783319164977",
    volume = "9027",
    series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
    publisher = "Springer Verlag",
    pages = "200--211",
    booktitle = "Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings",

    }

    TY - GEN

    T1 - Moody music generator

    T2 - Characterising control parameters using crowdsourcing

    AU - Scirea, Marco

    AU - Nelson, Mark J.

    AU - Togelius, Julian

    PY - 2015

    Y1 - 2015

    N2 - We characterise the expressive effects of a music generator capable of varying its moods through two control parameters. The two control parameters were constructed on the basis of existing work on valence and arousal in music, and intended to provide control over those two mood factors. In this paper we conduct a listener study to determine how people actually perceive the various moods the generator can produce. Rather than directly attempting to validate that our two control parameters represent arousal and valence, instead we conduct an open-ended study to crowd-source labels characterising different parts of this two-dimensional control space. Our aim is to characterise perception of the generator’s expressive space, without constraining listeners’ responses to labels specifically aimed at validating the original arousal/valence motivation. Subjects were asked to listen to clips of generated music over the Internet, and to describe the moods with free-text labels. We find that the arousal parameter does roughly map to perceived arousal, but that the nominal “valence” parameter has strong interaction with the arousal parameter, and produces different effects in different parts of the control space. We believe that the characterisation methodology described here is general and could be used to map the expressive range of other parameterisable generators.

    AB - We characterise the expressive effects of a music generator capable of varying its moods through two control parameters. The two control parameters were constructed on the basis of existing work on valence and arousal in music, and intended to provide control over those two mood factors. In this paper we conduct a listener study to determine how people actually perceive the various moods the generator can produce. Rather than directly attempting to validate that our two control parameters represent arousal and valence, instead we conduct an open-ended study to crowd-source labels characterising different parts of this two-dimensional control space. Our aim is to characterise perception of the generator’s expressive space, without constraining listeners’ responses to labels specifically aimed at validating the original arousal/valence motivation. Subjects were asked to listen to clips of generated music over the Internet, and to describe the moods with free-text labels. We find that the arousal parameter does roughly map to perceived arousal, but that the nominal “valence” parameter has strong interaction with the arousal parameter, and produces different effects in different parts of the control space. We believe that the characterisation methodology described here is general and could be used to map the expressive range of other parameterisable generators.

    UR - http://www.scopus.com/inward/record.url?scp=84925275050&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84925275050&partnerID=8YFLogxK

    U2 - 10.1007/978-3-319-16498-4_18

    DO - 10.1007/978-3-319-16498-4_18

    M3 - Conference contribution

    SN - 9783319164977

    VL - 9027

    T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

    SP - 200

    EP - 211

    BT - Evolutionary and Biologically Inspired Music, Sound, Art and Design - 4th International Conference, EvoMUSART 2015, Proceedings

    PB - Springer Verlag

    ER -