Can you feel it? Evaluation of affective expression in music generated by MetaCompose

Marco Scirea, Peter Eklund, Julian Togelius, Sebastian Risi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.

    Original languageEnglish (US)
    Title of host publicationGECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference
    PublisherAssociation for Computing Machinery, Inc
    Pages211-218
    Number of pages8
    ISBN (Electronic)9781450349208
    DOIs
    StatePublished - Jul 1 2017
    Event2017 Genetic and Evolutionary Computation Conference, GECCO 2017 - Berlin, Germany
    Duration: Jul 15 2017Jul 19 2017

    Other

    Other2017 Genetic and Evolutionary Computation Conference, GECCO 2017
    CountryGermany
    CityBerlin
    Period7/15/177/19/17

    Fingerprint

    Multiobjective optimization
    Evolutionary algorithms
    Experiments

    Keywords

    • Affective Computing
    • Music generation
    • Quantitative evaluation

    ASJC Scopus subject areas

    • Software
    • Computer Science Applications
    • Computational Theory and Mathematics

    Cite this

    Scirea, M., Eklund, P., Togelius, J., & Risi, S. (2017). Can you feel it? Evaluation of affective expression in music generated by MetaCompose. In GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference (pp. 211-218). Association for Computing Machinery, Inc. https://doi.org/10.1145/3071178.3071314

    Can you feel it? Evaluation of affective expression in music generated by MetaCompose. / Scirea, Marco; Eklund, Peter; Togelius, Julian; Risi, Sebastian.

    GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference. Association for Computing Machinery, Inc, 2017. p. 211-218.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Scirea, M, Eklund, P, Togelius, J & Risi, S 2017, Can you feel it? Evaluation of affective expression in music generated by MetaCompose. in GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference. Association for Computing Machinery, Inc, pp. 211-218, 2017 Genetic and Evolutionary Computation Conference, GECCO 2017, Berlin, Germany, 7/15/17. https://doi.org/10.1145/3071178.3071314
    Scirea M, Eklund P, Togelius J, Risi S. Can you feel it? Evaluation of affective expression in music generated by MetaCompose. In GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference. Association for Computing Machinery, Inc. 2017. p. 211-218 https://doi.org/10.1145/3071178.3071314
    Scirea, Marco ; Eklund, Peter ; Togelius, Julian ; Risi, Sebastian. / Can you feel it? Evaluation of affective expression in music generated by MetaCompose. GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference. Association for Computing Machinery, Inc, 2017. pp. 211-218
    @inproceedings{62b9c4fb8c0a486db048115c2a1357b0,
    title = "Can you feel it? Evaluation of affective expression in music generated by MetaCompose",
    abstract = "This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.",
    keywords = "Affective Computing, Music generation, Quantitative evaluation",
    author = "Marco Scirea and Peter Eklund and Julian Togelius and Sebastian Risi",
    year = "2017",
    month = "7",
    day = "1",
    doi = "10.1145/3071178.3071314",
    language = "English (US)",
    pages = "211--218",
    booktitle = "GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference",
    publisher = "Association for Computing Machinery, Inc",

    }

    TY - GEN

    T1 - Can you feel it? Evaluation of affective expression in music generated by MetaCompose

    AU - Scirea, Marco

    AU - Eklund, Peter

    AU - Togelius, Julian

    AU - Risi, Sebastian

    PY - 2017/7/1

    Y1 - 2017/7/1

    N2 - This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.

    AB - This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.

    KW - Affective Computing

    KW - Music generation

    KW - Quantitative evaluation

    UR - http://www.scopus.com/inward/record.url?scp=85026380651&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=85026380651&partnerID=8YFLogxK

    U2 - 10.1145/3071178.3071314

    DO - 10.1145/3071178.3071314

    M3 - Conference contribution

    SP - 211

    EP - 218

    BT - GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference

    PB - Association for Computing Machinery, Inc

    ER -