Deep interactive evolution

Philip Bontrager, Wending Lin, Julian Togelius, Sebastian Risi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    This paper describes an approach that combines generative adversarial networks (GANs) with interactive evolutionary computation (IEC). While GANs can be trained to produce lifelike images, they are normally sampled randomly from the learned distribution, providing limited control over the resulting output. On the other hand, interactive evolution has shown promise in creating various artifacts such as images, music and 3D objects, but traditionally relies on a hand-designed evolvable representation of the target domain. The main insight in this paper is that a GAN trained on a specific target domain can act as a compact and robust genotype-to-phenotype mapping (i.e. most produced phenotypes do resemble valid domain artifacts). Once such a GAN is trained, the latent vector given as input to the GAN’s generator network can be put under evolutionary control, allowing controllable and high-quality image generation. In this paper, we demonstrate the advantage of this novel approach through a user study in which participants were able to evolve images that strongly resemble specific target images.

    Original languageEnglish (US)
    Title of host publicationComputational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings
    PublisherSpringer-Verlag
    Pages267-282
    Number of pages16
    ISBN (Print)9783319775821
    DOIs
    StatePublished - Jan 1 2018
    Event7th International Conference on Computational Intelligence in Music, Sound, Art and Design, EvoMUSART 2018 - Parma, Italy
    Duration: Apr 4 2018Apr 6 2018

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume10783 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Other

    Other7th International Conference on Computational Intelligence in Music, Sound, Art and Design, EvoMUSART 2018
    CountryItaly
    CityParma
    Period4/4/184/6/18

    Fingerprint

    Evolutionary algorithms
    Image quality
    Phenotype
    Target
    Interactive Evolutionary Computation
    User Studies
    Genotype
    Music
    Image Quality
    Generator
    Valid
    Output
    Demonstrate
    Object

    ASJC Scopus subject areas

    • Theoretical Computer Science
    • Computer Science(all)

    Cite this

    Bontrager, P., Lin, W., Togelius, J., & Risi, S. (2018). Deep interactive evolution. In Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings (pp. 267-282). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10783 LNCS). Springer-Verlag. https://doi.org/10.1007/978-3-319-77583-8_18

    Deep interactive evolution. / Bontrager, Philip; Lin, Wending; Togelius, Julian; Risi, Sebastian.

    Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings. Springer-Verlag, 2018. p. 267-282 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10783 LNCS).

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Bontrager, P, Lin, W, Togelius, J & Risi, S 2018, Deep interactive evolution. in Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10783 LNCS, Springer-Verlag, pp. 267-282, 7th International Conference on Computational Intelligence in Music, Sound, Art and Design, EvoMUSART 2018, Parma, Italy, 4/4/18. https://doi.org/10.1007/978-3-319-77583-8_18
    Bontrager P, Lin W, Togelius J, Risi S. Deep interactive evolution. In Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings. Springer-Verlag. 2018. p. 267-282. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-77583-8_18
    Bontrager, Philip ; Lin, Wending ; Togelius, Julian ; Risi, Sebastian. / Deep interactive evolution. Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings. Springer-Verlag, 2018. pp. 267-282 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
    @inproceedings{5aa5fc7a978942c5a4ae029aee5b850c,
    title = "Deep interactive evolution",
    abstract = "This paper describes an approach that combines generative adversarial networks (GANs) with interactive evolutionary computation (IEC). While GANs can be trained to produce lifelike images, they are normally sampled randomly from the learned distribution, providing limited control over the resulting output. On the other hand, interactive evolution has shown promise in creating various artifacts such as images, music and 3D objects, but traditionally relies on a hand-designed evolvable representation of the target domain. The main insight in this paper is that a GAN trained on a specific target domain can act as a compact and robust genotype-to-phenotype mapping (i.e. most produced phenotypes do resemble valid domain artifacts). Once such a GAN is trained, the latent vector given as input to the GAN’s generator network can be put under evolutionary control, allowing controllable and high-quality image generation. In this paper, we demonstrate the advantage of this novel approach through a user study in which participants were able to evolve images that strongly resemble specific target images.",
    author = "Philip Bontrager and Wending Lin and Julian Togelius and Sebastian Risi",
    year = "2018",
    month = "1",
    day = "1",
    doi = "10.1007/978-3-319-77583-8_18",
    language = "English (US)",
    isbn = "9783319775821",
    series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
    publisher = "Springer-Verlag",
    pages = "267--282",
    booktitle = "Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings",

    }

    TY - GEN

    T1 - Deep interactive evolution

    AU - Bontrager, Philip

    AU - Lin, Wending

    AU - Togelius, Julian

    AU - Risi, Sebastian

    PY - 2018/1/1

    Y1 - 2018/1/1

    N2 - This paper describes an approach that combines generative adversarial networks (GANs) with interactive evolutionary computation (IEC). While GANs can be trained to produce lifelike images, they are normally sampled randomly from the learned distribution, providing limited control over the resulting output. On the other hand, interactive evolution has shown promise in creating various artifacts such as images, music and 3D objects, but traditionally relies on a hand-designed evolvable representation of the target domain. The main insight in this paper is that a GAN trained on a specific target domain can act as a compact and robust genotype-to-phenotype mapping (i.e. most produced phenotypes do resemble valid domain artifacts). Once such a GAN is trained, the latent vector given as input to the GAN’s generator network can be put under evolutionary control, allowing controllable and high-quality image generation. In this paper, we demonstrate the advantage of this novel approach through a user study in which participants were able to evolve images that strongly resemble specific target images.

    AB - This paper describes an approach that combines generative adversarial networks (GANs) with interactive evolutionary computation (IEC). While GANs can be trained to produce lifelike images, they are normally sampled randomly from the learned distribution, providing limited control over the resulting output. On the other hand, interactive evolution has shown promise in creating various artifacts such as images, music and 3D objects, but traditionally relies on a hand-designed evolvable representation of the target domain. The main insight in this paper is that a GAN trained on a specific target domain can act as a compact and robust genotype-to-phenotype mapping (i.e. most produced phenotypes do resemble valid domain artifacts). Once such a GAN is trained, the latent vector given as input to the GAN’s generator network can be put under evolutionary control, allowing controllable and high-quality image generation. In this paper, we demonstrate the advantage of this novel approach through a user study in which participants were able to evolve images that strongly resemble specific target images.

    UR - http://www.scopus.com/inward/record.url?scp=85044656652&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=85044656652&partnerID=8YFLogxK

    U2 - 10.1007/978-3-319-77583-8_18

    DO - 10.1007/978-3-319-77583-8_18

    M3 - Conference contribution

    AN - SCOPUS:85044656652

    SN - 9783319775821

    T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

    SP - 267

    EP - 282

    BT - Computational Intelligence in Music, Sound, Art and Design - 7th International Conference, EvoMUSART 2018, Proceedings

    PB - Springer-Verlag

    ER -