Empirical studies in information visualization: Seven scenarios

Heidi Lam, Enrico Bertini, Petra Isenberg, Catherine Plaisant, Sheelagh Carpendale

    Research output: Contribution to journalArticle

    Abstract

    We take a new, scenario-based look at evaluation in information visualization. Our seven scenarios, evaluating visual data analysis and reasoning, evaluating user performance, evaluating user experience, evaluating environments and work practices, evaluating communication through visualization, evaluating visualization algorithms, and evaluating collaborative data analysis were derived through an extensive literature review of over 800 visualization publications. These scenarios distinguish different study goals and types of research questions and are illustrated through example studies. Through this broad survey and the distillation of these scenarios, we make two contributions. One, we encapsulate the current practices in the information visualization research community and, two, we provide a different approach to reaching decisions about what might be the most effective evaluation of a given information visualization. Scenarios can be used to choose appropriate research questions and goals and the provided examples can be consulted for guidance on how to design one's own study.

    Original languageEnglish (US)
    Article number6095544
    Pages (from-to)1520-1536
    Number of pages17
    JournalIEEE Transactions on Visualization and Computer Graphics
    Volume18
    Issue number9
    DOIs
    StatePublished - 2012

    Fingerprint

    Visualization
    Distillation
    Communication

    Keywords

    • evaluation
    • Information visualization

    ASJC Scopus subject areas

    • Computer Graphics and Computer-Aided Design
    • Software
    • Computer Vision and Pattern Recognition
    • Signal Processing

    Cite this

    Lam, H., Bertini, E., Isenberg, P., Plaisant, C., & Carpendale, S. (2012). Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics, 18(9), 1520-1536. [6095544]. https://doi.org/10.1109/TVCG.2011.279

    Empirical studies in information visualization : Seven scenarios. / Lam, Heidi; Bertini, Enrico; Isenberg, Petra; Plaisant, Catherine; Carpendale, Sheelagh.

    In: IEEE Transactions on Visualization and Computer Graphics, Vol. 18, No. 9, 6095544, 2012, p. 1520-1536.

    Research output: Contribution to journalArticle

    Lam, H, Bertini, E, Isenberg, P, Plaisant, C & Carpendale, S 2012, 'Empirical studies in information visualization: Seven scenarios', IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 9, 6095544, pp. 1520-1536. https://doi.org/10.1109/TVCG.2011.279
    Lam, Heidi ; Bertini, Enrico ; Isenberg, Petra ; Plaisant, Catherine ; Carpendale, Sheelagh. / Empirical studies in information visualization : Seven scenarios. In: IEEE Transactions on Visualization and Computer Graphics. 2012 ; Vol. 18, No. 9. pp. 1520-1536.
    @article{4f099d6190cc4af68e12f0e61360dc73,
    title = "Empirical studies in information visualization: Seven scenarios",
    abstract = "We take a new, scenario-based look at evaluation in information visualization. Our seven scenarios, evaluating visual data analysis and reasoning, evaluating user performance, evaluating user experience, evaluating environments and work practices, evaluating communication through visualization, evaluating visualization algorithms, and evaluating collaborative data analysis were derived through an extensive literature review of over 800 visualization publications. These scenarios distinguish different study goals and types of research questions and are illustrated through example studies. Through this broad survey and the distillation of these scenarios, we make two contributions. One, we encapsulate the current practices in the information visualization research community and, two, we provide a different approach to reaching decisions about what might be the most effective evaluation of a given information visualization. Scenarios can be used to choose appropriate research questions and goals and the provided examples can be consulted for guidance on how to design one's own study.",
    keywords = "evaluation, Information visualization",
    author = "Heidi Lam and Enrico Bertini and Petra Isenberg and Catherine Plaisant and Sheelagh Carpendale",
    year = "2012",
    doi = "10.1109/TVCG.2011.279",
    language = "English (US)",
    volume = "18",
    pages = "1520--1536",
    journal = "IEEE Transactions on Visualization and Computer Graphics",
    issn = "1077-2626",
    publisher = "IEEE Computer Society",
    number = "9",

    }

    TY - JOUR

    T1 - Empirical studies in information visualization

    T2 - Seven scenarios

    AU - Lam, Heidi

    AU - Bertini, Enrico

    AU - Isenberg, Petra

    AU - Plaisant, Catherine

    AU - Carpendale, Sheelagh

    PY - 2012

    Y1 - 2012

    N2 - We take a new, scenario-based look at evaluation in information visualization. Our seven scenarios, evaluating visual data analysis and reasoning, evaluating user performance, evaluating user experience, evaluating environments and work practices, evaluating communication through visualization, evaluating visualization algorithms, and evaluating collaborative data analysis were derived through an extensive literature review of over 800 visualization publications. These scenarios distinguish different study goals and types of research questions and are illustrated through example studies. Through this broad survey and the distillation of these scenarios, we make two contributions. One, we encapsulate the current practices in the information visualization research community and, two, we provide a different approach to reaching decisions about what might be the most effective evaluation of a given information visualization. Scenarios can be used to choose appropriate research questions and goals and the provided examples can be consulted for guidance on how to design one's own study.

    AB - We take a new, scenario-based look at evaluation in information visualization. Our seven scenarios, evaluating visual data analysis and reasoning, evaluating user performance, evaluating user experience, evaluating environments and work practices, evaluating communication through visualization, evaluating visualization algorithms, and evaluating collaborative data analysis were derived through an extensive literature review of over 800 visualization publications. These scenarios distinguish different study goals and types of research questions and are illustrated through example studies. Through this broad survey and the distillation of these scenarios, we make two contributions. One, we encapsulate the current practices in the information visualization research community and, two, we provide a different approach to reaching decisions about what might be the most effective evaluation of a given information visualization. Scenarios can be used to choose appropriate research questions and goals and the provided examples can be consulted for guidance on how to design one's own study.

    KW - evaluation

    KW - Information visualization

    UR - http://www.scopus.com/inward/record.url?scp=84864130051&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84864130051&partnerID=8YFLogxK

    U2 - 10.1109/TVCG.2011.279

    DO - 10.1109/TVCG.2011.279

    M3 - Article

    VL - 18

    SP - 1520

    EP - 1536

    JO - IEEE Transactions on Visualization and Computer Graphics

    JF - IEEE Transactions on Visualization and Computer Graphics

    SN - 1077-2626

    IS - 9

    M1 - 6095544

    ER -