RuleMatrix: Visualizing and Understanding Classifiers with Rules

Yao Ming, Huamin Qu, Enrico Bertini

    Research output: Contribution to journalArticle

    Abstract

    With the growing adoption of machine learning techniques, there is a surge of research interest towards making machine learning systems more transparent and interpretable. Various visualizations have been developed to help model developers understand, diagnose, and refine machine learning models. However, a large number of potential but neglected users are the domain experts with little knowledge of machine learning but are expected to work with machine learning systems. In this paper, we present an interactive visualization technique to help users with little expertise in machine learning to understand, explore and validate predictive models. By viewing the model as a black box, we extract a standardized rule-based knowledge representation from its input-output behavior. Then, we design RuleMatrix, a matrix-based visualization of rules to help users navigate and verify the rules and the black-box model. We evaluate the effectiveness of RuleMatrix via two use cases and a usability study.

    Original languageEnglish (US)
    JournalIEEE Transactions on Visualization and Computer Graphics
    DOIs
    Publication statusAccepted/In press - Aug 17 2018

      Fingerprint

    Keywords

    • explainable machine learning
    • rule visualization
    • visual analytics

    ASJC Scopus subject areas

    • Software
    • Signal Processing
    • Computer Vision and Pattern Recognition
    • Computer Graphics and Computer-Aided Design

    Cite this