Fuzzy independence and extended conditional probability

Mark E. Hoffman, L. M. Manevitz, Edward K. Wong

    Research output: Contribution to journalArticle

    Abstract

    In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the "degree of independence." We show how this can be applied to compute a new notion of conditional probability (we call this "extended conditional probability"). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our "fuzzy" method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.

    Original languageEnglish (US)
    Pages (from-to)137-156
    Number of pages20
    JournalInformation Sciences
    Volume90
    Issue number1-4
    StatePublished - Apr 1996

    Fingerprint

    Conditional probability
    Linguistics
    Interpolants
    Conditioning
    Formulation
    Artificial intelligence
    Entropy
    Dempster-Shafer Theory
    Linguistic Variables
    Independence
    Guess
    Probability Theory
    High Power
    Updating
    Artificial Intelligence
    Heuristics
    Imply
    Calculate
    Methodology
    Approximation

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Science Applications
    • Information Systems
    • Information Systems and Management
    • Statistics, Probability and Uncertainty
    • Electrical and Electronic Engineering
    • Statistics and Probability

    Cite this

    Hoffman, M. E., Manevitz, L. M., & Wong, E. K. (1996). Fuzzy independence and extended conditional probability. Information Sciences, 90(1-4), 137-156.

    Fuzzy independence and extended conditional probability. / Hoffman, Mark E.; Manevitz, L. M.; Wong, Edward K.

    In: Information Sciences, Vol. 90, No. 1-4, 04.1996, p. 137-156.

    Research output: Contribution to journalArticle

    Hoffman, ME, Manevitz, LM & Wong, EK 1996, 'Fuzzy independence and extended conditional probability', Information Sciences, vol. 90, no. 1-4, pp. 137-156.
    Hoffman ME, Manevitz LM, Wong EK. Fuzzy independence and extended conditional probability. Information Sciences. 1996 Apr;90(1-4):137-156.
    Hoffman, Mark E. ; Manevitz, L. M. ; Wong, Edward K. / Fuzzy independence and extended conditional probability. In: Information Sciences. 1996 ; Vol. 90, No. 1-4. pp. 137-156.
    @article{08a9d7f1f39e4af7aff6987e028bd9a5,
    title = "Fuzzy independence and extended conditional probability",
    abstract = "In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the {"}degree of independence.{"} We show how this can be applied to compute a new notion of conditional probability (we call this {"}extended conditional probability{"}). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our {"}fuzzy{"} method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.",
    author = "Hoffman, {Mark E.} and Manevitz, {L. M.} and Wong, {Edward K.}",
    year = "1996",
    month = "4",
    language = "English (US)",
    volume = "90",
    pages = "137--156",
    journal = "Information Sciences",
    issn = "0020-0255",
    publisher = "Elsevier Inc.",
    number = "1-4",

    }

    TY - JOUR

    T1 - Fuzzy independence and extended conditional probability

    AU - Hoffman, Mark E.

    AU - Manevitz, L. M.

    AU - Wong, Edward K.

    PY - 1996/4

    Y1 - 1996/4

    N2 - In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the "degree of independence." We show how this can be applied to compute a new notion of conditional probability (we call this "extended conditional probability"). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our "fuzzy" method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.

    AB - In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the "degree of independence." We show how this can be applied to compute a new notion of conditional probability (we call this "extended conditional probability"). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our "fuzzy" method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.

    UR - http://www.scopus.com/inward/record.url?scp=0030129296&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0030129296&partnerID=8YFLogxK

    M3 - Article

    AN - SCOPUS:0030129296

    VL - 90

    SP - 137

    EP - 156

    JO - Information Sciences

    JF - Information Sciences

    SN - 0020-0255

    IS - 1-4

    ER -