### Abstract

In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the "degree of independence." We show how this can be applied to compute a new notion of conditional probability (we call this "extended conditional probability"). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our "fuzzy" method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.

Original language | English (US) |
---|---|

Pages (from-to) | 137-156 |

Number of pages | 20 |

Journal | Information Sciences |

Volume | 90 |

Issue number | 1-4 |

State | Published - Apr 1996 |

### Fingerprint

### ASJC Scopus subject areas

- Artificial Intelligence
- Computer Science Applications
- Information Systems
- Information Systems and Management
- Statistics, Probability and Uncertainty
- Electrical and Electronic Engineering
- Statistics and Probability

### Cite this

*Information Sciences*,

*90*(1-4), 137-156.

**Fuzzy independence and extended conditional probability.** / Hoffman, Mark E.; Manevitz, L. M.; Wong, Edward K.

Research output: Contribution to journal › Article

*Information Sciences*, vol. 90, no. 1-4, pp. 137-156.

}

TY - JOUR

T1 - Fuzzy independence and extended conditional probability

AU - Hoffman, Mark E.

AU - Manevitz, L. M.

AU - Wong, Edward K.

PY - 1996/4

Y1 - 1996/4

N2 - In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the "degree of independence." We show how this can be applied to compute a new notion of conditional probability (we call this "extended conditional probability"). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our "fuzzy" method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.

AB - In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously, incorrect. The notion of independence is a 0-1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the "degree of independence." We show how this can be applied to compute a new notion of conditional probability (we call this "extended conditional probability"). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our "fuzzy" method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.

UR - http://www.scopus.com/inward/record.url?scp=0030129296&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030129296&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0030129296

VL - 90

SP - 137

EP - 156

JO - Information Sciences

JF - Information Sciences

SN - 0020-0255

IS - 1-4

ER -