ITT63 ARTIFICIAL INTELLIGENCE UNIT III
Academic Year 2016-2017(EVEN SEM)
Reasoning under uncertainty: Logics of non-monotonic reasoning - Implementation- Basic
probability notation - Bayes rule – Certainty factors and rule based systems-Bayesian networks –
Dempster - Shafer Theory - Fuzzy Logic.
Uncertainty
Let action At = leave for airport t minutes before flight
Will At get me there on time?
Problems:
1) partial observability (road state, other drivers' plans, etc.)
2) noisy sensors (KCBS tra_c reports)
3) uncertainty in action outcomes (at tire, etc.)
4) immense complexity of modelling and predicting tra_c
Hence a purely logical approach either
1) risks falsehood: A25 will get me there on time"
or 2) leads to conclusions that are too weak for decision making:
“A25 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires
remain intact etc etc."
Methods for handling uncertainty
Default or nonmonotonic logic:
Assume my car does not have a at tire
Assume A25 works unless contradicted by evidence
Issues: What assumptions are reasonable? How to handle contradiction?
Rules with fudge factors:
III YR /VI SEM Page 1
,ITT63 ARTIFICIAL INTELLIGENCE UNIT III
Issues: Problems with combination, e.g., Sprinkler causes Rain??
non-monotonic reasoning
A non-monotonic logic is a formal logic whose consequence relation is not monotonic. Most
studied formal logics have a monotonic consequence relation, meaning that adding a formula to a theory
never produces a reduction of its set of consequences. Intuitively, monotonicity indicates that learning a new
piece of knowledge cannot reduce the set of what is known.
A monotonic logic cannot handle various reasoning tasks such as reasoning by default
(consequences may be derived only because of lack of evidence of the contrary), abductive reasoning
(consequences are only deduced as most likely explanations), some important approaches to reasoning about
knowledge
Default reasoning
An example of a default assumption is that the typical bird flies. As a result, if a given animal is
known to be a bird, and nothing else is known, it can be assumed to be able to fly. The default
assumption must however be retracted if it is later learned that the considered animal is a penguin. This
example shows that a logic that models default reasoning should not be monotonic.
Logics formalizing default reasoning can be roughly divided in two categories: logics able to deal
with arbitrary default assumptions (default logic, defeasible logic/defeasible reasoning/argument (logic),
and answer set programming) and logics that formalize the specific default assumption that facts that are
not known to be true can be assumed false by default (closed world assumption and circumscription).
Abductive reasoning
Abductive reasoning is the process of deriving the most likely explanations of the known facts.
An abductive logic should not be monotonic because the most likely explanations are not necessarily
correct.
III YR /VI SEM Page 2
, ITT63 ARTIFICIAL INTELLIGENCE UNIT III
For example, the most likely explanation for seeing wet grass is that it rained; however, this
explanation has to be retracted when learning that the real cause of the grass being wet was a sprinkler.
Since the old explanation (it rained) is retracted because of the addition of a piece of knowledge (a
sprinkler was active), any logic that models explanations is non-monotonic.
Reasoning about knowledge
If a logic includes formulae that mean that something is not known, this logic should not be
monotonic. Indeed, learning something that was previously not known leads to the removal of the formula
specifying that this piece of knowledge is not known. This second change (a removal caused by an addition)
violates the condition of monotonicity. A logic for reasoning about knowledge is the autoepistemic logic.
Belief revision
Belief revision is the process of changing beliefs to accommodate a new belief that might be inconsistent
with the old ones. In the assumption that the new belief is correct, some of the old ones have to be retracted
in order to maintain consistency. This retraction in response to an addition of a new belief makes any logic
for belief revision to be non-monotonic. The belief revision approach is alternative to paraconsistent logics,
which tolerate inconsistency rather than attempting to remove it.
Implementation Issues in non monotonic reasoning
The logical frameworks are not enough for implementing non monotonic reasoning in problem solving
programs. These are the some weakness for logical systems. The four important problems that arise in real
systems are us follows.
The first is how to derive exactly those monotonic conclusions that are relevant to solving the
problem at hand while not wasting not wasting time on those that, while they may be licensed by the
logic.
The second problem is how to update our knowledge incrementally as problem solving progresses.
The third problem is that in nonmonotonic reasoning systems, it often happens that more than one
interpretation of the known facts is licensed by the available inference rules.
The final problem is that, in general, these theories are not computationally effective.
Implementation – Depth first search
The implementation of DFS is directed by
III YR /VI SEM Page 3
Academic Year 2016-2017(EVEN SEM)
Reasoning under uncertainty: Logics of non-monotonic reasoning - Implementation- Basic
probability notation - Bayes rule – Certainty factors and rule based systems-Bayesian networks –
Dempster - Shafer Theory - Fuzzy Logic.
Uncertainty
Let action At = leave for airport t minutes before flight
Will At get me there on time?
Problems:
1) partial observability (road state, other drivers' plans, etc.)
2) noisy sensors (KCBS tra_c reports)
3) uncertainty in action outcomes (at tire, etc.)
4) immense complexity of modelling and predicting tra_c
Hence a purely logical approach either
1) risks falsehood: A25 will get me there on time"
or 2) leads to conclusions that are too weak for decision making:
“A25 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires
remain intact etc etc."
Methods for handling uncertainty
Default or nonmonotonic logic:
Assume my car does not have a at tire
Assume A25 works unless contradicted by evidence
Issues: What assumptions are reasonable? How to handle contradiction?
Rules with fudge factors:
III YR /VI SEM Page 1
,ITT63 ARTIFICIAL INTELLIGENCE UNIT III
Issues: Problems with combination, e.g., Sprinkler causes Rain??
non-monotonic reasoning
A non-monotonic logic is a formal logic whose consequence relation is not monotonic. Most
studied formal logics have a monotonic consequence relation, meaning that adding a formula to a theory
never produces a reduction of its set of consequences. Intuitively, monotonicity indicates that learning a new
piece of knowledge cannot reduce the set of what is known.
A monotonic logic cannot handle various reasoning tasks such as reasoning by default
(consequences may be derived only because of lack of evidence of the contrary), abductive reasoning
(consequences are only deduced as most likely explanations), some important approaches to reasoning about
knowledge
Default reasoning
An example of a default assumption is that the typical bird flies. As a result, if a given animal is
known to be a bird, and nothing else is known, it can be assumed to be able to fly. The default
assumption must however be retracted if it is later learned that the considered animal is a penguin. This
example shows that a logic that models default reasoning should not be monotonic.
Logics formalizing default reasoning can be roughly divided in two categories: logics able to deal
with arbitrary default assumptions (default logic, defeasible logic/defeasible reasoning/argument (logic),
and answer set programming) and logics that formalize the specific default assumption that facts that are
not known to be true can be assumed false by default (closed world assumption and circumscription).
Abductive reasoning
Abductive reasoning is the process of deriving the most likely explanations of the known facts.
An abductive logic should not be monotonic because the most likely explanations are not necessarily
correct.
III YR /VI SEM Page 2
, ITT63 ARTIFICIAL INTELLIGENCE UNIT III
For example, the most likely explanation for seeing wet grass is that it rained; however, this
explanation has to be retracted when learning that the real cause of the grass being wet was a sprinkler.
Since the old explanation (it rained) is retracted because of the addition of a piece of knowledge (a
sprinkler was active), any logic that models explanations is non-monotonic.
Reasoning about knowledge
If a logic includes formulae that mean that something is not known, this logic should not be
monotonic. Indeed, learning something that was previously not known leads to the removal of the formula
specifying that this piece of knowledge is not known. This second change (a removal caused by an addition)
violates the condition of monotonicity. A logic for reasoning about knowledge is the autoepistemic logic.
Belief revision
Belief revision is the process of changing beliefs to accommodate a new belief that might be inconsistent
with the old ones. In the assumption that the new belief is correct, some of the old ones have to be retracted
in order to maintain consistency. This retraction in response to an addition of a new belief makes any logic
for belief revision to be non-monotonic. The belief revision approach is alternative to paraconsistent logics,
which tolerate inconsistency rather than attempting to remove it.
Implementation Issues in non monotonic reasoning
The logical frameworks are not enough for implementing non monotonic reasoning in problem solving
programs. These are the some weakness for logical systems. The four important problems that arise in real
systems are us follows.
The first is how to derive exactly those monotonic conclusions that are relevant to solving the
problem at hand while not wasting not wasting time on those that, while they may be licensed by the
logic.
The second problem is how to update our knowledge incrementally as problem solving progresses.
The third problem is that in nonmonotonic reasoning systems, it often happens that more than one
interpretation of the known facts is licensed by the available inference rules.
The final problem is that, in general, these theories are not computationally effective.
Implementation – Depth first search
The implementation of DFS is directed by
III YR /VI SEM Page 3