[read Chapter 7]
[Suggested exercises: 7.1, 7.2, 7.5, 7.8]
Computational learning theory
Setting 1: learner poses queries to teacher
Setting 2: teacher chooses examples
Setting 3: randomly generated instances, labeled
by teacher
Probably approximately correct (PAC) learning
Vapnik-Chervonenkis Dimension
Mistake bounds
175 lecture slides for textbook Machine Learning, c Tom M. Mitchell, McGraw Hill, 1997
, Computational Learning Theory
What general laws constrain inductive learning?
We seek theory to relate:
Probability of successful learning
Number of training examples
Complexity of hypothesis space
Accuracy to which target concept is
approximated
Manner in which training examples presented
176 lecture slides for textbook Machine Learning, c Tom M. Mitchell, McGraw Hill, 1997
, Prototypical Concept Learning Task
Given:
{ Instances X : Possible days, each described by
the attributes Sky, AirTemp, Humidity,
Wind, Water, Forecast
{ Target function c: EnjoySport : X ! f0; 1g
{ Hypotheses H : Conjunctions of literals. E.g.
h?; Cold; High; ?; ?; ?i:
{ Training examples D: Positive and negative
examples of the target function
hx1; c(x1)i; : : : hxm; c(xm)i
Determine:
{ A hypothesis h in H such that h(x) = c(x) for
all x in D?
{ A hypothesis h in H such that h(x) = c(x) for
all x in X ?
177 lecture slides for textbook Machine Learning, c Tom M. Mitchell, McGraw Hill, 1997