11:14 2014-10-07
start CalTech machine learning,video 11
overfitting
11:14 2014-10-07
review:
* Multilayer perceptrons
* Neural networks
* Backpropagation
11:16 2014-10-07
overfitting, regularization, validation
11:20 2014-10-07
outline:
* What is overfitting
* The role of noise
* Deterministic noise
* Dealing with overfitting
11:23 2014-10-07
simple target function
11:28 2014-10-07
we're going to generate 5 point from the target
function to learn from.
11:29 2014-10-07
the target disappears, and you have 5 points to fit
11:30 2014-10-07
What is overfitting?
4th-order polynomial fit, Ein = 0, Eout is huge!!!
11:33 2014-10-07
you're exploring more & more the space of weights
11:53 2014-10-07
the total number of free parameters in the model
11:54 2014-10-07
neural network fitting noisy data
11:55 2014-10-07
overfitting occurs when you fitting Ein, Eout goes up!
11:55 2014-10-07
generalization error
11:56 2014-10-07
simply stop at that point
12:03 2014-10-07
early stopping
12:03 2014-10-07
overfitting happens when you compare 2 things,
whether the 2 things are 2 different models or
2 instances within the same model,
12:05 2014-10-07
if there was overfitting, we better detect it, and
stop earlier
12:05 2014-10-07
overfitting: fitting the data more than it's warranted.
culprit: fitting the noise
12:14 2014-10-07
fitting the noise is the cost of doing business
12:15 2014-10-07
it's taken you away from the correct solution
12:16 2014-10-07
so let's say, I'm going to generating 15 points in this case
12:17 2014-10-07
Ein // in-sample error
Eout // out-of-sample error
12:26 2014-10-07
now let's apply the 10th order fit
12:33 2014-10-07
and what is the out-of-sample error(Eout)?
just terrible
12:34 2014-10-07
you're actually fitting the noise
12:34 2014-10-07
getting that notion down is very important
12:36 2014-10-07
where are you going to get overfitting?
you could facing a completely noiseless in the
conventional sense situation, and yet there is
overfitting because you're fitting another type
of noise.
12:38 2014-10-07
noisy simple target
noiseless higher-order target
12:38 2014-10-07
if I told you what the target is, this is not machine learning
12:39 2014-10-07
choose your model
12:39 2014-10-07
you match the "data resources" rather than the "target complexity"
12:41 2014-10-07
in this case, you're looking at the generalization issues,
you know the generalization issues depend on the size & quality
of the data set.
12:42 2014-10-07
overfitting even without noise
12:58 2014-10-07
impact of "noise level" & "target complexity"
13:02 2014-10-07
y = f(x) + ε(x) // target function + noise
13:02 2014-10-07
factors that affect overfitting:
* noise level
* target complexity
* data set size
13:08 2014-10-07
we're fitting a data set with 2 models
13:10 2014-10-07
so you get a pattern for what is going on
13:14 2014-10-07
as I increase the noise level, overfitting worsens
13:15 2014-10-07
as you increase the number of points, the overfitting get down
13:16 2014-10-07
let's look at the impact of Qf(target complexity)
13:17 2014-10-07
overfitting error = f(noise level, target complexity, number of points)
13:18 2014-10-07
there are 2 things that you can derive from these 2 figures
13:19 2014-10-07
impact of noise:
* stochastic noise // noise level
* deterministic noise // target complexity
13:21 2014-10-07
I get more overfitting as I get more deterministic noise
13:22 2014-10-07
Defn of deterministic noise:
The part of f that H cannot capture
// f == target function
// H == Hypothese set
13:23 2014-10-07
it's the part of target that your hypothese cannot capture
13:24 2014-10-07
you're still not going to get f even try your best,
because your hypothese is limited.
13:25 2014-10-07
it cannot be captured because I'm limited in capturing.
13:25 2014-10-07
out of my league
13:26 2014-10-07
why we calling it noise?
13:26 2014-10-07
because their hypothese set is so limited.
13:27 2014-10-07
you're better off just killing that part, and give
them a simple thing they can learn; because the additional
part will mislead them.
13:29 2014-10-07
so that's why it is called noise.
13:30 2014-10-07
noise:
* stochastic noise
* deterministic noise
13:34 2014-10-07
it is out of your ability.
13:42 2014-10-07
that tells me the bias my hypothese is from the target
13:43 2014-10-07
noise & Bias-Variance decomposition
y = f(x) + ε(x) // what if we add noise?
13:44 2014-10-07
Actually, two noise term
var + bias(deterministic noise) + energy of noise(stochastic noise)
13:49 2014-10-07
your hypothese => centroid => target proper => actual output
13:50 2014-10-07
look at this decomposition,
13:54 2014-10-07
How do we deal with overfitting? // 2 cures
* regularization // putting the brakes
* validation //
13:56 2014-10-07
putting the brakes: the regularization part
13:56 2014-10-07
the amount of brake I'm going to put here is so minimal,
13:58 2014-10-07
that little bit of brake will result in this...
totally dramatic, fantastic fit
13:58 2014-10-07
free fit => restrained fit // regularization(some brake)
13:59 2014-10-07
we don't have to do much to prevent the overfitting
13:59 2014-10-07
but we need to understand what is regularization &
how to choose it etc.
14:00 2014-10-07
validation is the other prescription
start CalTech machine learning,video 11
overfitting
11:14 2014-10-07
review:
* Multilayer perceptrons
* Neural networks
* Backpropagation
11:16 2014-10-07
overfitting, regularization, validation
11:20 2014-10-07
outline:
* What is overfitting
* The role of noise
* Deterministic noise
* Dealing with overfitting
11:23 2014-10-07
simple target function
11:28 2014-10-07
we're going to generate 5 point from the target
function to learn from.
11:29 2014-10-07
the target disappears, and you have 5 points to fit
11:30 2014-10-07
What is overfitting?
4th-order polynomial fit, Ein = 0, Eout is huge!!!
11:33 2014-10-07
you're exploring more & more the space of weights
11:53 2014-10-07
the total number of free parameters in the model
11:54 2014-10-07
neural network fitting noisy data
11:55 2014-10-07
overfitting occurs when you fitting Ein, Eout goes up!
11:55 2014-10-07
generalization error
11:56 2014-10-07
simply stop at that point
12:03 2014-10-07
early stopping
12:03 2014-10-07
overfitting happens when you compare 2 things,
whether the 2 things are 2 different models or
2 instances within the same model,
12:05 2014-10-07
if there was overfitting, we better detect it, and
stop earlier
12:05 2014-10-07
overfitting: fitting the data more than it's warranted.
culprit: fitting the noise
12:14 2014-10-07
fitting the noise is the cost of doing business
12:15 2014-10-07
it's taken you away from the correct solution
12:16 2014-10-07
so let's say, I'm going to generating 15 points in this case
12:17 2014-10-07
Ein // in-sample error
Eout // out-of-sample error
12:26 2014-10-07
now let's apply the 10th order fit
12:33 2014-10-07
and what is the out-of-sample error(Eout)?
just terrible
12:34 2014-10-07
you're actually fitting the noise
12:34 2014-10-07
getting that notion down is very important
12:36 2014-10-07
where are you going to get overfitting?
you could facing a completely noiseless in the
conventional sense situation, and yet there is
overfitting because you're fitting another type
of noise.
12:38 2014-10-07
noisy simple target
noiseless higher-order target
12:38 2014-10-07
if I told you what the target is, this is not machine learning
12:39 2014-10-07
choose your model
12:39 2014-10-07
you match the "data resources" rather than the "target complexity"
12:41 2014-10-07
in this case, you're looking at the generalization issues,
you know the generalization issues depend on the size & quality
of the data set.
12:42 2014-10-07
overfitting even without noise
12:58 2014-10-07
impact of "noise level" & "target complexity"
13:02 2014-10-07
y = f(x) + ε(x) // target function + noise
13:02 2014-10-07
factors that affect overfitting:
* noise level
* target complexity
* data set size
13:08 2014-10-07
we're fitting a data set with 2 models
13:10 2014-10-07
so you get a pattern for what is going on
13:14 2014-10-07
as I increase the noise level, overfitting worsens
13:15 2014-10-07
as you increase the number of points, the overfitting get down
13:16 2014-10-07
let's look at the impact of Qf(target complexity)
13:17 2014-10-07
overfitting error = f(noise level, target complexity, number of points)
13:18 2014-10-07
there are 2 things that you can derive from these 2 figures
13:19 2014-10-07
impact of noise:
* stochastic noise // noise level
* deterministic noise // target complexity
13:21 2014-10-07
I get more overfitting as I get more deterministic noise
13:22 2014-10-07
Defn of deterministic noise:
The part of f that H cannot capture
// f == target function
// H == Hypothese set
13:23 2014-10-07
it's the part of target that your hypothese cannot capture
13:24 2014-10-07
you're still not going to get f even try your best,
because your hypothese is limited.
13:25 2014-10-07
it cannot be captured because I'm limited in capturing.
13:25 2014-10-07
out of my league
13:26 2014-10-07
why we calling it noise?
13:26 2014-10-07
because their hypothese set is so limited.
13:27 2014-10-07
you're better off just killing that part, and give
them a simple thing they can learn; because the additional
part will mislead them.
13:29 2014-10-07
so that's why it is called noise.
13:30 2014-10-07
noise:
* stochastic noise
* deterministic noise
13:34 2014-10-07
it is out of your ability.
13:42 2014-10-07
that tells me the bias my hypothese is from the target
13:43 2014-10-07
noise & Bias-Variance decomposition
y = f(x) + ε(x) // what if we add noise?
13:44 2014-10-07
Actually, two noise term
var + bias(deterministic noise) + energy of noise(stochastic noise)
13:49 2014-10-07
your hypothese => centroid => target proper => actual output
13:50 2014-10-07
look at this decomposition,
13:54 2014-10-07
How do we deal with overfitting? // 2 cures
* regularization // putting the brakes
* validation //
13:56 2014-10-07
putting the brakes: the regularization part
13:56 2014-10-07
the amount of brake I'm going to put here is so minimal,
13:58 2014-10-07
that little bit of brake will result in this...
totally dramatic, fantastic fit
13:58 2014-10-07
free fit => restrained fit // regularization(some brake)
13:59 2014-10-07
we don't have to do much to prevent the overfitting
13:59 2014-10-07
but we need to understand what is regularization &
how to choose it etc.
14:00 2014-10-07
validation is the other prescription