ML(NTU)
文章平均质量分 64
zypandora
这个作者很懒,什么都没留下…
展开
-
Lecture1-1Course Introduction
Course Design原创 2015-09-09 22:09:19 · 185 阅读 · 0 评论 -
Lecture3-1Learning with different data label
Supervised LearningUnsupervised Learning Semi-supervised LearningReinforcement LearningA very different but natural way of learningTeach your dog: SIT DOWNcannot easily show the dog that yn=y_n = sit原创 2015-09-20 11:36:59 · 213 阅读 · 0 评论 -
Lecture4-1Learning is impossible?
Puzzle problemNo free lunchWe all have to be aware that ff is unknown, and any possible ff can happen according to the previous examples outside \mathcal{D}(no free lunch). So we have to make some ass原创 2015-09-20 12:28:07 · 270 阅读 · 0 评论 -
Lecture4-2Probability to rescure
Inferring something unknownDifficult to infer unknown target ff outside \mathcal{D}, can we infer something unknown in other scenarios?How to infer the orange probability?SamplingBin: ASSUME orange p原创 2015-09-20 13:26:02 · 213 阅读 · 1 评论 -
Lecture3-1Learning with different output space
Binary ClassificationMulticlass Classification: Coin Recognition ProblemRegression: Patient Recovery Prediction ProblemStructured Learning: TaggingMini Summary原创 2015-09-20 10:45:08 · 360 阅读 · 0 评论 -
Lecture2-3Guarantee of PLA
Linear SeparableIf PLA halts(no mistakes),(necessary condition) \mathcal{D} allows some w\mathbf{w} to make no mistakeCall \mathcal{D} linear separable Linear Separable ⟺\mathcal{D} \Longleftrigh原创 2015-09-19 22:11:03 · 329 阅读 · 0 评论 -
Lecture2-4Non-Separable Data
In reality, we don’t know if \mathcal{D} is linear separable, and we don’t know what T exactly is, because we cannot calculate ρ\rho, which depends on wf\mathbf{w_f}.Flow Chart 2: With Noise What abou原创 2015-09-19 23:21:41 · 246 阅读 · 0 评论 -
Lecture3-3Learning with different input space
Concrete featureRaw featureAbstract featureNeed to extract real concrete features from abstract features Mini SummarySummary原创 2015-09-20 12:10:21 · 221 阅读 · 0 评论 -
Lecture4-3Connection to Learning
Bin Model vs. Learningunknown orange prob μ\mu ↔\leftrightarrow fixed hypothesis h(x)=?h(\mathbf{x}) =? target f(x)f(\mathbf{x})marbel ∈\in bin ↔\leftrightarrow x∈\mathbf{x} \in \mathcal{X}orange ↔原创 2015-09-20 22:32:00 · 166 阅读 · 0 评论 -
Lecture4-4Connection to Real Learning
Multiple hhReal Learning like PLA, what about when getting all green(right)?Bad Sample:EinE_{in} and EoutE_{out} far away – can get worse results. Eout=12E_{out} = \frac{1}{2}, but getting all heads(原创 2015-09-22 21:57:48 · 231 阅读 · 0 评论 -
Lecture6-1Restriction of Break Point
What must be true when min break point is k=2k=2?N=1N=1, every m(N)=2m_{\mathcal{H}}(N)=2 by definitionN=1N=1, every m(N)<4m_{\mathcal{H}}(N)<4 by definitionN=3N=3?? Must coform that m(N)<4m_{\原创 2015-10-12 14:39:13 · 299 阅读 · 0 评论 -
Lecture2-2Perceptron Learning Algorithm
Select gg from \mathcal{H}=\mathcal{H}=all possible perceptrons, g=?g = ?want g≈fg \approx falmost necessary: g≈fg \approx f on \mathcal{D}, ideally g(xn)=f(xn)=yng(x_n)=f(x_n)=y_nDifficult: \m原创 2015-09-19 18:30:07 · 327 阅读 · 0 评论 -
Lecture3-3Learning with different protocols
Batch LearningOnline LearningActive LearningMini Summary原创 2015-09-20 11:54:35 · 264 阅读 · 0 评论 -
Lecture1-2What is Machine Learning?
From Learning to Machine Learning原创 2015-09-11 22:44:26 · 528 阅读 · 0 评论 -
Lecture1-3Applications of ML
Applications of daily needsFooddata: twitter data(words + location)skill: tell food poisoning likeliness of restaurant properlyClothing-data: sales figures + client surveys -skill: give good fashio原创 2015-09-11 23:13:10 · 249 阅读 · 0 评论 -
Lecture5-1Recap and Preview
Flow Chart Recap2 Central QuestionsLearning is split into 2 questions - Can we make sure that Ein≈EoutE_{in} \approx E_{out}? - Can we make EinE_{in} small enough?Trade off on MMSmall MM :) few ch原创 2015-09-27 21:47:06 · 214 阅读 · 0 评论 -
Lecture 1-5ML and Other Fields
ML and Data MiningMachine Learning Use data to compute hypothesis gg that approximates target ffText Mning Use (huge) data to find property that is interestingif ‘interesting property’ same as ‘hy原创 2015-09-12 23:47:57 · 269 阅读 · 0 评论 -
Lecture1-4Components of ML
Components of ML: Metaphor using credit approvalThe bank has the applicant information, and there is unknown pattern to be learned, which is used to solve the final question: whether approving credit原创 2015-09-12 23:30:41 · 237 阅读 · 0 评论 -
Lecture5-3Effective number of hypotheses
Dichotomies={hypothesis h:→{x,o}}\mathcal{H} = \{\textrm{hypothesis } h: \mathcal{X} \rightarrow \{x,o\}\}Call: h(x1,x2,...,xN)=(h(x1),h(x2),...,h(xN))∈{x,o}Nh(\mathbf{x_1, x_2, ... , x_N}) = (h(\m原创 2015-09-29 22:49:41 · 506 阅读 · 0 评论 -
Lecture5-2Effective number of lines
Where did MM come from?Where did Union Bound fail? Union Bound Over-Estimating!! Can we group hypotheses by kind?How many lines are there?For only one point x1x_1, how many different lines are t原创 2015-09-29 22:16:08 · 290 阅读 · 0 评论 -
Lecture5-4Breaking Point
The 4 growth functionsBreak point of mm_{\mathcal{H}}For 2D perceptron, we know:for 3 inputs, exists shatterfor 4 inputs, for all data no shatter If no k points can be shattered by \mathcal{H}, ca原创 2015-10-01 00:18:21 · 269 阅读 · 0 评论 -
Lecture2-1Perceptron Hypothesis Set
Credit Approval RevisitedA simple hypothesis set: PerceptronFor x=(x1,x2,...,xd)\mathbf{x} = (x_1, x_2, ..., x_d), ‘features of customer’, compute a weight ‘score’ and - approve credit if ∑di=1wixi>th原创 2015-09-15 22:37:17 · 267 阅读 · 0 评论 -
Lecture6-2Bounding Function
Bounding Function B(N,k)B(N,k) Maximum possible mm_{\mathcal{H}} when break point = kCombinatiorial quantity: no shatter of any length k points in N pointsirrelevant of details of \mathcal{H} or a原创 2015-10-12 14:51:04 · 301 阅读 · 0 评论 -
Lecture6-3Bounding Function: inductive cases
Estimating B(4,3)B(4,3) B(4,3)B(4,3) = 11 after searching all 16 possibilities.Reorgnizing dichotomies of B(4,3)B(4,3)Check for B(3,3)B(3,3)α+β\alpha + \beta: # of different x1,x2,x3x_1, x_2, x_3 co原创 2015-10-12 15:39:32 · 310 阅读 · 0 评论