- 博客(22)
- 收藏
- 关注
原创 吴恩达·Machine Learning || chap18 Application example photo OCR & chap19 Conclusion简记
18 Application example photo OCR18-1 Problem description and pipelineThe photo OCR problem1.Text detection2.Character segmentation3.Character classification (recognition)4.*Spelling correctionPhoto OCR pipeline18-2 Sliding windowsText detection
2021-09-12 20:03:18
143
原创 吴恩达·Machine Learning || chap17 Large scale machine learning简记
17 Large scale machine learning17-1 Learning with large datasetsMachine learning and dataClassify between confusable words. E.g., (to, two, too), (then, than)It’s not who has the best algorithm that wins It’s who has the most dataLearning with large
2021-09-12 11:44:17
141
原创 吴恩达·Machine Learning || chap16 Recommender System简记
16 Recommender System16-1 Problem formulationExample : predicting movie ratingUser rates movies using one to five stars16-2 Content-based recommendationscontent-based recommender systemsProblem formulation$r(i,j)=$1 if user j has movie i (0,otherw
2021-09-11 17:20:26
104
原创 吴恩达·Machine Learning || chap15 Anomaly detection简记
15 Anomaly detection15-1 Problem motivationAnomaly detection exampleAircraft engine features: x1x_1x1=heat generated x2x_2x2=vibration intensity ⋯\cdots⋯Dataset: {x(1),x(2),⋯ ,x(m)}\{ x ^ { ( 1 ) } , x ^ { ( 2 ) } , \cdots , x ^ { ( m ) } \}{x(
2021-09-10 18:55:48
183
原创 吴恩达·Machine Learning || chap14 Dimensionality Reduction简记
14-1 Motivation I:Data CompressionData CompressionReduce data from 2D to 1D:project line x1,x2⟶z1x_1,x_2\longrightarrow z_1x1,x2⟶z1Reduce data from 3D to 2D:project plane x1,x2,x3⟶z1,z2x_1,x_2,x_3\longrightarrow z_1,z_2x1,x2,x3⟶z1,z214-2 Motiva
2021-09-09 17:45:49
104
原创 吴恩达·Machine Learning || chap13 Clustering简记
13 Clustering13-1 Unsupervised learning introductionSupervised learningTraining set: {(x(1),y(1)),(x(2),y(2)),(x(3),y(3)),⋯ ,(x(m),y(m))}\{ ( x ^ { ( 1 ) } , y ^ { ( 1 ) } ) , ( x ^ { ( 2 ) } , y ^ { ( 2 ) } ) , ( x ^ { ( 3 ) } , y ^ { ( 3 ) } ) , \cdo
2021-09-08 16:18:06
142
原创 吴恩达·Machine Learning || chap12 Support Vector Machines简记
12-1 Optimization objectiveAlternative view of logistic regressionhθ(x)=11+e−θTxh _ { \theta } ( x ) = \frac { 1 } { 1 + e ^ { - \theta ^ { T } x } }hθ(x)=1+e−θTx1If y=1y=1y=1,we want hθ(x)≈1,θTx≫0h_{\theta}(x)\approx1, \theta^Tx\gg0hθ(x)≈1,θTx≫0If
2021-09-06 18:55:43
108
原创 吴恩达·Machine Learning || chap11 Machine learning system design简记
11 Machine learning system design11-1 Prioritizing what to work on: Spam classification exampleBuilding a spam classifierSupervised learning. x=features of email. y=spam(1) or not spam(0).Features x: Choose 100 words indicative of spam/not spamxj={1&
2021-09-04 16:23:17
388
原创 吴恩达·Machine Learning || chap10 Advice for applying machine learning简记
10 Advice for applying machine learning10-1 Deciding what to try nextDebugging a learning algorithmSuppose you have implemented regularized linear regression to predict housing prices.J(θ)=12m[∑i=1m(h0(x(i))−y(i))2+λ∑j=1mθj2]J ( \theta ) = \frac { 1 }
2021-09-03 16:29:22
160
原创 吴恩达·Machine Learning || chap9 Neural Network : Learning简记
9 Neural Network : Learning9-1 Cost functionNeural Network(classification)(x(1),y(1)),(x(2),y(2)),⋯ ,(x(m),y(m))(x^{(1)},y^{(1)}),(x^{(2)},y^{(2)}),\cdots,(x^{(m)},y^{(m)})(x(1),y(1)),(x(2),y(2)),⋯,(x(m),y(m))LLL = total no. of layers in networksls_ls
2021-08-16 20:46:22
118
原创 吴恩达·Machine Learning || chap8 Neural Networks:Representation 简记
8 Neural Networks: Representation8-1 Non-linear hypothesesNon-linear Classification 0、18-2 Neurons and the brainNeurons Networks Origins: Algorithms that try to mimic the brain.Was very widely used in 80s and early 90s;popularity diminished in late
2021-08-13 17:00:08
142
原创 吴恩达·Machine Learning || chap7 Regularizationn 简记
7 Regularization7-1 The problem of overfittingunderfitting——high biasJust rightoverfitting——high variance 高方差Overfitting: If we have too many features, the learned hypothesis may fit the training set very well, but fail to generalize to new examplesA
2021-08-11 21:34:57
136
原创 吴恩达·Machine Learning || chap5&6 Octave Tutorial&Logistic Regression 简记
5 Octave Tutorial(Octave也可以用MATLAB学习,个人决定使用Python进行学习实现,5-6建议无论学习什么语言都可以看一看)5-1 Basic operations5-2 Moving data around5-3 Computing on data5-4 Plotting data5-5 for,while,if statements,and function5-6 VectorizationVectorization examplehθ(x)=∑j=0
2021-08-06 18:19:37
125
原创 吴恩达·Machine Learning || chap4 Linear Regression with multiple variables 简记
4 Linear Regression with multiple variables4-1 Multiple featuresMultiple features (variables) Notation: nnn = number of features x(i)x^{(i)}x(i)=input(features) of ithi^{th}ith training example. xj(i)x_j^{(i)}xj(i)=value of feature in ith
2021-07-31 16:21:25
114
原创 吴恩达·Machine Learning || chap3 Linear Algebra review(optional) 简记
3 Linear Algebra review(optional)3-1 Matrices and VectorsMatrix: Rectangular array of numbersDimension of matrix: number of rows x number of columns (m x n)Matrix Elements(entries of matrix)Vector: An n x 1 matrix1-indexedrefer A,B,C,X a,b,x,y3-
2021-07-30 18:06:42
154
原创 吴恩达·Machine Learning || chap2 Linear regression with one variable 简记
2 Linear regression with one variable2-1 Model representationTraining setm = Number of training examplesx’s = “input” variable/featuresy’s = “output” variable/“target” variable(x,y) = one training exampleHypothesis 假设函数hθ(x)=(θ)0+θ1xh_\theta (x)=(
2021-07-28 17:34:16
115
原创 DOS命令最基础的使用
基本的Dos命令打开cmd的方式开始+系统+命令提示符Win键+R 输入cmd打开控制台在任意的文件夹下,shift+鼠标右键资源管理器的地址栏前面加上“cmd 路径”资源管理器的地址栏前面加上cmd路径以管理员身份运行常见的DOS命令盘符切换 D:查看当前目录下的所有文件 dir切换目录 cd 目录具体路径/该目录下文件名 cd /d **cd…**返回上一级 md 创建文件夹 **cd>**进入文件 del 删除文件
2021-07-27 11:32:24
107
原创 吴恩达·Machine Learning || chap1 Introduction 简记
这里写自定义目录标题欢迎使用Markdown编辑器新的改变功能快捷键合理的创建标题,有助于目录的生成如何改变文本的样式插入链接与图片如何插入一段漂亮的代码片生成一个适合你的列表创建一个表格设定内容居中、居左、居右SmartyPants创建一个自定义列表如何创建一个注脚注释也是必不可少的KaTeX数学公式新的甘特图功能,丰富你的文章UML 图表FLowchart流程图导出与导入导出导入欢迎使用Markdown编辑器你好! 这是你第一次使用 Markdown编辑器 所展示的欢迎页。如果你想学习如何使用Mar
2021-07-26 16:40:16
106
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人