Machine Learning - III. Linear Algebra Review线性代数 (Week 1, Optional)

http://blog.csdn.net/pipisorry/article/details/43490965

机器学习Machine Learning - Andrew NG courses学习笔记

Linear Algebra Review线性代数复习

矩阵和向量及其表示介绍

what are matrices矩阵

matrix is just another way for saying, is a 2D or a two dimensional array.
dimension of the matrixis going to be written as the number of row times the number of columns in the matrix.
written out as R4 by 2 or concretely what people will sometimes say this matrix is an element of the set R 4 by 2.
matrix elements,(entries of matrix) the numbers inside the matrix.

the matrix gets you a way of letting you quickly organize, index and access lots of data.


what are vectors向量
A vector turns out to be a special case of a matrix.A vector is a matrix that has only 1 column so you have an N x 1 matrix.{本course中的vector都是列向量}
dimension:if have N equals four elements here.so we also call this is a four dimensional vector, just means that this is a vector with four elements, with four numbers in it.
refer to this as a vector in the set R4.

Notation关于符号的规范表示:

throughout the rest of these videos on linear algebra review, I will be using one index vectors.课程中大多向量下标都是从1开始。
when talking about machine learning applications, sometimes explicitly say when we need to switch to, when we need to use the zero index vectors as well.讨论机器学习应用时会转换到下标从0开始。

Finally, by convention,use upper case to refer to matrices.So we're going to use capital letters like A, B, C.and usually we'll use lowercase,like a, b, x, y,to refer to either numbers,or just raw numbers or scalars or to vectors.



矩阵运算

Matrix Addition and Scalar Multiplication矩阵加法和标量乘法

Scalar Multiplication:multiply a matrix by a number

Matrix Vector Multiplication矩阵向量相乘

Example applying to the house price prediction

(theta0\1已经计算出来后,左边是用matrix乘法同时预测多个house size对应的房价; 右边是用for循环一个一个地计算)

矩阵计算的优势

code on the left allows you to not only simplify the code,But, for subtle reasons,to be much more computationally efficient to make predictions on all of the prices of all of your houses doing it the way on the left than the way on the right than if you were to write your own formula.

matrix, matrix multiplication

重要性的体现:When we talk about the method in linear regression for how to solve for the parameters,theta zero and theta one, all in one shot.So, without needing an iterative algorithm like gradient descent.

Example:

have four houses whose prices we want to predict,but now we have three competing hypothesis.if you want to apply all 3 competing hypotheses to all four of the houses
(假设有3个不同的模型函数来预测4个house size对应的房价)


matrix multiplying的优势

most popular programming languages will have great linear algebra libraries are highly optimized to do matrix matrix multiplication very efficiently, including taking advantage of any parallel computation that your computer may be capable of, when your computer has multiple calls or lots of multiple processors, within a processor sometimes there's there's parallelism as well called symdiparallelism [sp].


Matrix Multiplication Properties

not commutative 非交换律    

Associative 结合律


Identity Matrix单位矩阵

note:上面这个等式中I的维度是不同的,所以有时在matrix下面带上下标。


Matrix Inverse and Transpose

matrix inverse矩阵的逆


note:only square matrices have inverses

the intuition if you want is that you can think of matrices as not have an inverse that is somehow too close to zero in some sense.
singular or degenerate matrix :(奇异或退化矩阵)

matrix that don't have an inverse Sometimes called a singular matrix or degenerate matrix.
zero zero zero matrix is an example of a matrix that is singular, or a matrix that is degenerate.
matrix transpose矩阵的转置


from:http://blog.csdn.net/pipisorry/article/details/43490965


Preface I wrote this book to help machine learning practitioners, like you, get on top of linear algebra, fast. Linear Algebra Is Important in Machine Learning There is no doubt that linear algebra is important in machine learning. Linear algebra is the mathematics of data. It’s all vectors and matrices of numbers. Modern statistics is described using the notation of linear algebra and modern statistical methods harness the tools of linear algebra. Modern machine learning methods are described the same way, using the notations and tools drawn directly from linear algebra. Even some classical methods used in the field, such as linear regression via linear least squares and singular-value decomposition, are linear algebra methods, and other methods, such as principal component analysis, were born from the marriage of linear algebra and statistics. To read and understand machine learning, you must be able to read and understand linear algebra. Practitioners Study Linear Algebra Too Early If you ask how to get started in machine learning, you will very likely be told to start with linear algebra. We know that knowledge of linear algebra is critically important, but it does not have to be the place to start. Learning linear algebra first, then calculus, probability, statistics, and eventually machine learning theory is a long and slow bottom-up path. A better fit for developers is to start with systematic procedures that get results, and work back to the deeper understanding of theory, using working results as a context. I call this the top-down or results-first approach to machine learning, and linear algebra is not the first step, but perhaps the second or third. Practitioners Study Too Much Linear Algebra When practitioners do circle back to study linear algebra, they learn far more of the field than is required for or relevant to machine learning. Linear algebra is a large field of study that has tendrils into engineering, physics and quantum physics. There are also theorems and derivations for nearly everything, most of which will not help you get better skill from or a deeper understanding of your machine learning model. Only a specific subset of linear algebra is required, though you can always go deeper once you have the basics.
Linear algebra is a pillar of machine learning. You cannot develop a deep understanding and application of machine learning without it. In this new laser-focused Ebook written in the friendly Machine Learning Mastery style that you’re used to, you will finally cut through the equations, Greek letters, and confusion, and discover the topics in linear algebra that you need to know. Using clear explanations, standard Python libraries, and step-by-step tutorial lessons, you will discover what linear algebra is, the importance of linear algebra to machine learning, vector, and matrix operations, matrix factorization, principal component analysis, and much more. This book was designed to be a crash course in linear algebra for machine learning practitioners. Ideally, those with a background as a developer. This book was designed around major data structures, operations, and techniques in linear algebra that are directly relevant to machine learning algorithms. There are a lot of things you could learn about linear algebra, from theory to abstract concepts to APIs. My goal is to take you straight to developing an intuition for the elements you must understand with laser-focused tutorials. I designed the tutorials to focus on how to get things done with linear algebra. They give you the tools to both rapidly understand and apply each technique or operation. Each tutorial is designed to take you about one hour to read through and complete, excluding the extensions and further reading. You can choose to work through the lessons one per day, one per week, or at your own pace. I think momentum is critically important, and this book is intended to be read and used, not to sit idle. I would recommend picking a schedule and sticking to it.
Basics of Linear Algebra for Machine Learning: Discover the Mathematical Language of Data in Python By 作者: Jason Brownlee Pub Date: 2018 ISBN: n/a Pages: 212 Language: English Format: PDF Linear algebra is a pillar of machine learning. You cannot develop a deep understanding and application of machine learning without it. In this new laser-focused Ebook written in the friendly Machine Learning Mastery style that you’re used to, you will finally cut through the equations, Greek letters, and confusion, and discover the topics in linear algebra that you need to know. Using clear explanations, standard Python libraries, and step-by-step tutorial lessons, you will discover what linear algebra is, the importance of linear algebra to machine learning, vector, and matrix operations, matrix factorization, principal component analysis, and much more. This book was designed to be a crash course in linear algebra for machine learning practitioners. Ideally, those with a background as a developer. This book was designed around major data structures, operations, and techniques in linear algebra that are directly relevant to machine learning algorithms. There are a lot of things you could learn about linear algebra, from theory to abstract concepts to APIs. My goal is to take you straight to developing an intuition for the elements you must understand with laser-focused tutorials. I designed the tutorials to focus on how to get things done with linear algebra. They give you the tools to both rapidly understand and apply each technique or operation. Each tutorial is designed to take you about one hour to read through and complete, excluding the extensions and further reading. You can choose to work through the lessons one per day, one per week, or at your own pace. I think momentum is critically important, and this book is intended to be read and used, not to sit idle. I would recommend picking a schedule and sticking to it. The tutorials are divided into five parts: Foundation. Discover a gentle introduction to the field of linear algebra and the relationship it has with the field of machine learning. NumPy. Discover NumPy tutorials that show you how to create, index, slice, and reshape NumPy arrays, the main data structure used in machine learning and the basis for linear algebra examples in this book. Matrices. Discover the key structures for holding and manipulating data in linear algebra in vectors, matrices, and tensors. Factorization. Discover a suite of methods for decomposing a matrix into its constituent elements in order to make numerical operations more efficient and more stable. Statistics. Discover statistics through the lens of linear algebra and its application to principal component analysis and linear regression.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值