mit 18.06 linear algebra video note

//
14:15 2014-8-24 Sunday
start "introduction to linear algebra", video I


2 videos * 17 days == 34 videos


14:16 2014-8-24
row picture, column picture


14:30 2014-8-24
linear combination


14:34 2014-8-24
"the big picture"


-----------------------------------
15:43 2014-8-24
introduction to linear algebra, video II


15:43 2014-8-24
elimination


15:57 2014-8-24
Gaussian elimination


15:57 2014-8-24
pivot


15:57 2014-8-24
forward elimination, backward substitution


15:58 2014-8-24
row exchange


16:10 2014-8-24
augmentd matrix


16:21 2014-8-24
the elimination steps I want to express as matrix


16:29 2014-8-24
elimination matrix


16:48 2014-8-24
particular important:


1. matrix * column vector


2. row vector * matrix


16:49 2014-8-24
E  // elimination matrix


P  // permutation matrix


17:11 2014-8-24
row operation, column operation


17:16 2014-8-24
EA = U  // from A to U


A = LU  // from U to A


--------------------------------------------------
17:35 2014-8-24
introduction to linear algebra, video III


17:36 2014-8-24
matrix multiplication


17:40 2014-8-24
block multiplication


18:09 2014-8-24
invertible(nonsingular)


18:19 2014-8-24
Gauss-Jordan idea


18:58 2014-8-24
upper triangular matrix // U


19:05 2014-8-24
matrix inverse with Gauss-Jordan method


/


///
7:03 2014-8-25 Monday
start introduction to linear algebra, video 4


13:45 2014-8-25
A = LU //  EA = U => A = inv(E) * U


13:45 2014-8-25
PA = LU


14:47 2014-8-25
permutation matrix


14:50 2014-8-25
introduction to linear algebra, video 5


15:06 2014-8-25
vector space


15:06 2014-8-25
subspace


15:06 2014-8-25
symmetric matrix


15:28 2014-8-25
vector space


15:38 2014-8-25
zero vector


15:38 2014-8-25
column vector, column space


15:43 2014-8-25
subspace came from matrix A: C(A)


// column space


16:04 2014-8-25
linear combination of columns => column space


16:06 2014-8-25
how to create a subspace from a matrix? // column space


///
14:43 2014-8-26 Tuesday
finish linear algebra text, chapter 3


------------------------------------------------------
today's goal, video 6, 7


14:44 2014-8-26
introduction to linear algebra, video 6 // chapter 3


vector space, subspace, column space


14:47 2014-8-26
C(A) // column space


N(A) // null space


14:48 2014-8-26
vector space // close for linear combinations


14:51 2014-8-26
column space of a matrix


15:03 2014-8-26
which right-hand side allow me to solve this?


15:20 2014-8-26
N(A)   // nullspace


-----------------------------------------------------
16:05 2014-8-26
start introduction to linear algebra, video 7


find nullspace...


rref // reduced row echelon form


16:05 2014-8-26
pivot column, free column


pivot variable, free variable


16:06 2014-8-26
while I'm doing the elimination, I'm 


not changing the nullspace


16:08 2014-8-26
elimination does change the column space!


16:08 2014-8-26
echelon form // U for rectangular matrix


16:13 2014-8-26
rank of matrix:


number of pivots


16:15 2014-8-26
special solution  // linear combination => nullspace solution


particular solution


16:41 2014-8-26
echelon form(staircase, U) => rref // reduced row echelon form


16:47 2014-8-26
I could do elimination upward // echelon form => rref


16:53 2014-8-26
block matrix


17:16 2014-8-26
N // nullspace matrix


/
13:50 2014-8-27 Wednesday


introduction to linear algebra, video 8, 9


13:51 2014-8-27
[A b] // augmented matrix


13:54 2014-8-27
Xcomplete = Xparticular + Xnullspace


14:28 2014-8-27
X = Xr + Xn 


// Xr rowspace solution


// Xn nullspace solution


14:29 2014-8-27
full column rank


full row rank


14:54 2014-8-27
square matrix, rectangular matrix


---------------------------------------------
15:23 2014-8-27
start introduction to linear algebra, video 9


15:23 2014-8-27
independence, span, basis, dimension


15:26 2014-8-27
there is something in the nullspace of A, 


rather than just the zero vector!


15:28 2014-8-27
basis for vector space


15:56 2014-8-27
dim C(A) = r


dim N(A) = n - r


/
9:52 2014-8-28 Thursday
complete introduction to linear algebra text,


chapter, determinant


9:52 2014-8-28
introduction to linear algebra, video 9


the four fundamental subspace


9:53 2014-8-28
standard basis


9:55 2014-8-28
orthogonal complement


10:11 2014-8-28
dimension of a vector space:


#number of basis


10:13 2014-8-28
we have this great fact to establish


10:17 2014-8-28
elimination, row reduction


10:22 2014-8-28
A is "square & invertible" matrix =>


A is rectangular matrix


10:40 2014-8-28
elementary matrix


10:43 2014-8-28
rref == reduced row echelon form


-----------------------------------------
11:04 2014-8-28
start introduction to linear algebra, video 11


11:04 2014-8-28
matrix space


11:14 2014-8-28
solution space


11:38 2014-8-28
rank one matrix


11:45 2014-8-28
How many steps does it take from anybody to anybody?


12:45 2014-8-28
small world graph  // node, edge


///
8:09 2014-8-29 Friday
potential, potential difference, flow


13:40 2014-8-29
graph // node, edge


13:40 2014-8-29
incidence matrix


13:42 2014-8-29
orthogonal vectors,


orthogonal subspace,


orthogonal basis


14:56 2014-8-29
orthogonal vectors => orthogonal subspaces


15:09 2014-8-29
orthogonal complement


15:20 2014-8-29
row space is orthogonal to nullspace


15:20 2014-8-29
fundamental theorem of linear algebra


15:41 2014-8-29
least square:


Ax = b => A'Ax = A'b // normal equation


---------------------------------------------------
16:07 2014-8-29
start introduction to linear algebra, video 15


projections, projection matrix, least square


16:08 2014-8-29
e == error vector


17:00 2014-8-29
square matrix, 


rectangular matrix


square & invertible matrix


/
18:33 2014/8/30 Saturday
regression, linear regression


18:34 2014/8/30
normal equations




///
8:11 2014/8/31
start introductiont to linear algebra, video 17


orthogonal basis, orthogonal matrix, Gram-schmit


8:12 2014/8/31
orthonormal


8:12 2014/8/31
orthogonal matrix: Q


8:13 2014/8/31
Q is orthogonal matrix =>


Q has orthonormal columns


8:46 2014/8/31
independent vectors => orthogonal vectors => orthonormal vectors


9:01 2014/8/31
projection, orthogonal, error vector


9:04 2014/8/31
A = QR 


// R is upper triangular


// Q is orthogonal matrix


// A is square matrix with independent columns


9:33 2014/8/31
start introduction to linear algebra, video 18


determinant of a square matrix


9:36 2014/8/31
the determinant is a number associated with


every square matrix


9:55 2014/8/31
invertible <=> determinant != 0


singular   <=> determinant == 0


9:57 2014/8/31
the 3 basic properties of determinant:


1. det(I) = 1


2. exchange row => reverse sign


3. linear property for each row(column)


10:10 2014/8/31
elimination does not change the determinant


10:14 2014/8/31
property 7:


det(triangular matrix) == (d1)(d2)...(dn) // product of diagonal


10:30 2014/8/31
determinant of "triangular matrix" is just


the product of diagonal entries


10:36 2014/8/31
property 9:  ???


det AB = detA * detB


10:46 2014/8/31
property 10:


det(A) == det(A') // transpose does not change det


/



8:02 2014/9/1 Monday
introduction to linear algebra, video 19


determinant


8:03 2014/9/1
How to find determinant of square matrix?


1. big formula


2. cofactor


3. pivots


8:03 2014/9/1
row exchange reverse sign


8:41 2014/9/1
Why can we use elimination to get upper triangular matrix,


the use the product of pivots to get det(A)?


because elimination does not change determinant!


--------------------------------------------------------------
9:56 2014/9/1
start introduction to linear algebra, video 20


1. formula for determinant


2. Cramer's rule for x = inv(A) * b


9:57 2014/9/1
cofactor matrix


10:26 2014/9/1
inv(A) =  C' / det(A) // C: cofactor matrix


10:27 2014/9/1
How to find the inv(A)?


1. Gauss-Jordan method


2. formula: inv(A) =  C' / det(A)


10:35 2014/9/1
the validity of the formula:


just check, C'A = det(A) I 


10:35 2014/9/1
why det(A*B) == det(A) * det(B)?


why Cramer's rule?


///
7:24 2014/9/2 Tuesday
start introduction to linear algebra, video 21


eigenvalues


10:57 2014/9/2
det(A - lambda * I) = 0


trace = sum of eigenvalues


10:58 2014/9/2
eigenvalues, eigenvectors


10:58 2014/9/2
Ax = λx // x: eigenvectors, λ:eigenvalue


Ax parallel to x, Ax is some multiple of some x!


11:09 2014/9/2
we look for special vectors!


11:09 2014/9/2
following the eigenvector direction


11:22 2014/9/2
projection matrix


11:27 2014/9/2
sum of the diagonal values == sum of the eigenvalues


a11 + a22 + ... + ann == λ1+ λ2 + λ3 + ... + λn


// trace


11:45 2014/9/2
det(A-λI) = 0 


// characteristic equation, eigenvalue equation



7:56 2014/9/3 Wednesday
introduction linear algebra, video 22


diagonalizing a matrix


powers of A / equation Uk+1 = A*Uk


7:57 2014/9/3
diagonalizable


8:38 2014/9/3
distinct eigenvalues => independent eigenvectors


8:54 2014/9/3
Uk = A * Uk+1 // A: transition matrix


8:59 2014/9/3
powers of A


8:59 2014/9/3
system of differential equations


system of equations


8:59 2014/9/3
to really solve, write U0 as a combination


of eigenvectors!


9:02 2014/9/3
"following the eigenvector direction!"


9:02 2014/9/3
eigenvalues == pole ???


9:33 2014/9/3
dorminant pole


9:33 2014/9/3
we're doing problems that evolving,


we're doing "dynamic", things evolving with time,


eigenvalues are crucial numbers!


9:36 2014/9/3
find the eigenvalue & eigenvector of A,


break U0 as combination of eigenvectors of A


///
7:46 2014-09-04
start linear algebra, video 23


differential equations


7:46 2014-09-04
exp(At) ???


7:47 2014-09-04
eigenvalues <=> poles


8:14 2014-09-04
the whole point of eigenvector is to uncouple


8:38 2014-09-04
by uncoupling it, I mean to diagonalize it


8:40 2014-09-04
it's a system of equations, but they're 


not connected


8:44 2014-09-04
matrix exponential: exp(At)


8:47 2014-09-04
power series  // infinite series, series expansion


8:48 2014-09-04
complex plane


--------------------------------------------
15:52 2014-09-04 Thursday
Harvard statistics video I


15:52 2014-09-04
pattern recognition skills


15:52 2014-09-04
sample space


16:47 2014-09-04
experiment


16:47 2014-09-04
event


16:48 2014-09-04
an event is a subset of a sample space


16:51 2014-09-04
permutation, combination // counting


17:31 2014-09-04
multiplication rule


17:31 2014-09-04
sampling


---------------------------------------------
17:45 2014-09-04
mit multivariable calculus, video I


vector


17:45 2014-09-04
law of cosine


18:14 2014-09-04
detect orthogonality



7:31 2014-09-05 Friday
start linear algebra, video 24


Markov matrix


8:41 2014-09-05
2 properties of Markov matrix:


1. all enteries >= 0


2. all columns add to 1


9:01 2014-09-05
eigenvalues:


1. stability    // λ < 0


2. steady state // λ = 0


3. blow up


9:03 2014-09-05
eigenvalues of A == eigenvalues of A'


9:25 2014-09-05
state transition matrix


9:39 2014-09-05
what can you tell me about the population in k steps?


9:40 2014-09-05
eigenvalue, eigenvector, Markov matrix


9:45 2014-09-05
Fourier series projections


9:53 2014-09-05
projection with orthonormal basis:


q1, q2, ..., qn


9:53 2014-09-05
V = x1 * q1 + x2 * q2 + ... + xn * qn


9:55 2014-09-05
but what is the dot product of functions?



7:14 2014-09-06 Saturday
symmetric matrix, positive definite matrix


7:15 2014-09-06
What is special about symmetric matrix?  // A == A'


1. The eigenvalues are REAL


2. The eigenvectors are ORTHOGONAL  // can be chosen


7:22 2014-09-06
the usual case: A == SΛinv(S)


symmetric case: A = QΛQ'  // Q is orthogonal matrix


7:28 2014-09-06
Q: orthogonal matrix // with all columns are orthonormal basis


7:29 2014-09-06
 A = QΛQ'


// spectrum theorem


// principle axis theorem


7:34 2014-09-06
Why real eigenvalues? // symmetric matrix


15:07 2014-09-06
symmetric matrix is a combination of perpendicular 


projection matrix


15:07 2014-09-06
for symmetric matrix(A == A'), the signs


of pivots same as signs of λ(eigenvalue)'s


15:12 2014-09-06
det(A) == product of pivots == product of eigenvalues


15:15 2014-09-06
What is a positive definite matrix?


they're symmetric matrix with all eigenvalues are real


// all eigenvalues are positive


// all pivots are positive


// all subdeterminant are positive


15:18 2014-09-06
start linear algebra, video 26


complex matrix, DFT, FFT


16:51 2014-09-06
Hermitian matrix


17:22 2014-09-06
for Hermitian matrix:


1. symmetric => Hermitian


2. orthogonal => unitary


17:29 2014-09-06
DFT matrix


17:32 2014-09-06
Fourier matrix


17:42 2014-09-06
matrix factorization


//
7:57 2014-09-07
linear algebra, video 27


positive definite matrix, test for minimum...


7:58 2014-09-07
positive definite: x'Ax > 0




/



10:28 2014-09-08 Monday
introduction to linear algebra, video 28


similar matrix


10:28 2014-09-08
positive definite matrix:


x'Ax > 0 except for x = 0


10:38 2014-09-08
where does positive definite matrix come from?


// least square Ax = b => 


// A'Ax = A'b, normal equation


10:38 2014-09-08
A'A is positive definite, just see


x'A'Ax == (Ax)'(Ax)


10:46 2014-09-08
A'A is square, symmetric, positive definite


10:51 2014-09-08
SVD == Singular Value Decomposition


10:52 2014-09-08
singlular value


10:52 2014-09-08
similar matrix:


A & B are both n by n matrix,


17:46 2014-09-08
similar matrices have the same eigenvalue


17:46 2014-09-08
A = UΣV'  


// A is any rectangular matrix


// Σ is diagonal matrix


// U & V are orthogonal matrix


18:22 2014-09-08
AV = UΣ // U, V orthogonal basis


18:36 2014-09-08
start Harvard Gambler's Ruin and Random variables, video 7


19:25 2014-09-08
random variables & their distribution


19:27 2014-09-08
Grambler's ruin


19:28 2014-09-08
LOTP == Law Of Total Probability


19:42 2014-09-08
PMF == Probability Mass Function


PDF == Probability Density Function


20:30 2014-09-08
SVD for symmetric positive definite matrix:


A = QΛQ'




/


13:31 2014-09-11
linear algebra, video 30,  linear transformation


13:32 2014-09-11
the projection is a linear transformation


13:46 2014-09-11
rotation transformation


13:56 2014-09-11
rotation is also a "linear transformation"


13:57 2014-09-11
linear transformation with coordinates // matrix


14:08 2014-09-11
find the matrix behind it(linear transformation)


14:10 2014-09-11
coordinates come from basis


14:30 2014-09-11
component == coordinate * bisis


14:32 2014-09-11
input basis, output basis


14:40 2014-09-11
basis, coordinate, linear transformation


14:51 2014-09-11
A * input coordinate == output coordinate


// matrix does the job!


14:52 2014-09-1111
A * x = λ * x // eigenvector x is a good coordinate!


15:06 2014-09-11
standard basis => eigenvector basis


15:08 2014-09-11
input space => output space


input basis => output basis


input coordinate => output coordinate


15:12 2014-09-11
the linear transformation which takes a derivative


15:27 2014-09-11
inverse matrix gives the inverse of the linear transformation


----------------------------------------------------------
15:36 2014-09-11
start linear algebra, video 31, the last video!


change of basis


15:38 2014-09-11
linear transformation <=> matrix


16:02 2014-09-11
image compression


16:02 2014-09-11
lossless compression, lossy compression


16:07 2014-09-11
JPEG // change of basis, standard basis => Fourier basis


16:13 2014-09-11
standard basis => better basis


16:13 2014-09-11
what basis to choose?  // image compression


16:15 2014-09-11
JPEG choose "Fourier basis"


16:16 2014-09-11
you have to use prediction & correction


16:38 2014-09-11
wavelet basis


16:40 2014-09-11
Fourier basis => wavelet basis


16:41 2014-09-11
standar basis => wavelet basis


16:47 2014-09-11
a good basis has a nice fast inverse!


16:48 2014-09-11
p = Wc  =>


c = inv(W) * P


16:49 2014-09-11
FFT == Fast Fourier Transform


16:49 2014-09-11
the nice property of orthogonal matrix:


inv(Q) = Q' // inverse is just transpose


16:53 2014-09-11
JPEG     // Fourier basis


JPEG2000 // wavelet basis


17:02 2014-09-11
change of basis


17:05 2014-09-11
I have my vector in one basis, and I want to 


change it to another one.


17:06 2014-09-11
eigenvector basis


17:28 2014-09-11
following the eigenvector direction!


---------------------------------------------------------------
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值