18.065 LEC3. Orthonormal Columns In Q Give Q’Q= I

《Matrix Methods in Data Analysis, Signal Processing, and Machine Learning》MIT Course Number:18.065课程习题答案

LEC3. Orthonormal Columns In Q Give Q’Q= I

Problems of Lecture 3 (from textbook Section I.5)

2 Draw unit vectors u and v that are not orthogonal. Show that ω = v − u ( u T v ) \omega = \textbf{v} − \textbf{u} (\textbf{u}^T\textbf{v}) ω=vu(uTv) is orthogonal to u (and add w to your picture).

4 Key property of every orthogonal matrix : ∣ ∣ Q x ∣ ∣ 2 = ∣ ∣ x ∣ ∣ 2 ||Qx||^2 =||x||^2 Qx2=x2 for every vector x.More than this, show that ( Q x ) T ( Q y ) = x T y (Qx)^T (Qy) = x^Ty (Qx)T(Qy)=xTy for every vector x and y. So lengths and angles are not changed by Q. Computations with Q never overflow !

6 A permutation matrix has the same columns as the identity matrix (in some order). Explain why this permutation matrix and every permutation matrix is orthogonal : P = [ 0 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 ] h a s   o r t h o n o r m a l   c o l u m n s   s o   P T P = ∗ ∗ ∗ ‾   a n d   P − 1 = ∗ ∗ ∗ ‾ . P= \begin{bmatrix}0&1&0&0\\0&0&1&0\\0&0&0&1\\1&0&0&0\\ \end{bmatrix} has \ orthonormal \ columns \ so \ P^TP=\underline{ ***} \ and \ P^{-1}=\underline{***}. P=0001100001000010has orthonormal columns so PTP= and P1=.
When a matrix is symmetric or orthogonal, it will have orthogonal eigenvectors.
This is the most important source of orthogonal vectors in applied mathematics.

Solutions to Lecture 3

2 v is separated into a piece u ( u T v ) u(u^Tv) u(uTv) in the direction of u and the remaining piece ω = v − u ( u T v ) \omega = \textbf{v} − \textbf{u}(\textbf{u}^T\textbf{v}) ω=vu(uTv) perpendicular to u. Check u T w = u T v − ( u T u ) ( u T v ) = 0 u^Tw = u^T v − (u^Tu)(u^T v) = 0 uTw=uTv(uTu)(uTv)=0

4 Check ( Q x ) T ( Q y ) = x T Q T Q y = x T y (Qx)^T (Qy) = x^TQ^TQy = x^Ty (Qx)T(Qy)=xTQTQy=xTy : Angles are preserved when all vectors are multiplied by Q. Remember x T y = ∣ ∣ x ∣ ∣   ∣ ∣ y ∣ ∣ c o s θ = ( Q x ) T ( Q y ) x^Ty = ||x|| \ ||y||cosθ = (Qx)^T (Qy) xTy=x ycosθ=(Qx)T(Qy): same θ!

6 Every permutation matrix has unit vectors in its columns (single 1 and (n − 1) zeros). Those columns are orthogonal because their 1’s are in different positions. P T P = I ‾   a n d   P − 1 = P T ‾ P^TP=\underline{I} \ and \ P^{-1}=\underline{P^T} PTP=I and P1=PT

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
import pandas as pd from sklearn.model_selection import train_test_split from sklearn.ensemble import RandomForestClassifier from sklearn.preprocessing import OneHotEncoder,LabelEncoder from sklearn.model_selection import cross_val_score from sklearn.model_selection import GridSearchCV df = pd.read_csv('mafs(1).csv') df.head() man = df['Gender']=='M' woman = df['Gender']=='F' data = pd.DataFrame() data['couple'] = df.Couple.unique() data['location'] = df.Location.values[::2] data['man_name'] = df.Name[man].values data['woman_name'] = df.Name[woman].values data['man_occupation'] = df.Occupation[man].values data['woman_occupaiton'] = df.Occupation[woman].values data['man_age'] = df.Age[man].values data['woman_age'] = df.Age[woman].values data['man_decision'] = df.Decision[man].values data['woman_decision']=df.Decision[woman].values data['status'] = df.Status.values[::2] data.head() data.to_csv('./data.csv') data = pd.read_csv('./data.csv',index_col=0) data.head() enc = OneHotEncoder() matrix = enc.fit_transform(data['location'].values.reshape(-1,1)).toarray() feature_labels = enc.categories_ loc = pd.DataFrame(data=matrix,columns=feature_labels) data_new=data[['man_age','woman_age','man_decision','woman_decision','status']] data_new.head() lec=LabelEncoder() for label in ['man_decision','woman_decision','status']: data_new[label] = lec.fit_transform(data_new[label]) data_final = pd.concat([loc,data_new],axis=1) data_final.head() X = data_final.drop(columns=['status']) Y = data_final.status X_train,X_test,Y_train,Y_test=train_test_split(X,Y,train_size=0.7,shuffle=True) rfc = RandomForestClassifier(n_estimators=20,max_depth=2) param_grid = [ {'n_estimators': [3, 10, 30,60,100], 'max_features': [2, 4, 6, 8], 'max_depth':[2,4,6,8,10]}, ] grid_search = GridSearchCV(rfc, param_grid, cv=9) grid_search.fit(X, Y) print(grid_search.best_score_) #最好的参数 print(grid_search.best_params_)
05-10

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值