MTH 496 – Machine Learning


Homework 2
MTH 496 – Machine Learning
Due date: Feb. 25th, 2019
(2 problems/1 page)
1 Handwritten Homework
Note All problems in this section requires the handwritten answers.
Problem 1.1 (10pts). Given a training data {x
(i)
, y(i)} with i = 1, 2, · · · , M and x
i ∈
R
N , y(i) ∈ R. Consider a linear regression model with predictor and loss defined in the
lecture note. Calculate and simplify the gradient of the loss function.
Problem 1.2 (10pts). Given a training data {x
(i)
, y(i)} with i = 1, 2, · · · , M and x
i ∈
R
N , y(i) ∈ {0, 1}. Consider a logistic regression model with predictor and loss defined in the
lecture note. Calculate and simplify the gradient of the loss function.
2 Programming Homework
Note Write your codes in Jupyter notebook format. Each problem is in a separate notebook
and submit all of them via a dropbox in D2L. Machine learning libraries in Python
packages are not allowed to use.
Problem 2.1 (30pts). Given training data: X iris train.csv (feature values), y iris train.csv
(labels) and test data: X iris test.csv (feature values), y iris test.csv (labels) . File
Iris feature description.csv describes the meaning of each column in the data set.
a) Program a logistic regression model to predict the labels in the test data. Explicitly write
down the representation of model’s predictor (note: type down your formulation in the
notebook).
b) Calculate the accuracy of your model.

因为专业,所以值得信赖。如有需要,请加QQ99515681 或邮箱:99515681@qq.com 

微信:codinghelp

转载于:https://www.cnblogs.com/blogsjava/p/10452603.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值