联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp

您当前位置:首页 >> Python编程Python编程

日期:2019-02-26 09:24

Homework 2

MTH 496 – Machine Learning

Due date: Feb. 25th, 2019

(2 problems/1 page)

1 Handwritten Homework

Note All problems in this section requires the handwritten answers.

Problem 1.1 (10pts). Given a training data {x

(i)

, y(i)} with i = 1, 2, · · · , M and x

i ∈

R

N , y(i) ∈ R. Consider a linear regression model with predictor and loss defined in the

lecture note. Calculate and simplify the gradient of the loss function.

Problem 1.2 (10pts). Given a training data {x

(i)

, y(i)} with i = 1, 2, · · · , M and x

i ∈

R

N , y(i) ∈ {0, 1}. Consider a logistic regression model with predictor and loss defined in the

lecture note. Calculate and simplify the gradient of the loss function.

2 Programming Homework

Note Write your codes in Jupyter notebook format. Each problem is in a separate notebook

and submit all of them via a dropbox in D2L. Machine learning libraries in Python

packages are not allowed to use.

Problem 2.1 (30pts). Given training data: X iris train.csv (feature values), y iris train.csv

(labels) and test data: X iris test.csv (feature values), y iris test.csv (labels) . File

Iris feature description.csv describes the meaning of each column in the data set.

a) Program a logistic regression model to predict the labels in the test data. Explicitly write

down the representation of model’s predictor (note: type down your formulation in the

notebook).

b) Calculate the accuracy of your model.


版权所有:编程辅导网 2021 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp