联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp

您当前位置:首页 >> Python编程Python编程

日期:2019-02-19 10:53

2019 Midterm due at Noon on Feb 23

Please submit the solutions be email to me or place in my mailbox on the second floor of Old

Chemistry.

Pr. 1.1 Consider the data set in Lab 1. This is the Advertisement.csv data set.

1. Compute the leave-on-out cross validation error for a Gaussian process regression

model as well as a Bayesian linear regression model.

2. Plot the predictive posterior distribution when observations 1, 50, 100, 150 are respectively

left out of the training set and you are asked to predict their response.

3. Use a bootstrap procedure to output confidence intervals when observations1, 50, 100,

150 are respectively left out of the training set and you are asked to use ordinary least

squares as your regression method.

Pr. 1.2 Write out the EM update steps for a mixture of multinomials model. Specfically,

consider the following likelihood

f(x1, ..., xn; π, {θ1, ..., θ7}) = Yn

i=1 "X

7

k=1

πkf(xi

; θk)

#

where x takes values 1, ..., 4 (there are four categories) and

f(x = c; θ) = θc, θc ≥ 0 and X

c

θc = 1.

Pr. 1.3 Given the classification data set in Lab 2 run regularized logistic regression versus SVM

and compare classification accuracy on a test-train split.

Pr. 1.4 Show that the EM algorithm does not decrease with respect to the likelihood value at

each step.

Pr. 1.5 Sketch how the Least Angle Regression problem implements a form of sparse regression.

Pr. 1.6 Sketch how the Least Angle Regression problem implements a form of sparse regression.

Pr. 1.7 Run the sklearn.mixture program with 8 components. Output the probability assignment

for each observation. Explain the difference between a hard assignment versus a soft

assignment for each observation.


版权所有:编程辅导网 2021 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp