辅导Machine Learning、讲解R/Python编程设计、辅导R/Python、讲解linear

- 首页 >> 其他
Assignment
Machine Learning
Due: March 26th at 11:59PM
Problem 1 [30%]
In this problem, we will establish some basic properties of vectors and linear functions.
1. The L2 norm of a vector measures the length (size) of a vector. The norm for a vector x of size n is
defined as:
kxk2 =vuutXn
i=1x2i
Show that the norm can be expressed as the following quadratic expression:Let a be a vector of size n and consider the following linear function f(x) = a
T x. Show that the
gradient of f is: xf(x) = a.
3. Let A be a matrix of size n × n and consider the following quadratic function f(x) = x
T Ax. Show that
the gradient of f is: xf(x) = 2Ax.
Problem 2 [40%]
In this problem, you will implement a gradient descent algorithm for solving a linear regression problem. The
RSS objective in linear regression is:
f(β) = ky Aβk. Consider the problem of predicting revenue as a function of spending on TV and Radio advertising.
There are only 4 data points:
Revenue TV Radio
20 3 7
15 4 6
32 6 1
5 1 1
Write down the matrix A and vector y for this regression problem. Do not forget about the intercept, which
can be modeled as a feature with a constant value over all data points. The matrix A should be 4 × 3
dimensional.
2. Express the objective f in terms of linear and quadratic terms.
3. Derive the gradient βf(β) using the linear and quadratic terms above.
4. Implement a gradient descent method with a fixed step size in R/Python.
5. Use your implementation of linear regression to solve the simple problem above and on a small dataset
of your choice. Compare the solution with linear regression from R or Python (sklearn). Do not forget
1about the intercept.
Problem 3 [45%]
Hint: You can follow the slides from the March 20th class, or the LAR reference from the class. See the class
website for some recommended linear algebra references.
You will derive the formula used to compute the solution to ridge regression. The objective in ridge regression
is:f(β) = ky Aβk
Here, β is the vector of coefficients that we want to optimize, A is the design matrix, y is the target, and λ is
the regularization coefficient. The notation k·k2 represents the Euclidean (or L2) norm.
Our goal is to find β that solves:
Follow the next steps to compute it.
1. Express the ridge regression objective f(β) in terms of linear and quadratic terms. Recall that. The results should be similar to the objective function of linear regression.
2. Derive the gradient: βf(β) using the linear and quadratic terms above.
3. Since f is convex, its minimal value is attained whenβf(β) = 0
Derive the expression for the β that satisfies the inequality above.
4. Implement the algorithm for β and use it on a small dataset of your choice. Do not forget about the
intercept.
5. Compare your solution with glmnet (or another implementation) using a small example and see if you
can make the results to be the same.
2