# 机器学习代写｜CSE 151A: Introduction to Machine Learning Homework Assignment 2

这是一篇英国的**机器学习代写**包课

You are given the points belonging to class- 1 and class-2 as follows:

Class 1 points: (11*, *11)*,*(13*, *11)*,*(8*, *10)*,*(9*, *9)*,*(7*, *7)*,*(7*, *5)*,*(16*, *3)

Class 2 points: (7*, *11)*,*(15*, *9)*,*(15*, *7)*,*(13*, *5)*,*(14*, *4)*,*(9*, *3)*,*(11*, *3)

What is the label of the sample (14*, *3) using the nearest neighbor classififier using L2 distance?

Consider house rent prediction problem where you are supposed to predict price of a house based on just its area. Suppose you have *n *samples with their respective areas, *x *(1)*, x*(2)*, . . . , x*(*n*),their true house rents *y *(1)*, y*(2)*, . . . , y*(*n*) . Let’s say, you train a linear regressor that predicts *f (xi)θ + θ1.x(i).*The parameters *θ*0 and *θ*1 are scalars and are learned by minimizing mean-squared-error loss through gradient descent with a learning rate *α*. Answer the following questions.

Consider the same house rent prediction problem where you are supposed to predict price of a house based on just its area. Suppose you have *n *samples with their respective areas,*x *(1)*, x*(2)*, . . . , x*(*n*) , their true house rents *y *(1)*, y*(2)*, . . . , y*(*n*) . Let’s say, you train a linear regressor that predicts *f*(*x*(*i*) ) = *θ*0 + *θ*1*x*(*i*) . The parameters *θ*0 and *θ*1 are scalars and are learned by minimizing mean-squared-error loss with L1-regularization through gradient descent with a learning rate *α*. Answer the following questions.

Consider the same house rent prediction problem where you are supposed to predict price of a house based on just its area. Suppose you have *n *samples with their respective areas,*x *(1)*, x*(2)*, . . . , x*(*n*) , their true house rents *y *(1)*, y*(2)*, . . . , y*(*n*) . Let’s say, you train a linear regressor that predicts *f*(*x*(*i*) ) = *θ*0 + *θ*1*x*(*i*) . The parameters *θ*0 and *θ*1 are scalars and are learned by minimizing mean-squared-error loss with L2-regularization through gradient descent with a learning rate *α*. Answer the following questions.

Now, you will implement a linear regression model from scratch. We have provided a skeleton code fifile (i.e. LinearRegression.py) for you to implement the algorithm as well as a notebook fifile (i.e. Linear Regression.ipynb) for you to conduct experiment and answer relevant questions. Libraries such as numpy and pandas may be used for auxiliary tasks (such as matrix multiplication, matrix inversion, and so on), but not for the algorithms. That is, you can use numpy to implement your model, but cannot directly call libraries such as scikit-learn to get a linear regression model for your skeleton code. We will grade this question based on the three following criteria:

A single **PDF **fifile that includes: