## 1. Linear Regression

Linear Regression(线性回归)是机器学习一个经典入门问题，本文根据课程中第一次编程作业内容，使用机器学习解决线性回归问题。

### 1.1 计算Cost function的代码：

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly
J = 0;

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.

h_theta = X * theta;
J = sumsq(h_theta - y) / (2 * m );

% =========================================================================

end


X为所有samples的横坐标矩阵$(m \times 2)$

Y为所有samples的纵坐标矩阵$(m \times 1)$

X多加了第一列是为了和$\theta_0$做矩阵乘法，所以$X\cdot theat$得到$(m \times 1)$的矩阵，也就是$H_{\theta}$。而

J = sumsq(h_theta - y) / (2 * m );

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters

% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
%               theta.
%
% Hint: While debugging, it can be useful to print out the values
%       of the cost function (computeCost) and gradient here.
%
h_theta = X * theta;
cost = h_theta - y;
theta0_update = alpha * sum(cost .* X(:,1)) / m;
theta1_update = alpha * sum(cost .* X(:,2)) / m;
theta(1) = theta(1) - theta0_update;
theta(2) = theta(2) - theta1_update;
% ============================================================

% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);

end

end


alpha=0.01，num_iters=1500，在1500次迭代之后，有：