一、理论

二、数据集

6.1101,17.592
5.5277,9.1302
8.5186,13.662
7.0032,11.854
5.8598,6.8233
8.3829,11.886
7.4764,4.3483
8.5781,
6.4862,6.5987
5.0546,3.8166
5.7107,3.2522
14.164,15.505
5.734,3.1551
8.4084,7.2258
5.6407,0.71618
5.3794,3.5129
6.3654,5.3048
5.1301,0.56077
6.4296,3.6518
7.0708,5.3893
6.1891,3.1386
20.27,21.767
5.4901,4.263
6.3261,5.1875
5.5649,3.0825
18.945,22.638
12.828,13.501
10.957,7.0467
13.176,14.692
22.203,24.147
5.2524,-1.22
6.5894,5.9966
9.2482,12.134
5.8918,1.8495
8.2111,6.5426
7.9334,4.5623
8.0959,4.1164
5.6063,3.3928
12.836,10.117
6.3534,5.4974
5.4069,0.55657
6.8825,3.9115
11.708,5.3854
5.7737,2.4406
7.8247,6.7318
7.0931,1.0463
5.0702,5.1337
5.8014,1.844
11.7,8.0043
5.5416,1.0179
7.5402,6.7504
5.3077,1.8396
7.4239,4.2885
7.6031,4.9981
6.3328,1.4233
6.3589,-1.4211
6.2742,2.4756
5.6397,4.6042
9.3102,3.9624
9.4536,5.4141
8.8254,5.1694
5.1793,-0.74279
21.279,17.929
14.908,12.054
18.959,17.054
7.2182,4.8852
8.2951,5.7442
10.236,7.7754
5.4994,1.0173
20.341,20.992
10.136,6.6799
7.3345,4.0259
6.0062,1.2784
7.2259,3.3411
5.0269,-2.6807
6.5479,0.29678
7.5386,3.8845
5.0365,5.7014
10.274,6.7526
5.1077,2.0576
5.7292,0.47953
5.1884,0.20421
6.3557,0.67861
9.7687,7.5435
6.5159,5.3436
8.5172,4.2415
9.1802,6.7981
6.002,0.92695
5.5204,0.152
5.0594,2.8214
5.7077,1.8451
7.6366,4.2959
5.8707,7.2029
5.3054,1.9869
8.2934,0.14454
13.394,9.0551
5.4369,0.61705

三、代码实现

clear  all;
clc;
data = load('ex1data1.txt');
X = data(:, 1); y = data(:, 2);
m = length(y); % number of training examples
plot(X,y,'rx'); %% =================== Part 3: Gradient descent ===================
fprintf('Running Gradient Descent ...\n') %为什么加上一列1,为了算J时候,theta0 乘以1
X = [ones(m, 1), data(:,1)]; % Add a column of ones to x
theta = zeros(2, 1); % initialize fitting parameters % Some gradient descent settings
iterations = 1500;
alpha = 0.01; % compute and display initial cost
computeCost(X, y, theta) % run gradient descent
[theta, J_history]= gradientDescent(X, y, theta, alpha, iterations); hold on; % keep previous plot visible
plot(X(:,2), X*theta, '-')
legend('Training data', 'Linear regression')
hold off % don't overlay any more plots on this figure % Predict values for population sizes of 35,000 and 70,000
predict1 = [1, 3.5] *theta;
fprintf('For population = 35,000, we predict a profit of %f\n',...
predict1*10000);
predict2 = [1, 7] * theta;
fprintf('For population = 70,000, we predict a profit of %f\n',...
predict2*10000); % Grid over which we will calculate J
theta0_vals = linspace(-10, 10, 100);
theta1_vals = linspace(-1, 4, 100); % initialize J_vals to a matrix of 0's
J_vals = zeros(length(theta0_vals), length(theta1_vals)); % Fill out J_vals
for i = 1:length(theta0_vals)
for j = 1:length(theta1_vals)
t = [theta0_vals(i); theta1_vals(j)];
J_vals(i,j) = computeCost(X, y, t);
end
end % Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals';
% Surface plot
figure;
surf(theta0_vals, theta1_vals, J_vals)
xlabel('\theta_0'); ylabel('\theta_1'); % Contour plot
figure;
% Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100
%以10为底的指数 logspace(-2, 3, 20)坐标值标注范围以及间距
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))
xlabel('\theta_0'); ylabel('\theta_1');
hold on;
plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);

  ...................

function J = computeCost(X, y, theta)

m = length(y); % number of training examples
J = 0; for i=1:m
J = J +(theta(1)*X(i,1) + theta(2)*X(i,2) - y(i))^2;
end
% 除以2m是为了在更新参数的时候 好算 2因为J是二次,求骗到后产生系数2,
%m是为了不让J 过大(i=1:m已经是求偏导第二部的m、项了)
J = J/(m*2);
end

  ......

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
J_1 = 0;% 偏导数J_1, J_2
J_2 = 0;
for iter = 1:num_iters
for i = 1:m
J_1 = J_1 + theta(1)*X(i,1) + theta(2)*X(i,2) - y(i);
J_2 = J_2 + (theta(1)*X(i,1) + theta(2)*X(i,2) - y(i)) * X(i,2);
end
%J中的m 没有在上面的for内除,因为只除以一次就够了
J_1 = J_1/m;
J_2 = J_2/m;
% temp1 = theta(1) - alpha * J_1;
% temp2 = theta(2) - alpha * J_2;
% theta(1) = temp1;
% theta(2) = temp2;
theta(1) = theta(1) - alpha * J_1;
theta(2) = theta(2) - alpha * J_2;
J_history(iter) = computeCost(X, y, theta);
% save J_history J_history
end
end

四、运行结果

Matlab实现单变量线性回归的更多相关文章

  1. 机器学习之单变量线性回归(Linear Regression with One Variable)

    1. 模型表达(Model Representation) 我们的第一个学习算法是线性回归算法,让我们通过一个例子来开始.这个例子用来预测住房价格,我们使用一个数据集,该数据集包含俄勒冈州波特兰市的住 ...

  2. Coursera《machine learning》--(2)单变量线性回归(Linear Regression with One Variable)

    本笔记为Coursera在线课程<Machine Learning>中的单变量线性回归章节的笔记. 2.1 模型表示 参考视频: 2 - 1 - Model Representation ...

  3. 机器学习(二)--------单变量线性回归(Linear Regression with One Variable)

    面积与房价 训练集 (Training Set) Size       Price 2104       460 852         178 ...... m代表训练集中实例的数量x代表输入变量 ...

  4. Ng第二课:单变量线性回归(Linear Regression with One Variable)

    二.单变量线性回归(Linear Regression with One Variable) 2.1  模型表示 2.2  代价函数 2.3  代价函数的直观理解 2.4  梯度下降 2.5  梯度下 ...

  5. python 单变量线性回归

      单变量线性回归(Linear Regression with One Variable)¶ In [54]: #初始化工作 import random import numpy as np imp ...

  6. 斯坦福第二课:单变量线性回归(Linear Regression with One Variable)

    二.单变量线性回归(Linear Regression with One Variable) 2.1  模型表示 2.2  代价函数 2.3  代价函数的直观理解 I 2.4  代价函数的直观理解 I ...

  7. 【原】Coursera—Andrew Ng机器学习—课程笔记 Lecture 2_Linear regression with one variable 单变量线性回归

    Lecture2   Linear regression with one variable  单变量线性回归 2.1 模型表示 Model Representation 2.1.1  线性回归 Li ...

  8. 机器学习 (一) 单变量线性回归 Linear Regression with One Variable

    文章内容均来自斯坦福大学的Andrew Ng教授讲解的Machine Learning课程,本文是针对该课程的个人学习笔记,如有疏漏,请以原课程所讲述内容为准.感谢博主Rachel Zhang的个人笔 ...

  9. 【Python】机器学习之单变量线性回归 利用正规方程找到合适的参数值

    [Python]机器学习之单变量线性回归 利用正规方程找到合适的参数值 本次作业来自吴恩达机器学习. 你是一个餐厅的老板,你想在其他城市开分店,所以你得到了一些数据(数据在本文最下方),数据中包括不同 ...

随机推荐

  1. MongoDB的安装小结

    正在做毕业设计,想尝试着用mongoDB来做数据库,之前没有接触过,然后,就在网上找资料,自己捣鼓,弄了好久才算上真正的把它安上,好心累.... 网上有很多安装教程,大同小异,这里呢,我只是想记录一下 ...

  2. Ueditor防止代码自动清除

    Ueditor功能真的很牛逼,可也有让人悲催的地方,尤其是自动清除代码,会将你默认的div标签改成p,挺让人闹心的,不过Ueditor的开发人员还是满热心的,搜遍网上无答案的时候,问了下他们,解决了 ...

  3. Ubuntu14.04 切换root账户su root失败解决办法

    原因是需要备份一个vimrc,可是cp就提示Permission denied. su root就提示su: Authentication failure 解决办法: sudo passwd root ...

  4. 复习后台代码(与前面clentHttp连接网络结合)

    package com.zzw.LoginServelt; import java.io.IOException; import java.io.PrintWriter; import javax.s ...

  5. masterha_check_repl报错汇总

    [root@DBMysql ~]#masterha_check_repl --conf=/etc/masterha/app1.cnf 导致如下报错的原因主要有两类: 1.mysql的安装时用源码安装, ...

  6. 【微网站开发】之微信内置浏览器API使用

    最近在写微网站,发现了微信内置浏览器的很多不称心的地方: 1.安卓版的微信内浏览器底部总是出现一个刷新.前进.后退的底部栏,宽度很大,导致屏幕显示尺寸被压缩 2.分享当前网站至朋友圈时,分享的图片一般 ...

  7. .net控件事件中的Sender

    private void button2_Click(object sender, RoutedEventArgs e) { } 最近看WPF内容,回顾下.net大家天天都在用,却不是十分关注的一个对 ...

  8. 菜鸟学习Spring——60s学会Spring与Hibernate的集成

    一.概述. Spring与Hibernate的集成在企业应用中是很常用的做法通过Spring和Hibernate的结合能提高我们代码的灵活性和开发效率,下面我就一步一步的给大家讲述Spring如何和H ...

  9. SQLserver查询数据库所有字段-表名

    SELECT * FROM INFORMATION_SCHEMA.columns WHERE TABLE_NAME='Account' SELECT (case when a.colorder=1 t ...

  10. MongoDB分片简单实例

    分片 在Mongodb里面存在另一种集群,就是分片技术,可以满足MongoDB数据量大量增长的需求. 当MongoDB存储海量的数据时,一台机器可能不足以存储数据也足以提供可接受的读写吞吐量.这时,我 ...