Multivariate Linear Regression
Multiple Features

Linear regression with multiple variables is also known as "multivariate linear regression".
We now introduce notation for equations where we can have any number of input variables.

The multivariable form of the hypothesis function accommodating these multiple features is as follows:

In order to develop intuition about this function, we can think about θ0 as the basic price of a house, θ1 as the price per square meter, θ2 as the price per floor, etc. x1 will be the number of square meters in the house, x2 the number of floors, etc.
Using the definition of matrix multiplication, our multivariable hypothesis function can be concisely represented as:

This is a vectorization of our hypothesis function for one training example; see the lessons on vectorization to learn more.
Remark: Note that for convenience reasons in this course we assume
.This allows us to do matrix operations with theta and x. Hence making the two vectors 'θ' and
match each other element-wise (that is, have the same number of elements: n+1).]
Gradient Descent For Multiple Variables
The gradient descent equation itself is generally the same form; we just have to repeat it for our 'n' features:


Gradient Descent in Practice I - Feature Scaling
Note: [6:20 - The average size of a house is 1000 but 100 is accidentally written instead]
We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.
The way to prevent this is to modify the ranges of our input variables so that they are all roughly the same. Ideally:

These aren't exact requirements; we are only trying to speed things up. The goal is to get all input variables into roughly one of these ranges, give or take a few.
Two techniques to help with this are feature scaling and mean normalization. Feature scaling involves dividing the input values by the range (i.e. the maximum value minus the minimum value) of the input variable, resulting in a new range of just 1. Mean normalization involves subtracting the average value for an input variable from the values for that input variable resulting in a new average value for the input variable of just zero. To implement both of these techniques, adjust your input values as shown in this formula:

Where
is the average of all the values for feature (i) and s_i is the range of values (max - min), or s_i is the standard deviation.
Note that dividing by the range, or dividing by the standard deviation, give different results. The quizzes in this course use range - the programming exercises use standard deviation.
For example, if x_i represents housing prices with a range of 100 to 2000 and a mean value of 1000, then, x_i := \dfrac{price-1000}{1900}.
Gradient Descent in Practice II - Learning Rate
Note: [5:20 - the x -axis label in the right graph should be θ rather than No. of iterations ]
Debugging gradient descent. Make a plot with number of iterations on the x-axis. Now plot the cost function, J(θ) over the number of iterations of gradient descent. If J(θ) ever increases, then you probably need to decrease α.
Automatic convergence test. Declare convergence if J(θ) decreases by less than E in one iteration, where E is some small value such as 10−3. However in practice it's difficult to choose this threshold value.

It has been proven that if learning rate α is sufficiently small, then J(θ) will decrease on every iteration.

To summarize:
If α is too small: slow convergence.
If α is too large: may not decrease on every iteration and thus may not converge.
Features and Polynomial Regression
We can improve our features and the form of our hypothesis function in a couple different ways.
We can combine multiple features into one. For example, we can combine x1 and x2 into a new feature x3 by taking x1⋅x2.
Polynomial Regression
Our hypothesis function need not be linear (a straight line) if that does not fit the data well.
We can change the behavior or curve of our hypothesis function by making it a quadratic, cubic or square root function (or any other form).

One important thing to keep in mind is, if you choose your features this way then feature scaling becomes very important.

Multivariate Linear Regression的更多相关文章
- Machine Learning - week 2 - Multivariate Linear Regression
Multiple Features 上一章中,hθ(x) = θ0 + θ1x,表示只有一个 feature.现在,有多个 features,所以 hθ(x) = θ0 + θ1x1 + θ2x2 + ...
- 多元线性回归(Multivariate Linear Regression)简单应用
警告:本文为小白入门学习笔记 数据集: http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearnin ...
- Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)
Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...
- 【转】Derivation of the Normal Equation for linear regression
I was going through the Coursera "Machine Learning" course, and in the section on multivar ...
- 机器学习---线性回归(Machine Learning Linear Regression)
线性回归是机器学习中最基础的模型,掌握了线性回归模型,有利于以后更容易地理解其它复杂的模型. 线性回归看似简单,但是其中包含了线性代数,微积分,概率等诸多方面的知识.让我们先从最简单的形式开始. 一元 ...
- Andrew Ng Machine Learning 专题【Linear Regression】
此文是斯坦福大学,机器学习界 superstar - Andrew Ng 所开设的 Coursera 课程:Machine Learning 的课程笔记. 力求简洁,仅代表本人观点,不足之处希望大家探 ...
- CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression
(1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput t ...
- Coursera machine learning 第二周 quiz 答案 Linear Regression with Multiple Variables
https://www.coursera.org/learn/machine-learning/exam/7pytE/linear-regression-with-multiple-variables ...
- Multivariate Adaptive Regression Splines (MARSplines)
Introductory Overview Regression Problems Multivariate Adaptive Regression Splines Model Selection a ...
随机推荐
- mount---挂载文件系统
挂载概念 Linux中的根目录以外的文件要想被访问,需要将其“关联”到根目录下的某个目录来实现,这种关联操作就是“挂载”,这个目录就是“挂载点”,解除次关联关系的过程称之为“卸载”. 注意:“挂载点” ...
- 如何优雅的写UI——(3)添加MFC选项卡
窗体创建完成,接下来我们讲讲控件的使用 首先在CFormView窗体下选项卡的成员变量,这里我选择MFC下的选项卡类库:CMFCTabCtrl class CtabView : public CFor ...
- [React] Create a queue of Ajax requests with redux-observable and group the results.
With redux-observable, we have the power of RxJS at our disposal - this means tasks that would other ...
- stm32与arm7比较(经典)
http://wenku.baidu.com/link?url=LIVcT1AQL0IgVF1xan5Zy9rXarCBo66hj7OXSxM1ap7FpssO4c3sd1pZd8azfBPr3PBy ...
- 34.node.js之Url & QueryString模块
转自:https://i.cnblogs.com/posts?categoryid=1132005&page=6//引用 var url = require("url"); ...
- 使用SqlBulkCopy进行批量数据插入
Dim dt As DataTable = New DataTable() dt.Columns.Add("DtCostProductRuleGUID", GetType(Guid ...
- yarn的安装和使用
yarn的简介: Yarn是facebook发布的一款取代npm的包管理工具. yarn的特点: 速度超快. Yarn 缓存了每个下载过的包,所以再次使用时无需重复下载. 同时利用并行下载以最大化资源 ...
- 在Linux下用CANopenSocket协议模拟CAN总线通讯
一.参考文档 https://github.com/CANopenNode/CANopenSocket //下载 CANopenSocket 的源码 http://elinux.org/Can-uti ...
- Java String对象的经典问题
先来看一个样例,代码例如以下: public class Test { public static void main(String[] args) { Strin ...
- hdu-3642--Get The Treasury-线段树求面积并
求空间中叠加3次及3次以上的体积. 由于|z|<=500.所以直接把z轴剥离出来1000层. 然后对于每一层进行线段树求面积并. #include<stdio.h> #include ...