1. Multiple Features

note:X0 is equal to 1

2. Feature Scaling

Idea: make sure features are on a similiar scale, approximately a -1<Xi<1 range

For example:

x1 = size (0-2000 feet^2) max-min or standard deviation

x2 = number of bedrooms(1-5)

The contour function of theta1 and theat2 is a very skewed elliptical shape. And

if you are running gradient descent on this, your gradients may end up a long time

to fjnd the global minimum.

Cure:

x1 = size (0-5000 feet^2)/2000

x2 = number of bedrooms(1-5)/5

so the coutour plots will give a much more direct path to the minimum

Mean normalization:

Replace Xwith Xi  -  Ui  to make features have zero mean(except X0)

Eg:

X1 = (size-1000)/2000

X2= (#bedrooms-2)/5

3. Learning Rate

We can plot the J(theata) vs number of iterations and the J(theata) should

decrease after every iteraion. and we can also see if the gradient descent converges or not.

And if gradient descent is not working, usually means that:

you should use a smaller value of alpha(learning rate)

To choose alpha():

..., 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1...

4. Features

you can try to define new features, for example:

Area = frontage * depth

Polynomial regression:

we can set that x1=size, x2=(size)^2, x3=(size)^3(remember ot feature scaling)

and it becomes linear regression

5. Normal Equations

Idea: method to solve for theta analytically

where  x is m*(n-1) dimensional matrix and y is a m dimensional matrix,

n : number of features, m:number of training example

And feature scaling is not necessary for normal equations

Gradient descent

1. choose alpha

2. need many iterations

3. works well even have large number of features n.

Normal equation:
1. no need for alpha and iterations

2. need to compute matrix inverse

3. slow for large n (n = 10^6 etc)

Note  is not invertible means that:

1. you have got redundant features(linearly dependent)

2. there are too many features, delete some features, or use regularization

机器学习笔记-1 Linear Regression with Multiple Variables(week 2)的更多相关文章

  1. 机器学习 (二) 多变量线性回归 Linear Regression with Multiple Variables

    文章内容均来自斯坦福大学的Andrew Ng教授讲解的Machine Learning课程,本文是针对该课程的个人学习笔记,如有疏漏,请以原课程所讲述内容为准.感谢博主Rachel Zhang 的个人 ...

  2. 机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables)

    机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables) 同样是预测房价问题  如果有多个特征值 那么这种情况下  假设h表示 ...

  3. 【原】Coursera—Andrew Ng机器学习—课程笔记 Lecture 4_Linear Regression with Multiple Variables 多变量线性回归

    Lecture 4 Linear Regression with Multiple Variables 多变量线性回归 4.1 多维特征 Multiple Features4.2 多变量梯度下降 Gr ...

  4. 【原】Coursera—Andrew Ng机器学习—Week 2 习题—Linear Regression with Multiple Variables 多变量线性回归

    Gradient Descent for Multiple Variables [1]多变量线性模型  代价函数 Answer:AB [2]Feature Scaling 特征缩放 Answer:D ...

  5. Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)

    Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...

  6. Linear regression with multiple variables(多特征的线型回归)算法实例_梯度下降解法(Gradient DesentMulti)以及正规方程解法(Normal Equation)

    ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, , ...

  7. 斯坦福机器学习视频笔记 Week2 多元线性回归 Linear Regression with Multiple Variables

    相比于week1中讨论的单变量的线性回归,多元线性回归更具有一般性,应用范围也更大,更贴近实际. Multiple Features 上面就是接上次的例子,将房价预测问题进行扩充,添加多个特征(fea ...

  8. 机器学习之多变量线性回归(Linear Regression with multiple variables)

    1. Multiple features(多维特征) 在机器学习之单变量线性回归(Linear Regression with One Variable)我们提到过的线性回归中,我们只有一个单一特征量 ...

  9. 机器学习笔记1——Linear Regression with One Variable

    Linear Regression with One Variable Model Representation Recall that in *regression problems*, we ar ...

随机推荐

  1. Selenium自动化脚本开发总结

    Selenium Selenium 是ThoughtWorks专门为Web应用程序编写的一个验收测试工具. Selenium测试直接运行在浏览器中,就像真正的用户在操作一样.支持的浏览器包括IE.Mo ...

  2. C#传递委托给C或C++库报错__对XXX类型的已垃圾回收委托进行了回调

    出现的原因: 因为你传给C或C++的委托是局部的.可能传过去之后就被垃圾回收了,再次调用就会异常. 想办法做成全局的就好 public void Play(string url) { _bassStr ...

  3. java对String进行sha1加密

    1.使用apache的codec jar包对string进行加密,先下载并引入jar包: http://commons.apache.org/proper/commons-codec/ 2.生成: S ...

  4. cuda学习笔记——deviceQuery

    main(int argc, char **argv):argc是参数个数,**argv具体的参数,第0个是程序全名 cudaError_t类型:记录cuda错误,值为cudaSuccess则正确执行 ...

  5. Android开发之AChartEngine的使用

    下面附上代码加注解 package com.example.com.my_achartnegine; import android.content.Context; import android.gr ...

  6. iOS开发之URLSession

    1.概述 n  NSURLSession是iOS7中新的网络接口,与NSURLConnection是并列的. n  当程序在前台时,NSURLSession与NSURLConnection大部分可以互 ...

  7. (转)cacti无图无数据等常见问题排查

    推荐阅读:零基础学习Nagios http://www.51ou.com/browse/nagios/52001.htmlNagios安装配置教程 http://www.51ou.com/browse ...

  8. CSS左侧固定宽 右侧自适应(兼容所有浏览器)

    左侧固定宽,右侧自适应屏幕宽: 左右两列,等高布局: 左右两列要求有最小高度,例如:200px;(当内容超出200时,会自动以等高的方式增高) 要求不用JS或CSS行为实现: 仔细分析试题要求,要达到 ...

  9. Android学习笔记---前传

    在正式的撰写个人的学习笔记前,先对个人的学习经历做一个简要的介绍.座右铭:诚不欺我 1. 前言 本人非软件工程出身,属于半路出家,误打误撞进入这个行业,初心是软件开发的门槛低,自以为学习过C语言,轻度 ...

  10. 27. Remove Element - 移除元素-Easy

    Description: Given an array and a value, remove all instances of that value in place and return the ...