3. Bayesian statistics and Regularization Content 3. Bayesian statistics and Regularization. 3.1 Underfitting and overfitting. 3.2 Bayesian statistics and regularization. 3.3 Optimize Cost function by regularization. 3.3.1 Regularized linear regressi…
5 Neural Networks (part two) content: 5 Neural Networks (part two) 5.1 cost function 5.2 Back Propagation 5.3 神经网络总结 接上一篇4. Neural Networks (part one).本文将先定义神经网络的代价函数,然后介绍逆向传播(Back Propagation: BP)算法,它能有效求解代价函数对连接权重的偏导,最后对训练神经网络的过程进行总结. 5.1 cost func…
6. 学习模型的评估与选择 Content 6. 学习模型的评估与选择 6.1 如何调试学习算法 6.2 评估假设函数(Evaluating a hypothesis) 6.3 模型选择与训练/验证/测试集(Model selection and training/validation/test sets) 6.4 偏差与方差 6.4.1 Diagnosing bias vs. variance. 6.4.2 正则化与偏差/方差(Regularization and bias/variance)…
10. Dimensionality Reduction Content  10. Dimensionality Reduction 10.1 Motivation 10.1.1 Motivation one: Data Compression 10.2.2 Motivation two: Visualization 10.2 Principal Component Analysis 10.2.1 Problem formulation 10.2.2 Principal Component An…
9. Clustering Content 9. Clustering 9.1 Supervised Learning and Unsupervised Learning 9.2 K-means algorithm 9.3 Optimization objective 9.4 Random Initialization 9.5 Choosing the Number of Clusters 9.1 Supervised Learning and Unsupervised Learning 我们已…
8. Support Vector Machines(SVMs) Content 8. Support Vector Machines(SVMs) 8.1 Optimization Objection 8.2 Large margin intuition 8.3 Mathematics Behind Large Margin Classification 8.4 Kernels 8.5 Using a SVM 8.5.1 Multi-class Classification 8.5.2 Logi…
7 Machine Learning System Design Content 7 Machine Learning System Design 7.1 Prioritizing What to Work On 7.2 Error Analysis 7.3 Error Metrics for Skewed Classed 7.3.1 Precision/Recall 7.3.2 Trading off precision and recall: F1 Score 7.4 Data for ma…
4. Neural Networks (part one) Content: 4. Neural Networks (part one) 4.1 Non-linear Classification. 4.2 Neural Model(神经元模型) 4.3 Forward Propagation 4.4 神经网络实现与或非门以及异或门 4.4.1 实现与或非门(AND/OR/NOT) 4.4.2 实现异或/同或门(XOR/XNOR) 4.5 Multi-class classification k…
9. Clustering Content 9. Clustering 9.1 Supervised Learning and Unsupervised Learning 9.2 K-means algorithm(代码地址:https://github.com/llhthinker/MachineLearningLab/tree/master/K-Means) 9.3 Optimization objective 9.4 Random Initialization 9.5 Choosing t…
Content: 2 Logistic Regression. 2.1 Classification. 2.2 Hypothesis representation. 2.2.1 Interpreting hypothesis output. 2.3 Decision boundary. 2.3.1 Non-linear decision boundaries. 2.4 Cost function for logistic regression. 2.4.1 A convex logistic r…