统计机器学习大纲!!!

监督学习:线性回归;逻辑回归;感知机;K近邻;决策树;朴素贝叶斯;支持向量机;最大熵模型

集成学习:Boosting系列算法;Bagging系列算法;AdaBoost算法;XgBoost算法;随机森林

无监督学习:k-means均值算法;BIRCH聚类算法;DBSCAN密度聚类算法;谱聚类

降维算法:主成分分析(PCA);线性判别分析(LDA);奇异值分解(SVD);局部线性嵌入

推荐算法:Apriori关联算法;FPTree算法;PrefixSpan算法;协同过滤算法;矩阵分解推荐算法

特征工程:特征选择;特征表达;特征预处理

贝叶斯个性化推荐(BPR)算法、机器学习模型跨平台上限、异常点检测算法

曾为培训讲师,由于涉及公司版权问题,现文章内容全部重写,地址为https://www.cnblogs.com/nickchen121/p/11686958.html。
更新、更全的Python相关更新网站,更有数据结构、人工智能、Mysql数据库、爬虫、大数据分析教学等着你:https://www.cnblogs.com/nickchen121/

第一篇 优化算法

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214940.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>001 梯度下降(GradientDescent)小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214844.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>002 最小二乘法小结</a></div>

第二篇 模型优化参数

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214784.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>003 交叉验证(CrossValidation)原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214851.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>004 精确率与召回率,RoC曲线与PR曲线</a></div>

第三篇 线性回归

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214779.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>005 线性回归原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214833.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>006 scikit-learn和pandas基于windows单机机器学习环境的搭建</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214941.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>007 用scikit-learn和pandas学习线性回归</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214936.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>008 Lasso回归算法:坐标轴下降法与最小角回归法小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214790.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>009 用scikit-learn和pandas学习Ridge回归</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214892.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>010 scikit-learn线性回归算法库小结</a></div>

第三篇 逻辑回归

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214884.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>011 逻辑回归原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214911.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>012 scikit-learn逻辑回归类库使用小结</a></div>

第四篇 感知机

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214821.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>013 感知机原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214847.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>014 机器学习算法的随机数据生成</a></div>

第五篇 决策树

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214819.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>015 决策树算法原理(上)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214818.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>016 决策树算法原理(下)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214915.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>017 scikit-learn决策树算法类库使用小结</a></div>

第六篇 K近邻法(KNN)

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214794.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>018 K近邻法(KNN)原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214857.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>019 scikit-learnK近邻法类库使用小结</a></div>

第七篇 朴素贝叶斯算法

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214817.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>020 朴素贝叶斯算法原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214786.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>021 scikit-learn朴素贝叶斯类库使用小结</a></div>

第八篇 最大熵模型

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214926.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>022 最大熵模型原理小结</a></div>

第九篇 支持向量机

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214876.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>023 支持向量机原理(一)线性支持向量机</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214859.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>024 支持向量机原理(二)线性支持向量机的软间隔最大化模型</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214903.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>025 支持向量机原理(三)线性不可分支持向量机与核函数</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214776.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>026 支持向量机原理(四)SMO算法原理</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214837.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>027 支持向量机原理(五)线性支持回归</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214789.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>028 scikit-learn支持向量机算法库使用小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214882.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>029 支持向量机高斯核调参小结</a></div>

第十篇 集成学习

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214792.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>030 集成学习原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214803.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>031 集成学习之Adaboost算法原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214863.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>032 scikit-learnAdaboost类库使用小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214831.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>033 梯度提升树(GBDT)原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214872.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>034 scikit-learn梯度提升树(GBDT)调参小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214797.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>035 Bagging与随机森林算法原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214871.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>036 scikit-learn随机森林调参小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214807.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>037 XGBoost算法原理小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214842.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>038 XGBoost类库使用小结</a></div>

第十一篇 无监督学习

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214929.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>039 K-Means聚类算法原理</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214885.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>040 用scikit-learn学习K-Means聚类</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214791.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>041 BIRCH聚类算法原理</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214922.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>042 用scikit-learn学习BIRCH聚类</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214901.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>043 DBSCAN密度聚类算法</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214923.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>044 用scikit-learn学习DBSCAN聚类</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214909.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>045 谱聚类(spectralclustering)原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214826.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>046 用scikit-learn学习谱聚类</a></div>

第十二篇 降维算法

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214834.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>047 主成分分析(PCA)原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214894.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>048 用scikit-learn学习主成分分析(PCA)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214783.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>049 线性判别分析LDA原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214860.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>050 用scikit-learn进行LDA降维</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214858.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>051 奇异值分解(SVD)原理与在降维中的应用</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214918.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>052 局部线性嵌入(LLE)原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214829.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>053 用scikit-learn研究局部线性嵌入(LLE)</a></div>

第十三篇 推荐算法

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214868.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>054 典型关联分析(CCA)原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214896.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>055 Apriori算法原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214938.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>056 FPTree算法原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214773.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>057 PrefixSpan算法原理总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214870.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>058 用Spark学习FPTree算法和PrefixSpan算法</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214846.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>059 协同过滤推荐算法总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214854.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>060 矩阵分解在协同过滤推荐算法中的应用</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214811.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>061 SimRank协同过滤推荐算法</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214809.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>062 用Spark学习矩阵分解推荐算法</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214770.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>063 分解机(FactorizationMachines)推荐算法原理</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214800.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>064 EM算法原理总结</a></div>

第十四篇 特征工程

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214874.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>065 特征工程之特征选择</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214916.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>066 特征工程之特征表达</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214772.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>067 特征工程之特征预处理</a></div>

第十四篇 贝叶斯个性化排序(BPR)算法

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214802.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>068 贝叶斯个性化排序(BPR)算法小结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11214768.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>069 用tensorflow学习贝叶斯个性化排序(BPR)</a></div>

第十五篇 用PMML实现机器学习模型的跨平台上线

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214889.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>070 用PMML实现机器学习模型的跨平台上线</a></div>

第十六篇 异常点检测算法

<div><a href=”https://www.cnblogs.com/nickchen121/p/11214849.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>071 异常点检测算法小结</a></div>

第十七篇 推荐阅读

<div><a href=”https://www.cnblogs.com/nickchen121/p/11569883.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>推荐书单(网课)-人生/编程/Python/机器学习</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/10718112.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>Python从入门到放弃(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11164387.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>十天快速入门Python(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11407287.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>数据结构与算法-Python/C(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11517502.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>Go从入门到放弃(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11215237.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>统计机器学习(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/10840284.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>TensorFlow2教程(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/10802091.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>机器学习(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/10825705.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>Python能干啥(目录)</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11637985.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>考研每日总结</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/11120894.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>大数据分析和人工智能科普</a></div>
<div><a href=”https://www.cnblogs.com/nickchen121/p/10653178.html” style=”text-decoration: none; color: rgba(7, 137, 224, 1)” target=”_blank”>人工智能(机器学习)学习之路推荐</a></div>

版权声明:本文为nickchen121原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://www.cnblogs.com/nickchen121/p/11215237.html