大学学习 统计机器学习 全8讲——更多资源,课程更新在
大学学习 统计机器学习 全8讲
索引: Outline(00:00:08)
Challenging problems(00:00:19)
Data Mining(00:00:53)
Machine Learning(00:02:15)
Application in PR(00:03:14)
Difference(00:03:28)
Biometrics(00:04:04)
Bioinformatics(00:04:39)
ISI(00:05:08)
Confusion(00:05:34)
统计机器学习基础研究(00:06:00)
Machine learning community(00:06:31)
学习(00:06:55)
Performance(00:08:15)
学习(00:08:19)
Performance(00:08:22)
More(00:08:53)
Theoretical Analysis(00:09:11)
Ian Hacking(00:09:44)
Statistical learning(00:10:28)
Andreas Buja(00:10:46)
Interpretation of Algorithms(00:11:22)
统计学习(00:11:58)
Main references(00:13:18)
Main kinds of theory(00:13:39)
Definition of Classifications(00:14:02)
统计学习(00:14:23)
Main kinds of theory(00:15:21)
Definition of Classifications(00:15:22)
Definition of regression(00:15:50)
Several well-known algorithms(00:16:27)
Framework of algorithms(00:17:02)
Designation of algorithms(00:17:58)
统计决策理论(00:18:39)
Bayesian:classification(00:19:26)
统计决策理论(00:20:10)
Bayesian:classification(00:20:13)
Bayesian: regression(00:20:18)
统计决策理论(00:20:55)
Bayesian:classification(00:21:00)
Bayesian: regression(00:21:17)
Estimating densities(00:21:25)
KNN(00:22:45)
Interpretation:KNN(00:23:20)
高维空间(00:24:15)
维数灾难(00:25:01)
维数灾难(00:25:50)
维数灾难:其它体现(00:26:45)
LMS(00:27:33)
Interpretation: LMS(00:29:57)
维数灾难(00:30:57)
KNN(00:30:58)
Designation of algorithms(00:30:59)
Designation of algorithms(00:31:00)
统计决策理论(00:31:01)
Estimating densities(00:31:18)
高维空间(00:31:19)
维数灾难:其它体现(00:31:20)
Interpretation: LMS(00:31:21)
Fisher Discriminant Analysis(00:31:40)
Interpretation: FDA(00:32:35)
FDA and LMS(00:33:04)
FDA: a novel interpretation(00:33:38)
FDA: parameters(00:34:24)
FDA: framework of algorithms(00:35:09)
Disadvantage(00:35:59)
Bias and variance analysis(00:36:44)
Bias-Variance Decomposition(00:37:17)
Bias-Variance Tradeoff(00:38:46)
Bias-Variance Decomposition(00:38:52)
Bias-Variance Tradeoff(00:39:05)
Interpretation: KNN(00:40:29)
Ridge regression(00:41:35)
Interpretation: ridge regression(00:42:03)
Ridge regression(00:42:43)
Interpretation: ridge regression(00:43:05)
Interpretation: parameter(00:43:28)
Interpretation: ridge regression(00:43:35)
Interpretation: parameter(00:43:37)
A note(00:44:32)
Other loss functions(00:45:39)
Interpretation: boosting(00:46:35)
Boosting方法的由来(00:47:22)
Boosting方法流程(AdaBoost)(00:48:18)
Interpretation: margin(00:48:47)
Interpretation: SVM(00:49:43)
SVM: experimental analysis(00:50:48)
Interpretation: base learners(00:51:57)
Disadvantage(00:52:38)
Generalization bound(00:53:15)
PAC Frame(00:54:16)
VC Theory and PAC Bounds(00:54:44)
PAC Bounds for Classification(00:55:38)
VC Dimension(00:55:51)
PAC Bounds for Classification(00:55:52)
VC Dimension(00:56:27)
A consistency problems(00:57:39)
Remarks on PAC+VC Bounds(00:58:33)
SVM: Linearly separable(00:59:21)
SVM: soft Margin(01:00:28)
SVM: Linearly separable(01:01:12)
SVM: soft Margin(01:01:22)
SVM: algorithms(01:01:59)
泛化能力的界(01:03:01)
Bound: VC Dimension(01:04:04)
Bound: VC dimension+errors(01:04:45)
Disadvantages of SRM(01:05:52)
Disadvantage: PAC+VC bound(01:06:52)
Several concepts(01:07:51)
Disadvantage: PAC+VC bound(01:08:00)
Several concepts(01:08:02)
Generalization Bound: margin(01:08:35)
Importance of Margin(01:09:48)
Generalization Bound: margin(01:10:29)
Importance of Margin(01:10:34)
Vapnik’s three periods(01:10:35)
Neural networks(01:11:51)
Interpretation: neural networks(01:12:55)
BP Algorithms(01:14:17)
Disadvantage(01:15:42)
The End(01:16:32)