首页 > 深度学习 > 正文

deep learning book-第5章 Machine Learning Basics

标签:deep learning book


几个git链接:

目录:

  • 5.1 Learning Algorithms
    • The Task, T
    • The Performance Measure, P
    • The Experience, E
    • Example: Linear Regression
  • 5.2 Capacity, Overfitting and Underfitting
    • The No Free Lunch Theorem
    • Regularization
  • 5.3 Hyperparameters and Validation Sets
    • Cross-Validation
  • 5.4 Estimators, Bias and Variance
    • Point Estimation
    • Bias
    • Variance and Standard Error
    • Trading off Bias and Variance to Minimize Mean Squared Error
    • Consistency
  • 5.5 Maximum Likelihood Estimation
    • Conditional Log-Likelihood and Mean Squared Error
    • Properties of Maximum Likelihood
  • 5.6 Bayesian Statistics
    • Maximum A Posteriori (MAP) Estimation
  • 5.7 Supervised Learning Algorithms
    • Probabilistic Supervised Learning
    • Support Vector Machines
    • Other Simple Supervised Learning Algorithms
  • 5.8 Unsupervised Learning Algorithms
    • Principal Components Analysis
    • k-means Clustering
  • 5.9 Stochastic Gradient Descent
  • 5.10 Building a Machine Learning Algorithm
  • 5.11 Challenges Motivating Deep Learning
    • The Curse of Dimensionality
    • Local Constancy and Smoothness Regularization
    • Manifold Learning

原创文章,转载请注明出处!
本文链接:http://hxhlwf.github.io/posts/dl-dlbook-chap5.html
上篇: dual learning for mt
下篇: 在docker中不要使用sshd

comment here..