Having been working on Machine Learning for a while, I realize that many models work only if you have an deep understanding of them and use them correctly, even for the simplest ones. In the most of time, the literatures that describe them usually missed many details, which are sometime important for practical usage. Even worse, some authors might not understand them deeply or correctly, even for the inventors of the models sometime. In this series of blog posts, I will share some of my understanding regarding some popular models that might not be seen from literatures.

Since I am not a famous guy in this area, I can use a more critical style to evaluate these models. It is certainly possible that my understanding or criticize is wrong. Therefore, I also keep an open mind for criticize for my posts. The hope is with this series, audiences (if there is any) and I can both learn something.

This page serves as a table of content of the series and will be updated regularly. I actually have a very long list in mind, from very basic models such as linear regression to more complicated ones such as Bayesian nonparametric and deep learning. Hope that I can squeeze out enough time to finish them. Here are the topics I will discuss in recent posts:

(I) Linear Regression

(II) Kernel Regression

(III) From Probabilistic Regression to Gaussian Process

(IV) Exponential Family

(V) Naive Bayes

(VI) Logistic Regression

(VII) Discriminant VS Generative Models (Maximum Entropy)

(VIII) Bayesian View of Semi-supervised Learning

(IX) Dirichlet Process

(X) Variational (Empirical) Bayesian Inference

(XI) TBD

Since these blog posts are not research papers, I will omit some detail derivations from time to time and try to make the posts succinct, if the details can be easily found in a textbook. One excellent machine learning textbook that I like very much is the Pattern Recognition and Machine Learning by Dr. Bishop. I will try to use a similar symbol system as in the book. If you find my posts confusing, it is better to first take a look at corresponding part in that book. Of course, you can always leave comment. I will be happy to discuss with you.

### Like this:

Like Loading...

*Related*