Awesome
library.ml
Implementation of Probabilistic Machine Learning models learned through the reading of Pattern Recognition and Machine Learning Book.
Handwritten solutions to the book exercises, solved and written by myself, may be found here.
The purpose of this library is to obtain a better understanding of the algorithms, plus using them for teaching goals. This is why the focus of the code developed is to be written in an understandable rather than the most computationally efficient way.
Index
Regression
- Least Squares (Max. Likelihood) Linear Regression
- Gradient descent
- Closed form
- Bayesian Linear Regression
Clasification
- Discriminant Functions
- Least Squares
- Fisher's Linear Discriminant
- Perceptron algorithm
- Probabilistic Generative Models
- Maximum Likelihood
- Probabilistic Discriminative Models
- Logistic Regression (Iterative reweighted least sq.)
- Probit Regression
- Bayesian Logistic regression
Note: All algorithms must work for binary classification and continuous features. Algorithms that may work with multi-class classification or discrete-feature classification, must allow multi-class classification or discrete-feature classification, respectively.