Overview
Master machine learning by implementing regression, classification, and optimization algorithms from scratch in C++. Build models like linear regression, k-NN, and decision trees, and learn key evaluation metrics—no high-level libraries required.
Syllabus
- Course 1: Regression and Gradient Descent
- Course 2: Classification Algorithms and Metrics
- Course 3: Gradient Descent: Building Optimization Algorithms from Scratch
Courses
-
Dig deep into regression and learn about the gradient descent algorithm. This course does not rely on high-level libraries like scikit-learn, but focuses on building these algorithms from scratch for a thorough understanding. Master the implementation of simple linear regression, multiple linear regression, and logistic regression powered by gradient descent.
-
Go beneath the surface of classification algorithms and metrics, implementing them from scratch for deeper understanding. Bypass commonly-used libraries such as scikit-learn to construct Logistic Regression, k-Nearest Neighbors, Naive Bayes Classifier, and Decision Trees from ground up. This course includes creating the AUCROC metric for Logistic Regression, among others.
-
Delve into the intricacies of optimization techniques with this immersive course that focuses on the implementation of various algorithms from scratch. Bypass high-level libraries to explore Stochastic Gradient Descent, Mini-Batch Gradient Descent, and advanced optimization methods such as Momentum, RMSProp, and Adam.