Learn about Ensemble Methods and their implementation from scratch in C++ using mlpack. This course covers the understanding and implementation of multiple ensemble methods such as Bagging, Random Forest, AdaBoost, and Stacking. Focus on building these models and aggregating their results without relying on high-level machine learning libraries beyond mlpack.
Overview
Syllabus
- Unit 1: Bagging with Decision Trees
- Iris Flower Species Prediction Using Bagging Technique
- Bootstrapping Function for Bagging Algorithm
- Bagging Predictions Implementation Task
- Decision Tree Prediction Aggregation Task
- Unit 2: Random Forests in C++
- Assessing Random Forest Classifier Accuracy on Iris Dataset
- Adjusting Random Forest Max Depth for Optimal Accuracy
- Implementing a Random Forest Constructor in C++
- Unit 3: AdaBoost in C++
- Efficient Spam Email Filtering with AdaBoost Classifier
- Experimenting with AdaBoost Learning Rate
- Updating Weights in AdaBoost Model
- Complete the AdaBoost Prediction Function
- Unit 4: Stacking Ensemble Learning
- Iris Species Prediction Using Stacking Model
- Switching Meta-Model to Decision Tree Classifier
- Stacking Model Predictions Preparation