Overview
Master feature engineering with Kaggle datasets. Progress from baselines to advanced optimizations for Linear Regression, Random Forest, and LightGBM. Learn to diagnose model weaknesses and build weighted ensembles for peak predictive performance.
Syllabus
- Course 1: Data Exploration and Baseline Modeling
- Course 2: Feature Engineering with Pandas and LightGBM
- Course 3: Evaluating and Finalizing Your Feature-Driven Model
Courses
-
In this course, learners will load and inspect a Kaggle dataset, perform exploratory data analysis, preprocess features, and build baseline regression models to establish initial performance benchmarks.
-
This course guides learners through diagnosing baseline model weaknesses, applying foundational and advanced feature engineering techniques, and building enhanced models to improve predictive performance.
-
This course shows how feature engineering should change across models like Linear Regression, Random Forest, and LightGBM. You’ll build and test model-specific features, compare results with RMSE, and refine your pipeline based on evidence.