XGBoost and Data Leakage in Machine Learning - Day 14 of 30 Days of ML
1littlecoder via YouTube
-
20
-
- Write review
Learn Generative AI, Prompt Engineering, and LLMs for Free
Lead AI-Native Products with Microsoft's Agentic AI Program
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Dive into Day 14 of Kaggle's 30 Days of ML Challenge, focusing on XGBoost and data leakage in machine learning. Explore the power of gradient boosting, a technique that dominates many Kaggle competitions and achieves state-of-the-art results on various datasets. Learn how to build and optimize models using XGBoost, and gain insights from a comprehensive tutorial. Understand the critical concept of data leakage, its potential to ruin models in subtle and dangerous ways, and discover effective prevention strategies. Complete hands-on exercises to reinforce your learning and wrap up Kaggle's Intermediate ML Course. Access additional resources, including related videos and a StatQuest tutorial on XGBoost, to deepen your understanding of these essential machine learning concepts.
Syllabus
Kaggle 30 Days of ML (Day 14) - XGBoost, Data Leakage - Learn Python ML in 30 Days
Taught by
1littlecoder
Reviews
5.0 rating, based on 1 Class Central review
Showing Class Central Sort
-
XGBoost Part: I understand how gradient boosting works, the purpose of key hyperparameters (learning rate, max depth, n_estimators, subsample, etc.), and how to control overfitting. I can interpret feature importance and apply XGBoost for classifica…