Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Udemy

XGBoost & Random Forest: Decision Trees + Boosting in R

via Udemy

Overview

XGBoost, Random Forest, Decision Trees, Gradient Boosting, ROC Curve/AUC, Machine Learning in R (RStudio), rpart, party

What you'll learn:
  • The algorithm behind recursive partitioning decision trees
  • Construct conditional inference decision trees with R`s ctree function
  • Construct recursive partitioning decision trees with R`s rpart function
  • Learn to estimate Gini´s impurity
  • Construct ROC and estimate AUC
  • Random Forests with R´s randomForest package
  • Gradient Boosting with R´s XGBoost package
  • Deal with missing data

Do you want to build predictive models with machine learning—and actually understand what’s happening under the hood?

Welcome to “Decision Trees, Random Forests, and Gradient Boosting in R.” This is a hands-on, learning-by-doing course where you’ll work with real datasets and build models step by step, using the most important tree-based methods in applied machine learning.

I’m Carlos Martínez (Ph.D., University of St. Gallen). I designed this course to be practical, structured, and rigorous, so you can go beyond “running code” and gain the judgment you need to build, tune, and evaluate models properly.

What you’ll learn

By the end of the course, you’ll be able to:

  • Understand how recursive partitioning works (the logic behind decision trees)

  • Build trees in R using rpart and ctree (conditional inference trees)

  • Control complexity, reduce overfitting, and improve generalization using:

    • complexity parameter (cp)

    • pruning strategies

  • Apply and compare two high-performance ensemble methods:

    • Random Forests

    • Gradient Boosting

  • Evaluate predictive performance using ROC curves and AUC, so you can compare models with a robust metric

What’s included

  • Video lessons + structured explanations

  • Real datasets and all course code (R scripts)

  • Practice assignments + detailed solutions, so you can self-check and build confidence

Who this course is for

  • University students and professionals who want practical machine learning skills

  • Analysts working in business intelligence, analytics, finance, operations, or data roles

  • Anyone who wants to learn tree-based modeling properly, from fundamentals to evaluation

Prerequisites

  • Basic comfort with spreadsheets

  • Basic familiarity with R (you don’t need to be advanced)

What students say

  • Stefan L.: “Even though the topic was new to me, the course is easy to understand and the RStudio exercises work as explained.”

  • Frank B.: “Very beneficial… well organized and easy to understand. It gave me new ideas to assess model validity.”

  • Steven H.: “A very good review before my test tomorrow.”

  • Al M.: “Excellent.”

If you want a clear, practical path to mastering decision trees and modern ensembles in R—and learning how to evaluate them correctly—this course is for you.

Enroll today, and I’ll see you in the first lesson.

Syllabus

  • Introducción
  • Data Preprocessing
  • Decisions Trees with CTREE
  • Decisions Tress with RPART
  • Random Forests
  • Gradient Boosting Trees
  • New section: Neural Network for Business Analytics

Taught by

Carlos Martínez, PhD • Instructor with 20+ Best-Selling Courses and Mercedes Chávez

Reviews

4.4 rating at Udemy based on 53 ratings

Start your review of XGBoost & Random Forest: Decision Trees + Boosting in R

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.