Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Coding Challenge 187 - Bayes Theorem

Coding Train via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to implement a Naive Bayes text classifier from scratch in JavaScript using p5.js in this comprehensive coding tutorial. Explore the mathematical foundations of Bayes' theorem and understand how it applies to text classification problems. Build a complete sentiment analysis system that analyzes word frequencies, implements Laplacian smoothing to handle unseen words, and creates probability calculations for classification decisions. Follow along as the implementation progresses from basic setup through advanced features like normalization techniques and user interface development. Master key machine learning concepts including training data processing, probability distribution calculations, and smoothing algorithms while creating a fully functional text classifier that runs entirely in the browser. Access multiple working code examples and interactive p5.js sketches to experiment with different approaches to Bayesian classification, from initial implementations to refactored versions with file loading capabilities.

Syllabus

0:00:00 Hello!
0:03:34 Explaining Bayes' Theorem
0:12:07 What is Naive Bayes?
0:13:49 Setting up the Classifier in p5.js
0:15:41 Coding the train function
0:22:14 Coding the classify Function
0:24:45 Revising the train function
0:29:06 Implementing Probability Calculations
0:33:24 Laplacian Additive Smoothing
0:42:21 Ignoring the enominator Normalization
0:45:36 Quick User Interface
0:49:42 Final thoughts and next steps.

Taught by

The Coding Train

Reviews

Start your review of Coding Challenge 187 - Bayes Theorem

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.