Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Death by Python Pickle - Betrayal ML

Linux Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the critical security vulnerabilities in Python's pickle serialization format and its dangerous implications for machine learning systems in this 39-minute conference talk. Learn how malicious actors can exploit pickle files to inject harmful code into ML models, similar to how Agent Smith might tamper with Neo's Kung Fu upload in The Matrix. Discover the mechanics behind these "Betrayal ML" attacks, where seemingly innocent model files can contain hidden malicious payloads that execute when loaded. Examine real-world examples of pickle-based attacks and understand why this serialization method poses such significant risks to AI and ML deployments. Gain insights into emerging detection capabilities and defensive strategies to protect your machine learning infrastructure from these sophisticated supply chain attacks. Master the technical details of how pickle deserialization can be weaponized and develop the knowledge needed to identify and mitigate these threats in your own ML workflows.

Syllabus

Death by (Python) Pickle: "Betrayal ML" - Kadi McKean & Andy Lewis, ReversingLabs

Taught by

Linux Foundation

Reviews

Start your review of Death by Python Pickle - Betrayal ML

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.