Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

MedGemma 27B Local Multimodal Health AI Advisor - X-rays and Text-Only Diagnosis Test

Venelin Valkov via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore Google's MedGemma 27B, a fine-tuned multimodal healthcare AI model based on Gemma3, through hands-on testing and implementation in this comprehensive tutorial. Learn about the model's capabilities and benchmarks before diving into practical setup using Google Colab, including resource requirements, model loading procedures, prompt formatting, and HuggingFace pipeline configuration. Test the AI's diagnostic capabilities across various medical scenarios using real-world cases from the /r/AskDocs/ subreddit, including text-only analysis for back pain conditions, multimodal evaluation of hip X-rays with accompanying text, lung X-ray interpretation combined with patient descriptions, and hearing loss assessment. Discover how this open-source healthcare AI model processes both textual patient information and medical imaging data to provide diagnostic insights, while understanding the technical implementation details needed to deploy and utilize this advanced medical AI system locally.

Syllabus

00:00 - MedGemma 27B multimodal, benchmarks
03:27 - Notebook setup resource requirements, load model, prompt format, huggingface pipeline
06:08 - Back pain text-only
08:55 - Hip X-Rays text with multiple images
11:27 - Lung X-Rays text with image
13:40 - Hearing loss
17:30 - Conclusion

Taught by

Venelin Valkov

Reviews

Start your review of MedGemma 27B Local Multimodal Health AI Advisor - X-rays and Text-Only Diagnosis Test

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.