Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Coursera

Understanding Open AI Workspaces

Coursera via Coursera

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
The Understanding Open AI Workspaces course is for developers with intermediate machine learning experience and Python skills who are new to Generative AI and want to learn how to build, customize, optimize, and deploy open source large language models. This course provides learners with the skills to set up, configure, and manage environments for open generative AI development. Beginning with local installations, learners practice running large language models on their own machines using Ollama, exploring performance optimization techniques for consumer hardware, and integrating external applications through APIs. The course then introduces Docker and Docker Compose, guiding learners through containerized environments that ensure reproducibility, persistence, and scalability. Learners build multi-container architectures to separate models and services while managing GPU passthrough and memory optimization. Finally, the course covers Google Colab for cloud-based GPU access, where learners configure free resources, manage storage through Google Drive, and monitor performance within session constraints. By the end, learners will have set up both local and cloud environments, documented their processes, and gained the ability to choose the right workspace for different AI workloads.

Syllabus

  • Local Large Language Model (LLM) Setup with Ollama
    • In this module, you’ll set up a local environment for working with large language models using Ollama. You’ll install and configure the tool, download and switch between different models, and practice operating through the command-line interface. You’ll also explore how to optimize performance and connect Ollama with external applications, giving you a hands-on way to manage and experiment with LLMs.
  • Containerized Environments with Docker
    • In this module, you’ll learn the essentials of using Docker to set up stable, reproducible environments for AI development. You’ll practice building containers, managing model persistence and data volumes, and designing multi-container setups that separate models from applications. You’ll also explore strategies to optimize memory and GPU resources, giving you the confidence to run and experiment with AI projects.
  • Navigating and Configuring Jupyter for GPUs
    • In this module, you’ll learn how to make Jupyter work effectively for AI development. You’ll navigate the notebook interface, set up GPU access, and manage dependencies with pip and conda. You’ll also implement strategies for persistent storage and monitor system performance during training, so your workflows stay efficient, stable, and ready for real-world projects.

Taught by

Professionals from the Industry

Reviews

Start your review of Understanding Open AI Workspaces

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.