Add intelligence to your app using OpenAI’s LLMs. Learn to generate structured recipes from ingredients with prompt engineering, render dynamic prompts, call the API, and process outputs. Build a script to extract recipes from messy HTML and store them cleanly. Your app will now auto-generate and parse recipes.
Overview
Syllabus
- Unit 1: Making Basic LLM Calls
- Setting Up Your OpenAI Client
- Changing Personas with System Prompts
- Crafting Effective User Prompts
- Controlling Randomness with Temperature Settings
- Selecting the Right LLM Model
- Unit 2: Prompt Structure and Variables
- Loading Templates from Files
- Replacing Placeholders with Regular Expressions
- Custom Variable Substitution
- Integrating the Prompt Generation Pipeline
- Creating a Recipe Generator with Templates
- Unit 3: Creating the LLM Manager
- Setting Up the LLM Foundation
- Building the Prompt Rendering Pipeline
- Making the LLM Call
- Adding Robust Error Handling
- Unit 4: Generating Recipes with AI
- Creating Recipe Generation Prompt Templates
- Implementing Recipe Generation Input Validation
- Connecting Flask Routes to AI
- Setting Up AI Response Parsing Structure
- Implementing AI Response Parsing Logic
- Completing the Recipe Generation Endpoint
- Unit 5: Extracting Recipes from HTML
- Creating Recipe Extraction Prompt Templates
- Wiring AI to Extract Recipes
- Parsing AI Response into Recipe Data
- Storing Extracted Recipes in Database
- Building the Complete CLI Pipeline