This lab provides you hands-on experience in building a serverless backend, integrating an LLM service, and connecting it to a frontend application, enabling seamless communication between the different layers of a modern web application.
Objectives
- Build an AWS Lambda function to invoke the Amazon Bedrock LLM service and generate flash cards from study notes in a JSON format suitable for the frontend.
- Create an API Gateway REST API with a gateway proxy method and configure CORS settings to enable communication between the frontend and backend.
- Integrate the frontend application with the backend by updating the API endpoint URL in the frontend source code.
- Test the end-to-end application by submitting study notes and verifying the generated flash cards, gaining practical experience in building a serverless backend with an LLM service and connecting it to a frontend application.
Prerequisites
- Basic understanding of Amazon API Gateway
- Basic understanding of AWS Lambda
- Familiarity with Python
Outline
Task 1: Set up and launch the frontend application
Task 2: Request Amazon Bedrock model access
Task 3: Create a Lambda function to generate flash cards
Task 4: Create an API Gateway REST API with a gateway proxy method
Task 5: Integrate the frontend application with the backend
Task 6: Test the end-to-end application