Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Coursera

Kafka for Developers - Data Contracts Using Schema Registry

Packt via Coursera

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Updated in May 2025. This course now features Coursera Coach! A smarter way to learn with interactive, real-time conversations that help you test your knowledge, challenge assumptions, and deepen your understanding as you progress through the course. Unlock the power of data contracts in Kafka with this comprehensive course focusing on Schema Registry and AVRO serialization. You'll explore how to create robust data pipelines, ensuring compatibility and scalability across producer-consumer applications. By the end, you'll master tools and techniques that empower efficient data processing with seamless schema evolution. Start with the fundamentals of data serialization in Kafka, diving deep into popular formats like AVRO, Protobuf, and Thrift. Gradually, you'll build hands-on expertise by setting up Kafka in a local environment using Docker, creating custom AVRO schemas, and generating Java records for real-world applications. The course includes practical exercises, such as building an end-to-end Coffee Shop order service and exploring schema evolution strategies in Schema Registry. You'll also learn naming conventions, logical schema types, and compatibility strategies that ensure smooth upgrades in production environments. Designed for software developers and data engineers, this course assumes basic knowledge of Java and Kafka. Whether you're a beginner or looking to deepen your expertise in Kafka and Schema Registry, this course is your gateway to mastering data contracts.

Syllabus

  • Getting Started with the Course
    • In this module, we will set the foundation for the course by providing an overview of its objectives, structure, and prerequisites. You’ll gain a clear understanding of what to expect and how to prepare for success in this learning journey.
  • Data Contract and Serialization in Kafka
    • In this module, we will dive into the intricacies of data contracts and serialization in Kafka. You’ll explore how serialization enhances Kafka's architecture and examine different serialization formats, such as AVRO, Protobuf, and Thrift, to understand their schema compatibility and applications.
  • Introduction to AVRO - A Data Serialization System
    • In this module, we will introduce you to AVRO, one of the most popular serialization systems. You’ll explore the reasons behind its popularity and learn how to build a simple AVRO schema to get hands-on experience with its functionality.
  • Kafka Setup and Demo in Local Using Docker
    • In this module, we will guide you through setting up a local Kafka environment using Docker Compose. You’ll practice producing and consuming messages with CLI tools and delve into AVRO serialization by leveraging the AVRO console producer and consumer for hands-on experience.
  • Greeting App - Base AVRO Project Setup - Gradle
    • In this module, we will set up the foundational components for a Greeting App using Gradle. You’ll learn how to configure the project for AVRO support and generate Java records from schema files, preparing the groundwork for seamless AVRO integration.
  • Greeting App - Base AVRO Project Setup - Maven
    • In this module, we will set up the Greeting App project using Maven as the build tool. You’ll learn how to configure Maven for AVRO support and generate Java records from schema files, ensuring the project is ready for AVRO-based serialization.
  • Build AVRO Producer and Consumer in Java
    • In this module, we will guide you through building an AVRO-based Kafka producer and consumer using Java. You’ll learn how to implement serialization and deserialization for seamless data exchange within Kafka topics.
  • Coffee Shop Order Service Using AVRO - A Real-Time Use Case
    • In this module, we will build a real-time Coffee Shop Order Service using AVRO and Kafka. You’ll start with an application overview and progress through project setup, schema creation, and AVRO class generation. Finally, you’ll create producers and consumers to simulate and process coffee shop orders in a real-time streaming scenario.
  • Logical Schema Types in AVRO
    • In this module, we will explore logical schema types in AVRO and their practical applications. You’ll learn to enhance the CoffeeOrder schema by adding logical types like timestamps, decimals, UUIDs, and dates to improve data accuracy and functionality in real-world scenarios.
  • AVRO Record- Under the Hood
    • In this module, we will delve into the inner workings of an AVRO record. You’ll uncover how data is stored, organized, and encoded, gaining insights into its efficiency and compatibility with evolving schemas.
  • Schema Changes in AVRO - Issues without Schema Registry
    • In this module, we will explore the effects of schema changes in AVRO, focusing on the consumer application's failure to process data with an updated schema. You’ll gain practical insights into schema evolution challenges and why a schema registry is crucial for maintaining compatibility.
  • Introduction to Schema Registry
    • In this module, we will introduce you to the schema registry and its critical role in handling AVRO schemas effectively. You’ll learn how to publish and consume records using the schema registry, interact with its REST API, and work with "key" fields as AVRO records to enhance data management and compatibility.
  • Data Evolution Using Schema Registry
    • In this module, we will focus on data evolution through the lens of schema changes managed by a schema registry. You’ll learn to update project configurations, explore compatibility types like backward, forward, and full, and understand the consequences of modifying AVRO schemas, enabling you to manage data evolution effectively.
  • Schema Naming Strategies
    • In this module, we will cover different schema naming strategies in AVRO and their practical applications. You’ll create a schema for Coffee Update events and learn how to utilize RecordNameStrategy to manage schema versions while ensuring seamless data processing.
  • Build a Coffee Order Service Using Spring Boot and Schema Registry
    • In this module, we will create a coffee order service application using Spring Boot, Kafka, and Schema Registry. We'll cover every step, from setting up the project using Gradle or Maven to building RESTful endpoints for creating and updating coffee orders. Additionally, we'll explore how to configure Kafka producers and consumers to publish and process coffee order events, leveraging AVRO for structured data serialization.

Taught by

Packt - Course Instructors

Reviews

Start your review of Kafka for Developers - Data Contracts Using Schema Registry

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.