- Practice using core data engineering platforms.
- Solve problems data engineers encounter in the real world.
- Create a data engineering project using open-source tools.
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Want to elevate your data engineering skills, from pipeline building and management to effectively using tools such as dbt and Airflow? In this learning path, you'll learn how to solve common problems encountered by data engineers in the workplace. You'll also find out how to set up a data warehouse and build a data engineering pipeline from scratch.
Syllabus
Courses under this program:
Course 1: Data Engineering with dbt
-Learn how to get started with setting up, running, and managing a dbt project.
Course 2: Problem-Solving Strategies for Data Engineers
-Data engineers are facing a wide variety of problems every day. Learn best practices on how to approach the typical challenges.
Course 3: ETL in Python and SQL
-Gain the knowledge you need to build data pipelines in a data-driven world.
Course 4: Fundamentals of Data Transformation for Data Engineering
-Get an introduction to the transformation landscape and how the pieces of data transformation fit together.
Course 5: Complete Guide to Data Lakes and Lakehouses
-Build the foundational knowledge and practical skills essential for data engineers, data scientists, and related professionals to effectively design and utilize data lakes.
Course 6: Learning Apache Airflow
-Get an introduction to Apache Airflow—its uses, structure, how to get it up and running, and how to create and execute workflows.
Course 7: Data Engineering Pipeline Management with Apache Airflow
-Explore ways to work with role-based access control, manage SLAs, schedule DAGs with datasets, work with Airflow Plugins, and scale Airflow.
Course 8: Data Engineering Project: Build Streaming Ingestion Pipelines for Snowflake with AWS
-Upskill as a data professional by learning how to build streaming pipelines using Snowflake, Kafka, and AWS.
Course 9: Data Pipeline Automation with GitHub Actions Using R and Python
-Learn how to set up workflows on GitHub Actions to automate processes with both R and Python.
Course 10: End-to-End Data Engineering Project
-Learn how to create an end-to-end data engineering project using open tools from the modern data stack to turn scattered data into a model that drives insights and decision-making.
Course 1: Data Engineering with dbt
-Learn how to get started with setting up, running, and managing a dbt project.
Course 2: Problem-Solving Strategies for Data Engineers
-Data engineers are facing a wide variety of problems every day. Learn best practices on how to approach the typical challenges.
Course 3: ETL in Python and SQL
-Gain the knowledge you need to build data pipelines in a data-driven world.
Course 4: Fundamentals of Data Transformation for Data Engineering
-Get an introduction to the transformation landscape and how the pieces of data transformation fit together.
Course 5: Complete Guide to Data Lakes and Lakehouses
-Build the foundational knowledge and practical skills essential for data engineers, data scientists, and related professionals to effectively design and utilize data lakes.
Course 6: Learning Apache Airflow
-Get an introduction to Apache Airflow—its uses, structure, how to get it up and running, and how to create and execute workflows.
Course 7: Data Engineering Pipeline Management with Apache Airflow
-Explore ways to work with role-based access control, manage SLAs, schedule DAGs with datasets, work with Airflow Plugins, and scale Airflow.
Course 8: Data Engineering Project: Build Streaming Ingestion Pipelines for Snowflake with AWS
-Upskill as a data professional by learning how to build streaming pipelines using Snowflake, Kafka, and AWS.
Course 9: Data Pipeline Automation with GitHub Actions Using R and Python
-Learn how to set up workflows on GitHub Actions to automate processes with both R and Python.
Course 10: End-to-End Data Engineering Project
-Learn how to create an end-to-end data engineering project using open tools from the modern data stack to turn scattered data into a model that drives insights and decision-making.
Courses
-
Learn all the essentials of Hadoop, a key tool for processing and understanding big data.
-
Upskill as a data professional by learning how to build streaming pipelines using Snowflake, Kafka, and AWS.
-
Get an introduction to Apache Airflow—its uses, structure, how to get it up and running, and how to create and execute workflows.
-
Learn how to create an end-to-end data engineering project using open tools from the modern data stack to turn scattered data into a model that drives insights and decision-making.
-
Learn how to get started with setting up, running, and managing a dbt project.
-
Data engineers are facing a wide variety of problems every day. Learn best practices on how to approach the typical challenges.
-
Gain the knowledge you need to build data pipelines in a data-driven world.
-
Explore ways to work with role-based access control, manage SLAs, schedule DAGs with datasets, work with Airflow Plugins, and scale Airflow.
Taught by
Mark Freeman II, Lynn Langit, Andreas Kretz, Jennifer Ebe, Janani Ravi, Sagar Suri and Thalia Barrera