Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to deploy and manage AI applications at the edge using ZEDEDA's Kubernetes service in this 24-minute conference showcase. Discover why edge AI deployment is becoming critical due to bandwidth limitations, high latency, and data privacy concerns that make cloud-based analysis impractical for sensor data. Explore the unique challenges of edge AI deployment, including managing diverse hardware configurations, ensuring autonomous operation in disconnected environments, and scaling AI model updates across distributed infrastructure. Understand how Kubernetes provides an ideal solution for packaging and managing complex AI pipelines at the edge through its lightweight architecture and robust ecosystem. Follow a practical demonstration of deploying an Edge AI solution for car classification using ZEDEDA's platform, including the setup of a multi-component application with OpenVINO inference server, model-pulling sidecar, and demo client application through Helm charts. See how ZEDEDA's unified control plane enables zero-touch provisioning and lifecycle management while maintaining model privacy within on-premise networks without cloud exposure. Witness real-time inference capabilities in action and learn how to leverage ZEDEDA's open-source repositories to build custom edge AI solutions for your own applications.
Syllabus
Manage Edge AI Using ZEDEDA Kubernetes Service
Taught by
Tech Field Day