Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Accelerating Inference at the Edge - Unlock Scalable, Secure, and Low Latency Connectivity for AI Applications

Open Compute Project via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how IP over DWDM (IPoDWDM) technology revolutionizes edge AI inference by collapsing traditional multi-layer optical transport into a unified, efficient architecture. Explore the networking challenges facing AI inference applications in retail analytics, fraud detection, smart cities, and autonomous systems, where traditional optical transport creates latency bottlenecks and scalability limitations. Discover how IPoDWDM eliminates these constraints by integrating IP and optical layers, delivering significantly lower latency, improved bandwidth efficiency, and reduced total cost of ownership. Examine practical deployment strategies for dynamic AI workload delivery across edge, metro, and regional network nodes, and understand how this unified architecture bridges the gap between theoretical AI potential and real-world implementation requirements for secure, real-time, high-throughput data exchange at the edge.

Syllabus

Accelerating Inference at the Edge Unlock Scalable, Secure, and Low Latency Connectivity for

Taught by

Open Compute Project

Reviews

Start your review of Accelerating Inference at the Edge - Unlock Scalable, Secure, and Low Latency Connectivity for AI Applications

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.