Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CONGO - Compressive Online Gradient Optimization

Centre for Networked Intelligence, IISc via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Attend this online lecture exploring the CONGO (Compressive Online Gradient Optimization) framework for solving zeroth-order online convex optimization problems with sparse gradients. Learn how this innovative approach leverages gradient sparsity to obtain useful gradient estimates with limited function samples, particularly in large-scale queueing networks processing time-sensitive jobs. Discover how CONGO enables compressive sensing methods to achieve optimal regret bounds without full problem dimensionality appearing in the bound, reducing required samples per gradient estimate to scale with sparsity rather than full dimensionality. Explore the motivation behind this research through the lens of optimizing resource allocation in queueing systems where jobs pass through multiple queues and service times depend on allocated resources, requiring balance between end-to-end latency and total resource cost. Examine numerical simulations and real-world microservices benchmarks demonstrating CONGO's superiority over traditional gradient descent approaches that ignore sparsity. The presentation covers joint research work with Jeremy Carleton, Prathik Vijaykumar, Divyanshu Saxena, Dheeraj Narasimha, and Aditya Akella, delivered by Prof. Srinivas Shakkottai from Texas A&M University, an expert in wireless networks, reinforcement learning, multi-agent learning, and networked systems.

Syllabus

Time: 5:30 PM - 6:30 PM IST

Taught by

Centre for Networked Intelligence, IISc

Reviews

Start your review of CONGO - Compressive Online Gradient Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.