50% OFF: In-Depth AI & Machine Learning Course
Power BI Fundamentals - Create visualizations and dashboards from scratch
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to handle large-scale optimization problems in Julia by leveraging constraint generators and parallelization techniques in this conference talk. Discover GenOpt, a JuMP extension that enables explicit specification of parameterized constraint groups, allowing existing JuMP codebases to scale with minimal modifications. Explore how grouping constraints into parametric groups can parallelize differentiation on GPUs and enable efficient communication of JuMP models across clusters for distributed solving. Understand the memory and computational advantages of processing constraint groups at the MathOptInterface level rather than the JuMP level, which reduces overhead and accelerates model generation. See how this approach makes JuMP models containing billions of constraints manageable when generated by just a few groups, and learn about the integration with ExaModels for GPU-accelerated differentiation. Gain insights into recovering grouped model printing functionality that was available in JuMP versions prior to v0.19, making large-scale optimization more accessible and efficient.
Syllabus
Large Scale JuMP Models with Constraint Generators
Taught by
The Julia Programming Language