Courses from 1000+ universities
Buried in Coursera’s 300-page prospectus: two failed merger attempts, competing bidders, a rogue shareholder, and a combined market cap that shrank from $3.8 billion to $1.7 billion.
600 Free Google Certifications
Communication Skills
Digital Marketing
Caregiving
Extreme Geological Events
Python and Statistics for Financial Analysis
Internet History, Technology, and Security
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore a wide range of free and certified Market research online courses. Find the best Market research training programs and enhance your skills today!
Detailed explanation of Scaling Transformers and Terraformer architecture, focusing on leveraging sparsity to improve efficiency and speed in large language models while maintaining accuracy.
Explore grafting technique for transferring learning rate schedules between optimizers, improving deep learning model performance and reducing computational costs in hyperparameter tuning.
Explores limitations of differentiable programming in machine learning, focusing on chaos-based failures in various systems. Discusses alternatives to backpropagation for gradient estimation in complex, stochastic environments.
Explore Autoregressive Diffusion Models, a novel approach combining autoregressive and diffusion models for efficient, order-agnostic generation and compression of text and image data.
Explore Topographic VAEs: a novel approach to deep generative models with organized latent variables, bridging topographic organization and equivariance in neural networks for improved feature learning and transformation handling.
Explores innovative Transformer model with unbounded memory, enabling processing of arbitrarily long sequences. Discusses continuous attention mechanisms, sticky memories, and potential applications in language modeling.
Explore DeepMind's PonderNet, a novel approach to adaptive computation in neural networks. Learn how it dynamically allocates computational steps based on problem complexity, improving efficiency and performance.
Explores the Dimpled Manifold Model to explain adversarial examples in machine learning, challenging existing theories and providing experimental evidence for a new perspective on neural network vulnerabilities.
Explore XCiT, a novel Transformer architecture for computer vision using cross-covariance attention. Learn about its linear complexity, scalability, and performance across various vision tasks.
Explores a unified framework for implicit differentiation in machine learning, enabling automatic differentiation of optimization problems without manual derivations or loop unrolling, with applications in meta-learning and hyperparameter optimization.
Explores the hypothesis that reward maximization drives intelligence development, discussing its implications for AGI and reinforcement learning, with critical analysis and commentary.
Explore how Diffusion Models outperform GANs in image synthesis, with improved architecture, classifier guidance, and better sample quality while maintaining distribution coverage.
Explores MLP-Mixer, a novel architecture using only multi-layer perceptrons for image classification. Discusses its design, performance, and implications for computer vision beyond CNNs and Transformers.
Explore Facebook AI's DINO system, combining self-supervised learning with Vision Transformers for impressive image analysis without labels. Learn about its architecture, methodology, and groundbreaking results in computer vision.
Explore cutting-edge reinforcement learning with DreamerV2, a model-based AI agent achieving human-level performance on Atari games using discrete world models and latent space predictions.
Get personalized course recommendations, track subjects and courses with reminders, and more.