Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

End-to-End Differentiable Proving - Tim Rocktäschel, University of Oxford

Alan Turing Institute via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore an innovative approach to knowledge base reasoning in this 42-minute lecture by Tim Rocktäschel from the University of Oxford, presented at the Alan Turing Institute. Delve into the concept of end-to-end differentiable proving using neural networks that operate on dense vector representations of symbols. Discover how this method combines symbolic reasoning with learning subsymbolic vector representations by replacing symbolic unification with a differentiable computation using a radial basis function kernel. Learn how gradient descent enables the neural network to infer facts from incomplete knowledge bases, place similar symbols in close proximity within a vector space, prove queries using these similarities, induce logical rules, and perform multi-hop reasoning. Examine the performance of this architecture compared to ComplEx, a state-of-the-art neural link prediction model, across four benchmark knowledge bases, and understand its ability to induce interpretable function-free first-order logic rules.

Syllabus

End-to-End Differentiable Proving: Tim Rocktäschel, University of Oxford

Taught by

Alan Turing Institute

Reviews

Start your review of End-to-End Differentiable Proving - Tim Rocktäschel, University of Oxford

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.