Machine Learning Street Talk (MLST)
Technology
Welcome to the Christmas special community edition of MLST! We discuss some recent and interesting papers from Pedro Domingos (are NNs kernel machines?), Deepmind (can NNs out-reason symbolic machines?), Anna Rodgers - When BERT Plays The Lottery, All Tickets Are Winning, Prof. Mark Bishop (even causal methods won't deliver understanding), We also cover our favourite bits from the recent Montreal AI event run by Prof. Gary Marcus (including Rich Sutton, Danny Kahneman and Christof Koch). We respond to a reader mail on Capsule networks. Then we do a deep dive into Type Theory and Lambda Calculus with community member Alex Mattick. In the final hour we discuss inductive priors and label information density with another one of our discord community members.
Panel: Dr. Tim Scarfe, Yannic Kilcher, Alex Stenlake, Dr. Keith Duggar
Enjoy the show and don't forget to subscribe!
00:00:00 Welcome to Christmas Special!
00:00:44 SoTa meme
00:01:30 Happy Christmas!
00:03:11 Paper -- DeepMind - Outperforming neuro-symbolic models with NNs (Ding et al)
00:08:57 What does it mean to understand?
00:17:37 Paper - Prof. Mark Bishop Artificial Intelligence is stupid and causal reasoning
wont fix it
00:25:39 Paper -- Pedro Domingos - Every Model Learned by Gradient Descent Is Approximately a Kernel Machine
00:31:07 Paper - Bengio - Inductive Biases for Deep Learning of Higher-Level Cognition
00:32:54 Anna Rodgers - When BERT Plays The Lottery, All Tickets Are Winning
00:37:16 Montreal AI event - Gary Marcus on reasoning
00:40:37 Montreal AI event -- Rich Sutton on universal theory of AI
00:49:45 Montreal AI event -- Danny Kahneman, System 1 vs 2 and Generative Models ala free energy principle
01:02:57 Montreal AI event -- Christof Koch - Neuroscience is hard
01:10:55 Markus Carr -- reader letter on capsule networks
01:13:21 Alex response to Marcus Carr
01:22:06 Type theory segment -- with Alex Mattick from Discord
01:24:45 Type theory segment -- What is Type Theory
01:28:12 Type theory segment -- Difference between functional and OOP languages
01:29:03 Type theory segment -- Lambda calculus
01:30:46 Type theory segment -- Closures
01:35:05 Type theory segment -- Term rewriting (confluency and termination)
01:42:02 MType theory segment -- eta term rewritig system - Lambda Calculus
01:54:44 Type theory segment -- Types / semantics
02:06:26 Type theory segment -- Calculus of constructions
02:09:27 Type theory segment -- Homotopy type theory
02:11:02 Type theory segment -- Deep learning link
02:17:27 Jan from Discord segment -- Chrome MRU skit
02:18:56 Jan from Discord segment -- Inductive priors (with XMaster96/Jan from Discord)
02:37:59 Jan from Discord segment -- Label information density (with XMaster96/Jan from Discord)
02:55:13 Outro
Create your
podcast in
minutes
It is Free