Machine Learning Street Talk (MLST)
Technology
We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST
Patreon: https://www.patreon.com/mlst
Yann LeCun thinks that it's specious to say neural network models are interpolating because in high dimensions, everything is extrapolation. Recently Dr. Randall Balestriero, Dr. Jerome Pesente and prof. Yann LeCun released their paper learning in high dimensions always amounts to extrapolation. This discussion has completely changed how we think about neural networks and their behaviour.
[00:00:00] Pre-intro
[00:11:58] Intro Part 1: On linearisation in NNs
[00:28:17] Intro Part 2: On interpolation in NNs
[00:47:45] Intro Part 3: On the curse
[00:48:19] LeCun
[01:40:51] Randall B
YouTube version: https://youtu.be/86ib0sfdFtw
Robert Lange on NN Pruning and Collective Intelligence
WelcomeAIOverlords (Zak Jost)
Facebook Research - Unsupervised Translation of Programming Languages
Francois Chollet - On the Measure of Intelligence
OpenAI GPT-3: Language Models are Few-Shot Learners
Jordan Edwards: ML Engineering and DevOps on AzureML
One Shot and Metric Learning - Quadruplet Loss (Machine Learning Dojo)
Harri Valpola: System 2 AI and Planning in Model-Based Reinforcement Learning
ICLR 2020: Yoshua Bengio and the Nature of Consciousness
ICLR 2020: Yann LeCun and Energy-Based Models
The Lottery Ticket Hypothesis with Jonathan Frankle
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
CURL: Contrastive Unsupervised Representations for Reinforcement Learning
Exploring Open-Ended Algorithms: POET
Create your
podcast in
minutes
It is Free
Insight Story: Tech Trends Unpacked
Zero-Shot
Fast Forward by Tomorrow Unlocked: Tech past, tech future
Lex Fridman Podcast
The Unbelivable Truth - Series 1 - 26 including specials and pilot