The Mathematical Foundations of Intelligence [Professor Yi Ma]
Machine Learning Street Talk (MLST)
2025/12/13
The Mathematical Foundations of Intelligence [Professor Yi Ma]
The Mathematical Foundations of Intelligence [Professor Yi Ma]

Machine Learning Street Talk (MLST)
2025/12/13
Shownote
Shownote
What if everything we think we know about AI understanding is wrong? Is compression the key to intelligence? Or is there something more—a leap from memorization to true abstraction? In this fascinating conversation, we sit down with **Professor Yi Ma**...
Highlights
Highlights
This episode features a deep, theory-driven conversation with Professor Yi Ma on the mathematical foundations of intelligence—challenging mainstream assumptions about how AI systems learn, represent, and reason about the world.
Chapters
Chapters
Introduction
00:00The First Principles Book & Research Vision
02:08Two Pillars: Parsimony & Consistency
05:21Evolution vs. Learning: The Compression Mechanism
09:50LLMs: Memorization Masquerading as Understanding
14:36The Leap to Abstraction: Empirical vs. Scientific
19:55Platonism, Deduction & The ARC Challenge
27:30Specialization & The Cybernetic Legacy
35:57Deriving Maximum Rate Reduction
41:23The Illusion of 3D Understanding: Sora & NeRF
48:21All Roads Lead to Rome: The Role of Noise
54:26All Roads Lead to Rome: The Role of Noise
59:56Benign Non-Convexity: Why Optimization Works
1:00:14Double Descent & The Myth of Overfitting
1:06:35Self-Consistency: Closed-Loop Learning
1:14:26Deriving Transformers from First Principles
1:21:03Verification & The Kevin Murphy Question
1:30:11CRATE vs. ViT: White-Box AI & Conclusion
1:34:11Transcript
Transcript
Yi Ma: In the past 10 years, I think the question about intelligence or artificial intelligence has captured people's imagination. I'm one of them, but it took me about 10 years to try to really understand, can we actually make understanding intelligence a...