Google DeepMind Developers: How Nano Banana Was Made
The a16z Show
2025/10/28
Google DeepMind Developers: How Nano Banana Was Made
Google DeepMind Developers: How Nano Banana Was Made

The a16z Show
2025/10/28
Shownote
Shownote
Google DeepMind’s new image model Nano Banana took the internet by storm. In this episode, we sit down with Principal Scientist Oliver Wang and Group Product Manager Nicole Brichtova to discuss how Nano Banana was created, why it’s so viral, and the futur...
Highlights
Highlights
A new AI image model from Google DeepMind has captured widespread attention, not just for its technical prowess but for how it's reshaping creative expression. In this conversation with key developers behind the project, we explore the ideas and design choices that fueled its rapid adoption and what it reveals about the future of generative AI in art and beyond.
Chapters
Chapters
What made Nano Banana go viral overnight?
00:00How is AI redefining the boundaries of art and creativity?
05:00Is art still art if a machine makes it? The role of intent.
07:20Why one-size-fits-all interfaces don’t work for AI tools
12:21Can AI teach a kindergartner to draw?
14:46When should AI think in 2D vs. 3D?
19:49How do we measure if a character looks like itself?
22:30What gets left out when building an AI model?
25:03How does AI know what you really want to edit?
27:25Are pixels still the best way to create digital images?
29:44Who’s using Nano Banana—and how are they hacking it?
32:21From manga to movies: can AI animate your script?
34:54If you can generate an image, can you generate a video?
37:31What happens when AI starts solving math problems with drawings?
40:15Transcript
Transcript
Speaker 2: These models are allowing creators to do less tedious parts of the job, right? They can be more creative, and they can spend, you know, 90% of their time being creative versus 90% of their time. Like editing things and doing these tedious kind o...