

Hungry Hungry Hippos (H3) and Language Modeling with State Space Models
Deep Papers, a Podcast from AI Pub and Arize AI
Deep Papers is a podcast series featuring deep dives on today’s seminal AI papers and research. Hosted by AI Pub creator Brian Burns and Arize AI founders Jason Lopatecki and Aparna Dhinakaran, each episode profiles the people and techniques behind cutting-edge breakthroughs in machine learning.
About This Episode
In this episode, we interview Dan Fu and Tri Dao, inventors of “Hungry Hungry Hippos” (aka “H3”). This language modeling architecture performs comparably to transformers, while admitting much longer context length: n log(n) rather than n^2 context scaling, for those technically inclined. Listen to learn about the major ideas and history behind H3, state space models, what makes them special, what products can be built with long-context language models, and hints of Dan and Tri’s future (unpublished) research.
Listen
🎧 SUBSCRIBE Spotify | Apple Podcasts | YouTube
Links
- Read Fu and Dao’s original paper titled Hungry Hungry Hippos: Towards Language Modeling with State Space Models
- Sign up for the Arize AI Slack community to ask the authors questions
- Follow Dan Fu on Twitter and his website
- Follow Tri Dao on Twitter and his website
- Follow AI Pub on Twitter
- Learn more about Arize AI and signup for a free account