Building Semantic Memory for AI With Cognee
AI Engineering Podcast
2024/11/25
Building Semantic Memory for AI With Cognee
Building Semantic Memory for AI With Cognee

AI Engineering Podcast
2024/11/25
Shownote
Shownote
Summary In this episode of the AI Engineering Podcast, Vasilije Markovich talks about enhancing Large Language Models (LLMs) with memory to improve their accuracy. He discusses the concept of memory in LLMs, which involves managing context windows to enhan...
Highlights
Highlights
In this episode of the AI Engineering Podcast, host Tobias Macey speaks with Vasilije Markovic about the evolving challenge of memory integration in large language models (LLMs). As LLMs become more central to complex applications, maintaining context and long-term knowledge remains a persistent hurdle. Markovic shares insights from his experience building Cognee, an open-source semantic memory engine designed to enhance how AI systems retain and retrieve information over time.
Chapters
Chapters
Introduction to AI Engineering Podcast
00:00Interview with Vasily Markovich
00:19Understanding Memory in LLM Systems
01:39Challenges of Forgetting in LLMs
03:05Multi-Turn Interactions and Context Management
05:06Hierarchical Memory in LLM Applications
06:52Semantic Memory and Cognitive Science
10:10Architectural Components for Semantic Memory
14:51Development and Evolution of Cogni
17:15Data Structures and Ontologies in LLMs
23:32Integrating Cogni into System Design
29:29Personalization and Use Cases for Cogni
34:34Navigating Unknowns in AI Ecosystem
38:40Potential Applications of Cogni
42:06Lessons Learned in Building Cogni
46:36Future Plans for Cogni
48:19Transcript
Transcript
Tobias Macey: Hello, and welcome to the AI Engineering Podcast, your guide to the fast-moving world of building scalable and maintainable AI systems. Your host is Tobias Macey, and today I'm interviewing Vasilije Markovic about adding memory to LLMs to imp...