One of the greatest challenges of developing with LLMs is their lack of up-to-date knowledge of the world or niche that they operate in. Retrieval Augmented Generation (RAG) allows us to tackle this issue by giving LLMs direct access to external information.
By pairing this with AI Agents, we can build systems with incredible accuracy and flexibility. The superpower of LLMs is their ability to converse in natural language, so to make the most of RAG and Agents we incorporate short-term memory, a fundamental component of conversational systems such as chatbots and personal assistants.
Find the slides here: https://docs.google.com/presentation/d/1HgqyX3O8RJsKI8uMPoH-KuKLZ49KF72Fz_ihGV71y-4/edit?usp=sharing
Add comment