LLMs for Advanced Question-Answering over Tabular/CSV/SQL Data (Building Advanced RAG, Part 2)
LlamaIndex Query Pipelines makes it possible to express these complex pipeline DAGs in a concise, readable, and visual manner. It’s very easy to add few-shot examples, link prompts, LLMs, custom functions, retrievers, and more.
Colab notebook used in this video: https://colab.research.google.com/drive/1fRkgSn2PSlXSMgLk32beldVnLMLtI1Pc?usp=sharing
This presentation was taken from our documentation guides – check them out 👇
Text-to-SQL: https://docs.llamaindex.ai/en/stable/examples/pipeline/query_pipeline_sql.html
Text-to-Pandas: https://docs.llamaindex.ai/en/stable/examples/pipeline/query_pipeline_pandas.html
Timeline:
00:00-06:18 – Intro
6:18-12:13 – Text-to-Pandas (Basic)
12:13-27:05 – Query-Time Table Retrieval for Advanced Text-to-SQL
27:05 – Query-Time Row Retrieval for Advanced Text-to-SQL
Add comment