10 RAG and Agentic apps with LangGraph and Streamlit

 

This chapter covers

  • Developing a chatbot frontend with Streamlit's chat elements
  • Using LangGraph and LangChain to streamline an advanced AI app
  • How embeddings and vector databases work
  • Augmenting an LLM's pre-trained knowledge using Retrieval Augmented Generation (RAG)
  • Enabling an LLM to access and execute real-world actions

Creating a fun and engaging experience, like the trivia game we built in chapter 9, is exciting, but the true power of AI lies in its ability to drive real business value. AI isn't just about answering questions or generating text; it's about transforming industries, streamlining operations, and enabling entirely new business models.

However, building AI applications that deliver economic value requires more than just calling a pre-trained model. For AI to be useful in real-world scenarios, it needs to be aware of the context in which it operates, connect to external data sources, and take meaningful actions. Companies need AI to understand and respond to domain-specific queries, interact with business systems, and provide personalized assistance.

10.1 Nibby: A customer service bot

10.1.1 Stating the concept and requirements

10.1.2 Visualizing the user experience

10.1.3 Brainstorming the implementation

10.1.4 Installing dependencies

10.2 Creating a basic chatbot

10.2.1 Intro to LangGraph and LangChain

10.2.2 Graphs, nodes, edges, and state

10.2.3 A one-node LLM graph

10.3 Multi-turn conversations

10.3.1 Adding memory to our graph

10.3.2 Displaying the conversation history

10.4 Restricting our bot to customer support

10.4.1 Creating a base prompt

10.4.2 Inserting a base context node in our graph

10.5 Retrieval Augmented Generation

10.5.1 What is Retrieval Augmented Generation?

10.5.2 Implementing RAG in our app

10.6 Turning our bot into an agent

10.6.1 What are agents?

10.6.2 The ReAct framework

10.6.3 Creating tools our agent can use

10.6.4 Making our graph agentic

10.7 Summary