8 Deploying and orchestrating GenAI apps
In this chapter
- Learn how to deploy applications so they can be accessed by others through websites and APIs.
- Understand the role of agent protocols (MCP, A2A, ACP) in orchestrating GenAI systems.
- Explore how to monitor and debug your GenAI apps using observability tools like LangSmith.
- Build an agent leveraging an MCP server and embed it on your personal website.
Up to now, we’ve been happily running our GenAI applications on our own machines. That’s fine for experimenting, but sooner or later, you want your creations to be used by others—colleagues, customers, or complete strangers visiting a website. To make this possible, your applications must be accessible from outside. You do so by deploying them somewhere so they can stay “always on.”
And not only that: you’ll often want them to be interoperable, meaning that a Langflow flow can call other applications, or agents can interact with one another, and other applications can call yours. Designing such a collaboration to be reliable requires properly orchestrating the various parts.
You already had a taste of these topics in chapter 7, when KNIME invoked a Langflow flow through an API. Here, we take it a step further and put all the pieces together, so you have the levers to both deploy your creations and make them interoperable. We’ll talk about practical things—protocols, servers, clients, URLs—one small step at a time with simple examples you can follow.