LangGraph is a library for building stateful, multi-actor applications with LLMs, extending the capabilities of LangChain. While LangGraph provides ways to achieve this, deploying these agents often involves extra setup and it's a bit unclear too.
What if you could deploy your existing LangGraph agent, complete with structured output, with just one command? That's where Agentuity comes in.
The LangGraph Agent with Structured Output
Let's take a simple example of a LangGraph agent using the createReactAgent
helper, designed to get weather information and return it in a specific JSON format. This example uses Zod for schema definition and OpenAI for the LLM, but the core concepts apply broadly.
Integrating with Agentuity
Notice how little code is needed to make this LangGraph agent work within Agentuity. The core logic remains the same. We just need to:
- Wrap the agent invocation within an
AgentHandler
function. - Read the user's input from
req.data.text()
. - Return the structured output using
resp.json()
. Agentuity handles the JSON serialization automatically.
Deploying to the Cloud
Once your agent is set up, deploying is trivial. Assuming you have the Agentuity CLI installed and configured, navigate to your project directory and run agentuity deploy
.
That's it! Your LangGraph agent is now live and accessible via an API endpoint provided by Agentuity.
Conclusion
Agentuity makes it incredibly simple to take your existing LangChain and LangGraph projects, like this agent focused on structured output, and deploy them to the cloud without fuss. Focus on building your agent logic, and let Agentuity handle the deployment. If you want to do some cooler things: You can even build out a Vercel AI SDK agent, or a CrewAI agent and have them all talk to each other! Cool eh?