Community Example: Snow Leopard + Agentuity for Enterprise Data Retrieval
February 12, 2026 by Agentuity

Getting AI agents to work with enterprise data is still harder than it should be. SQL databases contain the information your business runs on, but connecting agents to that data typically means setting up MCP servers, building ETL pipelines, implementing RAG, or spending weeks on context engineering.
Snow Leopard takes a different approach. Their platform sits between your AI agents and your SQL data sources, translating natural language queries into SQL, executing them against your database, and returning clean JSON for your agents to use. Their semantic engine learns your schema's business logic automatically, delivering 90%+ accuracy out of the box.
What Snow Leopard Does
When your agent needs data, it sends a natural language question to Snow Leopard's /retrieve endpoint. Behind the scenes, Snow Leopard:
- Creates a retrieval plan in real-time
- Generates native SQL for your specific database
- Fetches live data directly from the source and returns structured JSON
As a result, this goes beyond basic text-to-SQL. Snow Leopard's semantic engine understands your schema's business logic, so it routes to the right tables and joins them correctly without you having to spell it out. Every query hits your live database directly, so your agents always work with current data.
The key insight here is that Snow Leopard doesn't need ETL or data pipelines, and never touches your actual data during query generation, making it a data-privacy-preserving solution for enterprises that care about where their data goes.
From Data to Deployed Agent
Snow Leopard gives your agents accurate enterprise data on demand. Agentuity gives you the infrastructure to build and deploy those agents. The Snow Leopard team put together a great example showing how to combine both:
- Data retrieval — The
/retrieveendpoint gives the agent a tool for querying enterprise data in natural language - Thread state —
ctx.thread.statepersists conversation history between requests, so users can have multi-turn conversations about their data - Observability — Every agent gets OpenTelemetry instrumentation for traces, metrics, and logs via
ctx.logger - Deployment —
agentuity deployhandles containerization, routing, TLS, and scaling in a single command
Here's how it all comes together in one agent:
import { createAgent } from '@agentuity/runtime';
import { s } from '@agentuity/schema';
import { generateText, type ModelMessage } from 'ai';
import { openai } from '@ai-sdk/openai';
import { getData } from './getData'; // Snow Leopard retrieval tool
const agent = createAgent('chat', {
description: 'A data retrieval agent powered by Snow Leopard',
handler: async (ctx, { message }) => {
const messages: ModelMessage[] =
(await ctx.thread.state.get('messages')) ?? [{
role: 'system',
content: 'You are a helpful assistant that answers questions using your data tools.'
}];
messages.push({ role: 'user', content: message });
const result = await generateText({
model: openai('gpt-5-mini'),
messages,
tools: { getData },
});
return { response: result.text };
},
schema: {
input: s.object({ message: s.string() }),
output: s.object({ response: s.string() }),
},
});
export default agent;
Ask the agent a question about your data:
curl -X POST http://localhost:3500/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "What are the top 20 performing territories by revenue?"}'
Snow Leopard fetches live data from the database, and the agent summarizes it:
{
"response": "The top performing territory is Columbia with $110,622 in revenue,
followed by Santa Monica ($93,865) and Atlanta ($91,317). The top 20 territories
range from $52,207 to $110,622. Would you like to see performance trends over
time or drill down into specific regions?"
}
The agent joined multiple tables (territories, employees, orders, order details), computed the revenue, and summarized it, all from a plain English question. For the full code walkthrough, including the Snow Leopard tool definition, check out the quickstart guide.
Get Started
To try this yourself or dive deeper into the code, check out the full example on GitHub:
github.com/SnowLeopard-AI/snowy-examples/quickstart/agentuity
Clone it, set your API keys, and deploy:
git clone https://github.com/SnowLeopard-AI/snowy-examples.git
cd snowy-examples/quickstart/agentuity
agentuity project import
agentuity deploy
Want to try Snow Leopard with your own data? Their Playground APIs let you upload a dataset and start querying in minutes.
Resources
- Snow Leopard + Agentuity Quickstart
- Snow Leopard Documentation
- Snow Leopard Blog
- Agentuity Documentation
- Agentuity Web Console
- Get Started with Agentuity (let your coding agent onboard itself)
Building something cool with Agentuity? Share it with us on Discord or tag us on social media.