Natural language → SQL,
as a service.
Register your database schemas, ask questions in English, get back optimized SQL — and let LLM agents talk to your data through a standard MCP server. Cloud models or fully local CPU inference, your choice.
Register a schema
Connect to SQL Server, PostgreSQL, MySQL or SQLite from the browser, or extract locally with our cross-platform CLI and upload the JSON.
Each schema carries a description, author and owners — surfaced to LLM agents through MCP.
Pick a model
Drop-in choice of Azure OpenAI, any OpenAI-compatible local endpoint (llama.cpp · LM Studio · Ollama · vLLM), or an in-process GGUF model running on CPU.
Token-optimized prompting only sends the relevant tables, so even small models do real work.
Ask & run
Ask in plain English. Inspect the generated SQL, validate it, execute it read-only against the stored connection, like/dislike the answer for tuning.
Full audit log of every prompt, response and token cost in a per-tenant SQLite store.
Talk to it as an agent
Built-in MCP server exposes your registered schemas as resources and the translator as a tool. Wire it into Claude Desktop, Cursor, your own agent — anything that speaks MCP.
Every JSON-RPC call is captured to acp_audit.db for full traceability.
Your data, your folder
Sign-in is email + one-time code. On verification we create
Data/<your-email>/ on the server — your schemas, audit databases, and
API key live exclusively in that folder. Other tenants can't see them.
MCP and CLI access are gated by your personal API key shown in Settings.