Auto-generate, govern, and deploy MCP servers from any API. Powered by multi-model AI routing, DAuth, and enterprise-grade pipelines.
$ mcp-maker run --spec openapi.json --target mcp
✓ Ingesting spec... petstore-api v1.0.0
✓ Mining capabilities... 4 tools, 2 resources
✓ Synthesizing schemas... JSON Schema validated
✓ Generating server... Python + dedalus_mcp
✓ Testing... 4/4 passed
→ Deploying to Dedalus... live
Compatible with your entire stack
From spec ingestion to production deployment — one platform, no glue code.
Auto-generate fully typed MCP servers from OpenAPI specs, SDKs, docs, or recorded traffic.
Run multi-step pipelines: ingest → mine → synth → generate → test → deploy.
First-class per-tenant credential management. Users grant access with OAuth2/PAT flows.
Unified OpenAI-compatible API layer to call any provider. Automatic fallback and cost optimization.
Granular read/write/destructive controls per tool. Allowlist, denylist, and scope-based authorization.
Auto-generated test harness, containerization with Docker, and one-click deployment.
Orchestrates each stage with retries, rollback, and full observability.
Parse OpenAPI, SDK types, docs, or traffic captures
AI-powered extraction of tools, resources, and schemas
Generate JSON Schema inputs with validation rules
Emit typed MCP server with policy + auth layers
Run auto-generated test suite against the server
Push to GitHub and deploy to Dedalus
Security, governance, and reliability built into every generated artifact.
Generated MCP tools are typed, validated, and policy-governed — agents execute predictably every time.
Runner provides memory, rollback, and transactional integrity across multi-step pipelines.
Built-in governance, DAuth authentication, and audit trails at every layer of the stack.