Custom MCP Servers: Connecting an Existing Stack to AI Workflows
Built custom Model Context Protocol servers to give Claude direct access to n8n workflows, Notion databases, and internal REST APIs — turning a fragmented tool stack into a unified AI-native workspace.
Internal / Proof of Concept · Latin America
Results
- 4MCP servers built
- ~30/weekManual steps eliminated
- MCP + RESTIntegration layer
- ClaudeAI runtime
The problem
Most teams that adopt AI hit the same wall six weeks in: the AI is capable, but it can't reach anything. It can reason about your data, but it can't read your Notion database. It can draft a workflow, but it can't trigger your n8n pipeline. Every useful action requires a human in the middle — copy content, paste into the prompt, copy the result, paste it back.
The result is that AI becomes a glorified text editor instead of an autonomous co-worker.
The goal here was to close that gap — to give Claude the ability to read, write, and act across the tools already in use, without rebuilding the stack from scratch.
What MCP is
Model Context Protocol (MCP) is Anthropic's open standard for connecting AI models to external tools and data sources. Instead of hardcoding integrations into a prompt or building a custom agent framework, MCP lets you expose any capability — a database query, an API call, a workflow trigger — as a typed tool that Claude can call natively during a conversation or autonomous task.
Think of it as a universal adapter between Claude and the rest of your software.
What we built
Four custom MCP servers, each wrapping a different part of the stack:
1. n8n MCP server Exposes n8n workflow execution as callable tools. Claude can list available workflows, trigger them with parameters, and read execution results — without opening the n8n UI. Used to kick off data sync pipelines, send notifications, and run scheduled automations on demand.
2. Notion MCP server Wraps the Notion API to give Claude structured read/write access to databases and pages. Claude can query a project database, update task statuses, append notes to meeting pages, and create new records — all from a single conversation.
3. Internal REST API MCP server A thin adapter over a set of internal REST endpoints. Exposes business-specific operations (fetching records, updating states, triggering batch jobs) as first-class tools without exposing the raw API surface to the model.
4. File system + context MCP server Gives Claude access to a curated set of local files and documents — specs, templates, runbooks — so it can reference the right context without being fed it manually in every prompt.
Key decisions
Why MCP instead of a custom function-calling layer? MCP servers are composable and reusable across any Claude-compatible client (Claude.ai, Claude Code, custom apps). Building a bespoke function-calling layer would have locked the integrations to a single entry point. MCP means the same server works whether Claude is being called from a chat interface, a CLI, or an automated pipeline.
Why n8n as the automation backbone instead of native code? n8n already existed in the stack and handled dozens of automations. Rather than rewrite those in code, the MCP server treats n8n as a capability registry — Claude knows which workflows exist and what they do, and can invoke them as needed. This preserves the existing logic and the non-technical visibility that n8n provides.
Why expose Notion instead of a purpose-built database? The team already lived in Notion. Migrating to a structured database would have broken existing processes and required training. The MCP server translates Notion's flexible structure into typed schemas Claude can work with reliably.
Results
With the four MCP servers in place, Claude can now complete multi-step tasks autonomously that previously required 5–10 manual handoffs: read a project brief from Notion → trigger the relevant n8n data pipeline → update the project status → append a summary back to the Notion page.
Roughly 30 manual copy-paste or context-switch actions per week were eliminated across the workflows where MCP servers were deployed. More significantly, it shifted the human role from data mover to decision maker — Claude handles the retrieval and execution; humans handle judgment and approval.
The architecture is now the foundation for every back-office automation engagement at Solaar: identify the tools in the client's stack, build MCP servers to expose them, and connect them to an AI workflow designed around the client's actual processes.
Tech stack
- Claude AI
- Model Context Protocol (MCP)
- n8n
- REST API
- Node.js