We spent years drilling API-first thinking into our teams. Clean contracts. Solid specs and docs. Seamless integrations. That worked fine, right until AI made it onto the org chart.
Now LLMs keep asking for context, and the old APIs just shrug. Your systems are neatly locked behind endpoints built for humans, not machines. That means a constant mess of fragile connectors and “just one more” integration.
The Model Context Protocol (MCP) solves this with one clever move. It doesn’t replace your APIs. It finally makes them usable by AI, connecting your products to the next generation of intelligent assistants and agents.
Here’s how it works, and why every AI engineer and engineering leader should care.
What Is MCP And Why It Matters?
Think of MCP as a universal adapter for AI applications, similar to what USB-C is for physical devices. It’s an open standard, introduced by Anthropic that lets AI plug into any tool, database, or service with one simple connector. No more writing custom code every time you want to connect your AI models to different data sources and tools.
For years, every new model or data source meant another delicate integration. Before long, you’re drowning in breakpoints, custom connectors, and yet another round of authentication hassles. MCP fixes this. Build once, and your AI systems can finally collaborate, freeing your team to focus on the work that matters.
Here are some key differences between MCP and traditional API integrations:
Why bother with MCP? It lets you build AI agents and complex workflows on top of LLMs, minus the integration headaches. Since agents often need to work with data and tools, MCP offers a growing set of pre-built integrations your models can use right away. You also get the freedom to swap out LLM vendors when it suits you, plus best practices for keeping your data secure inside your own infrastructure.
Instead of juggling custom API integrations for every data source, developers can now build against a single standard. As MCP gains traction, AI systems will keep their context across tools and data, paving the way for a more flexible and scalable architecture.
MCP isn’t here to take over APIs. In fact, most MCP servers work by wrapping existing APIs. For example, the GitHub MCP server exposes high-level tools like “repository/list” for AI, but under the hood, it’s making regular REST API calls to GitHub. MCP and APIs aren’t rivals, they’re layers in the modern AI stack. MCP gives you an AI-friendly interface on top, often using familiar APIs as the foundation.
MCP Architecture
The Model Context Protocol (MCP) follows a clear client-server model. The idea is simple: keep the AI “brain” separate from the data and tools it needs, and control what it can reach.
At its core, MCP lets a host application connect to as many servers as needed, each one exposing different capabilities. The protocol uses JSON-RPC 2.0 messages to establish communication between:
MCP Host: This is the AI-powered app or agent, like Claude Desktop, an AI IDE like Cursor or VS Code with Copilot, or any tool that wants to access outside data. The host runs the LLM and decides what to ask for.
MCP Client: The connector managed by the host. It sets up a dedicated line to each MCP server, relaying requests and responses in both directions. The LLM never talks directly to the outside world. Every call and response is filtered and controlled.
MCP Server: A lightweight program that exposes a specific set of capabilities. Servers can connect to files, databases, APIs, or any system you care about. Each one describes what it can do in a standard way, so the AI always knows what’s possible.
This architecture enables safe, two-way communication between AI and external sources. The AI can fetch data, trigger actions, or stream updates, all through a standardized, stateless protocol. You can run servers locally to keep things private or connect to remote services over HTTP. Either way, you avoid tangled code and scale up integrations without the usual hassles.
MCP draws inspiration from the Language Server Protocol, which made it easy to add new programming languages to any development tool. Now, MCP brings the same simplicity to AI applications, letting you add new data and tools with minimal effort. Define what’s allowed, plug in your tools, and let MCP handle the heavy lifting. The result: cleaner, safer, and much more flexible AI integrations.
MCP Core Capabilities
Every MCP server exposes three core building blocks: tools, resources, and prompts. These simple parts make rich interactions possible between clients, servers, and language models. Here’s what each one does:
Tools: The Actions
Tools are the actions your AI can take in the outside world. Think sending a message, running a database query, creating a new ticket, or kicking off a deployment. Each tool is uniquely identified by a name and includes metadata describing its schema. No guesswork, no fragile scripts.
Resources: The Data
Resources are the data or content the AI can access. This could be files from Google Drive, rows from a database, issues from GitHub, or any other information you want your model to see. Resources ground the AI in real context, letting it read, reference, and summarize actual data instead of making things up.
Prompts: The Guidance
Prompts are reusable instructions or templates that steer the AI through more complex tasks. They act like recipes, guiding the model step by step, whether it’s debugging code, filling out a report, or updating CRM records. Good prompts keep the AI focused, consistent, and less likely to go off track.
Not every MCP server uses all three building blocks. Many just focus on tools. What matters is that AI agents can query any MCP server at runtime to see which features are available and use them right away. Each server publishes a catalog, so agents can discover and use new capabilities without changing any code. This is how MCP keeps your workflows flexible and ready for the future.
MCP Adoption
MCP is gaining momentum across the industry. Since Anthropic open-sourced the protocol in November 2024, adoption has moved quickly. Within months, companies like Block, Replit, Windsurf, and Sourcegraph were using it to power new AI workflows and assistants. By spring 2025, OpenAI, Google, and Microsoft had joined in, making MCP a true industry standard.
Today, thousands of MCP servers and connectors are active, with new ones appearing each week. The Model Context Protocol servers repository on Github is the main hub. There you’ll find official reference servers and a growing collection of community-built connectors for Google Drive, Slack, GitHub, databases, cloud services, and more. These projects let AI agents pull files, post messages, run code searches, or automate workflows.
It’s not just Big Tech. Open-source contributors have added servers for tools like Algolia, Chroma, Apify, Sentry, and even for creative services like text-to-speech and image generation. This means your AI agent can already connect to hundreds of real-world apps and data sources without custom code.
With both tech giants and open-source communities building on MCP, the pace is only speeding up. For AI engineers, the message is simple: if your systems are not MCP-ready yet, the ecosystem is growing fast enough to get you there. The sooner your products speak MCP, the sooner you’ll be ready for what comes next in AI.
From API-First to MCP-First
APIs made our products accessible to developers. Now, MCP is making them accessible to AI. API-first thinking still matters, but it only gets you halfway in a world where agentic AI wants to do real work.
Being “MCP-first” means designing your products so AI can use them right out of the box. You provide an MCP server, and suddenly AI assistants, copilots, and bots can plug in.
For instance, Amazon, famous for its API-first culture, is moving toward a true MCP-first approach. The company is adding MCP support to its tools and open-sourcing MCP servers for AWS, making it easy for anyone to get the most out of AWS wherever MCP is used.
Why shift to MCP-first?
AI as a first-class user: MCP-first products let AI agents use your service as easily as any developer, making your platform accessible in an AI-driven world.
Faster integration: Enable others to add your service to AI workflows in minutes, not weeks. No more patchwork integrations.
Future-proofing: With major players on board, one standard connector works everywhere, cutting down on vendor lock-in and plugin friction.
Smarter products: Expose your features through MCP and let AI automate tasks, create workflows, or combine your product’s strengths with others.
Easier internal automation: MCP makes it simple for enterprises to connect internal tools and let AI handle busywork, without brittle scripts or shadow IT.
MCP sits alongside your APIs as the standard way for AI to plug into your world. Just as APIs became essential during the mobile and SaaS booms, MCP is now essential for the AI era.
Final Thoughts
The industry learned API-first thinking the hard way. Now MCP is quietly rewriting the rules for the next generation of AI. This isn’t just another protocol; it’s a practical standard that’s already changing how products and teams connect their systems to real intelligence.
Look at the pace of adoption. A handful of reference servers and open-source SDKs has turned into a growing ecosystem, backed by cloud giants and scrappy startups alike. MCP cuts down integration challenges, limits lock-in, and gives AI agents the kind of access that actually gets work done.
If you run a product or engineering team, now’s the time to make your world legible to AI. Treat MCP as a key part of your strategy, not an afterthought. You don’t have to scrap your APIs, but if you want your product to keep up with tomorrow’s agents and workflows, get it MCP-ready.
AI is about to become your sharpest user. MCP makes sure it plays by your rules.
Thanks for reading The Engineering Leader. 🙏
If you enjoyed this issue, tap the ❤️, share it with someone who'd appreciate it, and subscribe to stay in the loop for future editions.
👋 Let’s keep in touch. Connect with me on LinkedIn.
I was really looking forward to an article like this!
Great walkthrough!
I just published a post highlighting 7 servers to get started with today https://metacircuits.substack.com/p/the-7-most-promising-mcp-servers?r=3tyiu7