By James Aspinwall, co-written by Alfred Pennyworth (my trusted AI) – March 2, 2026, 11:00
ServiceNow just shipped native support for both Anthropic’s Model Context Protocol (MCP) and Google’s Agent-to-Agent (A2A) protocol in their Zurich Patch 4 release. This isn’t a lab experiment – it’s production infrastructure for connecting AI agents to ITSM, knowledge bases, service catalogs, and each other. Here’s what they built, how it works, and why it matters.
The Two Protocols, Quick Refresher
MCP (Anthropic, open-source) standardizes how AI agents connect to tools and data. Think of it as USB for AI – a universal plug that lets any agent discover and call any tool without custom integration code. The agent asks “what can you do?”, gets a tool list with schemas, and calls them by name with JSON arguments.
A2A (Google, donated to the Linux Foundation with 50+ partners) standardizes how agents talk to each other. Not agent-to-tool, but agent-to-agent. One agent can discover another’s capabilities, delegate tasks, and get structured results back. It’s JSON-RPC 2.0 with an agent card discovery mechanism at /.well-known/agent.json.
MCP is vertical (agent reaches down to tools). A2A is horizontal (agents reach across to other agents). ServiceNow now supports both directions.
What ServiceNow Actually Ships
AI Agent Fabric
The umbrella product is called AI Agent Fabric, announced at Knowledge 2025 alongside the AI Control Tower. Agent Fabric is the runtime layer that enables three communication patterns:
- Agent-to-tool – Now Assist agents call external MCP servers for capabilities ServiceNow doesn’t have natively (custom databases, third-party APIs, proprietary systems)
- Tool-to-agent – External AI agents call into ServiceNow via MCP to create incidents, search knowledge bases, manage catalogs
- Agent-to-agent – Now Assist agents collaborate with external agents (Google Vertex AI, AWS Bedrock, Azure AI Foundry) via A2A protocol
Twelve launch partners are building integrations: Accenture, Adobe, Box, Cisco, Google Cloud, IBM, Jit, Microsoft, Moonhub, NVIDIA, RADCOM, UKG, and Zoom.
MCP Client – ServiceNow Calls Out
When a Now Assist agent needs to reach an external tool, ServiceNow acts as an MCP client. You configure external MCP server connections in AI Agent Studio with three auth options: authless, API key, or OAuth 2.1. The agent discovers the server’s tools at runtime and can call them during task execution.
Use case: A Now Assist agent handling an incident needs to check your internal monitoring system. Instead of building a custom integration, you point it at your monitoring tool’s MCP server. The agent discovers the check_service_health tool, calls it, and incorporates the result into its incident response.
Transport: Streamable HTTP and SSE. No stdio (makes sense – ServiceNow is cloud-hosted, not running local processes). Protocol version 2025-06-18.
MCP Server – External Agents Call In
ServiceNow also acts as an MCP server, exposing Now Assist skills as callable tools. This is the more interesting direction – it means any MCP-capable agent (Claude, GPT, your custom agent) can interact with ServiceNow without the ServiceNow SDK.
The MCP Server Console gives administrators control over:
- Which Now Assist skills are exposed as MCP tools
- Who can access them (OAuth + ServiceNow identity)
- Usage, latency, and reliability metrics per tool per server
- Cost tracking (each skill invocation consumes assists)
The Quickstart MCP Server ships as a ready-to-use package through ServiceNow’s GenAI Innovation Lab – preconfigured tools for testing and prototyping without touching production.
A2A Protocol – Agent Collaboration
With A2A support (protocol version 0.3), Now Assist agents can play either role:
- Primary (orchestrating) agent – delegates subtasks to external agents and coordinates their responses
- Secondary (invoked) agent – receives task requests from external agents and executes them using ServiceNow capabilities
Authentication supports OAuth 2.0 and API keys, with federated token auth for cross-identity-provider scenarios. Tested against Google Vertex AI, AWS Bedrock, and Azure AI Foundry.
The Community MCP Servers
Beyond ServiceNow’s official MCP server, the community has built open-source MCP servers that expose ServiceNow’s REST API as MCP tools. These run as standalone processes (Python, typically) that bridge between MCP clients and ServiceNow instances.
A typical community server exposes tools organized into role-based packages:
| Package | Tools |
|---|---|
| service_desk | incident_create, incident_update, incident_resolve, incident_search, kb_search |
| change_coordinator | change_create, change_update, change_approve, change_reject, task management |
| knowledge_author | article_create, article_publish, category_manage, kb_admin |
| catalog_builder | catalog_browse, item_create, variable_manage, catalog_optimize |
| agile_management | story_create, epic_manage, scrum_tasks, sprint_tracking |
| platform_developer | workflow_manage, script_includes, ui_policies, CRUD operations |
| system_administrator | user_create, group_manage, membership, system_logs |
You select which packages to load via an environment variable (MCP_TOOL_PACKAGE=service_desk,knowledge_author), keeping the tool surface minimal for each use case. Auth supports basic, OAuth, and API key against the ServiceNow instance.
This means you can connect Claude Code, Cursor, or any MCP client directly to your ServiceNow instance today – no waiting for the official MCP Server Console rollout.
The AI Control Tower
Sitting above all of this is the AI Control Tower – ServiceNow’s governance layer for AI agents. Think of it as a management dashboard for your AI workforce:
- Visibility – monitor every AI agent operating across ServiceNow and third-party systems
- Governance – assign human managers to oversee agent activities, set boundaries
- Risk mitigation – track agent decisions, flag anomalies, maintain audit trails
- Optimization – measure agent impact, identify bottlenecks, coordinate workflows
The Control Tower treats AI agents like employees – they have roles, managers, performance metrics, and access controls. This is the enterprise governance layer that makes the MCP/A2A plumbing acceptable to compliance teams.
Current Limitations
ServiceNow is honest about what’s not ready yet:
- No MCP resources or prompts – only tools are supported. MCP’s resource and prompt primitives are on the roadmap.
- No artifacts – A2A artifact support (rich data exchange between agents) is planned but not shipped.
- No parallel tasking – agents execute tasks sequentially, not concurrently.
- No HTTP streaming for A2A – responses are synchronous.
- MCP server enablement requires a support ticket – until mid-January 2026, you had to log an incident (KB2676985) to get it turned on. This should be self-service by now.
These are reasonable gaps for a first release. The core plumbing – tool discovery, authenticated tool calls, agent delegation – is there.
Why This Matters
ServiceNow processes a staggering amount of enterprise workflow. Incident management, change requests, service catalogs, knowledge bases, HR cases, security operations – it’s the system of record for how large organizations operate.
Opening that up via standard protocols changes the game in three ways:
1. Any agent can now work with ServiceNow data. Before MCP, integrating an AI agent with ServiceNow meant building to their REST API, understanding their table schema, handling authentication flows, and maintaining custom code. Now you configure an MCP connection and the agent discovers capabilities at runtime.
2. ServiceNow agents can reach beyond ServiceNow. The MCP client capability means Now Assist agents aren’t trapped inside the ServiceNow ecosystem. They can call out to any MCP server – your custom tools, third-party services, internal APIs – and incorporate external data into their workflows.
3. Multi-vendor agent orchestration becomes possible. With A2A, a Google Vertex agent can delegate an incident creation task to a ServiceNow agent, which can then call a Slack MCP server to notify the on-call engineer. No vendor lock-in at the agent layer. The protocols handle the handshake.
What This Tells Us About the Market
ServiceNow’s bet is that MCP and A2A become the TCP/IP of the agent era. They’re not hedging – they’re building core product features on these protocols, not just connectors or plugins.
The partner list tells the story: Accenture (consulting), Adobe (content), Box (documents), Cisco (networking), Google Cloud, IBM, Microsoft, NVIDIA, Zoom. These aren’t indie startups experimenting with protocols. These are the companies whose enterprise customers demand interoperability.
The pattern emerging across the industry is clear: MCP for tools, A2A for agents. Anthropic owns the tool layer. Google owns the agent layer (and donated it to the Linux Foundation for credibility). Everyone else – ServiceNow included – builds on top.
For developers building AI agent systems today, the signal is: adopt MCP and A2A now. The enterprise platforms are standardizing on them. If your agent speaks these protocols, it can talk to ServiceNow, and to every other platform making the same bet.
Prerequisites If You Want to Try It
- ServiceNow Zurich Patch 4+ (or Yokohama Patch 11+)
- Now Assist SKU – Pro Plus or Enterprise Plus
- AI Agents store app version 6.x+
- Model Context Protocol Client app version 1.1+
- For community MCP servers: Python 3.11+, ServiceNow instance credentials
The enterprise just got plugged into the agent mesh. Now the interesting question is what agents will do with all that ITSM data.