Implementing MCP Servers with Ansible Automation Platform
Model Context Protocol (MCP) servers can be integrated with Ansible Automation Platform (AAP) to enhance AI-driven network operations. By enabling natural language interaction with real-time network data, MCP augments large language models (LLMs) with contextual awareness, reducing technical debt and improving operational efficiency. A practical firewall use case illustrates the transformative potential of MCP in enterprise automation.
Executive summary
Integrating Model Context Protocol (MCP) servers with Ansible Automation Platform (AAP) enables AI systems to interact with live network data using natural language. This approach enhances the reasoning capabilities of large language models (LLMs), allowing them to deliver context-rich responses grounded in operational realities. By embedding MCP into existing automation workflows, organizations can reduce technical debt, improve customer service and accelerate decision-making. This Research Note outlines the MCP architecture, implementation strategies and a real-world firewall use case, concluding with actionable recommendations for enterprise adoption.
Introduction
The rapid evolution of artificial intelligence has introduced new paradigms for how systems interact with data. Among these, Model Context Protocol (MCP) represents a significant advancement in enabling LLMs to reason over real-time, organization-specific information. MCP facilitates natural language queries against live data sources, transforming static AI models into dynamic, context-aware agents.
This research note examines the integration of MCP servers with Ansible Automation Platform (AAP), a widely adopted tool for enterprise automation. By leveraging existing workflows within AAP, MCP can serve as a bridge between AI agents and operational data, enabling intelligent, responsive automation. The note provides a technical overview of MCP, details its implementation within AAP, and presents a firewall use case to illustrate its practical value.
Analysis
Understanding Model Context Protocol
Model Context Protocol (MCP), introduced by Anthropic, is an open standard designed to facilitate communication between LLMs and external data sources. MCP operates on a client-server architecture:
- MCP server: Connects to remote or local data sources and exposes tools or capabilities.
- MCP client: Interfaces with the server, selecting tools based on declared capabilities and invoking them via standard transport protocols such as stdio, HTTP or in some implementations, in-memory channels.
This architecture supports chaining multiple servers to enrich data responses, promoting interoperability across vendor, community and custom-built solutions. MCP's flexibility allows organizations to tailor AI interactions to their unique data environments.
Integrating MCP with Ansible Automation Platform
Ansible Automation Platform (AAP) is a cornerstone of enterprise automation, supporting configuration management, testing and deployment across teams. Integrating MCP with AAP enables AI agents to interact with automation workflows in real time, enhancing their ability to reason over operational data.
In this implementation, an MCP server is configured to retrieve workflow history and launch playbooks within AAP. This setup allows AI agents to query live automation data and respond with contextually accurate insights. The integration leverages existing development lifecycles, minimizing disruption while maximizing value.
Use case: Firewall query automation
Network engineers frequently face the question: "Is the firewall blocking this traffic?" Traditionally, answering this requires manual inspection of firewall rulesets — a time-consuming process. MCP offers a streamlined alternative.
Frameworks and tools used
The implementation utilizes several emerging MCP frameworks:
- FastMCP: A community-driven MCP server package supporting extensible tooling.
- Langchain and Langgraph: Popular frameworks for building AI agents and managing toolchains.
- Streamlit: Used as the frontend interface for user interaction.
The client employs FastMCP and Langchain, while the server runs FastMCP. This modular setup allows for rapid iteration and customization.
Workflow overview
- User query: A user submits a question about traffic flow, specifying source, destination and port.
- Client-server interaction: Our MCP server is named firewall-server. It exposes tools to our client. In this case the acl_audit tools is invoked via stdio.
- AAP integration: The MCP server triggers the get-panos-facts playbook in AAP via an API call.
- The Response: The get-panos-facts playbook is launched, and the result is returned to our MCP servers.
- AI reasoning: The MCP server returns the JSON-formatted ruleset to the client. The AI agent, acting as a policy analysis assistant, crafts an agent prompt to OpenAI that includes the policy ruleset and the original user query. OpenAI reasons about the data and returns a natural language response.
- User interface: The answer is displayed in a Streamlit-based chat interface.
This workflow demonstrates how MCP can transform routine network diagnostics into intelligent, conversational experiences.
Recommendations
To effectively adopt MCP within enterprise environments, organizations should consider the following strategic steps:
- Leverage existing automation platforms
Integrate MCP with tools like AAP to capitalize on existing workflows. This reduces implementation overhead and accelerates value realization. - Define clear use cases
Identify operational pain points — such as firewall diagnostics or configuration audits — where AI augmentation can deliver measurable benefits. - Establish governance and guardrails
As AI agents begin to interact with live systems, implement strict controls to prevent unintended actions. Use role-based access, approval workflows and audit logging to maintain security and compliance. - Measure impact with operational metrics
Track metrics such as engineer time saved, ticket resolution rates, and user satisfaction to evaluate MCP's effectiveness. Use these insights to refine workflows and justify further investment. - Plan for hybrid AI strategies
Recognize that MCP complements, rather than replaces, traditional automation and retrieval-augmented generation (RAG) models. Use MCP where open-ended reasoning is valuable and retain deterministic automation for tasks requiring predictable outcomes.
Conclusion
Model Context Protocol represents a powerful evolution in AI-human interaction, enabling LLMs to reason over live, contextual data. When integrated with Ansible Automation Platform, MCP transforms static automation into dynamic, intelligent data sources. The firewall use case illustrates how MCP can reduce technical debt, improve operational efficiency and enhance customer service.
As AI continues to evolve, organizations must adopt flexible, secure and strategic approaches to integration. MCP offers a compelling path forward — one that augments existing capabilities while unlocking new possibilities for intelligent automation.
Appendix
Code snippet of an MCP client with an acl_audit tool exposed.
This report may not be copied, reproduced, distributed, republished, downloaded, displayed, posted or transmitted in any form or by any means, including, but not limited to, electronic, mechanical, photocopying, recording, or otherwise, without the prior express written permission of WWT Research.
This report is compiled from surveys WWT Research conducts with clients and internal experts; conversations and engagements with current and prospective clients, partners and original equipment manufacturers (OEMs); and knowledge acquired through lab work in the Advanced Technology Center and real-world client project experience. WWT provides this report "AS-IS" and disclaims all warranties as to the accuracy, completeness or adequacy of the information.