What is MCP-Airflow-API?
A Model Context Protocol (MCP) server that brings natural language
operations to Apache Airflow workflow management. Works with Claude
Desktop, Claude Code, OpenWebUI, or any MCP-compatible LLM client.
Stop juggling Airflow Web UI, REST API curl commands, and CLI sessions.
Just ask:
- "Show me all currently running DAGs"
- "Trigger DAG 'example_complex' and show me the run status"
- "List failed task instances from yesterday with logs"
- "Show task durations for the latest run of etl_pipeline"
- "Find DAGs containing 'ETL' that haven't run in the last 24 hours"
Why I built it
Day-to-day Airflow operations — investigating failed DAG runs, checking
task logs, managing pools and variables, analyzing event logs — require
constant context switching between the Web UI, REST API documentation,
and CLI. I wanted an LLM agent that could translate operator intent
into precise Airflow API calls across both Airflow 2.x and 3.0+
clusters.
Key Features
- Dual API Version Support: Single MCP server handles both Airflow
API v1 (2.x) and v2 (3.0+) via AIRFLOW_API_VERSION environment
variable — switch versions without reconfiguration
- 45 Tools: 43 shared tools + 2 v2-exclusive asset management tools
- DAG Operations: Trigger, pause, resume, list, filter DAGs with
pagination support for 1000+ DAG environments
- Task Instance Analysis: Filter by state, pool, duration, time
range; retrieve logs and extra links
- Smart Date Handling: Natural language dates ("yesterday",
"last 3 days", "this morning") auto-converted to API parameters
- Pool, Variable, Connection Management: Full CRUD via natural
language
- Configuration Inspection: Search and filter Airflow config
sections and options
- Event Log Analysis: DAG event summaries, import error reports,
per-DAG event tracking
- XCom Management: List and retrieve XCom entries by task and run
- Asset Management (v2): Data-aware scheduling support for
Airflow 3.0+ assets and asset events
- Multi-Cluster Support: Manage multiple Airflow versions
simultaneously via separate MCP server entries
- Bearer Token Auth: Production-ready authentication for
streamable-http mode
- Flexible Transports: stdio for local, streamable-http for
remote/Docker
Installation
PyPI package available:
uvx --python 3.12 mcp-airflow-api
Claude Desktop config:
{
"mcpServers": {
"mcp-airflow-api": {
"command": "uvx",
"args": ["--python", "3.12", "mcp-airflow-api"],
"env": {
"AIRFLOW_API_VERSION": "v2",
"AIRFLOW_API_BASE_URL": "http://localhost:8080/api",
"AIRFLOW_API_USERNAME": "airflow",
"AIRFLOW_API_PASSWORD": "airflow"
}
}
}
}
Or run the full Docker Compose stack with MCP-Server, MCPO Proxy, and
OpenWebUI in 5 minutes. A companion repository (Airflow-Docker-Compose)
provides ready-to-use Airflow 2.x and 3.x test clusters.
Tech Stack
- Python 3.12 + asyncio
- FastMCP framework
- Apache Airflow REST API v1 (2.x) and v2 (3.0+)
- stdio and streamable-http transports
Looking for Feedback
- Which Airflow areas should be enhanced next? (DAG-as-code editing?
Sensor-specific tools? Provider package introspection?)
- How should destructive operations (clear DAG runs, delete
variables) be gated for safety?
- Anyone running this against large production clusters with
thousands of DAGs — performance feedback welcome
- Multi-tenant Airflow deployments — interest in workspace/team
scoping?
Repo: https://github.com/call518/MCP-Airflow-API
PyPI: https://pypi.org/project/MCP-Airflow-API/
DeepWiki: https://deepwiki.com/call518/MCP-Airflow-API
Test Clusters: https://github.com/call518/Airflow-Docker-Compose
⭐ Stars, issues, and PRs welcome. The codebase includes an
"Adding Custom Tools" guide for easy extension.