MCP Tools
Five tools for LLM-assisted Airflow→Prefect conversion. The LLM reads raw code, looks up translation knowledge, and generates Prefect flows.
Architecture
Tools
read_dag
Read an Airflow DAG file and return the raw source code with metadata.
Input:
{
"path": "dags/etl_pipeline.py",
"content": "# optional: inline DAG code"
}
Output:
source: The raw DAG source codefile_path: Resolved file pathfile_size_bytes: File sizeline_count: Number of lines
The LLM reads the source directly — no AST intermediary.
lookup_concept
Query pre-compiled Airflow→Prefect translation knowledge. Covers operators, patterns, connections, and core concepts. Backed by 78 Colin-compiled entries with built-in fallback mappings.
Input:
{
"concept": "PythonOperator"
}
Output includes:
concept_type: operator, pattern, connection, or concept- Airflow details and Prefect equivalent
translation_rules: How to convertsource: "colin" (pre-compiled) or "fallback" (built-in)
search_prefect_docs
Search current Prefect documentation via the Prefect MCP server. For real-time queries beyond what Colin pre-compiled.
Input:
{
"query": "task retries and caching"
}
Output: Search results from docs.prefect.io, or an error with fallback guidance.
validate
Syntax-check generated Prefect code and return both sources for structural comparison.
Input:
{
"original_dag": "dags/etl_pipeline.py",
"converted_flow": "# generated Prefect code..."
}
Output:
original_source: The original DAG codeconverted_source: The generated flow codesyntax_valid: Whether the generated code parsessyntax_errors: Line/column/message if invalidcomparison_guidance: Checklist for the LLM to verify fidelity
scaffold
Generate a Prefect project directory structure following prefecthq/flows conventions. Does not generate flow code.
Input:
{
"output_directory": "/path/to/project",
"project_name": "my-project",
"workspace": "default",
"flow_names": ["my_flow"],
"include_docker": true,
"include_github_actions": true,
"schedule_interval": "@daily"
}
Output: Created directories, files, prefect.yaml template, and next steps.
generate_deployment
Write a prefect.yaml deployment configuration from DAG metadata. Call after generating flow.py. Produces a complete prefect.yaml with YAML anchors, schedule config, parameter defaults, and TODO stubs for work pool and pull step configuration.
Input:
{
"output_directory": "/path/to/project",
"flows": [
{
"flow_name": "my_flow",
"entrypoint": "deployments/default/my_flow/flow.py:my_flow",
"schedule": "0 6 * * *",
"parameters": {"date": null},
"tags": ["etl"]
}
],
"workspace": "default"
}
Output: JSON with created_file, deployment_names, and next_steps.
generate_migration_report
Write MIGRATION.md — a human-readable record of a DAG conversion. Call as the final step after generate_deployment. Documents every conversion decision, produces a before-production checklist with Prefect doc links, and suggests adding the Prefect MCP server.
Input:
{
"output_directory": "/path/to/project",
"dag_path": "dags/etl_pipeline.py",
"flow_path": "deployments/default/my_flow/flow.py",
"decisions": [
{
"component": "PythonOperator",
"outcome": "converted",
"rationale": "Mapped to @task decorator"
}
],
"manual_actions": ["setup_work_pool", "migrate_connections"]
}
Output: JSON with created_file and checklist_items_count.
External MCP Integration
The search_prefect_docs tool connects to the Prefect MCP server via FastMCP Client:
| Server | URL | Purpose |
|---|---|---|
| Prefect | https://docs.prefect.io/mcp | Documentation search |
Configure via environment:
export MCP_PREFECT_ENABLED=true
export MCP_PREFECT_URL="https://docs.prefect.io/mcp"
Next Steps
- Tool Design — Design principles
- Examples — Usage examples
- Schemas — Input/output schemas