Troubleshooting

Common issues and solutions when migrating Airflow DAGs to Prefect flows.

Validation Issues

Syntax Errors in Generated Code

Symptom: validate reports syntax_valid: false.

{
  "syntax_valid": false,
  "syntax_errors": [
    { "line": 3, "column": 24, "message": "unexpected EOF while parsing" }
  ]
}

Solutions:

  1. Check the syntax_errors array for line/column/message details
  2. Fix the generated code and re-validate
  3. Common causes: missing closing parentheses, unclosed strings, indentation errors

Task Count Differences

Symptom: The generated flow has fewer tasks than the original DAG.

Solutions:

  1. Use the comparison_guidance from validate output to check each item:

    • Are all tasks from the DAG represented in the flow?
    • Were DummyOperator/EmptyOperator nodes intentionally removed?
    • Were dynamic tasks fully converted?
  2. For dynamic tasks, ensure all generated tasks are represented:

    # Airflow dynamic tasks
    for i in range(3):
        task = PythonOperator(task_id=f"task_{i}", ...)
    
    # Prefect equivalent
    for i in range(3):
        task_i.submit(...)  # Or use .map()
    

Dependency Ordering

Symptom: Tasks may run in wrong order in the Prefect flow.

Solutions:

  1. Ensure explicit data flow between tasks:

    # Wrong - no explicit dependency
    @flow
    def my_flow():
        extract()
        transform()  # May run before extract finishes
    
    # Correct - explicit data dependency
    @flow
    def my_flow():
        data = extract()
        transform(data)  # Waits for extract
    
  2. Use wait_for for non-data dependencies:

    from prefect import flow, task
    
    @flow
    def my_flow():
        a = task_a.submit()
        b = task_b.submit(wait_for=[a])  # Explicit ordering
    

XCom Pattern Not Converted

Symptom: Generated code still references ti.xcom_pull or context['ti'].

Solutions:

  1. Replace XCom pull with function parameters:

    # Airflow
    def transform(**context):
        data = context['ti'].xcom_pull(task_ids='extract')
    
    # Prefect
    @task
    def transform(data):  # Receive as parameter
        ...
    
    @flow
    def my_flow():
        data = extract()
        transform(data)  # Pass directly
    
  2. For multi-task pulls, use explicit parameters:

    # Airflow
    data = ti.xcom_pull(task_ids=['task_a', 'task_b'])
    
    # Prefect
    @task
    def combine(data_a, data_b):
        ...
    
    @flow
    def my_flow():
        a = task_a()
        b = task_b()
        combine(a, b)
    

Use lookup_concept("xcom") for detailed XCom→return-value translation rules.

Tool Issues

read_dag Errors

Symptom: read_dag fails to read a file.

Solutions:

  1. Ensure the file path is correct and accessible
  2. Both file paths and inline code are supported — pass code directly via the content parameter
  3. Verify the file contains valid Python

lookup_concept Returns No Match

Symptom: lookup_concept doesn't find a translation for your concept.

Solutions:

  1. Try variations: PythonOperator, python-operator, python_operator all work
  2. Check Operator Coverage for supported operators
  3. Use search_prefect_docs for concepts not in the pre-compiled knowledge base
  4. For custom operators, manual conversion is required

search_prefect_docs Not Available

Symptom: search_prefect_docs returns an error or no results.

Solutions:

  1. Check that MCP_PREFECT_ENABLED is not set to false
  2. Verify network connectivity to docs.prefect.io
  3. The tool requires the Prefect MCP server at https://docs.prefect.io/mcp

Runtime Issues

Import Errors

Symptom: Converted flow fails to import.

Solutions:

  1. Install required Prefect integrations:

    pip install prefect-aws prefect-gcp prefect-sqlalchemy
    
  2. Check for missing custom modules

  3. Verify Python version compatibility

Connection/Block Not Found

Symptom: Flow fails looking for Airflow connection.

Solutions:

  1. Create corresponding Prefect Block — use lookup_concept("connections") for mapping guidance
  2. Update code to use Block.load() pattern
  3. See Prefect Cloud Guide for Block examples

Schedule Not Triggering

Symptom: Converted flow doesn't run on schedule.

Solutions:

  1. Verify deployment was created:

    prefect deployment ls
    
  2. Check schedule configuration in prefect.yaml

  3. Ensure worker is running:

    prefect worker start --pool "my-pool"
    

Getting Help

If you encounter issues not covered here:

  1. Check the validation output for syntax errors and comparison guidance
  2. Use lookup_concept and search_prefect_docs for translation guidance
  3. Review the Prefect Cloud guide for deployment patterns
  4. Open an issue on GitHub with:
    • Original DAG (sanitized)
    • Generated flow code
    • Validation result
    • Expected vs actual behavior