DAG Architect – Apache Airflow / Astronomer-Atlanta,GA
Client- TRUIST
Position: DAG Architect – Apache Airflow / Astronomer
Location : ATLANTA, GA- Onsite
To apply / details send updated resume to chay@logic-loops.com
Role Overview
We are looking for an experienced DAG Architect who can design, standardize, and scale Apache Airflow workflows in an enterprise environment. This role goes beyond writing individual DAGs — the focus is on DAG architecture, DAG Factory patterns, orchestration best practices, and platform-level design.
The ideal candidate understands how Airflow works internally, how to build reusable DAG frameworks, and how to integrate Airflow with multiple data platforms while following clean code, CI/CD, and cloud best practices.
⸻
Key Responsibilities
Airflow & DAG Architecture (Core)
• Design and own enterprise-grade DAG architecture using Apache Airflow (Astronomer preferred)
• Build and maintain DAG Factory frameworks using YAML / config-driven DAG generation
• Define standard DAG templates (ingestion, transformation, validation, reconciliation, health checks)
• Ensure separation of orchestration vs business logic
• Design dependency patterns (task-level, DAG-level, cross-DAG dependencies)
• Implement best practices for:
• Scheduling
• Backfills
• Catchup behavior
• Retries, SLAs, alerts
• Idempotency and reruns
• Optimize DAG performance and scalability
⸻
Operators, Hooks & Integrations
• Strong hands-on experience with Airflow Operators, Hooks, Sensors
• Experience integrating Airflow with:
• Snowflake
• Talend / DTF frameworks
• Cloud Storage (S3 / GCS / ADLS)
• Databases & APIs
• Build and customize custom operators and plugins when required
• Work with connections, variables, secrets backends
⸻
Python (Required)
• Strong Python skills focused on:
• Writing clean, modular DAG code
• Reusable helper modules and utilities
• Config-driven DAG generation
• Understanding of:
• Python packaging
• Virtual environments
• Dependency management
• Ability to review and refactor Python code written by other developers
⸻
Platform & Cloud (Preferred)
• Experience running Airflow on:
• Astronomer
• Kubernetes-based environments
• Cloud experience in one or more:
• AWS, GCP, or Azure
• Understanding of:
• IAM / service accounts
• Secrets management
• Environment separation (dev, test, prod)
⸻
CI/CD & DevOps (Good to Have)
• Experience with CI/CD pipelines for Airflow:
• DAG validation
• Linting and unit testing
• Deployment through Git-based workflows
• Familiarity with:
• Git
• Jenkins / GitLab CI / GitHub Actions
• Understanding of release management and rollback strategies
⸻
Governance, Standards & Collaboration
• Define coding standards, folder structures, naming conventions
• Review DAGs built by developers and provide architectural guidance
• Work closely with:
• Data engineers
• Platform teams
• Cloud and DevOps teams
• Mentor junior developers on Airflow and DAG best practices
⸻
Required Skills (Must Have)
• Strong experience with Apache Airflow
• Proven experience as a DAG Architect or Lead Airflow Engineer
• Deep understanding of:
• DAG lifecycle
• Scheduler, executor, metadata DB
• XComs, sensors, triggers
• Strong Python skills
• Experience designing config-driven / DAG Factory frameworks
⸻
Nice to Have
• Experience with Astronomer
• Experience in regulated or enterprise environments
• Exposure to data quality, reconciliation, or health-check frameworks
• Experience supporting multiple teams using a shared Airflow platform

<< Home