Friday, January 9, 2026

SDET/QE - Atlanta,GA

Status- Active 

Client- Equifax

Position: Title : SDET /QE

Location : Alpharetta, GA (In Person)

To apply / details  send updated resume to chay@logic-loops.com 

Must-Have Skills : Typescript and Playwright.

Cloud-Native Application Development: 5+ years. Solid experience with software QA methodologies, tools, and processes, specifically in

a cloud-based environment

Frontend and backend software testing: 5+ years experience working in a TDD/BDD environment and can utilize technologies such as

JUnit, Rest Assured, Appium,

Jbehave/Cucumber frameworks, APIs (REST/SOAP)

Java Experience: 1+ years of general proficiency with Java; in the context of writing test cases

Frontend Development and testing: 3+ years with Angular, JavaScript, TypeScript, or modern web application development frameworks;

Jasper, Jest and other unit testing frameworks. Salenium, Cucumber, and other integration testing frameworks

Architecture Knowledge: Understanding of modular systems, performance, scalability, security

Agile Experience: Agile development mindset and experience

Service-Oriented Architecture: Knowledge of RESTful web services, JSON, AVRO

Application Troubleshooting: Debugging, performance tuning, production support

Test-Driven Development: Unit, integration, and load testing experience and profiling (e.g. Java JVM, Databases) and tools such as Load

Runner, JMeter

Documentation Skills: Strong written and verbal communication

General SDLC: Experience with CI/CD concepts and can use tools including Jenkins/Bamboo, and release management concepts.

Understanding of GCP services related to big data like BigQuery, Dataflow, Pub/Sub,GCS, Composer/Airflow. Or, similar solutions in AWS:

Redshift, SNS, SQS, S3, Kinesis and others.

Labels: ,

Thursday, January 8, 2026

GCP Data Engineer,Atlanta,GA

 Status- Active 

Client- Equifax

Position: Data Engineer

Location : Alpharetta, GA (In Person)


To apply / details  send updated resume to chay@logic-loops.com 


Job Description: We are seeking an experienced Data Engineer with a strong

background in Python, GCP, SQL, and BigQuery to join our dynamic team.


Key Responsibilities:

Design,build, and maintain scalable and reliable data pipelines to process large volumes of data.

Develop and optimize SQL queries for data extraction, transformation, and loading(ETL) processes.

Work with BigQuery to manage and analyze large datasets, ensuring high performance and efficiency.

Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.

Implement data validation and quality checks to ensure data integrity and accuracy.

Monitor and troubleshoot data pipelines and workflows to resolve any issues and

           ensure smooth operation.

Stay updated with the latest trends and technologies in data engineering and

         apply best practices.


Qualifications:

Bachelor's degree in Computer Science, Engineering, or a related field.

Proven experience as a Data Engineer.

Strong proficiency in Python for data processing and automation.

Advanced SQL skills, with experience in writing complex queries and optimizing performance.

Hands-on experience with BigQuery for data storage, processing, and analysis.

Familiarity with data warehousing concepts and data modeling.

Experience with cloud platforms (e.g., Google Cloud Platform, AWS, Azure).

Excellent problem-solving skills and attention to detail.

Strong communication skills and the ability to work effectively in a team-oriented environment.

Preferred Skills:

Experience with other data processing tools and frameworks such as Apache Beam,Apache Spark, or similar.

Knowledge of data governance and data security best practices.

Familiarity with CI/CD pipelines and version control systems (e.g., Git)

Labels: ,

DAG Architect – Apache Airflow / Astronomer-Atlanta,GA

Client- TRUIST

Position: DAG Architect – Apache Airflow / Astronomer

Location : ATLANTA, GA- Onsite


To apply / details  send updated resume to chay@logic-loops.com 





Role Overview


We are looking for an experienced DAG Architect who can design, standardize, and scale Apache Airflow workflows in an enterprise environment. This role goes beyond writing individual DAGs — the focus is on DAG architecture, DAG Factory patterns, orchestration best practices, and platform-level design.


The ideal candidate understands how Airflow works internally, how to build reusable DAG frameworks, and how to integrate Airflow with multiple data platforms while following clean code, CI/CD, and cloud best practices.



Key Responsibilities


Airflow & DAG Architecture (Core)

Design and own enterprise-grade DAG architecture using Apache Airflow (Astronomer preferred)

Build and maintain DAG Factory frameworks using YAML / config-driven DAG generation

Define standard DAG templates (ingestion, transformation, validation, reconciliation, health checks)

Ensure separation of orchestration vs business logic

Design dependency patterns (task-level, DAG-level, cross-DAG dependencies)

Implement best practices for:

Scheduling

Backfills

Catchup behavior

Retries, SLAs, alerts

Idempotency and reruns

Optimize DAG performance and scalability



Operators, Hooks & Integrations

Strong hands-on experience with Airflow Operators, Hooks, Sensors

Experience integrating Airflow with:

Snowflake

Talend / DTF frameworks

Cloud Storage (S3 / GCS / ADLS)

Databases & APIs

Build and customize custom operators and plugins when required

Work with connections, variables, secrets backends



Python (Required)

Strong Python skills focused on:

Writing clean, modular DAG code

Reusable helper modules and utilities

Config-driven DAG generation

Understanding of:

Python packaging

Virtual environments

Dependency management

Ability to review and refactor Python code written by other developers



Platform & Cloud (Preferred)

Experience running Airflow on:

Astronomer

Kubernetes-based environments

Cloud experience in one or more:

AWS, GCP, or Azure

Understanding of:

IAM / service accounts

Secrets management

Environment separation (dev, test, prod)



CI/CD & DevOps (Good to Have)

Experience with CI/CD pipelines for Airflow:

DAG validation

Linting and unit testing

Deployment through Git-based workflows

Familiarity with:

Git

Jenkins / GitLab CI / GitHub Actions

Understanding of release management and rollback strategies



Governance, Standards & Collaboration

Define coding standards, folder structures, naming conventions

Review DAGs built by developers and provide architectural guidance

Work closely with:

Data engineers

Platform teams

Cloud and DevOps teams

Mentor junior developers on Airflow and DAG best practices



Required Skills (Must Have)

Strong experience with Apache Airflow

Proven experience as a DAG Architect or Lead Airflow Engineer

Deep understanding of:

DAG lifecycle

Scheduler, executor, metadata DB

XComs, sensors, triggers

Strong Python skills

Experience designing config-driven / DAG Factory frameworks



Nice to Have

Experience with Astronomer

Experience in regulated or enterprise environments

Exposure to data quality, reconciliation, or health-check frameworks

Experience supporting multiple teams using a shared Airflow platform

Labels: ,

Python / Java with AI/ML

Python / Java with AI/ML

Location: Alpharetta GA- Onsite

Client- Morgan Stanley

To Apply/details send updated resume to

Chay@logic-loops.com



Description:

We are seeking a talented developer with a strong background in utilizing Python, Java, and other modern programming languages.

In this role, the developer will be responsible for designing and deploying pipelines on the Snowflake and Postgres platforms.

The developer will be operating in a large centralized enterprise database engineering team, utilizing AI extensively in their code development process.

The developer will collaborate closely with database infrastructure engineers and automation teams to drive enterprise-level adoption of advanced data and AI solutions.

Develop, test, deploy applications and prototypes using Python, Java, and additional relevant technologies.

Leverage Snowflake AI, Notebooks, and Postgres for building and integrating, data pipelines, and ML workflows.

Implement and experiment with embeddings and machine learning models, demonstrating real-world business value through prototypes.

Work alongside infrastructure and automation teams to support global deployment and management of database products.


Requirements:

Proficiency in Python and Java; experience with UNIX/Linux scripting is a plus.

Demonstrated experience with Snowflake (AI and Notebooks), Postgres, and embedding techniques.

Familiarity with machine learning concepts and practical implementation in modern data platforms.

Ability to create clear, maintainable demo code and prototypes for both technical and non-technical stakeholders.

Excellent problem-solving skills and a collaborative mindset.

Snowflake Certifications is a nice to have with a focus on AI.

Labels: ,

GCP DATA ENGINEER-NJ-Onsite

 Client- UPS

Location- NJ onsite.

Multiple Positions 

  - GCP DATA ENGINEER

  - Sr GCP Data engineer

  -  Sr GCP Architect

 Sr positions must have 15+ years exp and onsite NJ


To apply / details  send updated resume to chay@logic-loops.com 



Job Summary:

We are seeking a skilled Google Cloud Platform (GCP) Data Engineer to design, build, and optimize data pipelines and analytics solutions in the cloud. The ideal candidate must have hands-on experience with GCP data services, strong ETL/ELT development skills, and a solid understanding of data architecture, data modeling, data warehousing and performance optimization.

 

Key Responsibilities:

Develop ETL/ELT processes to extract data from various sources, transform it, and load it into BigQuery or other target systems.

Build and maintain data models, data warehouses, and data lakes for analytics and reporting.

Design and implement scalable, secure, and efficient data pipelines on GCP using tools such as Dataflow, Pub/Sub, cloud run, Python and linux scripting.

Optimize BigQuery queries, manage partitioning and clustering, and handle cost optimization.

Integrate data from on-premise and cloud systems using Cloud Storage, and APIs.

Work closely with DevOps teams to automate deployments using Terraform, Cloud Build, or CI/CD pipelines.

Ensure security and compliance by applying IAM roles, encryption, and network controls.

Collaborate with data analysts, data scientists, and application teams to deliver high-quality data solutions.

Implement best practices for data quality, monitoring, and governance.

Required Skills and Experience:

Bachelor’s degree in Computer Science, Information Technology, or related field.

Minimum 8 years of experience in data engineering, preferably in a cloud environment.

Minimum 3 years of hands-on and strong expertise in GCP services:

o BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring.

Proficiency in SQL, Python and Linux scripting.

Prior experience with ETL tools such as Datastage, Informatica, SSIS

Familiarity with data modeling (star/snowflake) and data warehouse concepts.

Understanding of CI/CD, version control (Git), and Infrastructure as Code (Terraform).

Strong problem-solving and analytical mindset.

Effective communication and collaboration skills.

Ability to work in an agile and fast-paced environment.

GCP Professional Data Engineer or Cloud Architect certification is a plus.

Labels: ,