Thursday, July 24, 2025

Senior Marketing Technology Lead (MarTech)- Status- ACTIVE- Currently accepting Resumes


Senior Marketing Technology Lead (MarTech)
Location: Atlanta, GA (Hybrid – 3 Days/Week Onsite
Client- Synovous
About the Role
We are seeking a strategic and hands-on Marketing Technology Lead to support a leading banking client in transforming their marketing technology and digital engagement capabilities. This position plays a critical role in optimizing the client’s internet presence (customer-facing channels) and intranet presence (internal communications and marketing enablement), while ensuring platform integration, regulatory compliance, and cross-team collaboration.
Key Responsibilities
Martech Leadership
  • Define and lead the MarTech strategy and roadmap aligned with business growth, compliance requirements, and customer engagement goals
  • Advise senior leadership on marketing technology trends, innovations, and investment priorities
  • Conduct vendor evaluations and recommend MarTech solutions that align with business objectives
Platform Management & Optimization
  • Lead strategy and execution of tools including Salesforce Marketing Cloud (SFMC), CDPs, CRM systems, analytics, and web platforms.
  • Drive seamless integration across digital platforms to enable consistent customer experience.
Data, Analytics, and Compliance
  • Support data-driven campaign execution, personalization, and customer segmentation.
  • Ensure adherence to privacy and financial compliance standards (e.g., CCPA, GDPR, FFIEC).
Internet & Intranet Presence
  • Strengthen the client’s internet presence by supporting public-facing digital experiences, customer journeys, SEO/SEM strategies, and personalized content delivery.
  • Enhance the intranet presence by delivering tools, dashboards, and content that empower internal marketing and sales teams.
Collaboration & Enablement
  • Partner with Marketing, IT, Compliance, and Analytics teams to ensure platform alignment and user adoption.
  • Communicate technical concepts clearly to both technical and non-technical audiences.
Team Leadership
  • Lead a focused strategic team of marketing technology specialists and analysts
  • Mentor team members on MarTech planning, analysis, and strategic thinking
  • Coordinate with implementation teams to ensure strategic alignment
 
Qualifications
Required Experience
  • 8–12 years of experience in marketing technology strategy and leadership, ideally within the banking or financial services sector
  • Proven experience in strategic planning and evaluation for marketing technology platforms
  • Strong strategic understanding of Salesforce Marketing Cloud, Adobe Experience Cloud, Marketo, or similar platforms
  • Deep knowledge of CRM integration strategy, customer data architecture, analytics frameworks, and personalization strategies
  • Comprehensive understanding of data governance, marketing compliance (e.g., FFIEC, FDIC), and privacy regulations
  • Exceptional cross-functional leadership and strategic project management skills
  • Bachelor’s degree in marketing, Information Systems, or related field (Master's preferred)
Preferred Experience
  • Experience working in regulated industries, especially banking or insurance
  • Familiarity with digital banking tools, omnichannel marketing strategy, and financial customer journey frameworks
  • Certifications in SFMC, Marketo, Adobe, or related platforms
  • Experience with marketing technology vendor management and contract negotiations
  • Background in change management and technology transformation initiatives.

Please send resumes to chay@logic-loops.com

Labels:

Sr Fullstack Developer with GCP- Status- ACTIVE- Currently accepting Resumes

Title: Sr Fullstack Developer with GCP

Location: Alpharetta, GA   - (Face to  Face  Interview-Day1 onsite role)

Duration: Longterm

Client :Equifax

 

Must have skills : Apache Beam, Dataflow, GCP, CICD, SpringBoot, Microservices, Webservices, Batch Processing,  Bigdata Development,

Seeking an experienced, resourceful, full stack engineer who can adapt and hit the ground running with minimal supervision. This individual will be passionate about end-user experience and best-in-class engineering excellence and will be part of a tight-knit, distributed engineering team developing and delivering a comprehensive data operations management solution 

Data operations management solution consists of:

·       A web portal UI/UX that provides a single point of access to all data management and data reliability engineering

·       A suite of backend API services that services the UI and integrates with low-level Data Fabric and other third-party system APIs 

·       Modern data lakehouse (data lake, data warehouse, batch and streaming ELT pipelines) 

The data operations roadmap envisions a set of rich management capabilities including:

·       Serves a large community of geographically dispersed data operations stakeholders

·       Data quality and observability management to detect, alert, and prevent data anomalies

·       Troubleshooting, triaging and resolving data and data pipeline issues

·       OLAP, batch and streaming big data processing, and BI reporting

·       MLOps

·       Real-time dashboards, alerting and notifications, case management, user/group management, AuthZ, and many other foundational capabilities

Tech Stack

·       Frontend: Angular 17+, JavaScript, TypeScript, HTML, SCSS, Webpack Module Federation, Tailwinds CSS, Angular Material, Angular Elements

·       Backend: Java (JDK 17+), Spring Framework 6.X.X, Spring Boot 3.X.X, NestJS 10.X.X, REST and GraphQL microservices, NodeJS

·       Tools & Frameworks: Nx build management, Monorepo architecture, Jenkins CI/CD, Fortify, Sonar, GitHub

·       Cloud & Data: GCP (GKE, Composer + Airflow, Dataflow + Apache Beam, BigQuery, BigTable, Firestore, GCS, PubSub, Vertex AI), Terraform, Helm Charts, GitOps

·       Other Technologies: Websockets, SSE, event-driven architecture

Environment

·       Culture: Fast-paced, creative, results-oriented

·       Team Structure: Agile, working in 2-week sprints using Aha and Jira for project management

·       Expectations: Self-starters who can work independently with limited guidance, delivering solutions that end-users value and love

Responsibilities

·       End-to-End Development: Design, develop, test, deploy, and operate software applications, covering both frontend and backend

·       Cross-Functional Work: Collaborate with global teams to integrate with existing internal systems and GCP cloud

·       Issue Resolution: Triage and resolve product or system issues, ensuring quality and performance

·       Documentation: Write technical documentation, support guides, and run books

·       Agile Practices: Participate in sprint planning, retrospectives, and other agile activities

·       Compliance: Ensure software meets secure development guidelines and engineering standard




Please send resumes to chay@logic-loops.com

Labels:

Data Infrastructure Engineer/Data Platform-Status- ACTIVE- Currently accepting Resumes

Client: T-Mobile

Title: Data Infrastructure Engineer/Data Platform

Location: Atlanta, GA / Frisco, TX / Kansas City, KS

Must-Have:

DataOps with experience in Kafka, Databricks, Snowflake

Data Infrastructure Engineer/Data Platform SRE – maintains uptime and reliability of data processing of DataOne, 

Automates IAC, 

Deployment, 

Monitoring



Please send resumes to chay@logic-loops.com

Labels:

Engineer, Systems Reliability -Status- ACTIVE- Currently accepting Resumes

 Client: T-Mobile

Title: Engineer, Systems Reliability 

Location: Atlanta, GA / Frisco, TX / Kansas City, KS

Must-Have:

3+ years of experience in Site Reliability Engineering, DevOps, or related fields.

Proficiency with cloud platforms like AWS, GCP, or Azure.

Strong experience with containerization (Docker, Kubernetes).

Solid knowledge of scripting/programming languages (Python, Go, Bash).

Experience with CI/CD tools (e.g., Jenkins, GitHub Actions, ArgoCD).

Familiarity with observability stacks and tools (e.g., ELK, Prometheus/Grafana, Datadog).

Good understanding of networking, DNS, load balancing, and security best practices



Please send resumes to chay@logic-loops.com

Labels:

Sr Engineer, Site Reliability -Status- ACTIVE- Currently accepting Resumes

Client: T-Mobile

Title: Sr Engineer, Site Reliability 

Location: Atlanta, GA / Frisco, TX / Kansas City, KS

Must-Have:

5+ years of experience in Site Reliability Engineering, DevOps, or related fields.

Proficiency with cloud platforms like AWS, GCP, or Azure.

Strong experience with containerization (Docker, Kubernetes).

Solid knowledge of scripting/programming languages (Python, Go, Bash).

Experience with CI/CD tools (e.g., Jenkins, GitHub Actions, ArgoCD).

Familiarity with observability stacks and tools (e.g., ELK, Prometheus/Grafana, Datadog).

Good understanding of networking, DNS, load balancing, and security best practices


Please send resumes to chay@logic-loops.com

Labels:

Principal Engineer, Site Reliability - Status- ACTIVE- Currently accepting Resumes

Client: T-Mobile

Title: Principal Engineer, Site Reliability  

Location: Atlanta, GA / Frisco, TX / Kansas City, KS

Must-Have:

8+ years of experience in Site Reliability Engineering, DevOps, or related fields.

Proficiency with cloud platforms like AWS, GCP, or Azure.

Strong experience with containerization (Docker, Kubernetes).

Solid knowledge of scripting/programming languages (Python, Go, Bash).

Experience with CI/CD tools (e.g., Jenkins, GitHub Actions, ArgoCD).

Familiarity with observability stacks and tools (e.g., ELK, Prometheus/Grafana, Datadog).

Good understanding of networking, DNS, load balancing, and security best practices.


Please send resumes to chay@logic-loops.com



Labels:

DataOps Engineer-Status- ACTIVE- Currently accepting Resumes

 Client: T-Mobile

Title: DataOps Engineer

Location: Atlanta, GA / Frisco, TX / Kansas City, KS

We are seeking a highly skilled DataOps Engineer to help implement automation that drives the technical and business processes associated with cloud-based data platform. This would include coordinating with the Business Technology team, implementing the CI/CD for our data workflows, and helping to establish critical platform integrations into key governance and quality tools as well as downstream systems. The ideal candidate will have a strong understanding of DataOps principles, experience in teams that manage data as a product and leverage modern data tools (Kafka, Databricks, Snowflake), and a passion for automation and efficiency. This role will be instrumental in building and scaling our data platform to support data-driven decision-making across the organization.


Responsibilities:

Hands-on implementation of automated workflows and processes for the Enterprise Data & Analytics team, ensuring performance and alignment to architectural guidelines.

Implement and maintain DataOps CI/CD pipelines for rapid, reliable infrastructure and data product deployment, encompassing automated environment provisioning and management, testing, and linting stages.

Develop and deploy sophisticated data pipeline automation and orchestration solutions for efficient and reliable data workflows.

Implement and optimize monitoring and alerting systems to proactively identify and resolve issues.

Collaborate across the organization.

Implement DataOps best practices through automated implementation of tooling (Snowflake, dbt, Atlan, Acceldata, Airflow, and Git).

Measure adoption and look for continuous improvement opportunities related to technology, process, and skills.

Drive and, to the extent possible, automate the creation of detailed documentation for implemented data pipelines and operational procedures.

Continuously tuned solutions for optimal performance.


Basic Qualifications:

Builds and supports data workflows and analytics, focuses on data availability, quality and resiliency.

Experience in the DataOps, DevOps, and/or Data Engineering space.

Proven experience (3-5+ years) with core data platforms and orchestration tools such as Kafka, Databricks, Snowflake, and GitHub Actions.

Deep practical understanding and application of DataOps principles and methodologies.

High level understanding of data modeling, ETL/ELT processes, and data warehousing best practices.

Hands-on experience implementing and maintaining robust CI/CD pipelines for data platforms and products.

Please send resumes to chay@logic-loops.com

Labels:

Senior Java Backend Engineer -Status- ACTIVE- Currently accepting Resumes

 Client: APPLE

Job Title: Senior Java Backend Engineer 

No. of Openings # 2

Location: Austin, TX (onsite) 

Experience: 7+ years of experience as senior Java backend development 

Job Description: 

We are seeking a talented Senior Java Backend Engineer with a minimum of 7 years of experience to join our dynamic team in Austin, TX. The ideal candidate will have extensive experience in designing, developing, and deploying multi-tier distributed web applications in an enterprise environment. You will play a critical role in creating and maintaining RESTful APIs and collaborating with front-end and back-end development teams to ensure seamless integration. 

 Key Responsibilities: 

Design, develop, and deploy multi-tier, distributed web applications in an enterprise setting. 

Create and maintain RESTful APIs while ensuring effective collaboration with development teams for smooth integration. 

Work with relational databases (Oracle) and NoSQL databases (Cassandra) to effectively manage and model data. 

Develop and optimize low-latency service APIs and data aggregation pipelines. 

Utilize Java Spring, JPA, and Hibernate frameworks to build scalable applications. 

Advocate for and implement best practices in Test Driven Development (TDD) and Continuous Integration (CI). 

Solve complex technical problems, debug issues efficiently, and propose scalable solutions. 

Utilize AWS services to deploy and maintain applications, ensuring their reliability and scalability. 

 Must-Have Skills: 

Minimum of 7 years of experience in backend development 

REST Based Web Services 

Spring (Spring Data, Spring JPA, Spring Web Services) 

Advanced Java (Java 8+) 

AWS 

Datastax Cassandra 

Hibernate 

Oracle


Please send resumes to chay@logic-loops.com

Labels:

Senior Engineering Program Manager – Full Stack & DevOps-Status- ACTIVE- Currently accepting Resumes

Client: APPLE

Title: Senior Engineering Program Manager – Full Stack & DevOps

Location: Austin, TX

We are urgently looking for a Senior Engineering Program Manager to drive large-scale, cross-functional technical initiatives. The ideal candidate should have strong experience with full stack development, CI/CD, Agile methodologies, and proven success in managing complex programs across global teams.

Responsibilities:

Support multiple cross-functional technical teams to deliver their objectives in fast-paced and complex programs and projects.

Develop strong partnerships with engineering leaders to drive focus on strategic and tactical program objectives.

Own and manage program and portfolio planning, execution of roadmap, and engineering operations for multiple verticals.

Drive alignments and prioritization across all technology initiatives, manage roadmap for a single consistent view and lead communications.

Build strategic relationships with key technology and business leaders to ensure program success

Drive teams and individuals in planning and executing roadmaps, releases, and work backlogs.

Lead efforts to identify risks, resolve key project conflicts, and establish appropriate resolution paths.

Fill in gaps across roles and functions as needed, performing as an adaptive problem solver.

Report on project status or portfolio roadmap, risks, issues and mitigation plans.

Create a collaborative work environment that fosters shared understanding, transparency, mastery, autonomy, innovation, and continuous learning.

Coach and mentor others in the best practices of modern planning and execution.

 

Qualifications:


5+ years of strong hands-on program management experience

Direct experience working with cross-functional teams, engineering leadership, technical teams and individual contributors.

Experience managing organization wide, large scale, high impact programs.

Proven success on establishing organization-wide processes.

Direct work experience in a engineering program management capacity to drive large technical initiatives including all aspects of process development and execution.

Knowledge of complex technical ecosystems and adequate technical depth.

Experience managing multiple major and concurrent projects/programs executed through multiple geographic locations.

Ability to quickly adapt to faster pace, shifting priorities, demands, and timelines through analytical and problem-solving capabilities.

 Preferred (not required):

Experience working in Security domain

 Required Skills:

Program Management

Experience with Agile Methodologies

Experience with full stack implementations, GitHub, CI/CD and micro services/APIs

Cross-functional collaboration


Please send resumes to chay@logic-loops.com

Labels:

Automation Tester (TOSCA)- Status- ACTIVE- Currently accepting Resumes

 Client: T-Mobile

Role: Automation Tester

Location: Atlanta, GA / Frisco, TX / Kansas, Ks / Seattle, WA

 We are looking for a dedicated Tester with strong expertise in Automation testing, with experience in development as well. The ideal candidate will focus on designing and executing test cases while contributing to automation initiatives as needed. This role requires a strong grasp of testing methodologies, attention to detail, and a willingness to grow automation skills to enhance overall testing efficiency.


The candidate must have:-


Strong hands-on experience, must have worked extensively in web-service testing

Have strong technical knowledge of Automation Testing

In-depth knowledge of testing against swagger. 


Key Responsibilities:


Manual Testing:


Analyze requirements and design detailed manual test cases to validate software functionality, usability, and performance.

Execute test cases, record results, and report defects with comprehensive descriptions and steps to reproduce.

Perform various types of testing, including regression, functional, exploratory, and usability testing.

Collaborate with cross-functional teams to ensure complete test coverage and alignment with project goals.

Automation Testing (Preferred):

Contribute to the development and execution of basic automation scripts for repetitive and high-impact test cases.

Support automation initiatives by maintaining and updating test scripts under guidance.

Assist in integrating automated tests into CI/CD pipelines to improve testing efficiency.


Defect Management:


Track, log, and manage defects using tools such as JIRA, q test, or similar platforms.

Retest resolved issues and provided detailed feedback to developers.


Development:

Experience to write pre-requisite scripts in POSTMAN

Create, edit and import schema formats like WADL/WSDL

Simulate API endpoints with mock servers

Have extensive hands-on working knowledge on Restful APIs, JSON, OAuth, etc. and how to integrate and test the development code in POSTMAN

Collaboration and Reporting:

Work closely with developers, product owners, and stakeholders to deliver high-quality software.

Document test cases, results, and issues clearly and concisely.

Contribute to test planning and strategy discussions.

Years of experience needed – 

4+ years of experience in testing.

1+ year as developer

Technical Skills: 

Proficient in designing and executing detailed manual test cases.

Strong understanding of software testing life cycle (STLC) and QA methodologies.

Familiarity with defect tracking tools (e.g., JIRA, q Test).

Knowledge of Agile methodologies and processes.

Deep understanding of automation tools such as Rest Assured and/or TOSCA.

Basic Programming knowledge in Java or JavaScript is required

Working knowledge in API development.

Strong in API testing using tools like Postman.

Familiarity with continuous integration tools like Jenkins or GitLab CI/CD.

Detail-oriented with a strong focus on quality and accuracy.

Eager to learn and grow in automation testing.

Strong problem-solving and analytical skills.

Excellent communication and teamwork abilities.

Certifications Needed: 

TOSCA certification is preferred.

Please send resumes to chay@logic-loops.com

Labels:

Friday, July 18, 2025

Oracle Fusion ERP & EPM Integration Developer (Only Atlanta local candidates)- Status- ACTIVE- Currently accepting Resumes

 Client: Equifax

Oracle Fusion ERP & EPM Integration Developer

We are looking for an Oracle Fusion ERP & EPM Integration Developer that will be responsible for building and maintaining integrations between Oracle Fusion Cloud ERP & EPM applications and other systems, both within and outside of the Oracle ecosystem. This involves using various tools and technologies to ensure seamless data flow and business process execution. 

Key Responsibilities:

  • Developing Integrations:

Creating custom integrations using Oracle Integration Cloud (OIC)Visual Builder Cloud Service (VBCS), and other relevant tools to connect Oracle Fusion with other applications (e.g., Ariba, FIS, third-party systems). 

  • Utilizing Adapters:

Working with prebuilt Oracle ERP Cloud adapters, technology adapters (REST, SOAP, FTP, etc.), and custom adapters to facilitate communication and data exchange. 

  • Understanding Integration Patterns:

Implementing various integration patterns, such as message queues, APIs, file-based transfers, and event-driven architectures. 

  • Ensuring Data Integrity:

Working with various data formats (XML, JSON, etc.) and ensuring data consistency and accuracy during integrations. 

  • Managing Security:

Implementing appropriate security measures, including OAuthSAML, and data encryption, to protect sensitive data during integration. 

  • Troubleshooting and Support:

Diagnosing and resolving integration-related issues, providing support to end-users, and optimizing integration performance. 

Key Skills and Technologies:

  • Oracle Integration Cloud (OIC): A core platform for building and managing integrations. 
  • Visual Builder Cloud Service (VBCS): Used for building user interfaces and extending functionality within Oracle Fusion ERP. 
  • Oracle ERP Cloud Adapter: A prebuilt adapter for connecting to Oracle Fusion ERP. 
  • SOAP, REST, XML, JSON: Fundamental web service and data exchange technologies. 
  • Cloud Native Technologies: Familiarity with cloud-based solutions and services. 
  • Security Protocols: Knowledge of OAuth 2.0, SAML 2.0, and data encryption. 
  • Oracle ERP Cloud (Financials, Procurement, SCM): Understanding the core modules and APIs of Oracle Fusion ERP. 
  • Oracle EPM Cloud (Account Reconciliation, Financial Close and Consolidation): Understanding the core modules and APIs of Oracle Fusion EPM. 
  • Integration Patterns: Understanding various integration approaches and best practices. 
  • API Client & Testing: Familiarity with tools such as Postman, Bruno, or SOAP UI.

This Oracle Fusion Integration Developer will be a crucial role in ensuring that Oracle Fusion Cloud can seamlessly interact with other systems, enabling businesses to streamline their processes and achieve greater efficiency.


Please send resumes to chay@logic-loops.com

Labels:

Sr Big Data Developer with GCP and Microservices (Only Atlanta local candidates)- Status- ACTIVE- Currently accepting Resumes

 Client: Equifax

Sr Big Data Developer with GCP and Microservices( Atlanta locals only)

Day 1 onsite.

10+ Years exp

Skill Set- Apache Beam, Dataflow, GCP, CICD, SpringBoot,  Microservices,  Webservices,  Batch Processing,  Bigdata Development,

Please send resumes to chay@logic-loops.com

Labels:

Thursday, July 17, 2025

GCP- Google Cloud Architect-Atlanta, GA (Only Atlanta local candidates)- Status- ACTIVE- Currently accepting Resumes

Title: GCP Cloud Architect
Location: Alpharetta, GA (Day1 onsite role) 
Duration: Longterm
Client :Equifax
 Visa Type- only USC
 Responsibilities:
  • Architectural Leadership: Lead the design and architecture of highly scalable, resilient, and cost-effective data solutions leveraging a diverse set of big data and cloud-native services in GCP and AWS.
  • Technical Guidance & Mentorship: Provide expert architectural guidance, technical leadership, and mentorship to multiple engineering teams, ensuring adherence to architectural principles, best practices, and design patterns.
  • Platform Development & Evolution: Drive the selection, implementation, and continuous improvement of core data platform components, tools, and frameworks.
  • Cloud-Native Expertise: Leverage deep understanding of GCP and AWS data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, S3, EMR, Kinesis, Redshift, Glue, Athena) to design optimal solutions.
  • Data Governance & Security: Architect and implement robust data governance, security, privacy, and compliance measures within the data platform, ensuring data integrity and regulatory adherence.
  • Performance & Optimization: Identify and address performance bottlenecks, optimize data pipelines, and ensure efficient resource utilization across cloud environments.
  • Innovation & Research: Stay abreast of emerging Technologies,  big data and cloud technologies, evaluate their potential impact, and recommend their adoption where appropriate.
  • Cross-Functional Collaboration: Collaborate closely with Developers, Test Engineers, Data scientists, data engineers, analytics teams, product managers, and other stakeholders to understand data requirements and translate them into architectural designs.
  • Documentation & Standards: Develop and maintain comprehensive architectural documentation, standards, and guidelines for data platform development.
  • Proof-of-Concepts (POCs): Lead and execute proof-of-concepts for new technologies and architectural patterns to validate their feasibility and value.
 
Qualifications:
  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
  • 10+ years of progressive experience in Application Developments using Java/J2EE, Data architecture, Data engineering, or cloud platform engineering.
  • Strong experience in API Developments, Spring Framework, Webservices (Rest/SOAP) , Microservices
  • 5+ years of hands-on experience specifically designing and building large-scale data platforms in a cloud environment.
  • Expertise in designing and implementing data lakes, data warehouses, and data marts in cloud environments.
  • Proficiency in at least one major programming language for data processing (e.g., Python, Scala, Java/J2EE).
  • Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Flink).
  • Experience with various data modeling techniques (dimensional, relational, NoSQL).
  • Solid understanding of DevOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation).
  • Experience with real-time data streaming technologies (e.g., Kafka, Kinesis, Pub/Sub).
  • Strong understanding of data governance, data quality, and metadata management concepts.
  • Excellent communication, presentation, and interpersonal skills with the ability to articulate complex technical concepts to both technical and non-technical audiences.
  • Proven ability to lead and influence technical teams without direct authority.
  • Strong, demonstrable experience with Google Cloud Platform (GCP) big data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Cloud Functions). GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect)
  • Strong, demonstrable experience with Amazon Web Services (AWS) big data services (e.g., S3, EMR, Kinesis, Redshift, Glue, Athena, Lambda).
  • GCP/AWS certifications (e.g., Solutions Architect Professional, Big Data Specialty).
  • Experience with data mesh principles and implementing domain-oriented data architectures.
  • Familiarity with other cloud platforms (e.g., Azure) or on-premise data technologies.
  • Experience with containerization technologies (e.g., Docker, Kubernetes).
  • Knowledge of machine learning operationalization (MLOps) principles and platforms.
  • Contributions to open-source big data projects

Please send resumes to chay@logic-loops.com

Labels: