GCP- Google Cloud Architect-Atlanta, GA (Only Atlanta local candidates)- Status- ACTIVE- Currently accepting Resumes
Title: GCP Cloud
Architect
Location: Alpharetta, GA (Day1 onsite role)
Duration: Longterm
Client :Equifax
Visa Type- only USC
Responsibilities:
Qualifications:
Location: Alpharetta, GA (Day1 onsite role)
Duration: Longterm
- Architectural Leadership: Lead the design and architecture of highly scalable, resilient, and cost-effective data solutions leveraging a diverse set of big data and cloud-native services in GCP and AWS.
- Technical Guidance & Mentorship: Provide expert architectural guidance, technical leadership, and mentorship to multiple engineering teams, ensuring adherence to architectural principles, best practices, and design patterns.
- Platform Development & Evolution: Drive the selection, implementation, and continuous improvement of core data platform components, tools, and frameworks.
- Cloud-Native Expertise: Leverage deep understanding of GCP and AWS data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, S3, EMR, Kinesis, Redshift, Glue, Athena) to design optimal solutions.
- Data Governance & Security: Architect and implement robust data governance, security, privacy, and compliance measures within the data platform, ensuring data integrity and regulatory adherence.
- Performance & Optimization: Identify and address performance bottlenecks, optimize data pipelines, and ensure efficient resource utilization across cloud environments.
- Innovation & Research: Stay abreast of emerging Technologies, big data and cloud technologies, evaluate their potential impact, and recommend their adoption where appropriate.
- Cross-Functional Collaboration: Collaborate closely with Developers, Test Engineers, Data scientists, data engineers, analytics teams, product managers, and other stakeholders to understand data requirements and translate them into architectural designs.
- Documentation & Standards: Develop and maintain comprehensive architectural documentation, standards, and guidelines for data platform development.
- Proof-of-Concepts (POCs): Lead and execute proof-of-concepts for new technologies and architectural patterns to validate their feasibility and value.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
- 10+ years of progressive experience in Application Developments using Java/J2EE, Data architecture, Data engineering, or cloud platform engineering.
- Strong experience in API Developments, Spring Framework, Webservices (Rest/SOAP) , Microservices
- 5+ years of hands-on experience specifically designing and building large-scale data platforms in a cloud environment.
- Expertise in designing and implementing data lakes, data warehouses, and data marts in cloud environments.
- Proficiency in at least one major programming language for data processing (e.g., Python, Scala, Java/J2EE).
- Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Flink).
- Experience with various data modeling techniques (dimensional, relational, NoSQL).
- Solid understanding of DevOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation).
- Experience with real-time data streaming technologies (e.g., Kafka, Kinesis, Pub/Sub).
- Strong understanding of data governance, data quality, and metadata management concepts.
- Excellent communication, presentation, and interpersonal skills with the ability to articulate complex technical concepts to both technical and non-technical audiences.
- Proven ability to lead and influence technical teams without direct authority.
- Strong, demonstrable experience with Google Cloud Platform (GCP) big data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Cloud Functions). GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect)
- Strong, demonstrable experience with Amazon Web Services (AWS) big data services (e.g., S3, EMR, Kinesis, Redshift, Glue, Athena, Lambda).
- GCP/AWS certifications (e.g., Solutions Architect Professional, Big Data Specialty).
- Experience with data mesh principles and implementing domain-oriented data architectures.
- Familiarity with other cloud platforms (e.g., Azure) or on-premise data technologies.
- Experience with containerization technologies (e.g., Docker, Kubernetes).
- Knowledge of machine learning operationalization (MLOps) principles and platforms.
- Contributions to open-source big data projects
Please send resumes to chay@logic-loops.com
Labels: EQF80
<< Home