Sr Big Data Engineer with Hadoop, Scala, Spark, Python, Kafka- Remote
Client-Health Care (Remote)
No of Positions-2
Sr Big Data Engineer with Hadoop, Scala, Spark, Python, Kafka
Experience- 10+ Years
Required Skills:
• Experience in Hadoop ecosystem components: HIVE,Pyspark, HDFS, SPARK, Scala, Streaming,(kinesis, Kafka)
• Strong experience in PySpark, Phython development
• Proficient with writing Hive and Impala Queries
• Ability to write complex SQL queries
• Experience with AWS Lambda, EMR, Clusters, Partitions, Datapipelines
Please send resumes to chay@logic-loops.com
<< Home