Urgent Opening For Senior Bigdata Developer with GCP & Spark
☞ Abzooba India Infotech Private Limited
Ver: 107
Dia de atualização: 06-05-2024
Localização: Pune Maharashtra
Categoria: R & D
Indústria: IT Services & Consulting
Posição: Other
Tipo de empregos: Full Time, Permanent
Experiência: 5 - 7 years
Conteúdo do emprego
We are only looking for early joiners.--60 or 90 days need not apply.
Job Responsibilities:
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
- Implementing the Ingestion & ETL/ELT processes.
- Monitoring performance and advising any necessary infrastructure changes.
Minimum Job Requirement:
- Implementing the Ingestion & ETL/ELT processes MUST have HANDS-ON experience on Hadoop tools/technologies like Spark (Strong in Spark), Map Reduce, Hive, HDFS.
- Proficiency in any of the programming language: Scala, Python or Java with total of 4years.
- Must have some experience in GCP Cloud infrastructures like GCS, GSUtill, Cloud Function, Cloud Pub-sub, Data flow, Big Table, Big Query etc.
- MUST have HANDS-ON experience on Hadoop tools/technologies like Spark (Strong in Spark), Map Reduce, Hive, HDFS.
- HANDS-ON expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi.
- Proficiency in any of the programming language: Scala, Python or Java with total of 4years experience.
- Must have experience in GCP Cloud infrastructures like GCS, GSUtill, Cloud Function, Cloud Pub-sub, Data flow, Big Table, Big Query etc.
- Must have experience of designing, developing, and deploying big data project/s into production.
- Basic knowledge about Rest APIs.Roles and Responsibilities
Desired Candidate Profile --We are only looking for early joiners.--60 or 90 days notice need not apply.
Candidates who are serving notice are highly preferred.
- Implementing the Ingestion & ETL/ELT processes MUST have HANDS-ON experience on Hadoop tools/technologies like Spark (Strong in Spark), Map Reduce, Hive, HDFS.
- Proficiency in any of the programming language: Scala, Python or Java with total of 4years.
- Must have experience in GCP Cloud infrastructures like Data flow, Big Query.
- Basic knowledge about Rest APIs.
- Should be aware of Jenkins, Terraform etc.
Perks and Benefits Attractive ctc structures with additional benefits.
Data limite: 20-06-2024
Clique para aplicar para o candidato livre
Reportar emprego
EMPREGOS SEMELHANTES
-
⏰ 02-07-2024🌏 Pune, Maharashtra
-
⏰ 02-07-2024🌏 Thane, Maharashtra
-
⏰ 22-06-2024🌏 Mumbai, Maharashtra
-
⏰ 03-07-2024🌏 Tarapur, Maharashtra
-
⏰ 02-07-2024🌏 Pune, Maharashtra