Position: Associate

Job type: Full-time

Loading ...

Job content

Responsibilities

:- Create Scala/Spark jobs for data transformation and aggregation- Develop, debug Spark /Scala, Java, Python code- Develop and apply patches- Work with core production support personnel in IT and Engineering to automate deployment and operation of the infrastructure. Manage, deploy, and configure infrastructure with Ansible or other automation tool sets.- Integrating ML libraries- Perform proof of concepts on scaling, reliability, performance and manageability.- Deployment during the release.- Help QA team with production parallel testing and performance testing.- Help out Dev team with POC/Adhoc execution of some of the jobs for debugging/cost analysis- Testing/Support of infrastructure component change (like changing the load balancer to F5- Creation of metrics and measures of utilization and performance.- Capacity planning and implementation of new/upgraded hardware and software releases as well as for storage infrastructure.- Research and recommend innovative, and where possible, automated approaches for system administration tasks.
  • Architect next generation solutions- Debugging Infrastructure issues (Like
  • Underlying network issue or Issues with the nodes)Qualifications :- Total 5 to 10 years of professional experience in Java, Scala and Python.- 3+ years of experience of Spark/MapReduce in production environment- Ability to work well with a global team of highly motivated and skilled personnel.
  • A deep understanding of Hadoop design principals, cluster connectivity, security and the factors that affect distributed system performance.- Knowledge & Experience on Kafka, Hbase and Hortonworks is mandatory.
  • Knowledge and experience in integrating ML integrations- Prior experience with remote monitoring and event handling using Nagios, ELK. Is added advantage- Good collaboration & communication skills, the ability to participate in an interdisciplinary team.- Strong written communications and documentation experience.- Knowledge of best practices related to Spark security, performance, and disaster recovery.- BE/BTech/BS/BCS/MCS/MCA in Computers or equivalent- Excellent interpersonal, written, and verbal communication skills (ref:hirist.com)
Loading ...
Loading ...

Deadline: 20-06-2024

Click to apply for free candidate

Apply

Loading ...
Loading ...

SIMILAR JOBS

Loading ...
Loading ...