Loading ...

Job content

Job Description:

  • Deploy Hadoop Bigdata cluster Commissioning decommissioning of nodes track jobs monitor services like zookeeper hbase SOLR indexing configure name node

  • HA schedule configuring backups and restore Installation and upgradation of mongo Databases Docker UCP and DTR Enabling Luks encryption, Gazzang security on mongo Database in WALL PROD servers for security reasons To develop script to review logs and alert us in case of long running queries

  • Configuration of Hadoop components such as hive BigSQL Active Directory profiles clusters nodes ambari servers, ambari slave agents for effective hadoop cluster monitoring

  • Bigdata servers Writing automation scripts to automate application deployment configuring CRON MAESTRO job for scheduling jobs, Performing Disaster Recovery between RISC and SISC data centers for mongo DB

  • Hadoop applications to ensure BAU Configuring SSL security in Hadoop clusters with respect to Bigdata applications servers, Monitor analyze fine tune application resource and runtime parameters like JVM heap size memory usage thread pools etc

  • Setting up sync between PROD and DR to ensure on high availability via Hadoop distcp distributed copy, Setting up enterprise online store EOS backup and restore using TALENA restoring the same in case of any Disaster failures

  • Monitor and analyze the servers and application using tools like APPdynamics Grafana Prometheus and Splunk Onboarding of new instances mongo DBA applications on docker infrastructure environment

  • As part of application batch jobs monitoring ensuring referential integrity Accomplish data restatements Load large data volumes in to Hadoop distributed file system on timely manner.

  • Develop DBA maintenance plans such as gathering strategy indexing monitoring guardium schedule automated scripts maestro jobs etc Troubleshoot connectivity availability issues like server node down DB down or network issues between hadoop servers and application servers such as Informatica


Primary Skills:

  • Hadoop Admin, Mongo DB, zookeeper, hbase

  • Deploy Hadoop Bigdata cluster

Secondary Skills:

  • Configuration of Hadoop components such as hive BigSQL Active Directory
  • Bigdata servers, Writing automation scripts
Loading ...
Loading ...

Deadline: 14-07-2024

Click to apply for free candidate

Apply

Loading ...
Loading ...

SIMILAR JOBS

Loading ...
Loading ...