Loading ...

Job content

The health and safety of our employees and candidates is very important to us. Due to the current situation related to the Novel Coronavirus (2019-nCoV), we’re leveraging our digital capabilities to ensure we can continue to recruit top talent at the HSBC Group. As your application progresses, you may be asked to use one of our digital tools to help you through your recruitment journey. If so, one of our Resourcing colleagues will explain how our video-interviewing technology will be used throughout the recruitment process and will be on hand to answer any questions you might have.

Some careers have more impact than others.

If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be.

HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions.

We are currently seeking an experienced professional to join our team in the role of Lead Architect

Job Profile:

Principal Responsibilities

  • Good experience in administration of Big data platform and the allied toolset. Big data platform software from Hortonworks, Cloudera, MapR
  • Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates
  • Prior experience of Linux system administration
  • Good experience of Hadoop capacity planning in terms of HDFS file system, Yarn resources
  • Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions
  • Good troubleshooting skills, able to identify the specific service causing issues, reviewing logs and able to identify problem entries and recommend solution working with product vendor
  • Capable of reviewing and accepting/ challenging solutions provided by product vendors for platform optimization and root cause analysis tasks
  • Experience in doing product upgrades of the core big data platform, cluster expansion, setting up High availability for core services
  • Good knowledge of Hive as a service, Hbase, Kafka, Spark
  • Knowledge of basic data pipeline tools like Sqoop, File ingestion, Distcp and their optimal usage patterns
  • Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases
  • Exposure to Amazon Web services (AWS) and Google cloud platform (GCP) services relevant to big data landscape, their usage patterns and administration
  • Working with application teams and enabling their access to the clusters with the right level of access control and logging using Active directory (AD) and big data tools
  • Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
  • Configuring Java heap and allied parameters to ensure all Hadoop services are running at their optimal best
  • Working knowledge of Hortonworks Data flow (HDF) architecture, setup and ongoing administration
  • Significant experience on Linux shell scripting, Python or perl scripting
  • Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc)
  • Worked on projects with Agile/ Devops as the product management framework, good understanding of the principles and ability to work as part of the POD teams
  • Working knowledge of open source RDBMS - MySQL, Postgres, Maria DB
  • Ability to go under the hood for Hadoop services (Ambari, Ranger etc) that use DB as the driver
  • Identify project issues, communicate them and assist in their resolution
  • Assist in continuous improvement efforts in enhancing project team methodology and performance

Requirements

  • Excellent communication, interpersonal and decision making skills
  • Ab-initio, Pentaho ETL tools implementation and administration
  • DevOps tooling.
  • Good knowledge of ANSI standard SQL and optimization
  • Working knowledge of ingestion batch and data lake management
  • Python scripting
  • Java knowledge
  • Change data capture tools like Attunity, IBM CDC - Implementation and administration
  • Contribution to Apache open source, Public github repository with sizeable big data operations and application code base

You’ll achieve more when you join HSBC.

HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.”

Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

***Issued By HSBC Software Development Centre***

Qualifications

Any Bachelors degree

Loading ...
Loading ...

Deadline: 16-05-2024

Click to apply for free candidate

Apply

Loading ...
Loading ...

SIMILAR JOBS

Loading ...
Loading ...