background

Niestety, nie wspieramy Twojej przeglądarki

Niestety nie wpieramy Twojej przeglądarki co może znacznie wpłynąć na poprawne ładowanie skryptów strony.

Logo Pracuj.pl

(GDT) Hadoop Administrator

HSBC Service Delivery (Polska) Sp. z o.o.About the company

  • Kraków, Lesser Poland
  • offer expired 2 years ago
  • contract of employment
  • full-time
  • specialist (Mid / Regular)
  • remote recruitment
  • Запрошуємо працівників з України
Запрошуємо працівників з України
Роботодавець відкритий для працевлаштування громадян України

HSBC Service Delivery (Polska) Sp. z o.o.

Kapelanka 42a

Dębniki

Kraków

Check how to get there

Technologies we use

Expected

  • Safe

  • Scrum

  • Kanban

  • Lean

Your responsibilities

  • Good experience in administration of Big data platform and the allied toolset. Big data platform software from Cloudera.
  • Working knowledge of Hortonworks Data flow (HDF) architecture, setup and ongoing administration
  • Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates
  • Prior experience of Linux system administration
  • Good experience of Hadoop capacity planning in terms of HDFS file system, Yarn resources
  • Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions
  • Good troubleshooting skills, able to identify the specific service causing issues, reviewing logs and able to identify problem entries and recommend solution working with product vendor
  • Capable of reviewing and accepting/ challenging solutions provided by product vendors for platform optimization and root cause analysis tasks
  • Experience in doing product upgrades of the core big data platform, cluster expansion, setting up High availability for core services
  • Good knowledge of Hive as a service, Hbase, Kafka, Spark
  • Knowledge of basic data pipeline tools like Sqoop, File ingestion, Distcp and their optimal usage patterns
  • Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases
  • Exposure to Amazon Web services (AWS) and Google cloud platform (GCP) services relevant to big data landscape, their usage patterns and administration
  • Working with application teams and enabling their access to the clusters with the right level of access control and logging using Active directory (AD) and big data tools
  • Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
  • Configuring Java heap and allied parameters to ensure all Hadoop services are running at their optimal best
  • Significant experience on Linux shell scripting, Python or perl scripting
  • Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc)
  • Worked on projects with Agile/ Devops as the product management framework, good understanding of the principles and ability to work as part of the POD teams
  • Working knowledge of open source RDBMS - MySQL, Postgres, Maria DB
  • Ability to go under the hood for Hadoop services (Ambari, Ranger etc) that use DB as the driver

Our requirements

  • 4+ years professional software administration experience and at least 2 years within Big data environment.
  • Agile and SDLC experience – at least 2+ years

Optional

  • Ab-initio, Pentaho ETL tools implementation and administration
  • Good knowledge of ANSI standard SQL and optimization
  • Working knowledge of ingestion batch and data lake management
  • Change data capture tools like Attunity, IBM CDC - Implementation and administration
  • Contribution to Apache open source, Public github repository with sizeable big data operations and application code base

Benefits

  • sharing the costs of sports activities
  • private medical care
  • sharing the costs of foreign language classes
  • sharing the costs of professional training & courses
  • life insurance
  • remote work opportunities
  • flexible working time
  • integration events
  • corporate sports team
  • doctor’s duty hours in the office
  • retirement pension plan
  • corporate library
  • no dress code
  • video games at work
  • coffee / tea
  • parking space for employees
  • leisure zone
  • extra social benefits
  • employee referral program
  • opportunity to obtain permits and licenses
  • charity initiatives
  • family picnics
  • extra leave

Recruitment stages

1

Phone interview

2

Online assessment

3

Zoom interview

4

Welcome to HSBC!

HSBC Service Delivery (Polska) Sp. z o.o.

HSBC is one of the world’s largest banking and financial services organisations. Our global businesses serve more than 40 million customers worldwide through a network that covers 64 countries and territories.

HSBC Service Delivery (Polska) Sp. z o.o. is HSBC's global finance, operations, risk and technology centre. We use our unique expertise and capabilities to provide specialised services – our people range from technologists transforming the banking experience to operations professionals managing 1.7 trillion payments a year.

Our Purpose – Opening up a world of opportunity – explains why we exist. We are bringing together the people, ideas and capital that nurture progress and growth, helping to create a better world – for our customers, our people, our investors, our communities and the planet we all share.

Pracodawca zakończył zbieranie zgłoszeń na tę ofertę

Aktualne oferty pracodawcy

Pracodawca zakończył zbieranie zgłoszeń na tę ofertę

Aktualne oferty pracodawcy

Pracodawca zakończył zbieranie zgłoszeń na tę ofertę

Aktualne oferty pracodawcy
Powiadamiaj mnie o podobnych ofertach

(GDT) Hadoop Administrator, Kapelanka 42a, Dębniki, Kraków