Senior BigData Engineer

Ringier Axel Springer Polska Sp. z o.o.O firmie

Rekrutacja zdalna

Rekrutacja zdalna

To wyróżnienie ofert oznacza, że cały proces rekrutacyjny jest prowadzony zdalnie. Dowiedz się więcej
Rekrutacja zdalna

Ringier Axel Springer Polska Sp. z o.o.

Alma Tower

Kraków

Ringier Axel Springer Tech to hub technologiczny Ringier Axel Springer Media AG, wiodącej firmy mediowo-technologicznej działającej w Europie Środkowo-Wschodniej i wydającej łącznie ponad 100 tytułów prasowych i 70 serwisów internetowych. W Polsce, w 50 zespołach i trzech lokalizacjach pracuje ponad 300 specjalistów, którzy budują różnorodne rozwiązania IT, wchodzące w skład nowoczesnej platformy digital publishing. Nasze produkty docierają do 40 milionów użytkowników na różnych kontynentach. Pomagamy rozumieć świat i podejmować świadome decyzje. Tworzymy przyszłość cyfrowych mediów.

Senior BigData Engineer

We are looking for a Senior BigData Engineer eager to take the challenge and unleash the power of data and technology. We are the Analytic Suite Team which is responsible for development of enterprise analytics products for publishers. On daily basis we deal with millions of users, tens of thousands of events per second, billions of records in data warehouses, petabytes of data, dozens of clients from the media industry throughout the World. Today you can join us! Together we will boost publishing to a new era.

What will you do?

  • Develop data transformation workflows within AWS public cloud and private cloud ecosystem, using battle proven open source projects for large-scale data processing. Your responsibility will include:
    o   Architecture design of product’s new features
    o   Improving existing products, including migration to the AWS public cloud
    o   Contribute to building products roadmap
  • Building cloud-based applications in a microservice way and near-real time processors, which among others:
    o   Create web interface to provide practical knowledge about users for editorial, sales and analytical tools
    o   Expose APIs
    o   Integrate with external sources
    o   Aggregate large data streams with sub minute latency
  • Work in the product oriented, self-organised team, with DevOps mindset using Scrum/LeSS methodology

We would love you to have:

  • Experience with data pipelines (i.e. Airflow) and streaming processing
  • Familiarity with cloud computing on AWS (S3, Athena, Redshift, etc.)
  • Knowledge of AWS on architect level (e.g. defining infrastructure and it access policies)
  • Strong SQL/NoSQL skills to handle large data sets transformations
  • Programming skills in Python and/or Java
  • TTD and Clean Code practices, SOLID principles

Extra skills and assets:

  • Knowledge of frontend development (e.g. vanilla js)
  • Experience with online and offline data processing tools and technologies (e.g. Flink, Storm, Spark)
  • Familiarity with Hadoop ecosystem (ie. Hive, Impala, Oozie)

Our offer: 

  • Accelerate your carrier development and work on cutting edge technologies in the field of BigData
  • Gain practical knowledge in applying architecture pattern for large scale systems in public cloud
  • Have an excellent team of strong team-players with an outstanding can-do attitude

And other reasons to join:

  • Agile culture
  • private healthcare
    for you & your family
  • high autonomy
  • bicycle parking
  • training and conference
    budget
  • DevOps culture
  • flexible
    working time
  • tech guilds
  • life insurance
  • lightning talks & tech days
  • no blame culture
  • cool offices
  • company parties
  • Multisport package & group sport activity
  • great developers to work with

Ogłoszenie archiwalne

Pracodawca zakończył zbieranie zgłoszeń na tę ofertę

Aktualne oferty pracodawcy