Ta oferta pracy jest nieaktualna od 24 dni. Sprawdź aktualne oferty pracy dla Ciebie

Big Data System Engineer

  • Kraków, małopolskie
  • Specjalista
  • 29.05.2019

    Pracodawca ma prawo zakończyć rekrutację we wcześniejszym terminie.

    Brown Brothers Harriman (BBH) is a privately-held financial institution and has been a thought leader and solutions provider for almost 200 years. We serve the most sophisticated individuals and institutions with award-winning expertise in Investment Management, Private Banking, and Investor Services. Our 5,000 colleagues operate from 18 cities throughout North America, Europe and Asia.

    BBH is committed to diversity, innovation and globalization. Our culture is driven by our goal to provide the best solutions and services to our clients and each other.  Our Partnership structure creates a flat organization that promotes collaboration across all business lines.  We believe that diverse ideas and the ability to come together globally across groups and borders are a competitive advantage. In order for all our teams to excel, members must trust each other and feel comfortable providing honest input from all perspectives.

    This openness sparks innovation and agility, which adds to the entrepreneurial spirit and provides many more career opportunities for our staff. We are a group of high-performing, dedicated and caring professionals who believe that working together is the foundation for superior client service excellence.

    As a BBH professional, your career path is yours to define. We take pride in our ability to retain our best employees. We help them manage their careers by moving top performers to new areas of BBH where their talents will make the greatest contribution.  As soon as you walk through the doors at BBH, we provide you with the tools to help you succeed and grow your career.

    Big Data System Engineer
    Miejsce pracy: Kraków
    Job ID: 42447

    Brown Brothers Harriman is seeking a full-time Big Data System Engineer to help with the configuration, architecture design and development of a brand-new Big Data platform. BBH’s Big Data platform serves as the foundation for a key set of offerings running Cloudera's distribution. It is an opportunity for the candidate to gain both knowledge and experience with the most advanced data technology in a highly-secured environment.

    If you are looking to push your career to the next level, take the next step by submitting your resume.

    Key Responsibilities Include:

    • Facilitate the establishment of a secure data lake on BBH’s Big Data infrastructure
    • Provide technical leadership on best practices for chosen architecture and technology (Cloudera)
    • Deliver solutions that fulfill business needs and align with the information vision and strategy of enterprise data lake
    • Design, document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
    • Help establish and maintain Data Governance processes and mechanisms for enterprise data lake
    • Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: HDFS, Hadoop, MapReduce, Sqoop, Hive, Impala, Solr, Oozie and Spark
    • Consistent practice in coding and unit testing
    • Build, install, upgrade or migrate large size big data systems
    • Identify and propose automated approaches for system administration tasks
    • Work with distributed teams and act as a subject matter expert

    Qualifications for your role would include:

    • Bachelor's degree in Computer Science or related technical field, or equivalent experience
    • 6+ years of experience in an IT developer role
    • 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
    • 5+ years of experience in supporting Linux servers
    • Strong hands-on experience with Spark and/or Hadoop/MapReduce
    • Strong experience with programming languages Java, Scala and/or Python
    • Strong experience in designing efficient and robust ETL/ELT workflows and schedulers
    • Experience with Apache Pig scripting and Apache HiveQL
    • Experience with NoSQL databases (HBase/MongoDB)
    • Experience with all aspects of Linux systems including hardware, software and applications
    • Experience in Agile development environment
    • End-to-end development life-cycle support and SDLC processes
    • Ability to understand and adapt to changing business priorities and technology advancements
    • Communication skills – both written and verbal
    • Strong analytical and problem-solving skills
    • Self-driven, ability to work independently and as part of a team

    Nice To Have:

    • Knowledge of Machine Learning libraries and exposure to Data Mining
    • Cloudera Certified Professional (CCP) or Cloudera Certified Administrator (CCA)
    • Good understanding of OS concepts, networking, CPU, memory and storage, process management and resource scheduling

    What We Offer:

    • A collaborative environment that enables you to step outside your role to add value wherever you can
    • Direct access to clients, information and experts across all business areas around the world
    • Opportunities to grow your expertise, take on new challenges, and reinvent yourself—without leaving the firm
    • A culture of inclusion that values each employee’s unique perspective
    • High-quality benefits program emphasizing good health, financial security, and peace of mind
    • Rewarding work with the flexibility to enjoy personal and family experiences at every career stage
    • Volunteer opportunities to give back to your community and help transform the lives of others

    In order to apply for the role please send your CV via Aplikuj button.

    In order to apply for the role please send your CV
    via Aplikuj button.