The skills you'll need
You’ll be an experienced programmer and data engineer, with a BSc qualification or equivalent in Computer Science or Software Engineering. Along with this, you’ll have a proven track record in extracting value and features from large scale data, and a developed understanding of data usage and dependencies with wider teams and the end customer.
Ideally, we'll look to you to bring experience in programming languages including Python, PySpark, Scala and Java, along with experience in StreamSets. Experience in AWS and other cloud technologies, as well as Snowflake would also be desirable.
You’ll also demonstrate:
- Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing and data modelling capabilities
- Extensive experience using RDMS, ETL pipelines, Python, Hadoop, SQL and data wrangling
- Good knowledge of modern code development practices
- Excellent written and verbal communication skills
- Good critical thinking and proven problem solving abilities