Data Engineer required to join expanding BI & Process team of this exciting business that are currently going through a major technology modernisation program.
You will help lead ETL solutions for the delivery of BI and data warehousing projects as part of the technology and digital team. They are a data focused organisation and are putting in new systems to enable them to improve services and continue growth.
- Strong SQL knowledge and experience with relational databases as well as ideally familiarity with cloud-based databases such as Snowflake, Redshift, BigQuery
- Good understanding of ETL processes
- Some with noSQL databases such as MongoDB, Document DB
- Worked in a Cloud services environment – GCP (Google Cloud Platform), Azure or AWS
- Source Control, CI/CD and deployment through CI Pipelines
- Python programming – able to perform complex transformations with commercial data
- Familiar with modern data tools and technologies like: Snowflake, Airflow, Terraform, Amplitude, Azure DevOps, Kafka, Beam, Cloud Dataflow, Redshift, AWS, Google Big Query, Hive, Fivetran, or Stitch, etc
- Experience of datawarehouse and BI modelling implementations like Kimball's modelling techniques for data marts.
- Experience Datamodelling for BI implementations needed for a variety of traditional and cloud based BI and reporting tools (Qlikview, Qliksense, Tableau, Chartio, Looker etc.)
- Knowledge and strong interest to work with modern cloud-based data architectures and platforms built on Google cloud (GCP and BigQuery) or equivalent cloud based platforms.
- Excellent communication, analytical and problem-solving skills.
Location: Remote – they are based in South East but this it to WFH
Salary: To 65K plus bonus plus benefits including training and development, pension, private medical
Interviews taking place now so please get your application in now.