Data Engineer
500 – 520 per day
10 month contract
After a recent merger of two large business units, my client are embarking on a project to reengineer and migrate their end-to-end reporting requirements (direct involvement with pipelines and systems) and operational systems (indirect involvement with data). This transition includes a shift from existing ETL processes to a modern data infrastructure, leveraging Data Vault 2.0 modeling, Snowflake for database management, and dbt for data transformation to establish robust data pipelines. In addition it requires a comprehensive cleansing and alignment of existing data sets and data structures according to the new design of the data models. This is specifically targeting Customer as a data element across Snowflake, IFS and SAP ECC 6.0.
Key Responsibilities:
- Data Modeling and Architecture: Design and implement scalable and robust data pipelines and platforms using Data Vault 2.0 methodology to support high-level reporting and operational requirements.
- Data Integration and Pipeline Development: Develop, construct, test, and maintain architectures such as databases and large-scale processing systems using Snowflake and dbt for data transformations.
- ETL to ELT Transition: Transition existing ETL processes to modern ELT processes, ensuring seamless data flow and integration across platforms.
- Data Cleansing and Alignment: Conduct comprehensive data cleansing to unify, correct, and standardize large data sets, ensuring data integrity across Snowflake, IFS, and SAP ECC 6.0 systems according to designs set by Enterprise Architecture teams.
- Data Governance and Compliance: Recommending data governance policies and procedures to manage the data lifecycle, ensuring compliance with data protection regulations and best practices.
- Performance Optimization: Optimize data retrieval and processing speeds to enhance user interactions with data-driven applications and reports.
Required Skills & Experience:
- Knoweldge of Data Vault 2.0 or experience in Data Vault 2.0 modeling techniques, ideally certified in Data Vault methodology.
- Proficiency with Snowflake: In-depth knowledge of Snowflake’s data warehousing solutions, including architecture, security, and data storage optimizations.
- Experience with dbt (data build tool): Demonstrated capability in using dbt for performing complex data transformations within data pipelines.
- Strong Background in Data Engineering: Minimum of 5 years of experience in data engineering, with a focus on building scalable and high-performance data infrastructures.
- Programming Skills: Proficiency in SQL and experience with scripting languages such as Python for data processing.