Description: The Data Engineer Level III will be responsible for leading data engineers and designing/developing data ingestion, storage, and distribution leveraging Microsoft Azure Cloud Technologies.
Responsibilities’Actively participate requirement elicitation sessions to identify and document business functions data needs’Estimate efforts to develop and refine functional and non-functional requirements.’Estimate tasks with granularity and accuracy commensurate with the information provided.’Lead rapid prototyping and POC development efforts and design enterprise-grade data services’Design and develop enterprise dataflow architecture and implementation best practices.’Design, develop and deliver cost-efficient data engineering solutions that meet business line and enterprise requirements.’Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.’Prepare technical documentation containing data mapping, data profiling, data modeling, data dictionary, data transformation rules, and data flow/process flow diagrams.’Work with stakeholders, including the Executive, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs.’Participates in iteration and release planning and delivers work products according to established deadlines’Strong ability to produce high-quality, properly functioning deliverables the first time’Takes shared ownership of the product and work collaboratively in a small team ‘Excel in a rapid iteration environment with short turnaround times and deals positively with high levels of uncertainty, ambiguity, and shifting priorities.’Configure cloud infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure technologies.’Ensure standardization of SQL coding practices and adherence to coding standards, change control, and SQL best practices’Design and implement ETL packages in accordance with documented templates and standards as defined by the data management group’Research, Diagnose, and Monitor Performance Bottlenecks, etc.’Build dimensional representations of data either in the data repository or in OLAP tools’Act as mentor to development teams to help with distributed uses of ETL tools and database engineering best practices’Assist development teams by tuning and creating complex queries against very large-databases.’Develop or assist in development of web services that allow access of databases to be protected by a service layer.’Develop logical test plans to assist in the validation of data development processes’Document development that is to be moved to production and develop code deployment process following CI/CD standard.
Bachelor or foreign equivalent degree in Computer Science, Information Technology, Computer Electronics, Communication Engineering or closely related field and following experience in data engineering:’10+ years of data warehousing experience’10+ years with ETL tools (MS SSIS, Pentaho Data Integration, Talend Open Studio & Integration, Oracle Data Integrator).’10+ years of experience with advanced SQL queries, stored procedures, views, triggers, and performance tuning queries. ‘5+ years of experience in data engineering using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Event Hub, Azure Stream Analytics, Azure Analysis Service, Azure Data Catalog, Cosmo Db, ML Studio, AI/ML. ‘3+ years of experience in C#, Python, JSON’3+ years of experience with one reporting tool (MS SSRS, PowerBI, Tableau, LogiAnalytics)’3+ years of experience with DevOps tools such as Azure DevOps, Visual Studio Team Server, GitLab, Jenkins.
Contact: [email protected]
This job and many more are available through The Judge Group. Find us on the web at www.judge.com