The data engineer will be able to effectively extract, transform, load and visualize critical data. They will build and ensure the accuracy of data pipelines driving faster analytics through data. This entry-level professional will contribute in an agile environment partnering with business, software application teams, and data scientists to understand their data requirements and ensure all the teams have reliable data that drives effective business analytics. The successful candidate will be a self-starter comfortable with ambiguity, with strong attention to detail, and enjoy working with large scale of data.
- Build data solutions from design phase to completion and ensure they meet specific requirements
- Build data pipelines, engineer complex new data sets, assess data quality, perform data engineering or ETL for data marts, visualizations, or data science models
- Query large data sets for ad hoc exploration, analysis, or testing
- Build a deep understanding of Spark (Databricks) and Python (Pyspark) to support your technical design solutions
- Build a deep understanding of the Azure Cloud Platform and stay updated on new capabilities, positioning yourself as a subject matter expert
- Build a deep understanding of Azure Data Factory, Databricks, ADLS and Synapse (SQL Data Warehouse) so you can identify and recommend improvements to designs and strategies across the Azure technology stack.
- Support Agile Scrum teams with planning and scoping technical analytic solutions, including time estimates for development and testing
- Participate in Agile scrum teams delivering data ingestion, validation, engineering, modelling, visualization, and analytics solutions.
- Engage with Technical Architects and technical staff to determine the most appropriate technical strategy and designs to meet business needs
- Liaise with data architecture, data engineers, and other technical contracting resources to work through technical dependencies, issues, and risks.
- Engage with business stakeholders to understand required capabilities, integrating business knowledge with technical solutions
- Communicate complex technical information to business customers and project teams in an effective and concise manner
- Adheres to applications security procedures, change control guidelines and coding structures, Sarbanes-Oxley IT and business requirements
- Performs other duties as assigned
- Complies with all policies and standards
Minimum Education and Experience Required:
- BA or BS degree required. Advanced degree in computer science or analytics preferred.
- Strong knowledge of programming languages, including SQL, Python, R or Scala, preferably in a Spark environment
- Strong written and oral communication skills with proven ability to communicate with technical and non-technical partners
- Strong organizational skills and understanding of agile project management methodology
- Ability to produce high quality work under pressure and within deadlines with specific references.
Preferred but not required:
- Experience in delivering large scale Azure projects.
- Experience delivering data-centric solutions utilizing Azure Data Factory, Databricks (Python/Pyspark) and Azure Data Lake.
- Experience writing complex T-SQL queries, procedures, and views.
- Exposure to Delta Lake, Azure Synapse (SQL Clusters) and Azure Analysis Services desired.
- Experience in migrating large volumes of data using standard Azure automation tools from on premise and cloud infrastructure to Azure
- Experience with Azure DevOps and Git Repositories.
Only applicants requiring reasonable accommodation for any part of the application and hiring process should contact us directly: