Quantitative Data Engineer – London – Hybrid – Up To 90,000 per annum
DataBuzz are recruiting a Quantitative Data Engineer on behalf of a prestigious organisation.
As a Quantitative Data Engineer, you will be supporting Product team and build out a comprehensive information-scraping and analysis tool to measure business disruption and eventually develop into a distinguished product feature.
As a Quantitative Data Engineer, you will be working across a technology stack that includes Python, C++, PostgreSQL / PostGIS, and NoSQL databases, as well as Amazon Web Services (AWS) infrastructure technologies.
- Utilise Natural Language Processing, Regular Expressions, and similar techniques to extract physical asset addresses from company websites and assign such locations/assets to relevant type and industry categories.
- Pipelining a solution to run at scale, in parallel, and on Amazon Web Services (AWS) backend.
- Undertake initial research for and rapidly prototype a Proof of Concept (PoC) for the financial impacts of business disruption associated with physical asset addresses.
- Create a geocoding module with Mapbox/Google Places/alternative APIs to source information and conducting basic spatial analytics on results.
- Working with Front End Developers in the team to deliver a visualised / User Interface version of the Business Disruption module, integrated into the analytics platform Spectra, including product requirements and UAT.
- Conduct continuous documentation of both the technical solution and user guides for the Spectra Business Disruption module and take ownership of the solution
- Researching ways to improve and optimise the existing approach.
- Explaining the end product complexities to non-technical audiences including internal and external stakeholders.
Experience & Skills:
- Agile environment experience in software engineering, infrastructure, automation.
- Demonstratable capabilities in low level languages (e.g. C/C++/Java/C#) and in scripting languages (Python, R)
- Expertise working on Public Cloud-based environments (AWS, GCP, Azure) from both a services and infrastructure perspective.
- Excellent analytical and quantitative skills, with a strong knowledge of data modelling and data analysis.
- Capacity for independent and creative thinking / writing on research and statistical problems.
- Ability to work with data from multiple sources, and experience working with large data sets.
- Strong experience in process improvement and a proven record of accomplishment in automation.
- Ability to understand complex systems quickly and to improve processes proactively.
- Capability for handling multiple stakeholders, have excellent multitasking skills, and be able to prioritise competing workloads.
- Strong understanding of application profiling and observability tools including Prometheus/Grafana and large-scale distributed systems would be beneficial.
- Knowledge of networking and web protocols would be a plus.
- Comfortable in language, framework, and platform selection processes, with respect to data tools and software.
- Excellent in project management and organization