HCLTech is a global technology company, home to 219,000+ people across 54 countries, delivering industry-leading capabilities centered on digital, engineering and cloud, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13 billion.
Job description-
15+ yrs of experience in IT industry (prefer experience in Banking & Financial Services sector
Key skills: Databases (RDBMS, No SQL, open source, proprietary), SQL expertise, StreamSets, Kafka, Big Data Hadoop, Spark, Python, Data on Cloud, Test Management tools (Octane), JIRA
– Project experience in building Data solutions, experience in utilizing and feeding data from data lake.
– Engineering experience in projects building data patterns around data sourcing , data controls, data on cloud, Data privacy, data accessibility, data lineage, meta data management
– Engineering expertise in implementing Cloud based solutions (public, private) that will require defining new or utilizing existing design patterns.
– Experience in integrating applications data with enterprise data lake using Streamsets and Kafka, batch mode for bulk loading and bulk data consumption
– Experience in using data from multiple sources (dimensions) and designing developing Risk Based scoring algorithm to assign risk score to a customer or a case.
– Experience in identifying data that will be key, for operational process efficiencies, regulatory reporting.
– Experience in data control frameworks, working with regulatory requirements on data retention , data accessibility controls with focus on data accuracy, completeness and de-duplication.
– Engineering expertise in implementing Cloud based data solutions (public, private) that will require defining new or utilizing existing design patterns.
– Engineering experience in Elastic Data Search capabilities with good knowledge on Data Streaming patterns and tools (e.g. StreamSets, Kafka)
– Exposure of implementing AI and ML patterns using application data for automated decisioning in business workflow based on rules.
– Experience in Agile methodology, using experimentation and simplification as building blocks