Data Engineering Lead ( Engineering Manager - Data)

Data Engineering Lead ( Engineering Manager - Data)

Job Summary
This position is responsible for the delivery and execution of projects and initiatives under the Future Finance Program (FFP). The FFP is a Group driven program and aims to modernize Prudential’s Finance function by strengthening Finance & Actuarial capabilities through enhanced data and improved systems. The FFP will be the mechanism to transform the role of Finance within the organization.
 
The role will work closely with FFP Technical Delivery Lead, Workstream Product Owners for Data, Actuarial. Accounting and MIS, for successfully delivering the Data related parts of the FFP project, including the Financial Data Repository (FDR) & application integration. The role involves coordination of the day-to-day activities, support on the prioritization of the requirements/backlog as well as support on the strategic planning.
 
The role will require experience in the data platform solution & integration implementation for applications within Financial domains including FDR, Oracle ERP (accounting), Coupa (Expenses), Workday (payroll) & actuarial. The FDR provides the logical “single source of truth” for all financial and actuarial data needs required for the Finance (FFP). This data architecture needs to be secure, resilient, accurate, timely, scalable & cost efficient.
 
 JOB DESCRIPTION
Role and Responsibilities
·        Be responsible for the implementation of the FDR and other Data products as defined by business owner and tech product owner.
·        Lead the Data delivery team in the planning and implementing the FDR and other Data products as per defined timelines & Prudential Engineering framework & policies and transition to Run team post go-live
·        Work in close collaboration with project sponsors and stakeholders to ensure appropriate definition of project goals, deliverables, and acceptance criteria
·        Coordinate interactions with the business platforms/entities
·        Day-to-day coordination on progress, prioritization and problem solving with the Data implementation team.
·        Provide timely and concise updates to Finance GTP (Global Tech Product) Owner and inputs to the governance committees.
·        Support the process of documenting business requirements/user stories from stakeholders.
·        Support prioritization of the requirements catalogue/user story backlog
·        Ensure good level of both technical & functional documentation for knowledge management & communications for within Engineering & Finance GTP
Qualifications
·        Experience Level: Minimum 10+ years of relevant experience in IT implementation for Finance domain (MIS, Accounting & Actuarial)
·        Education Level: Bachelor's degree in Computer Science, Data Engineering, or related field (Master's degree preferred).
·        Personal Attributes: Strong analytical skills, leadership capabilities, effective communication, and problem-solving ability
·        Specific Skills: Deep technical knowledge in data engineering, database management and data warehousing technologies. (e.g. SQL, Python, Spark, Azure stack, Databricks)
·        Strong Data modeling and managing Distributed Computing Platforms for Data Processing, with understanding of frameworks like Hadoop, Spark, or Flink.
·        Advance knowledge of SQL and writing resource-efficient queries
·        High expertise with data integration tools, data pipeline orchestration tools, and workflow management systems
Mandatory skills
·        Strong background in data engineering, with experience architecting complex data systems in large-scale environments for Financial domain (Accounting, MIS, Actuarial)
·        Experience in successful delivery of complex finance data related projects end to end
·        Experience in Agile and DevSecOps delivery, quality practices, techniques, test automation and tools required for all aspect of data engineering/ data platforms
·        Experience with cloud-based technologies incl. Microsoft Azure, Databricks, Delta Lake
·        Experience in data ingestion tools of CDC, ADF and Qlik Replicate
·        Experience in adoption of data governance frameworks using Collibra and Unity Catalog
·        Experience with JIRA, Confluence, Github, Bitbucket and other similar developer tools
·        Experience in using PowerBI for data visualization and analytics, supporting a data-enabled organization
Preferred skills·       
  Excellent problem-solving and troubleshooting skills
·        Strong communication and collaboration skills
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.