The Senior Developer (sometimes could be referred to asSenior Data Engineer ) will be responsible fordesigning, implementing, and maintaining scalable data solutionsfor the Finance for Future Programme (FFP) for Finance Data Repository (FDR).
The position will involve working closely with the IT Delivery Lead and Data Solution Architect on a day-to-day basis.
You will have a leading role in converting FSDs to TSDs that demands a deep understanding of data architectures, cloud-based data platforms, and experience in leading complex data architectures.
There are specific domains that could be assigned for data platform requirements: Accounting & Reporting, MIS & Analysis, Actuarial, regional UDP, or FDR platform foundation.
Mandatory skills
Strong background in data engineering, with experience architecting complex data systems in large-scale environments.
Technical Skills: Proficiency inSQL, Azure Databricks(Spark, Delta Lake),Azure Data Factory , and programming languages such asPython or Scala .
Communication: Strong problem-solving skills and the ability to communicate complex technical concepts to non-technical stakeholders.
Role and Responsibilities
Design, develop, and deliver the building of data pipelines across variousintegrationpoints for theglobal Financial Data Repository(FDR).
Create and maintain scalable data pipelines usingPySparkand ETL workflows usingAzure DatabricksandAzure Data Factory .
Data Modelling and Architecture : Build and optimize data models and structures to support analytics and business requirements.
Performance Optimization : Monitor, tune, and troubleshoot pipeline performance to ensure efficiency and reliability.
Collaboration:Partner with business analysts and stakeholders to understand data needs and deliver actionable insights.
Governance:Implement data governance practices, ensuring data quality, security, and compliance with regulations.
Documentation:Develop and maintain comprehensive documentation for data pipelines and architecture.
Testing:Experience in testing and test automation.
Collaborate with cross-functional teams to understand data requirements and provide technical advice.
Qualifications
Experience Level: Extensive experience inAzure Databricks, Unity Catalog, Azure Data Factory, Azure DevOps, Delta Lakes, DevSecOps, NFRs(Non-Functional Requirements).
Experience in the insurance domain is desirable.
Bachelor's or master's degree in computer science, Information Technology, Data Science, or a related relevant field.
Certification in Azure Data Engineer Associate or similar, and experience with additional Azure services (e.g., Azure Synapse Analytics, Azure SQL Database).
Personal Attributes: Strong analytical skills, leadership capabilities, effective communication, and problem-solving ability.#J-18808-Ljbffr