To design, develop, and maintain scalable data pipelines, integrate diverse data sources, and optimise data systems, ensuring the availability and accuracy of data for business analytics and reporting to support data-driven decision-making within the organisation.
KEY RESPONSIBILITIES :
Design and Maintain Scalable Data PipelinesDesign scalable data pipelines by creating efficient ETL/ELT processes to ingest, process, and store data from various sources, ensuring clean and structured data is available for decision-making, improving data-driven strategies.
Maintain and optimise data pipelines by reviewing and enhancing pipeline performance regularly to minimise delays and data loss, ensuring up-to-date and accurate data is available for reports, analytics, and operational decision-making.Develop and Optimise Data ArchitectureDevelop robust data architecture by designing adaptable data models and schemas to align with reporting and analytics needs, supporting scalability and ensuring data consistency as the organisation's needs grow.
Optimise existing data architecture by evaluating current systems and implementing best practices, improving data storage efficiency and enhancing data retrieval speed, leading to faster decision-making.Integrate and Validate Data from Multiple SourcesIntegrate data from multiple sources including databases, APIs, and third-party providers using data standardisation techniques, ensuring consistency across inputs and allowing for cohesive insights.
Validate and reconcile integrated data by conducting automated audits and addressing discrepancies promptly, ensuring data integrity for accurate reporting and reliable business decision-making.Manage and Enhance Databases and Data WarehousesManage databases and data warehouses by configuring them for optimal performance, including indexing and query optimisation, improving data storage and retrieval efficiency for faster access to business data.
Enhance data system performance by applying maintenance and optimisation techniques such as query tuning and indexing, reducing latency and improving the efficiency of data storage, leading to quicker response times and minimising downtime.Collaborate with Business and Data TeamsCollaborate with data visualisation and business analysts to understand specific data needs, aligning pipelines and models with business intelligence goals, enhancing insights and improving decision-making across departments.
Work with data governance teams to ensure compliance with data management policies, guaranteeing that data security, privacy, and governance standards are met, ensuring efficient and compliant data processes across the organisation.QUALIFICATIONS :
Degree in Computer Science, Information Technology, Data Science, or a related field. Master's degree is a plus.
1 – 5 years of experience in data engineering or related roles, with hands-on experience in ETL processes.
Proficiency in SQL and experience with relational database management systems (e.g., MySQL, PostgreSQL, Oracle).
Basic programming skills, preferably in Python (experience with other languages, such as Java or Scala, is a plus).
Familiarity with ETL tools and processes: Solid understanding of extracting, transforming, and loading (ETL) processes.
Familiarity with ETL tools such as Talend, Microsoft SSIS, and Apache Nifi.
Experience in designing, developing, and maintaining ETL pipelines.
Knowledge of data transformation techniques (e.g., filtering, aggregation, joining datasets).
Understanding of data mapping and data integration techniques.
Understanding of big data technologies (e.g., Hadoop, Spark) and data warehousing solutions (e.g., Amazon Redshift, Google BigQuery).
Awareness of data governance, data quality, and data security principles as they relate to ETL processes.
Familiarity with cloud platforms (e.g., Alibaba Cloud, AWS, Azure, Google Cloud) and cloud-based ETL services (e.g., Talend Cloud, AWS Glue, Azure Data Factory) is advantageous.
Ability to work independently with minimal supervision and meet deadlines.
Excellent problem-solving and analytical skills to troubleshoot ETL issues and optimise data workflows.
Excellent communication and collaboration skills, with the ability to understand business requirements and translate them into effective ETL solutions.
Relevant certifications (e.g., AWS Certified Data Analytics, Microsoft Certified: Azure Data Engineer) are a plus.
Willing to work in theHQ Office at Damansara Heights.
Kindly be advised that due to the high number of applications we receive, only candidates who have been shortlisted will receive notifications regarding the status of their applications. We sincerely appreciate your understanding and patience as we carefully review each application.#J-18808-Ljbffr