SUMMARY OF RESPONSIBILITIES
As a Principal DevOps Engineer supporting DataOps, you will play a crucial role in enabling efficient and relia ble data pipeline building, data analysis, and data governance. Your primary responsibility will be to ensure s mooth data ingestion into the data lake or data warehouse, enabling downstream data consumption for Data Scientists, Analysts, and other stakeholders. You will utilize various AWS services such as MSK/Kafka, S3, E MR or Glue, Lakeformation, Athena, QuickSight, and manage ingestion from different databases like MS SQL, MySQL, Oracle, and PostgreSQL.
(Apply now at https://my.hiredly.com/jobs/jobs-malaysia-triiio-job-senior-principle-senior-principle-devops-engineer_fully-in-office)
; Requirements:- Data Pipeline Build:
-Collaborate with Data Engineers to design, implement, and optimize data pipelines.
-Ensure efficient and reliable data ingestion processes from various sources into the data lake or data w warehouse.
-Implement data validation and quality checks to maintain data integrity throughout the pipeline.
Data Analysis Support:
-Assist Data Scientists and Analysts in their routine analytical work by providing data access and query s support.
-Optimize data storage and retrieval mechanisms to enhance performance and efficiency.
-Collaborate with stakeholders to identify and implement solutions for data analysis and reporting requir ements.
Data Governance and Security:
-Utilize AWS Lakeformation and other relevant tools to establish and enforce data governance policies.
-Implement security measures and access controls to protect sensitive data.
-Collaborate with stakeholders to ensure compliance with data privacy regulations
Technical Expertise:
-Strong expertise in AWS services such as MSK/Kafka, S3, EMR or Glue, Lakeformation, Athena, and Qui ckSight. Proficient in database management systems like MS SQL, MySQL, Oracle, and PostgreSQL.
-Familiarity with Change Data Capture (CDC) mechanisms and practices.
-Solid understanding of data pipeline architecture, ETL processes, and data integration techniques
Automation and Infrastructure as Code (IaC):
-Proficiency in infrastructure automation using tools like Terraform, CloudFormation
-Experience in managing and administering Kubernetes on-premises or EKS in AWS
-Experience with version control systems (e.g., Git) and CI/CD pipelines for automated deployments.
-Understanding of Infrastructure as Code (IaC) principles to maintain reproducible and scalable environ ments.
-Proficient in programming languages such as Python, Golang, Java