Hi All, We are hiring for the role of Data Engineer Team Lead with one of our direct client in Cyberjaya, MY.
As the Data Engineer Team Lead, you will oversee the design, development, and optimization of data pipelines,
ensuring seamless integration and data availability across systems.
You will lead a team of data engineers,
driving the execution of data strategies to support critical business decisions and operations.
A key
responsibility will be helping to create a unified data pipeline that serves both internal systems and external
applications, ensuring data flows smoothly, securely, and in real time across all systems.
Responsibilities Lead a team of data engineers to design, build, and optimize scalable data pipelines for analytics,
reporting,and operational use.
Create and manage a unified data pipeline that integrates data from various sources, ensuring that it
can be accessed efficiently by both internal and external systems, such as third-party tools or client-
facing platforms.
Leverage low-code/no-code tools such as Fivetran, RudderStack, or similar platforms to
accelerate dataintegrations and improve the efficiency of data workflows.
Collaborate with cross-functional teams to identify data needs and ensure proper data flows
between systems, enabling smooth integration and high availability.
Manage the implementation of data governance and quality standards, ensuring data
consistency andreliability across the organization.
Architect and oversee real-time and batch data pipelines using data streaming tools like Kafka,
Pub/Sub, orequivalent, ensuring low-latency and high-performance data processing.
Provide technical guidance to your team, helping them troubleshoot complex data issues, optimize
cost andperformance, and follow best practices.
Stay updated on the latest data engineering technologies and best practices, especially in the areas
of real-time data processing and low-code/no-code integrations, to ensure continuous improvement
of the data infrastructure.
Coordinate with stakeholders to ensure timely delivery of data solutions that meet business needs.
What you have Bachelor's degree in computer science, data engineering, or a related field.
8+ years of experience in data engineering, with at least 2 years in a leadership role.
Expertise in designing, building, and maintaining data pipelines and data warehouse solutions.
Strong knowledge of cloud platforms such as Google Cloud or AWS, and tools like BigQuery,
PostgreSQL, andAirflow.
Proficiency in programming languages such as Python or similar, with experience in ETL/ELT processes.
Hands-on experience with low-code/no-code data integration tools like Fivetran, RudderStack, or
similarplatforms.
Deep understanding of real-time data streaming and data processing tools such as Kafka, Pub/Sub, or
similar.
Experience in building unified data pipelines that serve both internal and external systems,
ensuringconsistent, secure data flows.
Experience working with both real-time and batch data processing systems.
Strong understanding of data governance, security, and compliance practices.
Excellent communication skills, with the ability to convey technical concepts to non-technical
stakeholdersand mentor junior team members.
What's good to have Experience with fintech, trading, or other data-heavy industries.
Knowledge of API-based integrations and working with third-party systems.
Knowledge of additional data streaming technologies and best practices.
Experience with automating data flows using low code platforms.