Hi All, One of our fintech client's has an exciting opportunity for you to work as a Data Integration Engineer.
Please find below the details for the current open positions,
Data Integration Engineer Location - Cyberjaya, Malaysia Experience - 5-8 yrs. Data Integration Engineer - JD As a Data Integration Engineer, you will play a key role in ensuring the seamless integration of data across various system, both internal and external.
Leveraging low-code/no-code tools and traditional programming methods, you will ensure the smooth transfer of data for analytics, reporting, and operational use.
You will help unify data layer, making sure the right data is accessible for all teams, driving key business decisions and innovations.
Responsibilities Design, develop, and maintain data integration pipelines that connect internal systems, third-party tools, and customer applications, ensuring data consistency and quality.
Use low-code/no-code platforms to accelerate the integration process while balancing custom development for complex pipelines.
Collaborate with internal teams to gather requirements and implement data flows that support business needs, including compliance, finance, and customer-facing platforms.
Monitor and optimize data transfer performance, ensuring that integrations are efficient, scalable, and secure.
Assist in production pipeline issues and coordinate with team to respond and solve incidents.
Handle real-time and batch data transfers, working closely with engineering and back-end teams to resolve complex integration challenges.
Develop documentation and governance standards for data integration processes, ensuring maintainability and ease of use.
What you have Bachelor's degree in computer science, data engineering, or a related field.
5+ years of experience in data integration, ETL development, or data engineering.
Hands-on experience with data integration tools like Talend.
Informatica, Fivetran, RudderStack, Zapier, or similar low-code/no-code platforms.
Strong knowledge of SQL, API development, and cloud-based data storage solutions Google Cloud, AWS).
Familiarity with Python or similar scripting languages.
Experience in working with real-time data systems and streaming technologies like Kafka, Pub/Sub, or equivalent.
Understanding of data governance, privacy, and security principles.
especially in regulated environments (e.g., GDPR compliance).
Excellent troubleshooting and debugging skills, with a focus on optimizing data flows for performance and reliability.
What's good to have Experience with fintech, trading, or other data-heavy industries.
Knowledge of API-based integrations and working with third-party systems.
Familiarity with data modeling techniques and data warehouse architectures.
Ability to work in a fast-paced, cross-functional team environment.