Overview of the Position: This role will oversee the company's data ingestion / integration work, including developing a data model, maintaining a data warehouse and analytics environment, and writing scripts for data ingestion / integration and analysis. This role will work closely and collaboratively with members of the Data & Analytics to define requirements, mine and analyze data, integrate data from a variety of sources, and deploy high quality data pipelines in support of the analytics needs.
The role required to execute the below tasks and initiative: Data Collection and Integration: Designing, building, and maintaining data pipelines to collect, process, and store large datasets from various sources. Data Transformation: Clean, preprocess, and transform raw data into a usable format for analysis and reporting.Data Pipeline Development: Design and maintain data pipelines for efficient and automated data processing.Database Management: Administer and maintain our databases, ensuring data accuracy, security, and availability.Data Warehousing: Build and manage data warehouses for storage and easy access to business-critical data.ETL Processes: Develop and optimize Extract, Transform, Load (ETL) processes for efficient data movement.Data Quality Assurance: Implement data quality checks and ensure data accuracy and consistency.Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver on data-related projects. To be successful in this role, we are expecting the below: Min 3-5 years of direct experience.Strong proficiency in SQL and experience with relational databases (e.g., MySQL).Proficient in ETL (Extract, Transform, Load) processes and tools (e.g., Fivetran).Experience with big data technologies (e.g., Hadoop, Spark) and data pipeline tools.Experience with cloud-based data platforms (e.g., AWS, Google Cloud, Azure) is a plus.Strong understanding of data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Databricks).Experience with scripting languages (e.g., Python) for data processing and automation.Knowledge of data modelling, schema design, and data architecture principles.Familiarity with API integration and data collection from external sources.Work location: Bukit Jalil (Nearby Pavilion Bukit Jalil)