Job Description
1.
RoleBig Data Engineer
2.
Required Technical Skill SetBig Data (Hadoop, HIVE, Scala, SQL)
3.
Number of Requirements2
4.
Desired Experience Range5-12 years
5.
Location of RequirementKuala Lumpur, Malaysia
Desired Competencies (Technical/Behavioral Competency)Must-Have
Hands-on experience with Hadoop, Hive SQL's, Spark, Scala, and Big Data Ecosystem Tools.
Ability to develop, tweak queries, and work on performance enhancement.
Solid understanding of object-oriented programming and HDFS concepts.
Responsible for delivering code, setting up environment, connectivity, and deploying code in production after testing.
Good-to-Have
Knowledge of DWH/Data Lake.
Conceptual and creative problem-solving skills; ability to work with ambiguity and learn new concepts quickly.
Experience working with teams in a complex organization with multiple reporting lines.
Good knowledge of DevOps and Agile Development Framework.
Responsibilities / Expectations from the RoleWork as a developer in Cloudera Hadoop.
Work on Hadoop, Hive SQL's, Scala, and Big Data Ecosystem Tools.
Experience working with teams in a complex organization with multiple reporting lines.
Strong functional and technical knowledge to meet requirements, familiar with Banking terminologies.
Strong DevOps and Agile Development Framework knowledge.
Create Scala/Spark jobs for data transformation and aggregation.
Experience with stream-processing systems like Spark-Streaming.
Job TypesFull-time, Contract
Summary of Role RequirementsLooking for candidates available to work:
Monday: Afternoon, Evening, Morning
Tuesday: Afternoon, Evening, Morning
Wednesday: Afternoon, Evening, Morning
Thursday: Afternoon, Evening, Morning
Friday: Afternoon, Evening, Morning
Saturday: Afternoon, Evening, Morning
Sunday: Afternoon, Evening, Morning
More than 4 years of relevant work experience required for this role.
Working rights required for this role.
Expected salary: RM8,000 per month.#J-18808-Ljbffr