Apply now »
Date:  5 Jun 2025
Req ID:  561
Location: 

Kuala Lumpur, Federal Territory of Kuala Lumpur, MY, 50470

City:  Kuala Lumpur
Entity Field:  Boost Bank Berhad

Data Engineer

Boost Bank Berhad (formally known as Boost Berhad), has received regulatory approval from Bank Negara Malaysia (BNM) and Ministry of Finance (MOF) to commence operations on 15 January 2024. As the first homegrown digital bank, we are anchored in the mission to pave the way for a banking revolution that serves all Malaysians and make financial wellbeing a seamless part of life.

We are looking for an experienced Data Engineer who will take a key role in Boost Bank Data Engineering team. The ideal candidate is a highly skilled data pipeline builder and data operations specialist with a passion for designing, Optimizing and building robust data system tailored for a digital banking environment.

You will be responsible for designing, developing, and maintaining efficient, scalable, and flexible data pipelines to process large volumes of data. You will create tools and solutions to empower analytics and data science teams, enabling them to build and optimize products that position us as an industry leader.

Your role will involve building robust infrastructure to support the seamless extraction, transformation, and loading of data from diverse sources, leveraging big data technologies. Additionally, you will contribute to the integration of technical and application components to align with business requirements. All development stages will adhere to defined methodologies and standards, including thorough documentation and maintenance.

Experience

  • A minimum of 3 years of hands-on experience in operating, and optimizing distributed, large-scale data storage and analytics solutions.
  • Proficiency in data modelling with the ability to design, implement, and maintain logical and physical data models for structured and unstructured data.
  • Proficiency in object-oriented and function-based scripting languages:
  • Preferred: Python and Bash Shell.
  • Advantage: Java, Scala.
  • Strong experience in ETL pipelining and data warehousing, with a proven track record of processing unstructured or semi-structured data streams and repositories:
  • File Processing: CSV, JSON, Excel, XML.
  • AWS Services: Glue, S3, RDS.
  • Advance working knowledge of SQL and experience with relational database systems for query authoring and database management:
  • Preferred: MySQL, PostgreSQL, MS SQL.
  • Advantage: MongoDB, DynamoDB.
  • Familiarity with AWS Cloud Infrastructure, including IAM, S3, Glue, Athena, EC2, and Security Groups.
  • Expertise in big data tools and technologies such as Spark, HDFS, AWS Redshift, AWS S3, AWS Glue, and AWS Athena.
  • Experience in designing, building, and deploying data warehouses and real-time stream-processing systems, preferably using open-source solutions.
  • A solid understanding of computer science principles, including object-oriented design, data structures, and algorithms.
  • Familiarity with professional software engineering practices across the software development life cycle (e.g., coding standards, code reviews, source control management, build processes, testing, and operations).
  • Proficiency in UNIX/Linux environments and solid experience in system commands, package installations, and basic server monitoring.
  • Understanding and implementation of security best practices and data protection measures.
  • Familiarity with the FinTech industry is a plus.

Attributes

  • Proficient in data modelling to support complex analytics, reporting, and business needs.
  • Passionate about improving products to deliver exceptional user experiences.
  • Strong focus on software quality, consistency, maintainability, scalability, performance, and security.

Education

  • Bachelor’s degree or equivalent experience in Computer Science, Engineering, IT, or a related field.
  • Strong foundational knowledge of:
  • Data structures. modelling and data architecture best practices.
  • Algorithm design, problem-solving, and complexity analysis.

Apply now »