
Tabby
Senior Data Engineer
- Permanent
- Riyadh, Saudi Arabia
- Experience 2 - 5 yrs
Job expiry date: 29/05/2026
Job overview
Date posted
14/04/2026
Location
Riyadh, Saudi Arabia
Salary
SAR 20,000 - 30,000 per month
Compensation
Comprehensive package
Experience
2 - 5 yrs
Seniority
Senior & Lead
Qualification
Bachelors degree
Expiration date
29/05/2026
Job description
Senior Data Engineer role within Tabby, a fast-growing fintech platform operating in Saudi Arabia and the GCC region, serving over 15 million users and 40,000+ global merchants including Amazon, Noon, IKEA, SHEIN, and others, enabling buy-now-pay-later financial services with no interest or fees. The company generates over $10 billion annual transaction volume and is valued at $4.5 billion, backed by over $1 billion in funding. The role focuses on building and evolving a modern corporate data platform integrating Data Warehouse (DWH), streaming systems, and internal data services such as Feature Store to support analytics, machine learning, and operational workloads. Responsibilities include designing scalable data architectures, implementing ELT/ETL pipelines, managing distributed data processing systems, ensuring data quality and governance, and supporting both batch and real-time streaming data pipelines. The role requires strong expertise in data architecture patterns including Kimball, Inmon, Medallion, and Data Mesh, as well as deep knowledge of data modeling techniques such as SCD forms, normalization, star schema, and Data Vault. The engineer will work with cloud-native infrastructure on Google Cloud Platform and modern data stack tools like Airflow, dbt, BigQuery, ClickHouse, PostgreSQL, Docker, and GitLab CI/CD. The role also involves integrating event-driven systems using tools like Debezium and message brokers, optimizing performance and cost in distributed systems, contributing to system design decisions, and collaborating with analytics, risk, and ML teams to deliver high-quality datasets and data services across the organization.
Required skills
Key responsibilities
- Design, build, and maintain Tabby’s corporate data warehouse (DWH) to support business users, analysts, and machine learning engineers
- Develop and evolve internal data services such as Feature Store and other data products for analytics, ML, and operational use cases
- Build and operate scalable ELT/ETL pipelines for batch processing and streaming data workloads across distributed systems
- Design and implement high-load data processing and storage systems ensuring scalability, reliability, and performance optimization
- Integrate and maintain event-driven data pipelines using tools such as Debezium and message brokers/queues
- Optimize cloud infrastructure and data workflows in Google Cloud Platform with focus on cost efficiency and scalability
- Collaborate with analytics, risk, and machine learning teams to gather requirements and deliver reliable datasets and services
- Contribute to data platform architecture decisions including tooling, system design, governance, and best practices
Experience & skills
- 3–5+ years of experience as Data Engineer, ML Engineer, or Backend Engineer working with data-intensive systems
- Strong proficiency in Python and advanced SQL with best practices for scalable data processing
- Hands-on experience with data warehousing systems and large-scale data modeling concepts (Kimball, Inmon, Medallion, Data Mesh, star schema, Data Vault)
- Experience with modern data stack tools including Airflow, dbt, BigQuery, ClickHouse, PostgreSQL, Docker, GitLab CI/CD, and Google Cloud Platform
- Experience building and maintaining ETL/ELT pipelines for both batch and streaming data architectures
- Strong understanding of distributed systems, system design principles, and scalable data architectures
- Experience with event-driven systems and messaging technologies such as Debezium and message brokers/queues
- Ability to work independently in a remote or distributed engineering environment with strong English communication skills