
Huspy
Data Analytics Engineer
- Permanent
- Dubai, United Arab Emirates
- Experience 2 - 5 yrs
Job expiry date: 12/11/2025
Job overview
Date posted
28/09/2025
Location
Dubai, United Arab Emirates
Salary
AED 15,000 - 20,000 per month
Compensation
Salary only
Experience
2 - 5 yrs
Seniority
Experienced
Qualification
Bachelors degree
Expiration date
12/11/2025
Job description
As a Data Analytics Engineer at Huspy, you will design, build, and own the transformation layer of the data stack using dbt and Snowflake to deliver clean, reliable, and scalable data models that act as a Single Source of Truth. You will develop automated reconciliation processes to validate raw data flowing from Kafka against core application portals to ensure trust and integrity of every metric. You will enable self-service analytics by creating curated tables and integrating them with Looker, and you will implement robust CI/CD pipelines and GitHub automations (required tests, SQL linting) to validate pull requests prior to merging. You will drive modern data governance by defining strategies for handling PII and managing the full data lifecycle from ingestion to archival, while optimizing Snowflake usage and AWS costs through best practices. Work setup is hybrid in Dubai or Madrid, with remote options in the EU or relocation support to Dubai or Madrid.
Required skills
Key responsibilities
- Design, build, and own dbt-based transformation layers in Snowflake to create scalable Single Source of Truth data models
- Implement automated reconciliation to verify that raw Kafka data matches core application portals and certify datasets
- Develop curated datasets and connect them to Looker to enable self-service analytics for business users and analysts
- Establish CI/CD pipelines and GitHub automations (tests, SQL linting) to enforce quality gates on pull requests
- Define and implement modern data governance for PII and manage the end-to-end data lifecycle from ingestion to archival
- Analyze and optimize Snowflake usage and AWS costs by applying cost-efficiency best practices
Experience & skills
- 4+ years in an Analytics Engineering or data modeling-heavy Data Engineering role
- Expert-level proficiency with dbt (Core or Cloud), including macros, packages, testing, and project structure
- Advanced SQL skills and deep experience with a cloud data warehouse (Snowflake preferred)
- Strong understanding of dimensional modeling concepts (e.g., Kimball methodology) and ability to design clear, comprehensive models
- Proficiency in Python for scripting, automation, and pipeline development
- Hands-on experience with cloud platforms (preferably AWS)
- Excellent communication skills to translate complex technical concepts and convert business requirements into robust data models
- Experience building and maintaining BI layers in Looker (LookML) (preferred)
- Experience working with event streaming data from Kafka (preferred)
- Familiarity with data governance tools (preferred)
- A keen eye for cost optimization in a cloud data warehouse environment