
Tadawul
DevOps Engineer Expert
- Permanent
- Riyadh, Saudi Arabia
- Experience 5 - 10 yrs
- Urgent
Job expiry date: 05/01/2026
Job overview
Date posted
21/11/2025
Location
Riyadh, Saudi Arabia
Salary
SAR 20,000 - 30,000 per month
Compensation
Comprehensive package
Experience
5 - 10 yrs
Seniority
Experienced
Qualification
Bachelors degree
Expiration date
05/01/2026
Job description
The DevOps Engineer Expert role is responsible for designing, building, optimizing, and automating large-scale cloud-based data platforms within a financial or capital markets environment. The role involves building ETL/ELT pipelines, automating continuous data ingestion workflows, managing orchestration platforms such as Airflow, Cloud Composer, Dataflow, and integrating multiple data sources including APIs, databases, streaming platforms, and IoT systems. The position supports data lake and warehouse environments including BigQuery, ensures data quality, lineage, and version control across datasets, and designs scalable architectures such as star schema, data vault, and lakehouse. Additional responsibilities include developing automation scripts using Python or Java, troubleshooting cloud infrastructure, ensuring system reliability and performance across Dev/QA/UAT/Production, implementing CI/CD pipelines, managing infrastructure as code using Typescript, Terraform, CloudFormation, or Deployment Manager, and maintaining containerized environments using Docker and Kubernetes (EKS, AKS, GKE). The role enforces data governance, compliance with financial regulatory standards, security, access control, cost optimization, validation, logging, and monitoring. It additionally advises on divisional strategy, risk mitigation, governance tools, future-state process flows, and best practice workflow changes, while ensuring stakeholder satisfaction and adherence to organizational policies and procedures.
Required skills
Key responsibilities
- Build ETL/ELT pipelines to extract, transform, and load data into warehouses or lakes
- Automate workflows to handle continuous data ingestion
- Collaborate with data engineering teams to automate ingestion, validation, and deployment workflows
- Manage and optimize data pipelines using Airflow, Cloud Composer, or Dataflow
- Ensure data quality, lineage, and version control across datasets
- Support data lake and warehouse environments such as BigQuery
- Design scalable data architectures including star schema, data vault, and lakehouse
- Optimize data storage for performance and cost
- Develop automation scripts using Python or Java
- Troubleshoot cloud infrastructure and data platform issues
- Ensure system reliability, scalability, and performance across all environments
- Integrate multiple data sources including APIs, databases, streaming, and IoT
- Use orchestration tools such as Airflow or Prefect to schedule and monitor workflows
- Design and maintain CI/CD pipelines for applications and data workflows
- Manage infrastructure as code using Typescript, Terraform, CloudFormation, or Deployment Manager
- Automate build, test, and deployment processes
- Implement containerization and orchestration using Docker and Kubernetes
- Ensure data accuracy, consistency, and regulatory compliance
- Implement validation, logging, and monitoring systems
- Support data governance and compliance with financial regulatory standards
- Implement cloud security, access control, and cost optimization measures
- Advise on division strategy to ensure alignment and integration
- Provide expert recommendations and support divisional projects
- Advise on risk management and mitigation tactics
- Recommend governance tools, frameworks, and methodologies
- Advise on future-state process flows and workflow enhancements
- Ensure customer satisfaction through timely and courteous service
- Promote adherence to organizational policies and procedures
Experience & skills
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or Data Science; Master’s preferred
- Professional certifications preferred
- 6+ years of experience in software engineering, DevOps, DataOps, or development/support in financial or capital markets environments
- Experience with relational and non-relational databases
- Experience with data pipeline tools such as Apache Airflow, dbt, Luigi, or Kafka
- Knowledge of data modeling, data warehousing, and distributed systems
- Experience with version control tools such as Git and CI/CD practices
- Knowledge of data governance and data quality frameworks
- Proficiency in Python or Scala for data processing
- Familiarity with cloud data platforms