Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/Qc6S5Pw4F6zBcTFq8
Back to the job results

Data Engineer II

20 days ago 2026/08/08
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Company Description

Since launching in Kuwait in 2004, talabat, the leading on-demand food and Q-commerce app for everyday deliveries, has been offering convenience and reliability to its customers. talabat’s local roots run deep, offering a real understanding of the needs of the communities we serve in eight countries across the region.
 


We harness innovative technology and knowledge to simplify everyday life for our customers, optimize operations for our restaurants and local shops, and provide our riders with reliable earning opportunities daily.
 


Here at talabat, we are building a high performance culture through engaged workforce and growing talent density. We're all about keeping it real and making a difference. Our 6,000+ strong talabaty are on an awesome mission to spread positive vibes. We are proud to be a multi great place to work award winner.



Job Description

About the Role


We’re looking for a Data Engineer who’s passionate about building reliable, scalable, and cost-efficient data systems. You’ll work with a modern stack, Kafka, Google Cloud Platform (GCP), AWS, to design and maintain the pipelines that power analytics, machine learning, and product insights.


Ideal role for someone with solid foundational skills in data engineering who’s ready to deepen their expertise, take ownership of workflows, and collaborate across teams.


If you don’t know every tool in our stack yet, that’s okay. We value curiosity, problem-solving, and a willingness to learn just as much as existing technical skills.


What's On Your Plate? 


  • Design, build, and maintain data pipelines and workflows for batch and streaming use cases.


  • Work with Kafka to manage real-time data ingestion and event-driven architectures.


  • Leverage GCP and AWS services for storage, processing, and orchestration (e.g., BigQuery, Dataflow, S3, Lambda).


  • Orchestrate workflows using tools like Airflow or similar schedulers.


  • Ensure data quality and reliability through monitoring, alerting, and automated validation.


  • Collaborate with analysts, data scientists, and product teams to understand requirements and deliver data solutions that drive business impact.


  • Optimize for cost and performance across cloud environments.


  • Participate in code reviews, documentation, and knowledge sharing to raise the bar for the team.



Our Tech Stack


  • Data Ingestion & Streaming: Apache Kafka, Kafka Connect


  • Cloud Platforms: Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage), AWS (S3, Lambda, Glue)


  • Workflow Orchestration: Apache Airflow


  • Programming Languages: Python, SQL, (bonus: Java/Scala)


  • Infrastructure & DevOps: Terraform, CI/CD pipelines, Docker


  • Monitoring & Observability: Grafana, Prometheus, Cloud-native tools



Qualifications

What Did We Order?


What We’re Looking For


  • Experience (1-3 years) in data engineering, software engineering, or a related field.


  • Proficiency in SQL and at least one programming language (Python preferred).


  • Understanding of data modeling, ETL/ELT concepts, and cloud-based data warehouses.


  • Familiarity with streaming platforms (Kafka, Kinesis, or similar).


  • Comfort working in cloud environments (GCP, AWS, or Azure).


  • Strong communication skills, able to explain technical concepts to non-technical audiences.


  • Growth mindset, eager to learn, adapt, and take on new challenges.
     


Nice-to-Have (But Not Required and willing to learn)


  • Experience with infrastructure-as-code (Terraform, CloudFormation).


  • Exposure to containerization (Docker, Kubernetes).


  • Knowledge of data governance, security, and compliance best practices.



Additional Information

Why You’ll Love Working Here


  • Impact: Your work will directly influence how data powers decisions across the company.


  • Learning culture: We invest in your growth — from mentorship to training budgets.


  • Modern stack: Work with cutting-edge tools and cloud platforms.


  • Collaboration: Partner with talented engineers, analysts, and product managers.


  • Flexibility: We care about outcomes, not where you work from.


Our Hiring Philosophy


We know that a great data engineer isn’t defined by checking every box. If you’re excited about data engineering, have a solid foundation, and are eager to grow, we want to hear from you.




This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.