Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/J8rMJ9iYSJ3R1xMN8
Back to the job results

Data Engineer - SDE 1

20 days ago 2026/08/13
General Engineering Consultancy
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

About Fam (previously FamPay)

Fam is India’s first payments app for everyone above 11. FamApp helps make online and offline payments through UPI and FamCard. We are on a mission to raise a new, financially aware generation, and drive 250 million+ young users in India to kickstart their financial journey super early in their life.
We’re reimagining how the next generation experiences fintech—going beyond payments to build a lifestyle brand that blends money, identity, and everyday experiences into one seamless, intuitive journey.
Founded in 2019 by IIT Roorkee alumni, Fam is backed by some of the most respected investors around the world like Elevation Capital, Y-Combinator, Peak XV (Sequoia Capital) India, Venture Highway, Global Founder’s Capital and the likes of Kunal Shah, Amrish Rao as angel investors.









About the Role 

We are looking for a Data Engineer (SDE-1) to join our data team. The ideal candidate will be a play a key role to develop of high performant and scalable Data Lake-house, moving us toward a world of sub-minute data latency and unified batch/streaming compute. This is an engineering-heavy role where you will manage complex CDC flows, optimize distributed query engines and leverage AI to accelerate our development lifecycle.


Technical Priorities
  • Real-time CDC: Ownership of high-throughput ingestion from RDBMS to Lakehouse using Debezium, PeerDB.
  • Lakehouse Architecture: Designing and optimizing table formats (Iceberg, Delta, Hudi) for both performance and storage efficiency.
  • Unified Compute: Developing robust ETL/ELT frameworks in PySpark and Flink (handling both batch and streaming workloads).
  • Infrastructure & Ops: Managing data workloads on AWS (EMR, EKS, MSK, S3) and automating everything via Gitlab/Github Actions.
  • Query & BI: Tuning Trino or Clickhouse to power real-time dashboards in Metabase, Superset, and PowerBI.
Requirements
  • Experience: 1–3 years in Data Engineering, specifically with distributed systems and cloud-native architectures.
  • Coding: Expert-level Python/PySpark and SQL.
    • Familiarity with Go/Java/Scala is a plus
  • Infrastructure: Hands-on experience with AWS (S3, EKS, MSK) and Infrastructure-as-Code.
  • Orchestration: Experience with Airflow or Temporal for complex workflow management.
  • AI-Native: Proficiency in using AI tools (Claude, Codex, Copilot) to write, test, and document code efficiently.
  • Systems Thinking: Ability to explain the trade-offs between different storage formats and processing frameworks.
  • Domain Modelling - Should be hands on in designing Domain models for OLAP like Fact, Dimension and types of SCD’s and OBT pattern tables.
  • Customer First - Interact with the Product & Key Stakeholders & help them by adding value to the business workflow with data & analytics.
Our Tech Stack
  • Ingestion: Debezium, PeerDB, Olake
  • Storage: Delta, Iceberg, Hudi (S3-based Lakehouse)
  • Compute: PySpark, Flink, EMR, EKS
  • Streaming: MSK (Kafka)
  • Query Engines: Trino, Clickhouse
  • Orchestration: Airflow, Temporal
  • DevOps: Gitlab, Github Actions, Terraform
  • Visualization: Metabase, Superset, Tableau, PowerBI

This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.