Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/18nmEeNzZofSSShU8
Back to the job results

Senior Lead Data Engineer(10+ Years in Data engineering, Backend development/PySpark, SQL and Snowflake/Databricks/similar)

30+ days ago 2026/09/10
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Senior Lead Data Engineer


As the world works and lives faster, FIS is leading the way.  Our fintech solutions touch nearly every market, company and person on the planet. Our teams are inclusive and diverse. Our colleagues work together and celebrate together. If you want to advance the world of fintech, we’d like to ask you: Are you FIS?


About the role:


  • The successful candidate will be a key contributor in our datalake journey. You will play a key role in transitioning Payment Hub data from OLTP to datalake and then in using it for various analytics and AI/ML initiatives.
  • You will work in a dedicated product development SCRUM team in an iterative delivery model.

About the team: 


  • FIS offers award-winning Treasury and Risk (T&R) management solutions that support a best-in-class, digital, modernized functions. Hosted in a SaaS, private cloud or on-premises environment, FIS T&R solutions cater to Treasury & Cash management, Payments & Bank Connectivity, Bank Account Management, Enterprise Risk for firms in Capital Markets space and Accounting & Reporting solution for Insurance Industry.
  • Our market leading solutions include – Enterprise Risk Suite, Insurance Accounting Suite, Bank Account Manager, Payment Hub, Treasury & Risk Manager - Quantum edition, Treasury & Risk Manager - Integrity.
  • Enterprise Risk Suite is ranked No1 for overall risk management by Chartis.
  • Treasury & Risk Manager – Integrity (SaaS) solution was awarded “Best Cash & Treasury Management solution” by Treasury Management International (TMI)
  • Payment Hub (Trax) was awarded Best Cross-Border Payments Solution for Corporates by Global Finance.
  • Treasury & Risk Manager - Quantum was named the overall winner for Best Treasury Management Software by Global Finance

What you will be doing:


  • Design and build the Apache Spark/PySpark ETL pipeline (Bronze → Silver → Gold medallion architecture)
  • Implement Apache Iceberg table operations (MERGE, UPSERT, SCD Type 2 logic, incremental loads)
  • Design and validate the analytical star schema (fact/dimension tables, conformed dimensions)
  • Define and execute three-tier data quality rules, dead-letter handling, and validation logic
  • Build business logic connectors, transformation helpers, and custom derivations
  • Collaborate with stakeholders to clarify KPIs, query patterns, and analytical use cases
  • Write comprehensive unit, integration, and end-to-end tests

What you will need:


  •  10+ years data engineering, backend development, or full-stack analytics platform work
  • Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)
  • Apache Iceberg or strong willingness to learn advanced table formats
  • Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)
  • Python for data transformation and microservices
  • API design, error handling, and testing discipline
  • Proficiency in building and maintaining CI/CD pipelines using tools like Jenkins or Azure DevOps.
  • Experience with cloud platforms (AWS or Azure) and their services (e.g., EC2, S3, Lambda, AKS).
  • Strong understanding of testing methodologies and experience with automated testing frameworks.
  • Should be able to perform tasks individually with ownership
  • Excellent communication and collaboration skills.

Added bonus if you have:


  • Advanced data architecture (data lakes, dimensional modeling)
  • Cloud-native pipelines (AWS/Azure, serverless, ETL orchestration)
  • DataOps (CI/CD, monitoring, logging – Splunk/Dynatrace)
  • Performance optimization (parallel processing, large datasets)
  • Data security basics (encryption, masking)
  • API integration & data services exposure

Privacy Statement


FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice.


Sourcing Model


Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company.


#pridepass


This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.