Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/x2BnkMjcF3y68MxD9
Back to the job results

Senior Business Analyst

30+ days ago 2026/07/29
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Job Purpose Own delivery governance and solution architecture for the Enterprise Data Platform.
Define and oversee end?
to?end data solutions—including batch and real?
time pipelines—using SQL, Python, and PySpark with strong focus on streaming, Change Data Capture (CDC), and database mirroring.
Translate business needs (including Loan Management System use cases) into scalable, secure, and cost?
effective data architectures and integration patterns.
Duties and Responsibilities • Define target?
state and incremental roadmaps for data platforms and domain solutions; produce HLD/LLD, logical/physical models, and interface contracts.
• Lead PMO activities: scope planning, backlog management, sprint planning, risk/issue/decision logs, dependencies, release planning, and stakeholder reporting.
• Architect real?
time and batch data ingestion using SQL, Python, PySpark; design streaming jobs (e.
g., Structured Streaming) with CDC and database mirroring for near?
real?time use cases.
• Create Data Flow Diagrams (DFDs) and end?
to?end data architecture involving multiple publishers/consumers, defining SLAs/SLOs and data contracts.
• Drive integration patterns across systems (APIs, event streaming, file?
based, message queues) including schema evolution and idempotency strategies.
• Establish data quality, observability, and lineage standards (validation, alerts, runbooks) across environments.
• Ensure security, privacy, and compliance (access controls, encryption, PII handling) aligned with enterprise policies.
• Coordinate with internal teams (BI/Analytics, App Dev, Infra/Security, DevOps) and external vendors/partners for delivery and support.
• Review estimates, monitor burn/cost, track delivery to milestones; drive retrospectives and continuous improvement.
• Support UAT, cutover, and hypercare; perform RCA for incidents and implement preventive actions.
• Maintain high?
quality documentation (architecture packs, runbooks, decision records) and conduct knowledge share sessions.
Key Decisions / Dimensions • Select architecture patterns (batch vs.
streaming), CDC approach, and storage/format strategies to meet SLAs and cost goals.
• Approve interface designs, data contracts, and non?
functional requirements (performance, reliability, security).
• Prioritize scope and sequence of increments/releases based on value, risk, and dependencies.
• Recommend tooling for orchestration, testing, observability, and CI/CD.
Major Challenges • Balancing architecture quality with delivery timelines across parallel initiatives.
• Managing integration complexity with Loan Management Systems and upstream/downstream platforms.
• Maintaining reliability, data quality, and low latency for streaming/CDC workloads at scale.
• Handling evolving business requirements and schema drift across diverse sources.
Required Qualifications and Experience Educational Qualifications: • Graduate or Post?
Graduate in Computer Science, Information Technology, or Data Science/Technologies.
Work Experience: • 3–4 years of hands?
on experience in data engineering/solution architecture/PMO within data platforms.
Technical Expertise / Skills Keywords: • SQL, Python, PySpark • Data streaming, Change Data Capture (CDC), Database Mirroring • Integration patterns (APIs, events/queues, file), schema evolution, and idempotency • Version control and DevOps pipelines (e.
g., Git/GitHub, Azure DevOps) • Preferred: Azure Databricks, Azure Data Factory, Data Lake Storage; data modeling and performance optimization Additional Expertise: • Understanding of Loan Management Systems (LMS) in banking/NBFC context • Strong grasp of system?
level integrations across enterprise platforms • Ability to create DFDs and end?
to?end data architecture involving multiple publishers and consumers

This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.