Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/cPF4cmBQRcA3sMdt8
Back to the job results

Principle Data Architect

4 days ago 2026/08/24
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Job Title: Data Architect — Analytical Warehouse (FNZ)



About FNZ:



FNZ is a global fintech firm transforming the way financial institutions serve their clients. By



combining cutting-edge technology, infrastructure, and investment operations, FNZ



enables wealth management firms to deliver personalized investment solutions at scale.



Operating across multiple regions and supporting over $1.5 trillion in assets under



administration, FNZ partners with leading banks, insurers, and asset managers to create



seamless and innovative wealth platforms that empower millions of investors worldwide.



Job Summary:



We are seeking a senior Data Architect to design and own the architecture of the Analytical



Warehouse built on Microsoft Fabric. This role is responsible for defining the data models,



storage strategies, ingestion patterns, semantic layer, and governance framework that



transform the NRT-ODS Gold-layer streaming data into a structured, performant, and



governed analytical platform. You will architect the bridge between real-time streaming and



historical analytics, serving both operational BI and client-facing reporting workloads.



Key Responsibilities:



• Analytical Warehouse Architecture: Design the end-to-end architecture for the



Analytical Warehouse on Microsoft Fabric — ingestion from ODS Gold topics,



Bronze/Silver/Gold layering within OneLake, transformation pipelines, semantic



layer, and consumption endpoints.



• Data Modelling: Define dimensional models, star schemas, and wide denormalized



tables optimized for analytical query patterns. Design fact and dimension tables for



wealth management domains — accounts, portfolios, transactions, positions, fees,



NAV, AUM.



• Ingestion Architecture: Architect the Kafka-to-Fabric ingestion pipeline — Kafka



Connect sink configuration, Avro-to-Delta schema mapping, partitioning strategy



(date, entity type, client), exactly-once delivery semantics, and error handling.



• Lakehouse Strategy: Define the OneLake storage architecture including



namespace design, table format strategy (Delta Lake near-term, Apache Iceberg



long-term), partition evolution, file compaction policies, and retention



management.



• Semantic Layer Design: Architect the semantic layer that provides businessfriendly metrics (AUM, NAV, trade volumes, fee breakdowns) with consistent



definitions across dashboards, reports, APIs, and client portals.



• Data Sharing Architecture: Design the architecture for Fabric Data Sharing —



OneLake shortcuts and Delta Sharing protocols that enable clients to consume



analytics in their own Fabric tenants with governed, client-scoped access.



• Data Governance & Contracts: Extend the ODS data contracts framework into the



Analytical Warehouse. Define governance policies for the analytical layer including



data classification, access controls (Purview), lineage tracking, and audit trails.



• Batch Extract Migration: Architect the migration of batch extract from SQL-driven



CSV to Kafka-sourced Parquet/Delta via Fabric pipelines. Design the metadatadriven configuration that preserves CentralHub flexibility.



• Performance Architecture: Design for query performance — Z-ordering strategies,



partition pruning, materialized views, caching layers, and compute resource



allocation across Fabric workspaces.



• Apache Iceberg Roadmap: Plan the long-term migration to Apache Iceberg on



OneLake for time-travel queries, partition evolution, and multi-engine access



(Fabric, Spark, Trino, Flink). Evaluate Confluent Tableflow or custom sink for Kafkato-Iceberg pipeline.



• Standards & Governance: Establish naming conventions, modelling standards,



documentation requirements, and code review processes for all Analytical



Warehouse development. Conduct architecture reviews for Data Engineer



deliverables.



Qualifications:



• Education: Bachelor's or Master's degree in Computer Science, Engineering, Data



Science, or a related technical field.



• Experience: 8+ years of experience in data architecture or data engineering, with at



least 3 years in a data architect role on analytical/warehouse platforms.



• Microsoft Fabric / Azure: Deep experience with Microsoft Fabric, Azure Synapse



Analytics, or equivalent cloud analytical platforms. Strong understanding of



OneLake, Fabric lakehouse, and Fabric SQL endpoints.



• Data Modelling: Expert-level skills in dimensional modelling (Kimball), data vault,



and denormalized modelling for analytical workloads. Experience modelling



financial services data domains.



• Delta Lake / Iceberg: Strong understanding of modern table formats — Delta Lake



(ACID transactions, time travel, schema evolution) and Apache Iceberg (partition



evolution, multi-engine support).



• SQL Expertise: Advanced SQL skills for analytical queries, performance tuning, and



query plan analysis.



• Streaming-to-Analytical Bridge: Experience architecting data pipelines that bridge



real-time streaming platforms (Kafka) with analytical warehouses/lakehouses.



• Semantic Layers: Experience with semantic layer and data transformation tools for



defining governed business metrics.



• Data Governance: Experience with data governance frameworks, data catalogs



(Purview, Atlan), and access control policies in multi-tenant environments.



Preferred Qualifications:



• Experience working in the Wealth Management or Financial Services industry with



deep understanding of investment operations data models.



• Experience with Apache Kafka — consumer architecture, Kafka Connect, Avro



schema evolution, and schema registries.



• Familiarity with SQL-based transformation frameworks for managing



transformation layers (models, tests, documentation, CI/CD).



• Experience with data quality frameworks (Great Expectations, Soda) integrated



into analytical pipelines.



• Experience architecting multi-tenant analytical platforms with client-scoped data



isolation.



• Knowledge of privacy-preserving analytics — differential privacy, confidential



compute, or federated analytics patterns.



• Microsoft Fabric certifications, Azure Data Engineer (DP-203), or Azure Solutions



Architect certifications are a plus.





About FNZ




FNZ is committed to opening up wealth so that everyone, everywhere can invest in their future on their terms. We know the foundation to do that already exists in the wealth management industry, but complexity holds firms back. 




We created wealth’s growth platform to help. We provide a global, end-to-end wealth management platform that integrates modern technology with business and investment operations. All in a regulated financial institution. 




We partner with the world’s leading financial institutions, with over US$2.4 trillion in assets on platform (AoP).
Together with our clients, we empower nearly 30 million people across all wealth segments to invest in their future.






This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.