كلما زادت طلبات التقديم التي ترسلينها، زادت فرصك في الحصول على وظيفة!

إليك لمحة عن معدل نشاط الباحثات عن عمل خلال الشهر الماضي:

عدد الفرص التي تم تصفحها

عدد الطلبات التي تم تقديمها

استمري في التصفح والتقديم لزيادة فرصك في الحصول على وظيفة!

هل تبحثين عن جهات توظيف لها سجل مثبت في دعم وتمكين النساء؟

اضغطي هنا لاكتشاف الفرص المتاحة الآن!
نُقدّر رأيكِ

ندعوكِ للمشاركة في استطلاع مصمّم لمساعدة الباحثين على فهم أفضل الطرق لربط الباحثات عن عمل بالوظائف التي يبحثن عنها.

هل ترغبين في المشاركة؟

في حال تم اختياركِ، سنتواصل معكِ عبر البريد الإلكتروني لتزويدكِ بالتفاصيل والتعليمات الخاصة بالمشاركة.

ستحصلين على مبلغ 7 دولارات مقابل إجابتك على الاستطلاع.


تم إلغاء حظر المستخدم بنجاح
https://bayt.page.link/K66KpE87FwUB5i1h6
العودة إلى نتائج البحث‎
خدمات الدعم التجاري الأخرى
أنشئ تنبيهًا وظيفيًا لوظائف مشابهة
تم إيقاف هذا التنبيه الوظيفي. لن تصلك إشعارات لهذا البحث بعد الآن.

الوصف الوظيفي

Stantec is a global leader in sustainable engineering, architecture, and environmental consulting. The diverse perspectives of our partners and interested parties drive us to think beyond what’s previously been done on critical issues like climate change, digital transformation, and future-proofing our cities and infrastructure. We innovate at the intersection of community, creativity, and client relationships to advance communities everywhere, so that together we can redefine what’s possible. The Stantec community unites approximately 32,000 employees working in over 450 locations across 6 continents.
Senior Data Engineer
Digital Practice Group | Environmental Services | Stantec
About the Role
Stantec's Digital Practice is building a modern, cloud-native data platform to power environmental project delivery, reporting, and analytics across the organization. As a Senior Data Engineer, you will play a central role in architecting, building, and maintaining the data infrastructure and solutions that enable the Environmental Services team to derive insight from large and complex project datasets.
You will work in close partnership with other Digital Practice resources who manage the underlying cloud platform, networking, security, and infrastructure, allowing you to focus on what sits above it: data architecture, pipeline engineering, semantic modelling, and data product delivery. Together, these roles form part of the technical core of the Digital Practice's data and cloud capability.
You will help lead data engineering architecture decisions while remaining deeply hands-on in pipeline development, data modelling, and platform optimization working in close partnership with data scientists, BI developers, business stakeholders, and the broader technology teams.
As Stantec's data platform evolves, this role will play a key part in evaluating and adopting Microsoft Fabric capabilities including Fabric Lakehouses, Data Pipelines, and Fabric-native semantic models, alongside the existing Databricks and Azure landing zone resources.
Key Responsibilities
Architecture & Design
- Help lead the design and evolution of Stantec's environmental data platform including medallion architecture (Bronze / Silver / Gold) in Databricks / ADLS Gen2, semantic layer design, data product definitions and evaluate equivalent patterns within Microsoft Fabric as the platform matures
- Define data engineering standards, coding conventions, pipeline design patterns, and data quality frameworks for the Digital Practice - covering both the current Databricks-led stack and emerging Fabric capabilities
- Architect end-to-end data flows across source systems (SharePoint, SQL, ESRI, EQuIS, OpenGround, cloud sources and other external feeds), the lakehouse, and consumption layers (Power BI, Fabric semantic models, APIs)
- Collaborate with the Azure Infrastructure Specialists to ensure data platform infrastructure is secure, scalable, and aligned with Stantec's landing zone architecture including ADLS Gen2 provisioning, private endpoint configuration, Databricks/Fabric setup, and cross-landing-zone data movement patterns.
- Align on shared governance concerns including RBAC, network segmentation, Unity Catalog permissions, Fabric workspace and item-level permissions, and Azure Policy compliance
- Contribute to Stantec's Fabric adoption strategy, assessing where Fabric capabilities (Lakehouse, Warehouse, Data Pipelines, Eventstream, etc) can complement or progressively replace existing tooling
Implementation & Engineering
- Design, build, and maintain scalable ETL/ELT pipelines using Databricks (PySpark, Delta Live Tables), Fabric Data Pipelines and Python
- Develop and maintain Delta Lake / Unity Catalog structures including managed tables, materialized views, and row-level security (RLS) using UDF row filters
- Implement and maintain data ingestion patterns using Databricks Autoloader and Fabric Eventstream / Dataflow Gen 2 for structured (CSV, Parquet) and semi-structured (JSON, XML) data sources
- Build and maintain Power BI / Fabric semantic models including DirectQuery configurations, dimension tables, and time intelligence, ensuring performance and governance
- Implement data governance controls including Unity Catalog permissions, Fabric item-level security, column-level security, and audit logging
- Build and maintain CI/CD pipelines for data workloads via Azure DevOps including deployment automation for both Databricks and Fabric workspace items where supported
- Coordinate with the Azure Infrastructure Specialist on shared integration services including Azure Functions, Logic Apps, and potentially ADF to ensure consistent deployment, access control, and operational standards
Operations & Support
- Monitor and optimize Databricks pipeline performance including instance pool configuration, serverless vs. classic compute trade-offs, and job orchestration efficiency
- Monitor Fabric capacity utilization including CU consumption, throttling behaviors, and workspace sizing to ensure cost-effective and performant operations
- Troubleshoot data quality issues, pipeline failures, and schema drift across the data platform
- Escalate and collaborate with the Azure Infrastructure Specialist on platform-level issues including networking failures, identity/access errors, and infrastructure capacity constraints
- Provide technical mentorship to other members within the Digital Practice team
- Maintain thorough documentation of data models, pipeline logic, and platform configurations - consistent with infrastructure documentation standards across the team
Innovation & Continuous Improvement
- Stay current with Databricks, Azure data platform, Microsoft Fabric and analytics engineering best practices, particularly as Fabric capabilities rapidly evolve
- Evaluate and pilot new Fabric features where possible for applicability to Stantec's environmental data platform
- Recommend and implement improvements to data pipeline reliability, performance, and platform scalability
- Contribute to cross-team knowledge sharing alongside the Azure Infrastructure Specialist, ensuring the Digital Practice maintains a unified, well-documented technical foundation
Qualifications & Experience
- Bachelor’s degree in Computer Science, IT, Data Science, or related field
- 10+ years of data engineering experience, with 5+ years working on cloud-native data platforms (Azure preferred)
- Expert-level proficiency in Python and SQL; Spark/PySpark experience essential
- Deep, hands-on experience with Databricks, Delta Live Tables, Unity Catalog, job orchestration, and performance optimization
- Proficiency in Power BI data modelling including DirectQuery, semantic model design, and M/DAX
- Familiarity with Azure SQL Database and SQL Server environments
- Experience with ADLS Gen2 and cross-landing-zone data movement patterns
- Exposure to data governance frameworks and enterprise RLS/security patterns
- Working understanding of Azure infrastructure concepts, particularly as they relate to data platform services (networking, RBAC, private endpoints, managed identities) to collaborate effectively with the Azure Infrastructure Specialist
- Hands-on exposure to Microsoft Fabric including Fabric Lakehouse, Data Pipelines, Dataflow Gen2, and Fabric-native semantic models (Direct Lake)
- Experience managing Fabric capacity sizing and workspace governance in a production or near-production environment
- Databricks Certified Data Engineer Professional or Azure Data Engineer Associate (DP-203) highly regarded
- Familiarity with OneLake, Fabric shortcuts, and cross-workspace data sharing patterns
- Experience working in regulated or governance-heavy environments (e.g., engineering, environmental consulting, or similar)
What You'll Bring
- Deep technical ownership - you care about data quality, pipeline reliability, and platform health
- The ability to balance strategic architecture thinking with day-to-day hands-on delivery
- A collaborative, team-first approach - particularly in working alongside Azure Infrastructure Specialist to ensure the data and infrastructure layers are tightly aligned and mutually supportive
- An adaptive mindset that is comfortable navigating a platform that is actively evolving, with a clear path toward Microsoft Fabric adoption alongside an established Databricks foundation
- Experience working in large, cross-functional enterprise environments
- Strong analytical mindset and a passion for clean, well-documented, maintainable data systems
- Confidence raising platform concerns and engaging with infrastructure peers to resolve them quickly
Primary Location: India | Pune
Organization: Stantec IN Business Unit
Employee Status: Regular
Business Justification: New Position
Travel: No
Schedule: Full time
Job Posting: 12/05/2026 11:05:10
Req ID: 1005820

لقد تمت ترجمة هذا الإعلان الوظيفي بواسطة الذكاء الاصطناعي وقد يحتوي على بعض الاختلافات أو الأخطاء البسيطة.

لقد تجاوزت الحد الأقصى المسموح به للتنبيهات الوظيفية (15). يرجى حذف أحد التنبيهات الحالية لإضافة تنبيه جديد.
تم إنشاء تنبيه وظيفي لهذا البحث. ستصلك إشعارات فور الإعلان عن وظائف جديدة مطابقة.
هل أنت متأكد أنك تريد سحب طلب التقديم إلى هذه الوظيفة؟

لن يتم النظر في طلبك لهذة الوظيفة، وسيتم إزالته من البريد الوارد الخاص بصاحب العمل.