كلما زادت طلبات التقديم التي ترسلينها، زادت فرصك في الحصول على وظيفة!
إليك لمحة عن معدل نشاط الباحثات عن عمل خلال الشهر الماضي:
عدد الفرص التي تم تصفحها
عدد الطلبات التي تم تقديمها
استمري في التصفح والتقديم لزيادة فرصك في الحصول على وظيفة!
هل تبحثين عن جهات توظيف لها سجل مثبت في دعم وتمكين النساء؟
اضغطي هنا لاكتشاف الفرص المتاحة الآن!ندعوكِ للمشاركة في استطلاع مصمّم لمساعدة الباحثين على فهم أفضل الطرق لربط الباحثات عن عمل بالوظائف التي يبحثن عنها.
هل ترغبين في المشاركة؟
في حال تم اختياركِ، سنتواصل معكِ عبر البريد الإلكتروني لتزويدكِ بالتفاصيل والتعليمات الخاصة بالمشاركة.
ستحصلين على مبلغ 7 دولارات مقابل إجابتك على الاستطلاع.
We are seeking an experienced Data Platform Engineer to join our core platform team. In this role, you will be responsible for building, securing, and automating our enterprise Data Platform on AWS. You will go beyond basic pipeline creation by designing and maintaining the underlying infrastructure and CI/CD frameworks that enable our data teams to operate and scale efficiently. Job Responsibilities • Cloud & Platform Infrastructure (IaC): Deploy and maintain Databricks workspaces and AWS infrastructure (VPC, PrivateLink, IAM, S3, Lambda, EKS, and Fargate) using Terraform. • Unity Catalog Implementation: Automate the governance layer, including metastore configuration, external locations, and access controls within Unity Catalog. • Security & Compliance: Ensure the platform adheres to enterprise security standards by managing implementing cloud infrastructure and data protection automated security controls. • Workspace Lifecycle Management: Use Terraform for end-to-end workspace provisioning, ensuring consistent setup across Dev, Acc, and Prod environments. • Governance & Cost Control (Policies): Design and implement policies and guardrails to enforce standards • Identity & Access Automation: Automate assignment of permissions using Terraform. Manage Service Principals for pipelines and map groups to specific Workspace roles and Unity Catalog grants. • DevOps & Automation (CI/CD) • Pipeline Architecture: Oversee GitLab CI/CD pipelines for data projects, transitioning the team from manual notebook deployments to automated workflows. • Databricks Asset Bundles (DABs): Standardize deployment strategies using DABs. Develop templates and presets for Data Engineers to deploy jobs and workflows. • Release Management: Implement branching strategies, code review policies, and environment promotion rules (Dev → Acc → Prod). • Service Organization & Operations • Observability: Configure monitoring, alerting, and logging (using system tables or integration with tools like CloudWatch) to ensure platform stability. • Support & Incident Management: Serve as an escalation point for platform-related incidents. • Knowledge Sharing: Document best practices and conduct workshops to upskill data engineers on effective platform usage. Job Qualifications: • Bachelor’s in computer science, software engineering, mathematics, or related field. • 5+ years industry experience in Data Engineering, Cloud Infrastructure, or DevOps; 3+ years with Databricks in enterprise settings. • Advanced Terraform skills for managing Cloud infrastructure and Databricks resources • Extensive AWS portfolio knowledge • Expertise in CI/CD pipelines using GitLab CI and Databricks Asset Bundles. • Deep understanding of Databricks Lakehouse architecture, Unity Catalog, Serverless Compute, Delta Lake, and Workflow orchestration. • Solid grasp of SDLC/DataOps, including unit testing, modular code, and Git strategies. • Proficient in Python (e.g., automation, PySpark, pandas) and Bash/Shell scripting for CI/CD. • Excellent communication, documentation, mentoring, and collaboration skills. • Preferred: Databricks Certified Data Engineer Professional or AWS Solutions Architect certification.
More information about NXP in India...
لن يتم النظر في طلبك لهذة الوظيفة، وسيتم إزالته من البريد الوارد الخاص بصاحب العمل.