كلما زادت طلبات التقديم التي ترسلينها، زادت فرصك في الحصول على وظيفة!

إليك لمحة عن معدل نشاط الباحثات عن عمل خلال الشهر الماضي:

عدد الفرص التي تم تصفحها

عدد الطلبات التي تم تقديمها

استمري في التصفح والتقديم لزيادة فرصك في الحصول على وظيفة!

هل تبحثين عن جهات توظيف لها سجل مثبت في دعم وتمكين النساء؟

اضغطي هنا لاكتشاف الفرص المتاحة الآن!
نُقدّر رأيكِ

ندعوكِ للمشاركة في استطلاع مصمّم لمساعدة الباحثين على فهم أفضل الطرق لربط الباحثات عن عمل بالوظائف التي يبحثن عنها.

هل ترغبين في المشاركة؟

في حال تم اختياركِ، سنتواصل معكِ عبر البريد الإلكتروني لتزويدكِ بالتفاصيل والتعليمات الخاصة بالمشاركة.

ستحصلين على مبلغ 7 دولارات مقابل إجابتك على الاستطلاع.


تم إلغاء حظر المستخدم بنجاح
https://bayt.page.link/VQnG6xHybbNseBfL8
العودة إلى نتائج البحث‎
خدمات الدعم التجاري الأخرى
أنشئ تنبيهًا وظيفيًا لوظائف مشابهة
تم إيقاف هذا التنبيه الوظيفي. لن تصلك إشعارات لهذا البحث بعد الآن.

الوصف الوظيفي

The Data Engineer – Senior leads the design, development, and maintenance of scalable data and analytics platforms that support enterprise data, analytics, and AI initiatives. This role is responsible for building reliable data pipelines, enabling efficient data processing, storage, and access for analysts, data scientists, and business stakeholders.


Working closely with cross-functional teams including business stakeholders, IT partners, and subject matter experts, the role delivers high-quality, governed, and scalable data solutions. The position contributes to one or multiple product teams and drives continuous improvement in data engineering practices, tools, and platforms.


Key Responsibilities: Data Engineering & Platform Development
  • Design, develop, deploy, and automate distributed data systems for ingesting and transforming data from relational, event-based, unstructured, and IoT sources.
  • Build and maintain scalable ETL/ELT data pipelines with monitoring, alerting, and performance optimization.
  • Own end-to-end lifecycle of data pipelines including development, testing, deployment, and operational support.
  • Design and operate large-scale data storage and processing systems (e.g., data lakes, distributed databases, and cloud platforms).
Data Modeling & Performance Optimization
  • Design and implement logical and physical data models to support analytics and business requirements.
  • Optimize database and processing performance through indexing, partitioning, and efficient data relationships.
  • Ensure scalability, reliability, and efficiency of data architecture and storage solutions.
Data Governance, Quality & Compliance
  • Implement data governance frameworks including metadata management, data access controls, and retention policies.
  • Develop and maintain systems to monitor, validate, and resolve data quality and integrity issues.
  • Ensure compliance with enterprise data standards, regulatory requirements, and best practices.
Analytics, AI & Innovation Enablement
  • Enable advanced analytics, AI/ML, and data science use cases by delivering high-quality, well-governed datasets.
  • Apply modern tools, frameworks, and automation techniques to streamline data integration and preparation.
  • Stay current with emerging trends in Big Data, cloud technologies, and AI, and recommend improvements.
Collaboration & Delivery
  • Collaborate with stakeholders to translate business requirements into scalable technical solutions.
  • Deliver solutions using Agile methodologies such as DevOps, Scrum, and Kanban.
  • Mentor and coach junior team members, fostering best practices and continuous improvement.
  • Develop and maintain comprehensive solution documentation for knowledge transfer and operational readiness.

Cummins is an equal opportunity employer. Our policy is to provide equal employment opportunities to all qualified persons without regard to race, sex, color, disability, national origin, age, religion, union affiliation, sexual orientation, veteran status, citizenship, gender identity, or other status protected by law.
Responsibilities:
Skills and Experience:Required:
  • Strong experience in data engineering and large-scale data processing.
  • Proficiency in Python, SQL, and Spark (PySpark preferred).
  • Hands-on experience with ETL/ELT tools and data pipeline development.
  • Experience designing and implementing Big Data platforms using open-source and cloud technologies.
  • Knowledge of distributed systems and clustered cloud-based environments.
  • Experience with data storage technologies such as Hadoop, HBase, Cassandra, MongoDB, or similar.
  • Strong understanding of data governance, metadata management, and data quality frameworks.
  • Familiarity with data regulations and complex business systems.
  • Experience with Agile development methodologies (Scrum, DevOps, Kanban).
  • Strong problem-solving, system design, and programming skills.
  • Excellent verbal and written communication skills.
Preferred:
  • Experience with IoT data processing and integration.
  • Knowledge of AI/ML workflows and advanced analytics techniques (e.g., regression, clustering, time-series analysis).
  • Experience with cloud data platforms and tools (e.g., Snowflake, Palantir, Azure ecosystem).
  • Familiarity with data cataloging tools such as Azure Purview or Alation.
  • Experience in dimensional modeling, 3NF, and advanced data modeling techniques.
  • Exposure to graph databases (e.g., Neo4j, TigerGraph) and ontology-based data modeling.
  • Experience handling large-scale file movement and diverse data extraction methods.
Core Competencies:
  • System Requirements Engineering: Translates stakeholder needs into clear, testable system requirements and manages them through the lifecycle.
  • Collaborates: Builds strong partnerships to achieve shared goals effectively.
  • Communicates Effectively: Conveys information clearly across different audiences and formats.
  • Customer Focus: Delivers solutions that meet customer needs and enhance satisfaction.
  • Decision Quality: Makes timely, well-informed decisions that drive progress.
  • Data Extraction (ETL/ELT): Extracts, transforms, and loads data from multiple sources for downstream use.
  • Programming: Develops, tests, and maintains efficient, secure, and scalable code.
  • Quality Assurance Metrics: Uses metrics and standards to ensure solutions meet quality expectations.
  • Solution Documentation: Creates clear documentation to support knowledge sharing and usability.
  • Solution Validation Testing: Ensures solutions function correctly and meet defined requirements.
  • Data Quality: Identifies and resolves data issues to ensure accuracy and reliability.
  • Problem Solving: Applies structured methods to analyze issues and implement effective solutions.
  • Values Differences: Embraces diverse perspectives to drive better outcomes.

Qualifications:
Qualifications:
  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related technical discipline, or equivalent practical experience.
  • May require compliance with export control or regulatory requirements where applicable.
لقد تمت ترجمة هذا الإعلان الوظيفي بواسطة الذكاء الاصطناعي وقد يحتوي على بعض الاختلافات أو الأخطاء البسيطة.

لقد تجاوزت الحد الأقصى المسموح به للتنبيهات الوظيفية (15). يرجى حذف أحد التنبيهات الحالية لإضافة تنبيه جديد.
تم إنشاء تنبيه وظيفي لهذا البحث. ستصلك إشعارات فور الإعلان عن وظائف جديدة مطابقة.
هل أنت متأكد أنك تريد سحب طلب التقديم إلى هذه الوظيفة؟

لن يتم النظر في طلبك لهذة الوظيفة، وسيتم إزالته من البريد الوارد الخاص بصاحب العمل.