كلما زادت طلبات التقديم التي ترسلينها، زادت فرصك في الحصول على وظيفة!
إليك لمحة عن معدل نشاط الباحثات عن عمل خلال الشهر الماضي:
عدد الفرص التي تم تصفحها
عدد الطلبات التي تم تقديمها
استمري في التصفح والتقديم لزيادة فرصك في الحصول على وظيفة!
هل تبحثين عن جهات توظيف لها سجل مثبت في دعم وتمكين النساء؟
اضغطي هنا لاكتشاف الفرص المتاحة الآن!ندعوكِ للمشاركة في استطلاع مصمّم لمساعدة الباحثين على فهم أفضل الطرق لربط الباحثات عن عمل بالوظائف التي يبحثن عنها.
هل ترغبين في المشاركة؟
في حال تم اختياركِ، سنتواصل معكِ عبر البريد الإلكتروني لتزويدكِ بالتفاصيل والتعليمات الخاصة بالمشاركة.
ستحصلين على مبلغ 7 دولارات مقابل إجابتك على الاستطلاع.
Job Summary
Synechron is seeking an experienced IDMC (Informatica Intelligent Data Management Cloud) Developer to design, implement, and maintain enterprise data integration solutions. This role involves building scalable ETL workflows, managing data pipelines, and ensuring data quality, security, and compliance across cloud and on-premise environments. The ideal candidate will leverage their expertise in cloud data services, API management, and data governance to support digital transformation, enable data-driven insights, and deliver high-quality, reliable data solutions aligned with business objectives.
Software Requirements
Required Software Proficiency:
Informatica Cloud Services (IDMC / IICS) — hands-on experience in developing and managing data workflows (version 2022 and above preferred)
SQL (MySQL, Oracle, SQL Server, PostgreSQL) — strong skills in data querying, transformation, and optimization
Cloud platforms: AWS, Azure, or GCP — familiarity with data services and migration in cloud environments
API management: REST APIs, Web Services — ability to design, consume, and manage API integration workflows
Data integration tools and platforms supporting data pipelines and application integration
Data governance and security frameworks supporting enterprise compliance
Preferred Software Skills:
Automation tools: CI/CD pipelines (Jenkins, Azure DevOps, GitLab) — for automation of deployment and workflows
Data modeling, data warehouse concepts, and metadata management tools
Cloud automation frameworks: Terraform, CloudFormation — for infrastructure as code and environment provisioning
Overall Responsibilities
Design, develop, and optimize scalable ETL/ELT workflows supporting enterprise data ingestion, transformation, and migration
Build and maintain resilient data pipelines supporting analytics, reporting, and compliance needs in cloud and hybrid environments
Collaborate with data architects, analysts, and developers to capture requirements and deliver robust data solutions
Conduct data validation, reconciliation, and security assessments to ensure data quality and regulatory compliance, including GDPR and HIPAA
Support data migration projects, system upgrades, and cloud adoption initiatives
Implement data governance principles, access controls, and encryption measures to protect sensitive data
Monitor, troubleshoot, and optimize data pipeline performance, resolving issues proactively
Automate deployment processes, infrastructure provisioning, and workflow orchestration using Terraform, CloudFormation, and CI/CD tools
Maintain detailed documentation of data architecture, workflows, security protocols, and operational procedures
Technical Skills (By Category)
Languages & Data Tools (Essential):
SQL (MySQL, Oracle, PostgreSQL, SQL Server): data validation, queries, and performance tuning
PySpark, Spark SQL: large-scale data processing and transformation workflows
IDMC / IICS: designing, deploying, and managing cloud-native data pipelines
Databases & Data Management:
Relational databases, data modeling, and data warehousing concepts supporting enterprise analytics
Metadata management and data lineage tools supporting compliance
Cloud Technologies:
AWS, Azure, or GCP data services, supporting deployment, integration, and migration
Cloud automation tools like Terraform or CloudFormation (preferred)
Frameworks & Libraries:
PySpark, Delta Lake, Dataflow for data processing and lakehouse architectures
Tools & Methodologies:
CI/CD pipelines, Git, Terraform, Agile/Scrum practices supporting DevOps and continuous delivery
Security & Compliance:
Knowledge of encryption, role-based access controls, and data privacy standards (GDPR, HIPAA)
Experience Requirements
5+ years of experience supporting enterprise-scale data pipelines and cloud data solutions
Demonstrated expertise in designing, deploying, and managing data integration workflows in cloud environments
Proven ability to support data migration, data quality assurance, and compliance initiatives
Experience with analytics data platforms supporting business intelligence and reporting in regulated industries (preferred)
Background supporting or implementing data governance, security, and privacy policies
Day-to-Day Activities
Develop, test, and maintain scalable ETL/ELT workflows supporting analytics, migration, and compliance
Collaborate with business, data science, and application teams to capture and implement data requirements
Monitor data pipelines, troubleshoot failures, and implement fixes to improve system reliability and performance
Support data migration efforts, including schema validation, data transformation, and validation tasks
Ensure data security, encryption, and access control measures are enforced across systems
Automate environment setup, deployment, and management using Terraform, CI/CD, and cloud automation tools
Document data architecture, process workflows, and security requirements for operational governance
Stay current with emerging cloud, big data, and data governance trends, sharing insights to improve solutions
Qualifications
Bachelor’s or Master’s degree in Data Engineering, Computer Science, or related technical discipline
6+ years supporting enterprise data platforms, cloud data migration, and big data solutions
Relevant certifications such as AWS Data Analytics, GCP Professional Data Engineer, or Azure Data Engineer are a plus
Proven experience designing compliant, secure, and high-performance data pipelines supporting regulated industries
Professional Competencies
Strong analytical and troubleshooting skills for complex data processing issues
Leadership qualities to guide junior engineers and promote best practices
Effective communication skills for stakeholder engagement and technical documentation
Adaptability to evolving cloud architectures, data standards, and regulatory requirements
Innovative mindset to leverage new tools, frameworks, and data management techniques
Time management skills to prioritize tasks effectively in a fast-paced environment
SYNECHRON’S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Candidate Application Notice
لن يتم النظر في طلبك لهذة الوظيفة، وسيتم إزالته من البريد الوارد الخاص بصاحب العمل.