Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/GMcyPhnVTeoiiq629
Back to the job results

Cloud Data Engineer | AWS, Snowflake, Airflow, Python, DataOps, CI/CD

30+ days ago 2026/08/29
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Job Summary
Synechron is seeking an experienced Cloud Data Engineer to develop and optimize cloud-based data pipelines supporting enterprise analytics. In this role, you will leverage AWS services, orchestrate workflows with Airflow, execute data transformations with DBT, and manage Snowflake data warehouses. Your expertise will enable efficient and reliable data operations, supporting strategic decision-making and data-driven initiatives. You will collaborate with cross-functional teams to deliver scalable, high-quality data solutions aligned with organizational goals.


Software Requirements


  • Required: AWS (S3, Lambda, Glue, EC2, IAM), Apache Airflow, DBT, Snowflake, SQL (MySQL, PostgreSQL, or Snowflake), Git, CI/CD tools (Jenkins, GitLab CI)


  • Preferred: AWS CloudFormation/Terraform, AWS Glue, Kubernetes, Prometheus, Grafana, DataOps tools, Python scripting


  • Experience level: 5+ years supporting enterprise data pipelines, cloud infrastructure, and automation in a large-scale environment


Overall Responsibilities


  • Design, build, and manage scalable data pipelines on AWS for analytics and reporting purposes


  • Develop and maintain workflows using Apache Airflow to orchestrate data ingestion, processing, and transformation tasks


  • Construct modular, testable data models and transformations using DBT, ensuring data quality and correctness


  • Manage and optimize Snowflake data warehouses to support high-performance analytics workloads


  • Collaborate across teams to gather requirements, validate data processes, and implement scalable solutions


  • Monitor data pipeline health, troubleshoot issues, and optimize for performance and reliability


  • Automate deployment, testing, and operational workflows to support continuous integration and delivery


  • Document architecture, workflows, and operational procedures for transparency and compliance


  • Conduct data validation, performance tuning, and capacity planning activities


Technical Skills (By Category)


  • Programming Languages:


    • Essential: SQL, Python (preferred), shell scripting for automation


    • Preferred: Java, JavaScript, or other scripting languages for data workflow automation


  • Data Management & Databases:


    • Snowflake, PostgreSQL, MySQL, and experience with data modeling, query optimization, and data validation


  • Cloud Technologies:


    • AWS (S3, Lambda, Glue, EC2, IAM), cloud deployment, security models, and multi-cloud concepts (preferred)


  • Frameworks & Libraries:


    • DBT, Apache Airflow, Prometheus, Grafana, data orchestration SDKs, security libraries (OAuth2, SAML)


  • Development & Automation Tools:


    • Git, Jenkins, Terraform, CloudFormation, CI/CD pipelines, containerization (Docker, Kubernetes)


  • Security & Compliance:


    • Data encryption, IAM roles, access controls, compliance standards (GDPR, HIPAA, SOC)


Experience Requirements


  • 5+ years supporting cloud data pipelines, data warehouse management, and automation in large environments


  • Proven experience designing, building, and operating scalable, high-availability data workflows in AWS


  • Expertise in Snowflake data warehousing and data transformation using DBT


  • Strong troubleshooting and performance tuning skills, with a focus on data throughput and reliability


  • Industry experience in finance, banking, fintech, or enterprise analytics is preferred; equivalent large-scale data support experience in other sectors acceptable


Day-to-Day Activities


  • Develop and optimize cloud data pipelines supporting enterprise dashboards and analytics


  • Collaborate with data scientists, analysts, and engineering teams to define and implement data workflows


  • Monitor system health using Prometheus, Grafana, and cloud-native tools; troubleshoot operational issues


  • Automate deployment, configuration, and scaling activities with Infrastructure as Code (Terraform, CloudFormation)


  • Conduct root cause analysis and performance tuning for data pipelines and warehouse systems


  • Maintain operational documentation, runbooks, and process standards


  • Support cloud migration, system upgrades, and security compliance tasks


  • Implement automation to reduce manual intervention and improve reliability


Qualifications


  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field


  • 5+ years supporting enterprise-level cloud data infrastructure and pipelines


  • Certifications such as AWS Data Analytics Specialty, AWS Solutions Architect, or equivalent are advantageous


  • Hands-on experience with AWS, Snowflake, Airflow, DBT, and SQL optimization


  • Proven record of supporting data platforms in regulated or high-availability environments


  • Strong troubleshooting, analytical, and communication skills


  • Ability to work independently and coordinate with cross-functional teams


Professional Competencies


  • Analytical mindset with a focus on data quality, performance, and operational stability


  • Leadership and teamwork capabilities for cross-team collaboration and mentoring


  • Effective communication to translate technical outcomes for diverse audiences


  • Adaptability and continuous learning to stay current with emerging data technologies


  • Ownership of data reliability, security, and process automation


  • Excellent time management and prioritization skills to meet project deliverables



S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT
 


Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.



All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.


Candidate Application Notice


This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.