Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/haypqQ9wxjymLjHL7
Back to the job results

Cloud Data Engineer | Snowflake, DBT, AWS/Azure/GCP, Large-scale Pipelines, Data Governance, Automation

30+ days ago 2026/08/28
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Job Summary
Synechron is seeking an experienced Cloud Data Engineer to design, develop, and support scalable, cloud-native data platforms supporting enterprise analytics and data modernization initiatives. The role involves building robust ELT pipelines, optimizing data models, and implementing data governance standards across cloud environments such as AWS, GCP, or Azure. The successful candidate will collaborate with analytics, data science, and platform teams to enable impactful data-driven insights, supporting operations, strategic planning, and innovation.


Software Requirements


  • Required:


    • In-depth experience with Snowflake data platform for scalable data warehousing solutions


    • Hands-on expertise in DBT for data transformations, testing, and documentation


    • Strong knowledge of AWS cloud services such as S3, IAM, Glue, and supporting data workflows (GCP or Azure experience preferred)


    • Practical experience with orchestration tools such as Apache Airflow for pipeline management


    • Advanced SQL skills for data modeling, query tuning, and performance optimization


    • Python proficiency for scripting, automation, and data processing tasks


    • Knowledge of big data ecosystem tools such as Spark, Hadoop, or NiFi (preferred)


  • Preferred:


    • Experience with Infrastructure as Code tools such as Terraform or CloudFormation


    • Familiarity with BI tools like Power BI or Tableau for reporting integrations


    • Exposure to data governance, security standards, and compliance frameworks (GDPR, HIPAA, etc.)


Overall Responsibilities


  • Design, build, and optimize scalable data pipelines within cloud environments supporting enterprise analytics and reporting


  • Develop data transformation workflows using DBT, ensuring accuracy, quality, and documentation


  • Collaborate with data scientists, analytics teams, and platform engineers to support data ingestion, feature engineering, and ML workflows


  • Monitor pipeline performance, troubleshoot issues, and implement enhancements for efficiency and resilience


  • Support cloud migration, supporting multi-region, hybrid architectures for data platforms


  • Enforce data security, privacy, and governance policies across pipelines and data stores


  • Automate data workflows and infrastructure deployment supporting CI/CD pipelines


  • Document system architecture, data schemas, transformation logic, and operational procedures


Technical Skills (By Category)


  • Languages & Scripts:
    Required: Python, SQL (PostgreSQL, MySQL, or equivalent), Bash for automation
    Preferred: Scala, R, or Java for supporting big data integrations


  • Data Management & Storage:
    Snowflake, data models, query optimization, data security best practices, data governance practices


  • Cloud Platforms:
    AWS (S3, Glue, Redshift), GCP (BigQuery, Dataflow), Azure support supporting migration and scaling


  • Frameworks & Ecosystems:
    DBT, Spark (PySpark), Hadoop, NiFi (preferred)


  • Orchestration & Automation:
    Apache Airflow, Terraform, CloudFormation, Jenkins, Git, CI/CD pipelines support for data deployment


  • Security & Compliance:
    Data encryption, access control, GDPR/HIPAA compliance standards, audit logging


Experience Requirements


  • 5+ years supporting or developing large-scale, enterprise data pipelines in cloud environments


  • Proven success in optimizing data workflows, data modeling, and pipeline automation


  • Hands-on experience with cloud data platforms like Snowflake and supporting data ecosystems (Spark, Hadoop, NiFi)


  • Experience supporting data governance, security, and compliance initiatives (GDPR, HIPAA, etc.)


  • Strong background supporting data science, analytics, or ML workflows preferred


Day-to-Day Activities


  • Develop, tune, and support scalable data pipelines for enterprise analytics and reporting


  • Collaborate with cross-functional teams to gather data requirements and translate them into technical solutions


  • Automate data ingestion, transformation, and deployment workflows supporting CI/CD practices


  • Troubleshoot data pipeline issues, perform performance tuning, and implement security controls


  • Support cloud migration efforts, data governance initiatives, and pipeline automation projects


  • Document architecture, schemas, and operational procedures


  • Monitor system health, data quality, and compliance status to ensure operational stability and security standards


Qualifications


  • Bachelor’s or Master’s degree in Data Science, Computer Science, or related disciplines


  • 5+ years of experience supporting enterprise data platforms in cloud environments


  • Certifications such as GCP Professional Data Engineer, AWS Data Analytics or equivalent are advantageous


  • Proven experience in building, managing, and optimizing large-scale data pipelines supporting analytics and ML workflows


Professional Competencies


  • Analytical problem-solving skillset, particularly for optimizing data workflows at scale


  • Effective communication with technical teams and business stakeholders for data requirements and governance


  • Mentoring abilities to guide junior data engineers and promote best practices


  • Strategic thinking for designing scalable, secure cloud data ecosystems supporting enterprise needs


  • Adaptability and continuous learning to leverage new data technologies and compliance standards


  • Organization and time management skills for handling multiple data projects efficiently


S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT
 


Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.



All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.


Candidate Application Notice


This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.