Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/KkWQdGqRGXjMEteZ9
Back to the job results
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description


Job Description:



Business Title    



Data Engineer
Years of Experience    



Min 3 and max upto 7.
Job Descreption    



"Looking for a hands‑on Senior Data Engineer – AWS with experience to development, build, and maintain scalable, secure, and high‑performance data platforms on AWS.
This is an individual contributor role focused on data pipeline development, cloud data engineering, and analytics enablement. The role requires strong hands‑on skills in AWS data services, SQL, and Python, along with experience building reliable batch and streaming data pipelines in a global delivery environment."
Must have skills    



"Cloud & Data Engineering (AWS)
Strong hands‑on experience with AWS data services, including:
- Amazon S3
- AWS Glue
- Amazon Athena
- Amazon Redshift
- Amazon EMR



Experience designing cloud‑native data lakes and data warehouse architectures
Solid understanding of batch data pipelines and basic exposure to streaming concepts



SQL & Python (Mandatory)
Strong SQL skills (mandatory)
Writing complex queries, joins, aggregations, and transformations
Experience working with large datasets in Redshift / Athena



Strong Python skills (mandatory)
Python for data engineering and ETL use cases
Experience with PySpark / Spark is a strong plus



Good understanding of data modeling, transformations, and performance tuning



Data Processing & Engineering
Hands‑on experience with distributed data processing frameworks (Spark / PySpark)
Experience handling structured and semi‑structured data
Understanding of schema evolution, data quality checks, and validation logic



DevOps & Platform Basics



Working knowledge of Infrastructure as Code (Terraform and/or CloudFormation)
Basic experience with CI/CD pipelines for data workloads
Understanding of logging and monitoring using CloudWatch



Collaboration



Ability to work closely with architects, DevOps, QA, and business stakeholders
Good communication skills to explain technical concepts clearly"
Good to have skills    "Exposure to streaming technologies such as Amazon Kinesis or Kafka
Familiarity with Lakehouse and modern data platform patterns
Experience integrating AWS data platforms with BI / reporting tools
Basic knowledge of data governance, data quality, and metadata concepts
Awareness of AWS cost optimization best practices
Experience working in Agile delivery models, with global clients
Exposure to AI / ML"
Key responsibiltes 



   "Data Engineering & Development
Design and build scalable ETL / ELT pipelines on AWS
Develop SQL‑based data transformations and Python‑based data pipelines
Implement data ingestion pipelines using AWS services such as S3, Glue, EMR
Build data models optimized for analytics, performance, and cost efficiency



Platform & Operations
Support deployment and execution of data pipelines across environments
Monitor pipeline performance, reliability, and data quality
Troubleshoot data pipeline issues and perform root‑cause analysis
Apply best practices for security, reliability, and scalability



Collaboration & Delivery
Work closely with architects and product teams to understand requirements
Translate business and analytics needs into working AWS data solutions
Contribute to documentation, code reviews, and engineering standards"
Education Qulification



    1. Bachelor’s or Master Degree or equivalent Degree
Certification If Any    



"1.AWS Certified Solutions Architect / DevOps – Professional
2. Snowflake Core  "
Shift timing    12 PM to 9 PM and / or  2 PM to 11 PM - IST time zone 




Location:



DGS India - Pune - Kharadi EON Free Zone

Brand:



Merkle

Time Type:



Full time

Contract Type:



Permanent
This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.