Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/5ECZeccjqnzjto6P9
Back to the job results

GCP Data Engineer, AVP

30+ days ago 2026/09/10
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Job Description:

Job Title: GCP Data Engineer, AVP


Location: Pune, India


Role Description:


It’s a GCP Data Engineer role, performing end to end development of integrations, data preparation/synthesization/transformation, data quality, data storage workloads, along with usage of the prepared data for advanced data analytics (may include some of the AI use cases). It encompasses the L3 support activities as well for the implemented workloads.


What we’ll offer you


As part of our flexible scheme, here are just some of the benefits that you’ll enjoy


  • Best in class leave policy.
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities:


  • Design, develop and maintain data pipelines using Python and SQL programming language on GCP.
  • Expertise on API development and integration
  • Experience on writing SQL queries and optimizing it.
  • Strong programming skills with Python.
  • Experience with GCP Composer or Apache airflow.
  • Hands-on experience on streaming data pipelines.
  • Worked on near real-time services like Pub Sub/Kafka.
  • Proven experience on app development and hosting over cloud platform with help of Docker (preferably GCP, cloud run).
  • A willingness to accept failures, learn and try again.
  • Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills.
  • Work with Cloud Composer to manage and process batch data jobs efficiently.
  • Develop and optimize complex SQL queries for data analysis, extraction, and transformation.
  • Develop and deploy google cloud services using Terraform.
  • Consume and Hosting REST API using Python.
  • Build and deploy AI and RAG pipelines
  • Design, build and deploy agentic workflows
  • Operationalize AI applications
  • Knowledge of RAG, vector database, multi-agent system, agentic framework – ADK along with a2a protocol and agent cards
  • Experienced in Vertex AI platform
  • Implement CI CD pipeline using GitHub Action
  • Monitor and troubleshoot data pipelines, resolving any issues in a timely manner.
  • Ensure team collaboration using Jira, Confluence, and other tools.
  • Ability to quickly learn new any existing technologies Strong problem-solving skills.
  • Write advanced SQL and Python scripts.
  • Strategize novel approaches for designing, implementing, deploying robust and scalable ai systems
  • Certification on Professional Google Cloud Data engineer will be an added advantage.

Your skills and experience


  • 10+ years of IT experience, as a hands-on technologist.
  • Proficient in Python
  • Proficient in SQL.
  • Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run, Google ADK and good to have GKE  
  • Hands on experience in building and managing AI workloads and REST API hosting and consumptions.
  • Proficient in Terraform/ Hashicorp.
  • Experienced in GitHub and Git Actions
  • Experienced in CI-CD
  • Experience in automating ETL testing using python and SQL.
  • Good to have APIGEE. 
  • Good to have Bit Bucket
  • Understanding of LLM(Gemini) and embedding models
  • Experienced in prompt engineering

How we’ll support you


  • Training and development to help you excel in your career
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression
  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams


Please visit our company website for further information:


https://www.db.com/company/company.html


We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.


Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.


We welcome applications from all people and promote a positive, fair and inclusive work environment.






This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.