Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/ThbF25H5WuRX2FVn6
Back to the job results
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

The Enterprise Data and AI organization unites the data governance, strategy, engineering, and product teams with those responsible for AI engineering, generative AI enablement, and automation product and engineering. This group plays a pivotal role in leveraging data as a core driver of innovation and integrating AI capabilities to transform products, operations, and customer experiences.


The EDAI organization also incorporates technology Research & Development and experimentation with emerging capabilities, along with engineering support for Amex Digital Labs. This integration ensures that research breakthroughs seamlessly translate into business impact.


Purpose of the Role:


LUMI is company's largest Big Data Platform, ideally suited for computationally and/or data intensive processing applications. Whether the data needs to be processed in batch, online, or streaming manner, Lumi provides robust capabilities to handle such workloads effectively, in a cost-efficient manner.


A hub of very hardworking Big Data engineers and most exciting & upcoming technologies. Cornerstone platform offers an environment where Engineers are challenged every day to build world class products.


As we embark on the journey to move to public cloud - GCP you will be part of a fast-paced Agile team, design, develop, test, troubleshoot & optimize solutions created to simplify access to the Amex's Big Data Platform.



At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. From delivering differentiated products to providing world-class customer service, we operate with a strong risk mindset, ensuring we continue to uphold our brand promise of trust, security, and service.


As part of Team Amex, you’ll experience our powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express.



Responsibilities:
  • Engineer Data solutions using Massive Parallel Processing (MPP) systems to support large data processing (TB’s/ PB’s) across Enterprise.
  • Design enterprise-scale data warehouses or data lakes.
  • Experience in Scalable Data Platform Design on Google Cloud Platform.
  • Design, optimize and managing large-scale datasets in BigQuery.
  • Manage BigQuery compute including slot management, reservations, and workload isolation using BigQuery Editions / Reservations API.
  • Work with structured and unstructured data, enabling analytics across diverse data types and storage systems.
  • Implement fine-grained access control (IAM, row-level security, column-level security), data masking, policy tags, and compliance frameworks.
  • Develop multi-tenant data architectures supporting cross-team and cross-organization data access.
  • Develop data access policies, data governance and compliance controls in Google cloud platform.
  • Lead development, optimization, and maintenance of disaster recovery plans and business continuity strategies, ensuring systems can recover quickly and effectively from unexpected disruptions.
  • Communicate and collaborate with business and product teams to facilitate changes and implementation.
  • Coach and potentially lead junior engineers.
  • Consistently question assumptions, challenge the status quo, and strive for improvement.

Qualifications:
  • 10+ years of experience in data engineering
  • Deep expertise in Google cloud platform particularly in BigQuery Warehouse, including:
  • Data sharing (Authorized Views, Analytics Hub / Data Exchange)
  • Query optimization, partitioning, clustering
  • Workload management (slots, reservations, concurrency control)
  • Expert-level SQL skills for large-scale analytical processing
  • Strong programming experience in Python for building data pipelines and automation using shell scripts
  • Hands on experience in Secure Data Governance & Access Control frameworks
  • Hands-on experience implementing secure data sharing mechanisms using BigQuery Authorized Views, Data Sharing, and Analytics Hub (Data Exchange)
  • Hands-on experience with GCP services (Dataflow, Pub/Sub, Cloud Storage, Composer/Airflow)
  • Proven experience designing scalable data models and shared data platforms
This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.