Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/ebdiAkqHQECA7YmD6
Back to the job results

Databricks Solution Architect

Today 2026/09/03
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Job Description We’re seeking a hands-on Senior Consultant – Databricks with deep technical expertise in building and optimizing Lakehouse-based data and AI solutions.
This is a contract role based in Singapore, offering flexibility to work on-site with clients or remotely within the region.
In this role, you’ll design, develop, and operationalize Delta Lakehouse architectures using Databricks, driving real-world outcomes for enterprise customers.
You’ll take ownership of implementation tasks, lead technical delivery, and mentor engineering teams in best practices across data engineering, governance, and AI.
Key Responsibilities Design and implement scalable data pipelines using Delta Live Tables (DLT), Spark SQL, Python, or Scala.
Optimize ETL, streaming, and ML workloads for performance, cost efficiency, and reliability.
Administer and configure Databricks Workspaces, Unity Catalog, and cluster policies for secure, governed environments.
Automate infrastructure and deployments using Terraform, Git, and CI/CD pipelines.
Implement observability, cost optimization, and monitoring frameworks using tools like Splunk, Prometheus, or CloudWatch.
Collaborate with customers to build AI and LLM solutions leveraging MLflow, DBRX, and Mosaic AI.
Required Skills & Experience Strong hands-on experience with Databricks, including workspace setup, notebooks, clusters, and job orchestration.
Expertise in Delta Lake, DLT, Unity Catalog, and SQL Warehouses.
Proficiency in Python or Scala for data engineering and ML workflows.
Strong understanding of AWS, Azure, or GCP cloud ecosystems.
Experience with Terraform automation, DevOps, and MLOps practices.
Familiarity with monitoring and governance frameworks for large-scale data platforms.
Nice to Have Experience developing AI/LLM pipelines and RAG architectures on Databricks.
Exposure to Bedrock, OpenAI, or Hugging Face integrations.
Databricks certifications (Data Engineer, Machine Learning, or Solutions Architect) preferred.
This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.