Submitting more applications increases your chances of landing a job.
Here’s how busy the average job seeker was last month:
Opportunities viewed
Applications submitted
Keep exploring and applying to maximize your chances!
Looking for employers with a proven track record of hiring women?
Click here to explore opportunities now!You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for
Would You Be Likely to Participate?
If selected, we will contact you via email with further instructions and details about your participation.
You will receive a $7 payout for answering the survey.
Job Purpose To drive data-led decision-making by leveraging data science and automation tools, with a focus on Microsoft Fabric and Azure Databricks.
Designing and implementing ETL pipelines using SQL on Microsoft Fabric Duties and Responsibilities • Databricks Development: Build and manage Databricks notebooks using SQL • ETL Development: Design and maintain data integration pipelines in Azure Data Factory and Microsoft fabric • Database Proficiency: Strong knowledge of SQL and experience with relational databases like SQL Server, MySQL, etc.
• Cloud Platform Familiarity (Preferred): Exposure to Azure cloud services and architecture best practices.
• Create, implement and monitor home loan offers across multiple channels • Methodical analysis of data and conversions of complex data into simple and readable format to enable business decisioning • Build, deploy and maintain data pipelines for offer management • Ensure data accuracy, quality and integrity across automated systems • Ensure offers adhere to Risk policies and audit requirements Key Decisions / Dimensions • Making critical decisions during production issues, including root cause analysis, quick fixes, and long-term resolutions.
• Prioritizing and escalating support tasks effectively to minimize downtime and business impact.
Major Challenges • Translate business requirements into technical solutions • Own end-to-end offer management, ensuring on-time execution and adherence to quality standards.
• Develop and maintain robust ETL pipelines and data integration modules across systems.
• Monitor and resolve performance bottlenecks in data workflows and programs.
• Establish best practices, standard operating procedures, and drive their implementation across teams.
• Coordinate with internal teams to troubleshoot and resolve issues efficiently.
• Manage workload through effective planning, prioritization, and progress tracking.
Required Qualifications and Experience Educational Qualifications: • Graduate or Post?
Graduate in Computer Science, Information Technology, or Data Science/Technologies.
Work Experience: • 0.
5–1 year of hands?
on data engineering experience.
• Technical Expertise / Skills Keywords: Azure Databricks - SQL - Must Have • Azure Data Factory – For ETL & Data Integrations - Must Have
You'll no longer be considered for this role and your application will be removed from the employer's inbox.