Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


User unblocked successfully
https://bayt.page.link/BwHHKaUcM4z5gGM56
Back to the job results

Talend Architect

3 days ago 2026/09/03
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

We are looking for a skilled Talend Architect to design and implement scalable data integration solutions using Talend (Qlik).
The candidate will be responsible for building robust data pipelines, ensuring data quality, and defining enterprise data architecture.
🔷 Key Responsibilities Design and implement end-to-end ETL/ELT pipelines using Talend Define data architecture for data warehouse, data lake, and integrations Ensure data quality, governance, and data lineage Lead and mentor Talend developers and data engineers Optimize performance of data jobs and pipelines Collaborate with business and technical teams for data requirements Work on cloud-based data platforms (AWS/Azure/GCP) Establish best practices, standards, and reusable frameworks 🔷 Required Skills Strong experience in Talend Data Integration / Talend Cloud (Qlik) Expertise in ETL/ELT, data modeling, and data warehousing concepts Hands-on experience with SQL and databases (Oracle, SQL Server, etc.
) Experience with cloud platforms (AWS / Azure / GCP) Knowledge of Big Data tools (Spark, Hadoop) is a plus Experience in data quality, governance, and metadata management Good understanding of APIs and integration patterns 🔷 Good to Have Experience with Snowflake / Databricks Exposure to CI/CD, Git, DevOps practices Knowledge of real-time/streaming data (Kafka, etc.
)
This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.