Data Pipeline Manager
2026-04-01T10:08:08+00:00
Yas Tanzania
https://cdn.greattanzaniajobs.com/jsjobsdata/data/employer/comp_6055/logo/download%20(7).png
https://yas.co.tz/
FULL_TIME
Dar es Salaam
Dar es Salaam
00000
Tanzania
Professional Services
Computer & IT, Science & Engineering, Management
2026-04-10T17:00:00+00:00
8
Let's grow together, become our DATA PIPELINE MANAGER
Bachelor Degree in Computer Science, ICT, Engineering or any related field, Master's Degree is an added advantage
Minimum of 5+ years in data engineering or software development, with at least 3 years focused on building production-grade data pipelines.
CORE RESPONSIBILITIES
- Design and maintain pipelines by creating an agile environment that will ensure scalability while monitoring performance and failures preventions.
- Evaluate new technologies, infrastructure and tools to improve data processing and analysis capabilities.
- Modernise systems and workflows to enhance data organisational efficiency and resilience.
- Coordinate, schedule, and monitor complex workflows in data pipelines.
- Support data product delivery by ensuring reliable and reusable data pipelines.
- Implement rigorous automated testing, data quality checks, validation frameworks, and monitoring alerts to ensure high data integrity and availability.
- Oversee the selection and optimization of data processing tools and cloud infrastructure to balance performance.
COMPETENCES
- Programming Proficiency: Expert-level knowledge of Python or Java/Scala, alongside advanced SQL skills for complex querying and performance tuning.
- Handling large-scale, multi-source data experience on the cloud or multi-platforms sources
- Cloud Expertise: Hands-on experience with major cloud platforms (AWS, Google Cloud Platform, or Azure) and their respective data services.
- Architectural Knowledge: Deep understanding of distributed computing principles, data modelling techniques and data warehouse architecture.
- Design and maintain pipelines by creating an agile environment that will ensure scalability while monitoring performance and failures preventions.
- Evaluate new technologies, infrastructure and tools to improve data processing and analysis capabilities.
- Modernise systems and workflows to enhance data organisational efficiency and resilience.
- Coordinate, schedule, and monitor complex workflows in data pipelines.
- Support data product delivery by ensuring reliable and reusable data pipelines.
- Implement rigorous automated testing, data quality checks, validation frameworks, and monitoring alerts to ensure high data integrity and availability.
- Oversee the selection and optimization of data processing tools and cloud infrastructure to balance performance.
- Expert-level knowledge of Python or Java/Scala
- Advanced SQL skills
- Handling large-scale, multi-source data experience on the cloud or multi-platforms sources
- Hands-on experience with major cloud platforms (AWS, Google Cloud Platform, or Azure) and their respective data services.
- Deep understanding of distributed computing principles, data modelling techniques and data warehouse architecture.
- Bachelor Degree in Computer Science, ICT, Engineering or any related field
- Master's Degree is an added advantage
JOB-69ccee8860e9a
Vacancy title:
Data Pipeline Manager
[Type: FULL_TIME, Industry: Professional Services, Category: Computer & IT, Science & Engineering, Management]
Jobs at:
Yas Tanzania
Deadline of this Job:
Friday, April 10 2026
Duty Station:
Dar es Salaam | Dar es Salaam
Summary
Date Posted: Wednesday, April 1 2026, Base Salary: Not Disclosed
Similar Jobs in Tanzania
Learn more about Yas Tanzania
Yas Tanzania jobs in Tanzania
JOB DETAILS:
Let's grow together, become our DATA PIPELINE MANAGER
Bachelor Degree in Computer Science, ICT, Engineering or any related field, Master's Degree is an added advantage
Minimum of 5+ years in data engineering or software development, with at least 3 years focused on building production-grade data pipelines.
CORE RESPONSIBILITIES
- Design and maintain pipelines by creating an agile environment that will ensure scalability while monitoring performance and failures preventions.
- Evaluate new technologies, infrastructure and tools to improve data processing and analysis capabilities.
- Modernise systems and workflows to enhance data organisational efficiency and resilience.
- Coordinate, schedule, and monitor complex workflows in data pipelines.
- Support data product delivery by ensuring reliable and reusable data pipelines.
- Implement rigorous automated testing, data quality checks, validation frameworks, and monitoring alerts to ensure high data integrity and availability.
- Oversee the selection and optimization of data processing tools and cloud infrastructure to balance performance.
COMPETENCES
- Programming Proficiency: Expert-level knowledge of Python or Java/Scala, alongside advanced SQL skills for complex querying and performance tuning.
- Handling large-scale, multi-source data experience on the cloud or multi-platforms sources
- Cloud Expertise: Hands-on experience with major cloud platforms (AWS, Google Cloud Platform, or Azure) and their respective data services.
- Architectural Knowledge: Deep understanding of distributed computing principles, data modelling techniques and data warehouse architecture.
Work Hours: 8
Experience in Months: 36
Level of Education: bachelor degree
Job application procedure
If this description corresponds to you, grow with us by applying before April 10, 2026
Application Link:Click Here to Apply Now
All Jobs | QUICK ALERT SUBSCRIPTION