Data & MLOps Engineer

In addition to finding you the perfect job, we can do the same for your friends and relatives. Send us a referral and we'll give you a cash bonus.
.jpg)
The Data & AI department brings together expertise in Data Science with Generative AI, Machine Learning & Predictive Analytics, On-/Offsite Analytics & Tracking, Data Engineering, and Enterprise Reporting.
We're on a mission to embed a data-driven mindset, champion customer-centricity in everything we do, and unlock sustainable business value through innovative AI-driven solutions.
This role will contribute to:
- Reinforcement of our "Machine Learning & Predictive" team to accelerate our MLOps and data infrastructure capabilities
- Scaling technical foundation for ML operations across strategic business areas
- Meeting growing internal demand for robust, production-ready ML systems and data pipelines
- Enabling data-driven transformation at Tchibo through improved technical infrastructure and MLOps maturity
- Supporting Data Science teams with enhanced deployment capabilities and operational excellence
Responsibilities
- Development and enhancement of our MLOps framework and establishment across ML projects
- Building and maintaining ETL/ML pipelines using GCP services, Airflow, and modern orchestration tools
- Implementation of CI/CD processes and GitLab pipeline development for automated ML deployments
- Infrastructure management using Terraform, containerization (Docker), and cloud-native GCP services
- Technical support for Data Science teams in MLOps/DevOps implementation and best practices
- Monitoring and optimization of production ML systems for stability, performance, and scalability
- API development and integration to enable seamless data flow and model serving
- Collaboration with cross-functional teams to improve ML system maturity and operational excellence
Requirements:
- Bachelor's/Master's degree in Computer Science, Software Engineering, or equivalent field
- 3-5+ years of proven track record in Data Engineering, MLOps, or DevOps environments
- Demonstrated experience in agile working methodologies and cross-functional teams
- Extensive experience in data processing, ETL & ML pipeline development and orchestration
- Strong software development proficiency in Python and SQL with clean code principles
- Hands-on experience with GitLab CI/CD, developing CI components and automated pipelines
- Deep expertise in GCP ecosystem and native services: GCS, BigQuery, CloudRun, VertexAI
- Proficient in containerization technologies (Docker, dev-containers) and microservices
- Skilled in Infrastructure as Code (Terraform), dbt, and Airflow orchestration
- Experienced in API development and modern DevOps practices
- Collaborative team player with a strong service mindset towards Data Science teams
- Pragmatic problem-solver who balances technical excellence with business needs
- Strategic thinker who maintains oversight in complex technical landscapes
- Self-driven and organized approach to managing multiple priorities
- Full professional proficiency in English
Benefits:
- 26 days of vacation
- Meal vouchers
- Christmas vouchers
- Medical subscription + health insurance
- Transport allowance
Only candidates considered suitable for the role will be contacted for an interview.
All your data is confidential, regardless the recruitment process status.