Migrate Data, <Mesh> in mind
06-19, 10:40–11:00 (Europe/Berlin), Palais Atelier

For quite some time, Hadoop served as the data warehouse for Kleinanzeigen. In this presentation, our objective is to provide an overview of our approach, which involves implementing a cloud-based data pipeline with the help of dbt and Airflow.


For quite some time, Hadoop served as the data warehouse for Kleinanzeigen. However, the central teams eventually decided to say goodbye to this old friend due to its outdated nature and high costs. This migration presented us with a valuable opportunity to embrace the Data Mesh strategy and establish a new data pipeline. In this presentation, our objective is to provide an overview of our approach, which involves implementing a cloud-based data pipeline with the help of dbt and Airflow. Furthermore, we will delve into the challenges we faced during the process, including the debugging of legacy data flows, the complexities of copying data to s3, and dealing with the domain ownership issues. By sharing these experiences, we aim to provide valuable insights into our journey.

See also: Slides (2.5 MB)

Aydan Rende is a Senior Data Engineer within the platform team at Kleinanzeigen. In this role, she develops data pipelines and assists teams in facilitating their data requirements. Aydan began her professional journey at Kleinanzeigen as a Software Engineer, working with commercial products. Alongside her professional career, she is a Formula 1 fan who has an on & off relationship with Ferrari.