Building MLOps Infrastructure at Japan's Largest C2C E-Commerce Site
06-20, 10:00–10:40 (Europe/Berlin), Kesselhaus

The MLOps infrastructure we built to support ML in search at Mercari, Japan’s largest C2C e-commerce platform.


We describe the system we built to support ML in search at Mercari, Japan’s largest C2C e-commerce platform. We start by describing the journey to enable the use of ML in a “traditional” term-based search infrastructure with high throughput and strict latency requirements. We also discuss the mixed blessing of rushing a successful proof of concept into production and the technical challenges this posed on the infrastructure side.

Next, we discuss the nuts and bolts of data engineering, ETLs, training pipelines, and serving/monitoring our ML model in production. We also discuss some of the weaknesses of our initial homegrown system, including A/B testing and model monitoring. Finally, we discuss our efforts to evolve our homegrown system into a more modern MLOps infrastructure using an A/B testing framework and Seldon for traffic routing and model serving.

I am a machine learning engineer at Mercari. I live and work in Japan. My professional interest these days is using machine learning in production at scale, and the special challenges this poses.

Teo is a machine learning engineer in the AI & Search division of Mercari, Japan’s largest C2C marketplace. He is currently working across various business-critical projects and helping establish foundational MLOps processes and best practices across the org.