Cedric Renggli



I am a PhD candidate at ETH Zurich's DS3Lab, supervised by Ce Zhang.

My main research interest lies in the foundation of usable, scalable and efficient data-centric systems to support all kind of interactions in a machine learning model lifecycle, broadly known as MLOps. This includes notably the definition of new engineering principles, such as feasibility study of ML application development, or continuous integration (CI) for ML models. I'm furthermore working on efficient methods to enable a model search functionality in pre-trained model collections. This typically serves as a starting point to solve new machine learning tasks.

I hold a bachelor's degree from the Bern University of Applied Sciences and received my MSc in Computer Science from ETH Zurich in 2018. My work on Efficient Sparse AllReduce For Scalable Machine Learning was awarded with the silver medal of ETH Zurich for outstanding master thesis.

My office is located at STF G 222 and feel free to contact me by email cedric.renggli@inf.ethz.ch.
You can also find me on Twitter and LinkedIn.

Publications


For a up-to-date list of my publications check my profile on Google Scholar.

2021

A Data Quality-Driven View of MLOps
Cedric Renggli, Luka Rimanic, Nezihe Merve Gürel, Bojan Karlaš, Wentao Wu and Ce Zhang
IEEE Data Engineering Bulletin March 2021
Scalable Transfer Learning with Expert Models
Joan Puigcerver, Carlos Riquelme, Basil Mustafa, Cedric Renggli, Andre Susano Pinto, Sylvain Gelly, Daniel Keysers and Neil Houlsby
International Conference on Learning Representations (ICLR) 2021
Ease.ML: A Lifecycle Management System for MLDev and MLOps
Aguilar Leonel, ..., Cedric Renggli, ..., Wentao Wu and Ce Zhang
Conference on Innovative Data Systems Research (CIDR) 2021
Decoding EEG Brain Activity for Multi-Modal Natural Language Processing
Nora Hollenstein, Cedric Renggli, Benjamin Glaus, Maria Barrett, Marius Troendle, Nicolas Langer and Ce Zhang
arXiv preprint

2020

On Convergence of Nearest Neighbor Classifiers over Feature Transformations
Luka Rimanic*, Cedric Renggli*, Bo Li and Ce Zhang
Neural Information Processing Systems (NeurIPS) 2020
Which Model to Transfer? Finding the Needle in the Growing Haystack
Cedric Renggli, Andre Susano Pinto, Luka Rimanic, Joan Puigcerver, Carlos Riquelme, Ce Zhang and Mario Lucic
arXiv preprint
Ease.ml/snoopy: Towards Automatic Feasibility Study for Machine Learning Applications
Cedric Renggli*, Luka Rimanic*, Luka Kolar*, Nora Hollenstein, Wentao Wu and Ce Zhang
arXiv preprint
Observer Dependent Lossy Image Compression
Maurice Weber, Cedric Renggli, Helmut Grabner and Ce Zhang
German Conference on Pattern Recognition (DAGM-GCPR) 2020
Ease.ml/snoopy in Action: Towards Automatic Feasibility Analysis for Machine Learning Application Development
Cedric Renggli*, Luka Rimanic*, Luka Kolar, Wentao Wu and Ce Zhang
International Conference on Very Large Data Bases (VLDB) 2020, Demo
Building Continuous Integration Services for Machine Learning
Bojan Karlaš, Matteo Interlandi, Cedric Renggli, Wentao Wu, Ce Zhang, Deepak Mukunthu Iyappan Babu, Jordan Edwards, Chris Lauren, Andy Xu and Markus Weimer
Special Interest Group on Knowledge Discovery and Data Mining (SIGKDD) 2020, Applied Data Science, Oral Presentation 44/756

2019

SparCML: High-Performance Sparse Communication for Machine Learning
Cedric Renggli, Saleh Ashkboos, Mehdi Aghagolzadeh, Dan Alistarh and Torsten Hoefler
High Performance Computing, Networking, Storage and Analysis (SC) 2019
Ease.ml/ci and Ease.ml/meter in Action: Towards Data Management for Statistical Generalization
Cedric Renggli*, Frances Ann Hubis*, Bojan Karlaš, Kevin Schawinski, Wentao Wu and Ce Zhang
International Conference on Very Large Data Bases (VLDB) 2019, Demo
Distributed Learning over Unreliable Networks
Chen Yu, Hanlin Tang, Cedric Renggli, Simon Kassing, Ankit Singla, Dan Alistarh, Ji Liu and Ce Zhang
International Conference on Machine Learning (ICML) 2019
Continuous Integration of Machine Learning Models with ease.ml/ci: Towards a Rigorous Yet Practical Treatment
Cedric Renggli, Bojan Karlaš, Bolin Ding, Feng Liu, Kevin Schawinski, Wentao Wu and Ce Zhang
Conference on Systems and Machine Learning (SysML) 2019
Speeding up Percolator
John T. Halloran, Hantian Zhang, Kaan Kara, Cedric Renggli, Matthew The, Ce Zhang, David M. Rocke, Lukas Käll and William Stafford Noble
Journal of Proteome Research 2019

2018

The Convergence of Sparsified Gradient Methods
Dan Alistarh, Torsten Hoefler, Mikael Johansson, Nikola Konstantinov, Sarit Khirirat and Cedric Renggli (Authors ordered alphabetically)
Neural Information Processing Systems (NeurIPS) 2018

2017

MPIML: A High-Performance Sparse Communication Layer for Machine Learning (Poster)
Cedric Renggli, Dan Alistarh and Torsten Hoefler
Neural Information Processing Systems (NIPS) 2017, Workshop: Deep Learning At Supercomputer Scale

MSc Thesis

Efficient Sparse AllReduce For Scalable Machine Learning
Cedric Renggli
Outstanding thesis award: Silver medal of ETH Zurich
 

Current & Former Students


I am extremely lucky to (have) work(ed) with the following students:

  • Luka Kolar (Master Thesis ETH and Scientific Assistant at DS3Lab)
  • Dominic Steiner (Bachelor Thesis ETH)
  • Maurice Weber (Master Thesis ETH)
  • Spyridon Angelopoulos (Master Thesis ETH)

Contact


 
Dept. of Computer Science
STF G 222
Stampfenbachstrasse 114
8092 Zürich
Switzerland