Cedric Renggli



I am a PhD candidate at ETH Zurich's DS3Lab, supervised by Ce Zhang.

My main research interest lies in the foundation of usable, scalable and efficient data-centric systems to support all kinds of interactions in a machine learning model lifecycle, broadly known as MLOps. This notably led to the definition of new engineering principles, such as how to run a feasibility study for ML application development, or how to perform continuous integration (CI) of ML models with statistical guarantees. I'm furthermore working on efficient methods to enable a model search functionality in pre-trained model collections. This typically serves as a starting point to solve new machine learning tasks using transfer learning.

I hold a bachelor's degree from the Bern University of Applied Sciences and received my MSc in Computer Science from ETH Zurich in 2018. My work on Efficient Sparse AllReduce For Scalable Machine Learning was awarded with the silver medal of ETH Zurich for outstanding master thesis.
During my PhD, I have worked as a research intern and student research consultant at Google Brain.

My office is located at STF G 222 and feel free to contact me by email cedric.renggli@inf.ethz.ch.
You can also find me on Twitter and LinkedIn.

Publications


For a up-to-date list of my publications check my profile on Google Scholar.

2022

Which Model to Transfer? Finding the Needle in the Growing Haystack
Cedric Renggli, Andre Susano Pinto, Luka Rimanic, Joan Puigcerver, Carlos Riquelme, Ce Zhang and Mario Lucic
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2022
Dynamic Human Evaluation for Relative Model Comparisons
Thórhildur Thorleiksdóttir, Cedric Renggli, Nora Hollenstein and Ce Zhang
Language Resources and Evaluation Conference (LREC) 2022
In-Database Machine Learning with CorgiPile: Stochastic Gradient Descent without Full Data Shuffle
Lijie Xu, ..., Cedric Renggli, ..., Wentao Wu and Ce Zhang
ACM Special Interest Group in Management Of Data (SIGMOD) 2022
[Pending]
SHiFT: An Efficient, Flexible Search Engine for Transfer Learning
Cedric Renggli, Xiaozhe Yao, Luka Kolar, Luka Rimanic, Ana Klimovic and Ce Zhang
arXiv preprint
Learning to Merge Tokens in Vision Transformers
Cedric Renggli*, Andre Susano Pinto*, Neil Houlsby, Basil Mustafa, Joan Puigcerver and Carlos Riquelme*
arXiv preprint
DeepSE-WF: Unified Security Estimation for Website Fingerprinting Defenses
Alexander Veicht, Cedric Renggli and Diogo Barradas
arXiv preprint

2021

Evaluating Bayes Error Estimators on Real-World Datasets with FeeBee
Cedric Renggli*, Luka Rimanic*, Nora Hollenstein and Ce Zhang
Neural Information Processing Systems (NeurIPS) 2021, Datasets and Benchmarks
A Data Quality-Driven View of MLOps
Cedric Renggli, Luka Rimanic, Nezihe Merve Gürel, Bojan Karlaš, Wentao Wu and Ce Zhang
IEEE Data Engineering Bulletin March 2021
Scalable Transfer Learning with Expert Models
Joan Puigcerver, Carlos Riquelme, Basil Mustafa, Cedric Renggli, Andre Susano Pinto, Sylvain Gelly, Daniel Keysers and Neil Houlsby
International Conference on Learning Representations (ICLR) 2021
Ease.ML: A Lifecycle Management System for MLDev and MLOps
Aguilar Leonel, ..., Cedric Renggli, ..., Wentao Wu and Ce Zhang
Conference on Innovative Data Systems Research (CIDR) 2021
Decoding EEG Brain Activity for Multi-Modal Natural Language Processing
Nora Hollenstein, Cedric Renggli, Benjamin Glaus, Maria Barrett, Marius Troendle, Nicolas Langer and Ce Zhang
Frontiers in Human Neuroscience Vol. 15 2021

2020

On Convergence of Nearest Neighbor Classifiers over Feature Transformations
Luka Rimanic*, Cedric Renggli*, Bo Li and Ce Zhang
Neural Information Processing Systems (NeurIPS) 2020
Ease.ML/Snoopy: Towards Automatic Feasibility Studies for ML via Quantitative Understanding of "Data Quality for ML"
Cedric Renggli*, Luka Rimanic*, Luka Kolar*, Wentao Wu and Ce Zhang
arXiv preprint
Observer Dependent Lossy Image Compression
Maurice Weber, Cedric Renggli, Helmut Grabner and Ce Zhang
German Conference on Pattern Recognition (DAGM-GCPR) 2020
Ease.ml/snoopy in Action: Towards Automatic Feasibility Analysis for Machine Learning Application Development
Cedric Renggli*, Luka Rimanic*, Luka Kolar, Wentao Wu and Ce Zhang
International Conference on Very Large Data Bases (VLDB) 2020, Demo
Building Continuous Integration Services for Machine Learning
Bojan Karlaš, Matteo Interlandi, Cedric Renggli, Wentao Wu, Ce Zhang, Deepak Mukunthu Iyappan Babu, Jordan Edwards, Chris Lauren, Andy Xu and Markus Weimer
Special Interest Group on Knowledge Discovery and Data Mining (SIGKDD) 2020, Applied Data Science, Oral Presentation 44/756

2019

SparCML: High-Performance Sparse Communication for Machine Learning
Cedric Renggli, Saleh Ashkboos, Mehdi Aghagolzadeh, Dan Alistarh and Torsten Hoefler
High Performance Computing, Networking, Storage and Analysis (SC) 2019
Ease.ml/ci and Ease.ml/meter in Action: Towards Data Management for Statistical Generalization
Cedric Renggli*, Frances Ann Hubis*, Bojan Karlaš, Kevin Schawinski, Wentao Wu and Ce Zhang
International Conference on Very Large Data Bases (VLDB) 2019, Demo
Distributed Learning over Unreliable Networks
Chen Yu, Hanlin Tang, Cedric Renggli, Simon Kassing, Ankit Singla, Dan Alistarh, Ji Liu and Ce Zhang
International Conference on Machine Learning (ICML) 2019
Continuous Integration of Machine Learning Models with ease.ml/ci: Towards a Rigorous Yet Practical Treatment
Cedric Renggli, Bojan Karlaš, Bolin Ding, Feng Liu, Kevin Schawinski, Wentao Wu and Ce Zhang
Conference on Systems and Machine Learning (SysML) 2019
Speeding up Percolator
John T. Halloran, Hantian Zhang, Kaan Kara, Cedric Renggli, Matthew The, Ce Zhang, David M. Rocke, Lukas Käll and William Stafford Noble
Journal of Proteome Research 2019

2018

The Convergence of Sparsified Gradient Methods
Dan Alistarh, Torsten Hoefler, Mikael Johansson, Nikola Konstantinov, Sarit Khirirat and Cedric Renggli (Authors ordered alphabetically)
Neural Information Processing Systems (NeurIPS) 2018

2017

MPIML: A High-Performance Sparse Communication Layer for Machine Learning (Poster)
Cedric Renggli, Dan Alistarh and Torsten Hoefler
Neural Information Processing Systems (NIPS) 2017, Workshop: Deep Learning At Supercomputer Scale
 

PhD Thesis

Building Data-Centric Systems for Machine Learning Development and Operations
Cedric Renggli
 

MSc Thesis

Efficient Sparse AllReduce For Scalable Machine Learning
Cedric Renggli
Outstanding thesis award: Silver medal of ETH Zurich
 

Current & Former Students


I am extremely lucky to (have) work(ed) with the following students:

  • Juan Saez Solana (Bachelor Thesis ETH 2022)
  • Armando Schmid (Master Thesis ETH 2022)
  • Diogo Sampaio Ribeiro (Bachelor Thesis ETH 2022)
  • Alexander Veicht (Bachelor Thesis ETH 2021 and Semester Project ETH 2022)
  • Thórhildur Thorleiksdóttir (Master Thesis ETH 2021)
  • Wilke Grosche (Master Thesis ETH with D-PHYS 2021)
  • Magnus Wuttke (Bachelor Thesis ETH 2021)
  • David Graf (External Master Thesis ETH with IBM 2021)
  • Olga Grigorieva (Bachelor Thesis ETH 2021)
  • Constantin El Ghazi (Semester Project ETH 2021)
  • Cristian Blaga (Semester Project ETH 2021)
  • Dominic Steiner (Bachelor Thesis ETH 2021)
  • Luka Kolar (Master Thesis ETH 2020 and Scientific Assistant at DS3Lab 2021)
  • Maurice Weber (Master Thesis ETH 2019)
  • Spyridon Angelopoulos (Master Thesis ETH 2019)

Contact


 
Dept. of Computer Science
STF G 222
Stampfenbachstrasse 114
8092 Zürich
Switzerland