Cedric Renggli



Cedric Renggli

I am a PhD candidate at ETH Zurich's DS3Lab, supervised by Ce Zhang.

My main research interest lies in comparison-based/preferential optimization. I believe this technique can be used to develop novel methods for efficient model-selection with the focus on machine learning. In a more general setting, it helps putting humans into the loop in a more structured way. This will enable people of various domain, especially those with no or little ML or computer science background, to analyze their large-scale datasets more efficiently. Additionally, I am working on different optimization techniques and systems relying on distributed machine learning algorithms to speedup the training process.

I hold a bachelor degree from the Bern University of Applied Sciences and received my MSc in Computer Science from ETH Zurich in 2018. My work on Efficient Sparse AllReduce For Scalable Machine Learning got awarded with the silver medal of ETH Zurich for outstanding master thesis.

My office is located at CAB E 61.2 and feel free to contact me by email cedric.renggli@inf.ethz.ch.

Publications


2019

Continuous Integration of Machine Learning Models: A Rigorous Yet Practical Treatment
Cedric Renggli, Bojan Karlas, Bolin Ding, Feng Liu, Kevin Schawinski, Wentao Wu and Ce Zhang
To appear at SysML 2019

2018

The Convergence of Sparsified Gradient Methods
Dan Alistarh, Torsten Hoefler, Mikael Johansson, Nikola Konstantinov, Sarit Khirirat and Cedric Renggli
(Authors ordered alphabetically)
Neural Information Processing Systems (NeurIPS), 2018
Distributed Learning over Unreliable Networks
Hanlin Tang, Chen Yu, Cedric Renggli, Simon Kassing, Ankit Singla, Dan Alistarh, Ji Liu and Ce Zhang
Manuscript, Arxiv, 2018
SparCML: High-Performance Sparse Communication for Machine Learning
Cedric Renggli, Dan Alistarh, Torsten Hoefler and Mehdi Aghagolzadeh
Manuscript, Arxiv, 2018

2017

MPIML: A High-Performance Sparse Communication Layer for Machine Learning (Poster)
Cedric Renggli, Dan Alistarh and Torsten Hoefler
NIPS 2017 Workshop: Deep Learning At Supercomputer Scale

MSc Thesis

Efficient Sparse AllReduce For Scalable Machine Learning
Cedric Renggli
Outstanding thesis award: Silver medal of ETH Zurich
 

Contact


 
Dept. of Computer Science
CAB E 61.2
Universit├Ątsstrasse 6
8092 Z├╝rich
Switzerland