Hongzhou Lin
Curriculum Vitæ
Í http://www.hongzhoulin.com
Education
2018–present Massachusetts Institute of Technology, Cambridge, USA.
PostDoc in Machine Learning Group with Stefanie Jegelka.
2014–2017 Univ. Grenoble Alpes, Grenoble, France.
PhD in Mathematics and Computer Science.
Thesis: Generic acceleration schemes for gradient-based optimization.
Advisors: Zaid Harchaoui and Julien Mairal.
2010–2014 École Normale Supérieure, Paris, France.
M.Sc. in Probability and Statistics.
Minor in Linguistics.
2008–2010 Lycée Louis-le-Grand, Paris, France.
Classes préparatoires aux grandes écoles.
Publications
According to Google Scholar, my papers received about 168 citations.
[1]
H. Lin, J. Mairal, and Z. Harchaoui. A universal catalyst for first-order optimization. In Advances
Neural Information Processing Systems (NIPS), 2015.
[2]
H. Lin, J. Mairal, and Z. Harchaoui. A generic Quasi-Newton algorithm for faster gradient-based
optimization. Submitted to SIAM Journal on Optimization, 2017.
[3]
H. Lin, J. Mairal, and Z. Harchaoui. Catalyst acceleration for first-order convex optimization: from
theory to practice. The Journal of Machine Learning Research (JMLR), volume 18, 2018.
[4]
C. Paquette, H. Lin, D. Drusvyatskiy, J. Mairal, and Z. Harchaoui. Catalyst for gradient-based non-
convex optimization. In International Conference on Artificial Intelligence and Statistics (AISTATS),
2018.
Selected Talks
2017.06 Trends in Optimization Seminar, University of Washington.
2017.05 SIAM Conference on Optimization, Vancouver, Canada.
2016.12 NIPS Optimization workshop, Barcelona, Spain.
2016.08 ICCOPT, Tokyo, Japan.
2016.05 Workshop ATLAS, Univ. Grenoble Alpes.
2016.04 Optimization Seminar, Princeton University.