EE Seminar: on sample compression
(The talk will be given in English)
Speaker: Prof. Aryeh Kontorovich
Department of Computer Science, BGU
Monday, April 15th, 2019
15:00 - 16:00
Room 011, Kitot Bldg., Faculty of Engineering
on sample compression
Abstract
Sample compression is a natural and elegant learning framework, which allows for storage and runtime savings as well as sharp generalization bounds. In this talk, I will survey a few recent collaborations that touch upon various aspects of sample compression. Central among these is the development of a new algorithm for learning in arbitrary metric spaces based on a margin-regularized 1-nearest neighbor, which we call OptiNet. The latter is strongly universally Bayes-consistent in all essentially-separable metric probability spaces. OptiNet is the first learning algorithm to enjoy this property; by comparison, k-NN and its variants are not Bayes-consistent, except under additional structural assumptions, such as an inner product, a norm, finite doubling dimension, or a Besicovitch-type property. I will then talk about sample compression in the context of regression, extensions to non-uniform margins, and, time permitting, eneralization lower bounds.
Based on joint work with: Lee-Ad Gottlieb, Steve Hanneke, Sivan Sabato, Menachem Sadigurschi, Roi Weiss
Short Bio: Aryeh Kontorovich received his undergraduate degree in mathematics with a certificate in applied mathematics from Princeton University in 2001. His M.Sc. and Ph.D. are from Carnegie Mellon University, where he graduated in 2007. After a postdoctoral fellowship at the Weizmann Institute of Science, he joined the Computer Science department at Ben-Gurion University of the Negev in 2009, where he is currently an associate professor. His research interests are mainly in machine learning, with a focus on probability, statistics, Markov chains and metric spaces.