Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Colloquia Archive 
 Postdoc Seminars Archive 
 Quantum Lunch 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 Kac Lectures 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Summer Research 
 Student Application 
 Past Visitors 
 PD Travel Request 
 History of CNLS 
 Maps, Directions 
 CNLS Office 
Tuesday, July 16, 2019
09:30 AM - 10:30 AM
CNLS Conference Room (TA-3, Bldg 1690)


Updated: Machine Learning through the Information Bottleneck

Artemy Kolchinsky
Santa Fe Institute

The information bottleneck (IB) has been proposed as a principled way to compress a random variable, while only preserving that information which is relevant for predicting another random variable. In recent times, the IB has been proposed --- and challenged --- as a theoretical framework for understanding why and how deep learning architectures achieve good performance. I will cover: (1) an introduction to the ideas behind IB, (2) methods for implementing information-theoretic compression in neural networks + some possible applications of such methods, (3) the current status of the IB theory of deep learning, (4) recently discovered caveats that arise for IB in machine learning scenarios.

NOTE: Future speaker nominations through the Information Science and Technology Institute (ISTI) are welcome and can be entered at:

Host: Juston Moore