facebook twitter
#icml2016
All workshops at a glance

Tentative schedule

June 23th

Gimli: Geometry in Machine Learning

Søren Hauberg  (Technical University of Denmark), Oren Freifeld (MIT), Michael Schober (Max Plack Institute for Intelligent Systems)
Location: Ballroom Crowne Plaza – Times Square

Many machine ­learning (ML) problems are fundamentally geometric in nature, e.g. finding optimal subspaces can be recast as finding point estimates on the Grassmannian; multi-metric learning can be recast as the learning of a Riemannian tensor; and covariance estimation entails optimization over a nonlinear cone. In spite of this, most practitioners neglect the geometry, only to find suboptimal models. Furthermore, many difficult problems that involve both geometry and statistical learning are usually ignored by the ML community. This workshop will raise these discussion points through a series of invited talks from experts on both geometry and machine learning.
https://sites.google.com/site/gimliworkshop/

Machine Learning for Digital Education and Assessment Systems

Alina A. von Davier (Educational Testing Service), Mihaela van der Schaar (UCLA), Richard Baraniuk (Rice University)
Location: Marriott – Times Square

The focus of this workshop is on multidisciplinary research in the area of machine learning to enable new forms of digital education and assessment tools.
Recent developments indicate that the society is interested in redesigning learning and assessment systems (LAS) and not merely improving the systems we have. There is a renewed interest in performance assessments that are individualized and adaptive, which are developed in virtual settings. However, virtual LASs come with a number of psychometric and operational challenges. Advances in ML provide opportunities to address these challenges.
This workshop provides a platform for the sharing of knowledge and ideas across disciplines including ML, computational psychometrics, adaptive learning and testing, and natural language processing.
http://medianetlab.ee.ucla.edu/ICML-Education2016.html

Human Interpretability in Machine Learning

Been Kim (Allen Institute for Artificial Intelligence), Dmitry Malioutov (IBM T. J. Watson Research Center), Kush Varshney (IBM T. J. Watson Research Center)
Location: Microsoft, Central Park (6th floor)

The goal of this workshop is to bring together researchers who study interpretable machine learning. This is a very exciting time to study interpretable machine learning, as the advances in large­ scale optimization and Bayesian inference that have enabled the rise of black­box machine learning (e.g., deep learning) are now also starting to be exploited to develop principled approaches to large­ scale interpretable machine learning. Participants in the workshop will exchange ideas on these and allied topics, including, but not limited to, developing interpretability of predictive models, interpretable machine learning algorithms, methodology to interpret black­box machine learning models (e.g., post ­hoc interpretations), and visual analytics.
https://sites.google.com/site/2016whi

Multi-View Representation Learning

Xiaodong He (Microsoft Research), Karen Livescu (TTI-Chicago), Weiran Wang (TTI-Chicago), Scott Wen-tau Yih (Microsoft Research)
Location: Marriott: Carnegie-Booth

The workshop will bring together researchers and practitioners in this area, and discuss both theoretical and practical aspects of representation/feature learning in the presence of multi-view data.
http://ttic.uchicago.edu/~wwang5/ICML2016_MVRL/

Theory and Practice of Differential Privacy (TPDP 2016)

Gilles Barthe (IMDEA Software), Christos Dimitrakakis (Chalmers University of Technology), Marco Gaboardi (University at Buffalo, SUNY), Andreas Haeberlen (University of Pennsylvania), Aaron Roth (University of Pennsylvania), Aleksandra Slavkovic (Penn State University)
Location: Marriott: O’Neil

Differential privacy is a promising approach to the privacy-preserving release of data: it offers a strong guaranteed bound on the increase in harm that a user incurs as a result of participating in a differentially private data analysis. Several mechanisms and software tools have been developed to ensure differential privacy for a wide range of data analysis task.
Researchers in differential privacy come from several disciplines such as computer science, data analysis, statistics, security, law and privacy making, social science. The workshop is an occasion for researchers to discuss the recent developments in the theory and practice of differential privacy and applications.
http://tpdp16.cse.buffalo.edu/

Visualization for Deep Learning

Biye Jiang (UC Berkeley), John Canny (UC Berkeley), Polo Chau (Georgia Tech), Aditya Khosla (MIT)
Location: Marriott: Astor

Deep neural networks are complex to design and train. They are non-linear systems that have many local optima and are sensitive to hyper-parameters. Systematic optimization of structure and hyper-parameters is possible, but hampered by the expense of training each design on realistic datasets.
We argue that visualization can play an essential role in understanding DNNs and in developing new design principles. With rich tools for visual exploration of networks during training and inference, one should be able to form closer ties between theory and practice: validating expected behaviors, and exposing the unexpected which can lead to new insights.
http://icmlviz.github.io/

Reliable Machine Learning in the Wild

Jacob Steinhardt (Stanford), Tom Dietterich (OSU), Percy Liang (Stanford), Andrew Critch (MIRI), Jessica Taylor (MIRI), Adrian Weller (Cambridge)
Location: Marriott: Empire

How can we be confident that a system that performed well in the past will do so in the future, in the presence of novel and potentially adversarial input distributions? Answering these questions is critical for high stakes applications such as autonomous driving, as well as for building reliable large-scale machine learning systems. This workshop explores approaches that are principled or can provide performance guarantees, ensuring AI systems are robust and beneficial in the long run. We will focus on three aspects — robustness, adaptation, and monitoring — that can aid us in designing and deploying reliable machine learning systems.
https://sites.google.com/site/wildml2016/

Neural Networks Back To The Future

Léon Bottou (Facebook), David Grangier (Facebook), Tomas Mikolov (Facebook), John Platt (Google)
Location: Crowne Plaza – Broadway

As research in deep learning is extremely active today, we could take a step back and examine its foundations. We propose to have a critical look at previous work on neural networks, and try to have a better understanding of the differences with today’s work. Previous work can point at promising directions to follow, pitfalls to avoid, ideas and assumptions to revisit. Similarly, today’s progress can allow a critical examination of what should still be investigated, what has been answered…
https://sites.google.com/site/nnb2tf

Deep Learning Workshop

Antoine Bordes (Facebook AI Research), Kyunghyun Cho (New York University), Emily Denton (New York University), Nando de Freitas (Google DeepMind, University of Oxford), Rob Fergus (Facebook AI Research, New York University)
Location: Westside Ballroom 3 & 4

Deep learning is a fast-growing field of machine learning concerned with the study and design of computer algorithms for learning good representations of data, at multiple levels of abstraction. There has been rapid progress in this area in recent years, both in terms of methods and in terms of applications, which are attracting the major IT companies as well as major research labs. Many challenges remain, however, in aspects like large sample complexity of deep learning approaches, generative modeling, learning representations for reinforcement learning and symbolic reasoning, modeling of temporal data with long-term dependencies, efficient Bayesian inference for deep learning and multi-modal data and models. This workshop aims at tackling two major challenges in deep learning, which are unsupervised learning in the regime of small data, and simulation-based learning and its transferability to the real world, by bringing together researchers in the field of deep learning.
https://sites.google.com/site/dlworkshop16/

Abstraction in Reinforcement Learning

Daniel Mankowitz, Shie Mannor (Technion Israel Institute of Technology), Timothy Mann (Google Deepmind)
Location: Marriott: Marquis

Many real-world domains can be modeled using some form of abstraction. An abstraction is an important tool that enables an agent to focus less on the lower level details of a task and more on solving the task at hand. Temporal abstraction (i.e., options or skills) as well as spatial abstraction (i.e., state space representation) are two important examples. The goal of this workshop is to provide a forum to discuss the current challenges in designing as well as learning abstractions in real-world Reinforcement Learning (RL).
http://rlabstraction2016.wix.com/icml

Advances in non-convex analysis and optimization

Animashree Anandkumar (UCI), Sivaraman Balakrishnan (CMU), Srinadh Bhojanapalli (TTI), Kamalika Chaudhuri (UCSD), Yudong Chen (Cornell), Anastasios Kyrillidis (UT Austin), Percy Liang (Stanford), Praneeth Netrapalli (Microsoft), Sewoong Oh (UIUC), Zhaoran Wang (Princeton)
Location: Westin – Majestic

This workshop will attempt to present some of the very recent developments on non-convex analysis and optimization, as reported in diverse research fields: from machine learning and mathematical programming to statistics and theoretical computer science. We believe that this workshop can bring researchers closer, in order to facilitate a discussion regarding why tackling non-convexity is important, where it is found, why non-convex schemes work well in practice and, how we can progress further with interesting research directions and open problems.
https://sites.google.com/site/noncvxicml16/

Machine Learning for Music Discovery

Erik Schmidt (Pandora), Fabien Gouyon (Pandora), Oriol Nieto (Pandora), Gert Lanckriet (Amazon/University of California San Diego)
Location: Marriott: Wilder

The ever-increasing size and accessibility of vast music libraries has created a demand more than ever for machine learning systems that are capable of understanding and organizing this complex data. Collaborative filtering provides excellent music recommendations when the necessary user data is available, but these approaches also suffer heavily from the cold-start problem. Furthermore, defining musical similarity directly is extremely challenging as myriad features play some role (e.g., cultural, emotional, timbral, rhythmic). The topics discussed will span a variety of music recommender systems challenges including cross-cultural recommendation, content-based audio processing and representation learning, automatic music tagging, and evaluation.
https://sites.google.com/site/ml4md2016/

June 24th

Data-Efficient Machine Learning

Marc Deisenroth (Imperial College London), Shakir Mohamed (Google Deepmind), Finale Doshi-Velez (Harvard University), Andreas Krause (ETH Zürich), Max Welling (University of Amsterdam)
Location: Marriott: Astor

Recent efforts in machine learning have addressed the problem of learning from massive amounts data. We now have highly scalable solutions for problems in object detection and recognition, machine translation, text-to-speech, recommender systems, and information retrieval, all of which attain state-of-the-art performance when trained with large amounts of data. In these domains, the challenge we now face is how to learn efficiently with the same performance in less time and with less data. Other problem domains, such as personalized healthcare, robot reinforcement learning, sentiment analysis, and community detection, are characterized as either small-data problems, or big-data problems that are a collection of small-data problems. The ability to learn in a sample-efficient manner is a necessity in these data-limited domains. Collectively, these problems highlight the increasing need for data-efficient machine learning: the ability to learn in complex domains without requiring large quantities of data.
This workshop will discuss the diversity of approaches that exist for data-efficient machine learning, and the practical challenges that we face. There are many approaches that demonstrate that data-efficient machine learning is possible, including methods that consider trade-offs between incorporating explicit domain knowledge and more general-purpose approaches, exploit structural knowledge of our data, such as symmetry and other invariance properties, apply bootstrapping and data augmentation techniques that make statistically efficient reuse of available data, use semi-supervised learning techniques, e.g., where we can use generative models to better guide the training of discriminative models, generalize knowledge across domains (transfer learning), use active learning and Bayesian optimization for experimental design and data-efficient black-box optimization, apply non-parametric methods, one-shot learning and Bayesian deep learning.
The objective of this interdisciplinary workshop is to provide a platform for researchers from a variety of areas, spanning transfer learning, Bayesian optimization, bandits, deep learning, approximate inference, robot learning, healthcare, computational neuroscience, active learning, reinforcement learning, and social network analysis, to share insights and perspectives on the problem of data-efficient machine learning, discuss challenges and to debate the roadmap towards more data-efficient machine learning.
https://sites.google.com/site/dataefficientml/

Computational Biology

Dana Pe’er (Columbia University), Elham Azizi (Columbia University), Sandhya Prabhakaran (Columbia University), Olga Troyanskaya (Princeton University), Edoardo Airoldi (Harvard University), Volker Roth (University of Basel)
Location: Marriott: Cantor/Jolson

The application of Machine Learning in Computational biology has advanced significantly in recent years. In computational biology, there has been credible developments in many high-throughput technologies like next-generation sequencing, CyToF and single-cell sequencing that enable data generation from many interesting biological systems. The gamut of novel algorithms in Machine Learning makes it very attractive to apply these methods to the challenging biological questions. It therefore only seems befitting to bring together researchers engaged in applying ML in Computational biology to discuss recent advances in this interdisciplinary field and ongoing developments.
https://sites.google.com/site/compbioworkshopicml2016/

Anomaly Detection 2016

Nico Goernitz (Berlin Institute of Technology), Marius Kloft (Humboldt University of Berlin), Vitaly Kuznetsov (Google Research)
Location: Marriott – Soho

Anomaly, outlier and novelty detection methods are crucial tools in any data scientist’s inventory and are critical components of many real-world applications. Abnormal user activities can be used to detect credit card fraud, network intrusions or other security breaches. In computational biology, characterization of systematic anomalies in gene expression can be translated into clinically relevant information. With the rise of Internet-of-Things, the task of monitoring and diagnostics of numerous autonomous systems becomes intractable for a human and needs to be outsources to a machine. Early detection of an upcoming earthquake or tsunami can potentially save human lives. These applications make anomaly detection methods increasingly relevant in the modern world.
However, with the advent of Big Data, new challenges and questions are introduced, which will need to be addressed by the next generation of the anomaly and outlier detection algorithms. The goal of our workshop is to survey the existing techniques and discuss new research directions in this area.
https://sites.google.com/site/icmlworkshoponanomalydetection/

Automatic Machine Learning (AutoML)

Frank Hutter (University of Freiburg), Lars Kotthoff (University of British Columbia), Joaquin Vanschoren (Eindhoven University of Technology)
Location: Marriott: Empire

Machine learning has been very successful, but its successes rely on human machine learning experts to define the learning problem, select, collect and preprocess the training data, choose appropriate ML architectures (deep learning, random forests, SVMs, …) and their hyperparameters, and finally evaluate the suitability of the learned models for deployment. As the complexity of these tasks is often beyond non-experts, the rapid growth of machine learning applications has created a demand for off-the-shelf machine learning methods that are more bullet-proof and can be used easily without expert knowledge. We call the resulting research area that targets progressive automation of machine learning AutoML.
See also ChaLearn’s AutoML challenge: http://automl.chalearn.org/
http://icml2016.automl.org/

Machine Learning Systems

Aparna Lakshmi Ratan (Facebook), Joaquin Quiñonero Candela (Facebook), Hussein Mehanna (Facebook), Joseph Gonzalez (UC Berkeley)
Location: Microsoft, Central Park (6th floor)

The diverse use of machine learning, the explosive growth in data, and the complexity of large-scale learning systems have fueled an interesting area at intersection of Machine Learning and large scale System Design. The goal of this workshop is to bring together experts working in the intersection of machine learning, system design, software engineering to explore the challenges needed to address real world, large scale machine learning problems. In particular, we aim to elicit new connections among these diverse fields, identify tools, best practices and design principles. The workshop will cover ML and AI platforms and algorithm toolkits (Caffe, Torch, MXNet and parameter server, Theano etc), as well as dive into Machine learning focused developments in distributed learning platforms, programming languages, data structures and general purpose GPU programming.
The workshop will have a mix of invited speakers and reviewed papers to facilitate the flow of new ideas as well as best practices which can benefit those looking to implement large ML systems in academia or industry.
https://sites.google.com/site/mlsys2016/

#data4good: Machine Learning in Social Good Applications

James Faghmous (Mount Sinai), Matt Gee (University of Chicago), Rayid Ghani (University of Chicago), Gideon Mann (Bloomberg), Aleksandra Mojsilović (IBM Research), Kush Varshney (IBM Research)
Location: Marriott: Wilder

This workshop will bring together experts from different fields to explore the opportunities for machine learning in applications with social impact. Our goal is to raise awareness among ML practitioners about the opportunities in Data-for-Good movement and push the boundaries on addressing tough humanitarian challenges. The workshop will consist of: 1) invited presentations from the leading practitioners in the field and 2) a series of presentations on research that fits the theme of machine learning for social good; broadly construed, this could be machine learning related social good applications, or machine learning methods/theory of particular interest for social good applications.
https://sites.google.com/site/icml2016data4goodworkshop/

Theory of Deep Learning

Rene Vidal  (the John Hopkins University), Alex M. Bronstein (Technion – IIT), Raja Giryes (Tel Aviv University)
Location: Marriott: Westside Ballroom 3 & 4

Deep learning led to a significant breakthrough in many applications in computer vision and machine learning. However, only little is known about the theory behind this successful paradigm. This workshop will discuss the recent achievements with respect to the theoretical understanding of deep networks.
https://sites.google.com/site/deeplearningtheory

On-Device Intelligence

Vikas Sindhwani, Daniel Ramage, Keith Bonawitz (Google), Suyog Gupta (IBM), Sachin Talathi (Qualcomm)
Location: Marriott: Odets

Consumer adoption of mobile devices has created a new normal in computing: there are now more mobile devices on the planet than people, and exabytes of mobile data per month now dominates global internet traffic. As computing systems, these pocket-sized devices are more powerful in many ways than vintage supercomputers. They come packed with an ever growing array of sensors. They are “always-on”, and becoming increasingly capable of rich contextual understanding and natural interaction with their users.
This workshop will focus on research themes emerging at the intersection of machine learning and mobile systems. The topics of interest range from the design of new machine learning algorithms under storage and power constraints, new on-device learning mechanisms, the interaction between devices and cloud resources for privacy-aware distributed training, and opportunities for machine learning in the nascent area of “Internet of Things.” The scope of the workshop also extends to real-time learning and optimization in the context of novel form-factors: wearable computers, home intelligence devices, and consumer robotics systems. We are also interested in hardware-software co-design for mobile machine learning applications.
https://sites.google.com/site/ondeviceintelligence/icml2016

Online advertising systems

Sharat Chikkerur (Nanigans Inc), Hossein Azari (Google Research), Edoardo Airoldi (Harvard)
Location: Marriott: Carnegie/Booth

Online advertising is a multi-billion dollar industry driven by the confluence of machine learning, optimization, control systems, auction algorithms, econometrics and software engineering. The goal of this workshop is to discuss how machine learning systems operate within the context of an advertising system.
https://sites.google.com/site/admlsystemsworkshop/

Optimization Methods for the Next Generation of Machine Learning

Katya Scheinberg (Lehigh University), Frank E. Curtis (Lehigh University), Jorge Nocedal (Northwestern University), Yoshua Bengio (University of Montreal)
Location: Westin – Majestic

The future of optimization for machine learning, lies in the design of methods for nonconvex optimization problems, such as those arising through the use of deep neural networks. Nonconvex formulations lead to more powerful predictive models, but are much more complex in the sense that they result in much more challenging optimization problems. This workshop will bring together experts from the machine learning and optimization communities whose research focuses on the design of optimization methodologies that combine recent trends of optimization in machine learning—stochasticity, parallel and distributed computing, and second order information—but do so in nonconvex settings.
http://optml.lehigh.edu/ICML2016

Computational Frameworks for Personalization

Suchi Saria (Johns Hopkins University), Yisong Yue (Caltech), Khalid El-Arini (Facebook), Ambuj Tewari (University of Michigan)
Location: Marriott: O’Neil

This workshop aims to bring together researchers from industry and academia in order to describe recent advances and discuss future research directions pertaining to computational frameworks for personalization, broadly construed. Personalization has already made a huge impact in online recommender systems. Furthermore, there are many emerging applications where personalization has begun to show great promise, such as education and medicine. We are particularly interested in understanding what are the common computational challenges that underlie all these applications, with the goal of accelerating the development of personalization frameworks across a broad range of domains.
https://sites.google.com/site/icml2016ersonalization/