Learning Hierarchies of Invariant Features
Courant Institute of Mathematical Sciences and Center for Neural Science, NYU
Intelligent perceptual tasks such as vision and audition require the construction of good internal representations. Machine Learning has been very successful for producing classifiers, but the next big challenge for ML, computer vision, and computational neuroscience is to devise learning algorithms that can learn features and internal representations automatically.
Theoretical and empirical evidence suggest that the perceptual world is best represented by a multi-stage hierarchy in which features in successive stages are increasingly global, invariant, and abstract. An important question is to devise “deep learning” methods for multi-stage architecture than can automatically learn invariant feature hierarchies from labeled and unlabeled data.
A number of unsupervised methods for learning invariant features will be described that are based on sparse coding and sparse auto-encoders: convolutional sparse auto-encoders, invariance through group sparsity, invariance through lateral inhibition, and invariance through temporal constancy. The methods are used to pre-train convolutional networks (ConvNets). ConvNets are biologically-inspired architectures consisting of multiple stages of filter banks, interspersed with non-linear operations, spatial pooling, and contrast normalization operations.
Several applications will be shown through videos and live demos, including a a pedestrian detector, a category-level object recognition system that can be trained on the fly, and a system that can label every pixel in an image with the category of the object it belongs to (scene parsing). Specialized hardware architecture that run these systems in real time will also be described.
Yann LeCun is Silver Professor of Computer Science and Neural Science at the Courant Institute of Mathematical Sciences and the Center for Neural Science of New York University.
He received the Electrical Engineer Diploma from Ecole Supérieure d'Ingénieurs en Electrotechnique et Electronique (ESIEE), Paris in 1983, and a PhD in Computer Science from Université Pierre et Marie Curie (Paris) in 1987. After a postdoc at the University of Toronto, he joined AT&T Bell Laboratories in Holmdel, NJ, in 1988, and became head of the Image Processing Research Department at AT&T Labs-Research in 1996. He joined NYU as a professor in 2003, after a brief period as Fellow at the NEC Research Institute in Princeton. His current interests include machine learning, computer perception and vision, mobile robotics, and computational neuroscience.
He has published over 150 technical papers and book chapters on these topics as well as on neural networks, handwriting recognition, image processing and compression, and VLSI design. His handwriting recognition technology is used by several banks around the world to read checks. His image compression technology, called DjVu, is used by hundreds of web sites and publishers and millions of users to access scanned documents on the Web, and his image recognition methods are used in deployed systems by companies such as Google, Microsoft, NEC, France Telecom and several startup companies for document recognition, human-computer interaction, image indexing, and video analytics.
He has been on the editorial board of IJCV, IEEE PAMI, IEEE T. Neural Networks, was program chair of CVPR'06, and is chair of the annual Learning Workshop. He is on the science advisory board of Institute for Pure and Applied Mathematics, and is the co-founder of MuseAmi, a music technology company.
Modern Algorithmic Tools for Analyzing Data Streams
Professor of Computer Science, Rutgers University
We now have a second generation of algorithmic tools for analyzing data streams, that go beyond the initial tools for summarizing a single stream in small space. The new tools deal with distributed data, stochastic models, dynamic graph and matrix objects and others; they optimize communication, number of parallel rounds and privacy among other things. I will provide an overview of these tools and explore their application to Machine Learning problems.
S. (Muthu) Muthukrishnan is a Professor of Computer Science at Rutgers Univ. His research interest is in Internet Auctions and Game Theory, as well Data Stream Algorithms and its connections to Compressed Sensing, Databases and Networking. He also maintains a blog: http://mysliceofpizza.blogspot.com/
The site http://algo.research.googlepages.com/ will have papers relevant to this talk.
Information Theory and Sustainable Energy
David MacKay FRS
Chief Scientific Advisor, DECC
Professor of Natural Philosophy, University of Cambridge
How easy is it to get off our fossil fuel habit? Can European countries live on their own renewables? What do the fundamental limits of physics say? How does our current energy consumption compare with our sustainable energy options? This talk will offer a straight-talking assessment of the numbers; will discuss how to make energy plans that add up; and will hunt for connections between machine learning, climate change, and sustainable energy.
David MacKay was appointed as Chief Scientific Advisor to the Department of Energy and Climate Change (DECC) on 1st October 2009. The Chief Scientific Advisor’s role is to ensure that the Department’s policies and operations, and its contributions to wider Government issues, are underpinned by the best science and engineering advice available.
David MacKay studied Natural Sciences at Trinity College, then went to Caltech to complete a PhD in Computation and Neural Systems. In 1992 he returned to Cambridge as a Royal Society research fellow at Darwin College. In 1995 he became a university lecturer in the Department of Physics, where he was promoted in 1999 to a Readership and in 2003 to a Professorship in Natural Philosophy. He was elected a fellow of the Royal Society in 2009.
David MacKay’s research interests include reliable computation with unreliable hardware, and communication systems for the disabled. He believes that what the climate-change discussion needs is clear, simple numbers, so that we can understand just how big our challenge is, and not be duped by wishful thinking. His book on the subject (Sustainable Energy - Without The Hot Air: David MacKay, UIT Cambridge, 2009) has received endorsements from all sectors and from all political parties; The Economist called it “a tour de force”, and The Guardian called it “this year's must-read book”.
For further information please see http://www.withouthotair.com/.