The city of Haifa


For Participants

For authors


ICML 2010 - Invited Speakers


Robert (Yisrael) Aumann

Center for Rationality, Hebrew University

An exploration of the conceptual foundations of the backward induction algorithm -- the algorithm that lies at the basis of all chess-playing programs, including Deep Blue. These foundations are not nearly as clear as they may at first seem. One of the central issues is that of "counterfactual conditionals": sentences like "If I had pushed my pawn, he could have trapped my queen."

Robert Aumann was born in Frankfurt am Main, Germany, in 1930, to a well-to-do orthodox Jewish family. Fleeing Nazi persecution, he emigrated to the United States with his family in 1938, settling in New York. In the process, his parents lost everything, but nevertheless gave their two children an excellent Jewish and general education. Aumann attended Yeshiva elementary and high schools, got a bachelor's degree from the City College of New York in 1950, and a Ph.D. in mathematics from MIT in 1955.

He joined the mathematics department at the Hebrew University of Jerusalem in 1956, and has been there ever since. In 1990, he was among the founders of the Center for Rationality at the Hebrew University, an interdisciplinary research center, centered on Game Theory, with members from over a dozen different departments, including Business, Economics, Psychology, Computer Science, Law, Mathematics, Ecology, Philosophy, and others.

Aumann is the author of over ninety scientific papers and six books, and has held visiting positions at Princeton, Yale, Berkeley, Louvain, Stanford, Stony Brook, and NYU. He is a member of the American Academy of Arts and Sciences, the National Academy of Sciences (USA), the British Academy, and the Israel Academy of Sciences; holds honorary doctorates from the Universities of Chicago, Bonn, Louvain, City University of New York, and Bar Ilan University; and has received numerous prizes, including the Nobel Memorial Prize in Economic Sciences for 2005.

Aumann is married and had five children (the oldest was killed in Lebanon in 1982). Also, he has twenty-one grandchildren, and seven great-grandchildren. When not working, he likes to hike, ski, cook, and study the Talmud.

Never-Ending Learning

Tom Mitchell

School of Computer Science, Carnegie Mellon University

What would it take to develop machine learners that run forever, each day improving their performance and also the accuracy with which they learn? This talk will describe our attempt to build a never-ending language learner, NELL, that runs 24 hours per day, forever, and that each day has two goals: (1) extract more structured information from the web to populate its growing knowledge base, and (2) learn to read better than yesterday, by using previously acquired knowledge to better constrain its subsequent learning.

The approach implemented by NELL is based on two key ideas: coupling the semi-supervised training of hundreds of diffent functions that extract different types of information from different web sources, and automatically discovering new constraints that more tightly couple the training of these functions over time. NELL has been running nonstop since January 2010 (follow it at, and had extracted a knowledge base containing hundreds of thousands of beliefs as of May 2010. This talk will describe NELL, its successes and its failures, and use it as a case study to explore the question of how to design never-ending learners.

Tom M. Mitchell is the E. Fredkin University Professor and head of the Machine Learning Department at Carnegie Mellon University. His research interests lie in machine learning, natural language processing, artificial intelligence, and cognitive neuroscience. Mitchell is a member of the U.S. National Academy of Engineering, a Fellow of the American Association for the Advancement of Science (AAAS), and a Fellow and Past President of the Association for the Advancement of Artificial Intelligence (AAAI). Mitchell believes the field of machine learning will be the fastest growing branch of computer science during the 21st century.

Genes, Chromatin, and Transcription

Nir Friedman

School of Computer Science and Engineering, Hebrew University

A central question in molecular biology is understanding how cells process information and react accordingly. A crucial level of response is by regulating gene transcription, the first step in producing the protein that the gene encodes. Transcriptional regulation is crucial for defining the cell’s identity and its ability to function. The main dogma is that regulatory “instructions” are part of the genetic blueprint encoded in the genome, the sequence of DNA base pairs. In recent years there is growing evidence for additional layers of epigenetic information that are passed from a cell to its daughter cells not through the DNA sequence. One of these layers is chromatin, the protein-DNA complex that forms chromosomes. The basic unit of chromatin is a nucleosome, around which about ~150 base pairs of DNA are wound. Each nucleosome can be modified by addition of multiple discrete marks, which in turn can be recognized by specific regulatory proteins that modify nucleosomes or impact transcription. As such nucleosomes serve as a substrate for recording information by some regulatory mechanisms and reading it by others, and for passing information to daughter cells following cell division.

These new discoveries raise basic questions of what does chromatin state encodes, how it is maintained, updated, passed to next generations, and how it interact with transcription. The research to answer these questions relies on new methodologies that collect massive amount of data about chromatin state in each location along the genome. In this talk I will provide an overview of the field and describe ongoing investigations that attempt to answer these questions.

Nir Friedman is a Professor at the Hebrew University of Jerusalem, holding a joint appointment at the School of Computer Science and Engineering and the Institute of Life Sciences. He received his doctorate from Stanford in 1997, and after a two year post-doctoral fellowship at U.C. Berkeley, joined the Hebrew University. He started his research career in Artificial Intelligence, mainly specializing in learning Graphical Probabilistic Models. In the last decade he became more involved in research in Molecular and Cellular Biology and Systems Biology. His current research interests are in understanding regulatory mechanisms involved in gene expression, and in particular how the mechanisms combine to achieve specific expression of different genes. He is a co-founder of the B.Sc. in Computational Biology program at the Hebrew University, one of the first of its kind. Together with Daphne Koller he is a co-author of “Probabilistic Graphical Models: Principles and Techniques”.

Using the Web to do Social Science

Duncan Watts

Yahoo! Research

Social science is often concerned with the emergence of collective behavior out of the interactions of large numbers of individuals, but in this regard it has long suffered from a severe measurement problem—namely that interactions between people are hard to observe, especially at scale, over time, and at the same time as observing behavior. In this talk, I will argue that the technological revolution of the Internet is beginning to lift this constraint. To illustrate, I will describe several examples of internet-based research that would have been impractical to perform until recently, and that shed light on some longstanding sociological questions. Although internet-based research still faces serious methodological and procedural obstacles, I propose that the ability to study truly “social” dynamics at individual-level resolution will have dramatic consequences for social science.

Duncan Watts is a principal research scientist at Yahoo! Research, where he directs the Human Social Dynamics group. He is also an adjunct senior research fellow at Columbia University, and an external faculty member of the Santa Fe Institute and Nuffield College, Oxford. His research on social networks and collective dynamics has appeared in a wide range of journals, from Nature, Science, and Physical Review Letters to the American Journal of Sociology. He is also the author of Six Degrees: The Science of a Connected Age (W.W. Norton, 2003) and Small Worlds: The Dynamics of Networks between Order and Randomness (Princeton University Press, 1999). He holds a B.Sc. in Physics from the University of New South Wales, and Ph.D. in Theoretical and Applied Mechanics from Cornell University.