Tutorial
Understanding your Neighbors: Practical Perspectives From Modern Analysis
Sanjoy Dasgupta · Samory Kpotufe
A9
Nearest-neighbor methods are among the most ubiquitous and oldest approaches in Machine Learning and other areas of data analysis. They are often used directly as predictive tools, or indirectly as integral parts of more sophisticated modern approaches (e.g. recent uses that exploit deep representations, uses in geometric graphs for clustering, integrations into time-series classification, or uses in ensemble methods for matrix completion). Furthermore, they have strong connections to other tools such as classification and regression trees, or even kernel machines, which are all (more sophisticated) forms of local prediction. Interestingly, our understanding of these methods is still evolving, with many recent results shedding new insights on performance under various settings describing a range of modern uses and application domains. Our aim is to cover such new perspectives on k-NN, and in particular, translate new theoretical insights (with practical implications) to a broader audience.
Website: http://www.princeton.edu/~samory/Documents/ICML-kNN-Tutorial.pdf
Live content is unavailable. Log in and register to view live content