Timezone: »
Metric Multidimensional scaling (MDS) is a classical method for generating meaningful (non-linear) low-dimensional embeddings of high-dimensional data. MDS has a long history in the statistics, machine learning, and graph drawing communities. In particular, the Kamada-Kawai force-directed graph drawing method is equivalent to MDS and is one of the most popular ways in practice to embed graphs into low dimensions. Despite its ubiquity, our theoretical understanding of MDS remains limited as its objective function is highly non-convex. In this paper, we prove that minimizing the Kamada-Kawai objective is NP-hard and give a provable approximation algorithm for optimizing it, which in particular is a PTAS on low-diameter graphs. We supplement this result with experiments suggesting possible connections between our greedy approximation algorithm and gradient-based methods.
Author Information
Erik Demaine (MIT)
Adam C Hesterberg (Harvard John A. Paulson School Of Engineering And Applied Sciences)
Frederic Koehler (MIT)
Jayson Lynch (University of Waterloo)
John C Urschel (Massachusetts Institute of Technology)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: Multidimensional Scaling: Approximation and Complexity »
Thu. Jul 22nd 04:00 -- 06:00 AM Room
More from the Same Authors
-
2021 : Representational aspects of depth and conditioning in normalizing flows »
Frederic Koehler -
2021 Poster: Representational aspects of depth and conditioning in normalizing flows »
Frederic Koehler · Viraj Mehta · Andrej Risteski -
2021 Spotlight: Representational aspects of depth and conditioning in normalizing flows »
Frederic Koehler · Viraj Mehta · Andrej Risteski -
2017 Poster: Learning Determinantal Point Processes with Moments and Cycles »
John C Urschel · Ankur Moitra · Philippe Rigollet · Victor-Emmanuel Brunel -
2017 Talk: Learning Determinantal Point Processes with Moments and Cycles »
John C Urschel · Ankur Moitra · Philippe Rigollet · Victor-Emmanuel Brunel