Timezone: »
Representing shapes as level-sets of neural networks has been recently proved to be useful for different shape analysis and reconstruction tasks. So far, such representations were computed using either: (i) pre-computed implicit shape representations; or (ii) loss functions explicitly defined over the neural level-sets.
In this paper we offer a new paradigm for computing high fidelity implicit neural representations directly from raw data (i.e., point clouds, with or without normal information). We observe that a rather simple loss function, encouraging the neural network to vanish on the input point cloud and to have a unit norm gradient, possesses an implicit geometric regularization property that favors smooth and natural zero level-set surfaces, avoiding bad zero-loss solutions. We provide a theoretical analysis of this property for the linear case, and show that, in practice, our method leads to state-of-the-art implicit neural representations with higher level-of-details and fidelity compared to previous methods.
Author Information
Amos Gropp (Weizmann Institute of Science)
Lior Yariv (Weizmann Institute of Science)
Niv Haim (Weizmann Institute of Science)
Matan Atzmon (Weizmann Institute of Science)
Yaron Lipman (Weizmann Institute of Science)
More from the Same Authors
-
2023 Oral: Equivariant Polynomials for Graph Neural Networks »
Omri Puny · Derek Lim · Bobak T Kiani · Haggai Maron · Yaron Lipman -
2023 Poster: Equivariant Polynomials for Graph Neural Networks »
Omri Puny · Derek Lim · Bobak T Kiani · Haggai Maron · Yaron Lipman -
2023 Poster: SinFusion: Training Diffusion Models on a Single Image or Video »
Yaniv Nikankin · Niv Haim · Michal Irani -
2023 Poster: Multisample Flow Matching: Straightening Flows with Minibatch Couplings »
Aram-Alexandre Pooladian · Heli Ben-Hamu · Carles Domingo i Enrich · Brandon Amos · Yaron Lipman · Ricky T. Q. Chen -
2023 Poster: On Kinetic Optimal Probability Paths for Generative Models »
Neta Shaul · Ricky T. Q. Chen · Maximilian Nickel · Matthew Le · Yaron Lipman -
2023 Poster: MultiDiffusion: Fusing Diffusion Paths for Controlled Image Generation »
Omer Bar-Tal · Lior Yariv · Yaron Lipman · Tali Dekel -
2022 Poster: Matching Normalizing Flows and Probability Paths on Manifolds »
Heli Ben-Hamu · samuel cohen · Joey Bose · Brandon Amos · Maximilian Nickel · Aditya Grover · Ricky T. Q. Chen · Yaron Lipman -
2022 Spotlight: Matching Normalizing Flows and Probability Paths on Manifolds »
Heli Ben-Hamu · samuel cohen · Joey Bose · Brandon Amos · Maximilian Nickel · Aditya Grover · Ricky T. Q. Chen · Yaron Lipman -
2021 Poster: Phase Transitions, Distance Functions, and Implicit Neural Representations »
Yaron Lipman -
2021 Spotlight: Phase Transitions, Distance Functions, and Implicit Neural Representations »
Yaron Lipman -
2021 Poster: Riemannian Convex Potential Maps »
samuel cohen · Brandon Amos · Yaron Lipman -
2021 Spotlight: Riemannian Convex Potential Maps »
samuel cohen · Brandon Amos · Yaron Lipman -
2019 : Yaron Lipman, Weizmann Institute of Science »
Yaron Lipman -
2019 Poster: On the Universality of Invariant Networks »
Haggai Maron · Ethan Fetaya · Nimrod Segol · Yaron Lipman -
2019 Oral: On the Universality of Invariant Networks »
Haggai Maron · Ethan Fetaya · Nimrod Segol · Yaron Lipman