Oral
|
Tue 8:00
|
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks
Liam Collins · Hamed Hassani · Mahdi Soltanolkotabi · Aryan Mokhtari · Sanjay Shakkottai
|
|
Poster
|
Thu 4:30
|
Benign Overfitting in Two-Layer ReLU Convolutional Neural Networks for XOR Data
Xuran Meng · Difan Zou · Yuan Cao
|
|
Poster
|
Thu 4:30
|
Neural Collapse for Cross-entropy Class-Imbalanced Learning with Unconstrained ReLU Features Model
Hien Dang · Tho Tran Huu · Tan Nguyen · Nhat Ho
|
|
Poster
|
Wed 2:30
|
ReLU to the Rescue: Improve Your On-Policy Actor-Critic with Positive Advantages
Andrew Jesson · Christopher Lu · Gunshi Gupta · Nicolas Beltran-Velez · Angelos Filos · Jakob Foerster · Yarin Gal
|
|
Poster
|
Thu 2:30
|
Convex Relaxations of ReLU Neural Networks Approximate Global Optima in Polynomial Time
Sungyoon Kim · Mert Pilanci
|
|
Poster
|
Wed 4:30
|
Universal Consistency of Wide and Deep ReLU Neural Networks and Minimax Optimal Convergence Rates for Kolmogorov-Donoho Optimal Function Classes
Hyunouk Ko · Xiaoming Huo
|
|
Poster
|
Wed 2:30
|
Symmetric Matrix Completion with ReLU Sampling
Huikang Liu · Peng Wang · Longxiu Huang · Qing Qu · Laura Balzano
|
|
Poster
|
Thu 2:30
|
Stochastic Bandits with ReLU Neural Networks
Kan Xu · Hamsa Bastani · Surbhi Goel · Osbert Bastani
|
|
Poster
|
Thu 4:30
|
Activation-Descent Regularization for Input Optimization of ReLU Networks
Hongzhan Yu · Sicun Gao
|
|
Poster
|
Tue 4:30
|
ReLU Network with Width d+O(1) Can Achieve Optimal Approximation Rate
Chenghao Liu · Minghua Chen
|
|
Poster
|
Tue 4:30
|
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks
Liam Collins · Hamed Hassani · Mahdi Soltanolkotabi · Aryan Mokhtari · Sanjay Shakkottai
|
|
Poster
|
Tue 4:30
|
The Effect of Weight Precision on the Neuron Count in Deep ReLU Networks
Songhua He · Periklis Papakonstantinou
|
|