Timezone: »
Initialization, normalization, and skip connections are believed to be three indispensable techniques for training very deep convolutional neural networks and obtaining state-of-the-art performance. This paper shows that deep vanilla ConvNets without normalization nor skip connections can also be trained to achieve surprisingly good performance on standard image recognition benchmarks. This is achieved by enforcing the convolution kernels to be near isometric during initialization and training, as well as by using a variant of ReLU that is shifted towards being isometric. Further experiments show that if combined with skip connections, such near isometric networks can achieve performances on par with (for ImageNet) and better than (for COCO) the standard ResNet, even without normalization at all. Our code is available at https://github.com/HaozhiQi/ISONet.
Author Information
Haozhi Qi (UC Berkeley)
Chong You (University of California, Berkeley)
Xiaolong Wang (UCSD)

Our group has a broad interest around the directions of Computer Vision, Machine Learning and Robotics. Our focus is on learning 3D and dynamics representations through videos and physical robotic interaction data. We explore various means of supervision signals from the data itself, language, and common sense knowledge. We leverage these comprehensive representations to facilitate the learning of robot skills, with the goal of generalizing the robot to interact effectively with a wide range of objects and environments in the real physical world. Please check out our individual research topic of Self-Supervised Learning, Video Understanding, Common Sense Reasoning, RL and Robotics, 3D Interaction, Dexterous Hand.
Yi Ma (UC Berkeley)
Jitendra Malik (University of California at Berkeley)
More from the Same Authors
-
2021 : Stabilizing Deep Q-Learning with ConvNets and Vision Transformers under Data Augmentation »
Nicklas Hansen · Hao Su · Xiaolong Wang -
2021 : Disentangled Attention as Intrinsic Regularization for Bimanual Multi-Object Manipulation »
Minghao Zhang · Pingcheng Jian · Yi Wu · Harry (Huazhe) Xu · Xiaolong Wang -
2021 : Learning Vision-Guided Quadrupedal Locomotionwith Cross-Modal Transformers »
Ruihan Yang · Minghao Zhang · Nicklas Hansen · Harry (Huazhe) Xu · Xiaolong Wang -
2022 : Robust Calibration with Multi-domain Temperature Scaling »
Yaodong Yu · Stephen Bates · Yi Ma · Michael Jordan -
2023 : SCAFF-PD: Communication Efficient Fair and Robust Federated Learning »
Yaodong Yu · Sai Praneeth Karimireddy · Yi Ma · Michael Jordan -
2023 : Cal-QL: Calibrated Offline RL Pre-Training for Efficient Online Fine-Tuning »
Mitsuhiko Nakamoto · Yuexiang Zhai · Anikait Singh · Max Sobol Mark · Yi Ma · Chelsea Finn · Aviral Kumar · Sergey Levine -
2023 Poster: Learning Dense Correspondences between Photos and Sketches »
Xuanchen Lu · Xiaolong Wang · Judith E. Fan -
2023 Poster: Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles »
Chaitanya Ryali · Yuan-Ting Hu · Daniel Bolya · Chen Wei · Haoqi Fan · Po-Yao Huang · Vaibhav Aggarwal · Arkabandhu Chowdhury · Omid Poursaeed · Judy Hoffman · Jitendra Malik · Yanghao Li · Christoph Feichtenhofer -
2023 Poster: Understanding the Complexity Gains of Single-Task RL with a Curriculum »
Qiyang Li · Yuexiang Zhai · Yi Ma · Sergey Levine -
2023 Poster: MonoNeRF: Learning Generalizable NeRFs from Monocular Videos without Camera Poses »
Yang Fu · Ishan Misra · Xiaolong Wang -
2023 Oral: Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles »
Chaitanya Ryali · Yuan-Ting Hu · Daniel Bolya · Chen Wei · Haoqi Fan · Po-Yao Huang · Vaibhav Aggarwal · Arkabandhu Chowdhury · Omid Poursaeed · Judy Hoffman · Jitendra Malik · Yanghao Li · Christoph Feichtenhofer -
2023 Poster: On Pre-Training for Visuo-Motor Control: Revisiting a Learning-from-Scratch Baseline »
Nicklas Hansen · Zhecheng Yuan · Yanjie Ze · Tongzhou Mu · Aravind Rajeswaran · Hao Su · Huazhe Xu · Xiaolong Wang -
2022 Poster: Temporal Difference Learning for Model Predictive Control »
Nicklas Hansen · Hao Su · Xiaolong Wang -
2022 Spotlight: Temporal Difference Learning for Model Predictive Control »
Nicklas Hansen · Hao Su · Xiaolong Wang -
2022 Poster: Predicting Out-of-Distribution Error with the Projection Norm »
Yaodong Yu · Zitong Yang · Alexander Wei · Yi Ma · Jacob Steinhardt -
2022 Spotlight: Predicting Out-of-Distribution Error with the Projection Norm »
Yaodong Yu · Zitong Yang · Alexander Wei · Yi Ma · Jacob Steinhardt -
2022 Poster: Image-to-Image Regression with Distribution-Free Uncertainty Quantification and Applications in Imaging »
Anastasios Angelopoulos · Amit Pal Kohli · Stephen Bates · Michael Jordan · Jitendra Malik · Thayer Alshaabi · Srigokul Upadhyayula · Yaniv Romano -
2022 Spotlight: Image-to-Image Regression with Distribution-Free Uncertainty Quantification and Applications in Imaging »
Anastasios Angelopoulos · Amit Pal Kohli · Stephen Bates · Michael Jordan · Jitendra Malik · Thayer Alshaabi · Srigokul Upadhyayula · Yaniv Romano -
2021 Poster: Differentiable Spatial Planning using Transformers »
Devendra Singh Chaplot · Deepak Pathak · Jitendra Malik -
2021 Poster: Compositional Video Synthesis with Action Graphs »
Amir Bar · Roi Herzig · Xiaolong Wang · Anna Rohrbach · Gal Chechik · Trevor Darrell · Amir Globerson -
2021 Spotlight: Compositional Video Synthesis with Action Graphs »
Amir Bar · Roi Herzig · Xiaolong Wang · Anna Rohrbach · Gal Chechik · Trevor Darrell · Amir Globerson -
2021 Spotlight: Differentiable Spatial Planning using Transformers »
Devendra Singh Chaplot · Deepak Pathak · Jitendra Malik -
2020 Poster: Which Tasks Should Be Learned Together in Multi-task Learning? »
Trevor Standley · Amir Zamir · Dawn Chen · Leonidas Guibas · Jitendra Malik · Silvio Savarese -
2020 Poster: Rethinking Bias-Variance Trade-off for Generalization of Neural Networks »
Zitong Yang · Yaodong Yu · Chong You · Jacob Steinhardt · Yi Ma -
2017 Poster: Fast k-Nearest Neighbour Search via Prioritized DCI »
Ke Li · Jitendra Malik -
2017 Talk: Fast k-Nearest Neighbour Search via Prioritized DCI »
Ke Li · Jitendra Malik