Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Dynamic Neural Networks

HARNAS: Neural Architecture Search Jointly Optimizing for Hardware Efficiency and Adversarial Robustness of Convolutional and Capsule Networks

Alberto Marchisio · Vojtech Mrazek · Andrea Massa · Beatrice Bussolino · Maurizio Martina · Muhammad Shafique


Abstract:

Neural Architecture Search (NAS) methodologies aim at finding efficient Deep Neural Network (DNN) models for a given application under given system constraints. DNNs are compute-intensive as well as vulnerable to adversarial attack threats. To address multiple design objectives, we propose HARNAS, a novel NAS framework that jointly optimizes for hardware-efficiency and adversarial-robustness of DNNs executed on specialized hardware accelerators. Besides the traditional convolutional DNNs, HARNAS extends the search for complex types of DNNs such as Capsule Networks. For reducing the exploration time, HARNAS selects appropriate values of adversarial perturbations to employ in the NAS algorithm. Our evaluations provide a set of Pareto-optimal solutions leveraging the tradeoffs between the above-discussed design objectives.

Chat is not available.