Optimizing Network Simulation: Enhancing Performance Prediction Accuracy via Neural Architecture Search
ShaoChen He ⋅ Zirui Zhuang ⋅ Haifeng Sun ⋅ Xiaoyuan Fu ⋅ Qi Qi ⋅ Lei Zhang ⋅ Jianxin Liao ⋅ Jingyu Wang
Abstract
Existing machine learning models for network simulation excel at predicting average performance but, due to their reliance on mean squared error, systematically fail to capture the critical tail-latency and jitter that define modern network stability. This 'tail-blindness' renders them unreliable for latency-sensitive systems. We bridge this gap by introducing Accurate Neural Architecture Search (ANAS), a paradigm that automates the discovery of architectures for high-precision, distribution-aware network simulation. ANAS corrects the evaluation inaccuracies of weight-sharing NAS via a similarity-constrained search, employs a hybrid search space to model complex traffic, and uses a Wasserstein loss to optimize for the entire delay distribution, not just its mean. Empirically, the ANAS-discovered architecture is holistically superior: it reduces overall validation loss by 25.8\% compared to DeepQueueNet, demonstrating strong average-case performance, while simultaneously excelling at tail-sensitive metrics by lowering the normalized Wasserstein distance ($W_n$) by up to 69.8\%. This confirms its ability to faithfully model a comprehensive performance spectrum, encompassing both average and critical tail behaviors. The ANAS framework provides a practical methodology for automatically creating high-fidelity model of network devices, enabling more reliable validation of next-generation network protocols and algorithms.
Successful Page Load