Timezone: »

Data Scaling Laws in NMT: The Effect of Noise and Architecture
Yamini Bansal · Behrooz Ghorbani · Ankush Garg · Biao Zhang · Colin Cherry · Behnam Neyshabur · Orhan Firat

Thu Jul 21 03:00 PM -- 05:00 PM (PDT) @ Hall E #230

In this work, we study the effect of varying the architecture and training data quality on the data scaling properties of Neural Machine Translation (NMT). First, we establish that the test loss of encoder-decoder transformer models scales as a power law in the number of training samples, with a dependence on the model size. Then, we systematically vary aspects of the training setup to understand how they impact the data scaling laws. In particular, we change the following (1) Architecture and task setup: We compare to a transformer-LSTM hybrid, and a decoder-only transformer with a language modeling loss (2) Noise level in the training distribution: We experiment with filtering, and adding iid synthetic noise. In all the above cases, we find that the data scaling exponents are minimally impacted, suggesting that marginally worse architectures or training data can be compensated for by adding more data. Lastly, we find that using back-translated data instead of parallel data, can significantly degrade the scaling exponent.

Author Information

Yamini Bansal (Google)
Behrooz Ghorbani (Google Research)
Ankush Garg (Google)
Biao Zhang (University of Edinburgh)

Biao Zhang is a final-year Ph.D. student at the ILCC at the University of Edinburgh under the supervision of Prof. Rico Sennrich and Prof. Ivan Titov. His research focuses on improving neural machine translation (NMT), particularly its efficiency and universality, including developing lightweight (fast and effective) architectures for NMT, low-resource NMT, massively multilingual NMT, speech-to-text translation, context-aware NMT, and their intersections.

Colin Cherry (Google)
Colin Cherry

Research Scientist at Google Translate who works on data quality, and speech translation.

Behnam Neyshabur (Google)
Orhan Firat (Google)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors