Title
Language Modeling At Scale
Abstract
We show how Zipf's Law can be used to scale up language modeling (LM) to take advantage of more training data and more GPUs. LM plays a key role in many important natural language applications such as speech recognition and machine translation. Scaling up LM is important since it is widely accepted by the community that there is no data like more data. Eventually, we would like to train on terabytes (TBs) of text (trillions of words). Modern training methods are far from this goal, because of various bottlenecks, especially memory (within GPUs) and communication (across GPUs). This paper shows how Zipf's Law can address these bottlenecks by grouping parameters for common words and character sequences, because U << N, where U is the number of unique words (types) and N is the size of the training set (tokens). For a local batch size K with G GPUs and a D-dimension embedding matrix, we reduce the original per-GPU memory and communication asymptotic complexity from Theta(GKD) to Theta(GK +UD). Empirically, we find U proportional to (GK)(0.64) on four publicly available large datasets. When we scale up the number of GPUs to 64, a factor of 8, training time speeds up by factors up to 6.7 x (for character LMs) and 6.3 x (for word LMs) with negligible loss of accuracy. Our weak scaling on 192 GPUs on the Tieba dataset shows a 35% improvement in LM prediction accuracy by training on 93 GB of data (2.5 x larger than publicly available SOTA dataset), but taking only 1.25 x increase in training time, compared to 3 GB of the same dataset running on 6 GPUs.
Year
DOI
Venue
2018
10.1109/IPDPS.2019.00068
2019 IEEE 33RD INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS 2019)
Keywords
Field
DocType
Language Modeling, Deep Learning, High Performance Computing, Communication Optimization
Zipf's law,Embedding,Matrix (mathematics),Computer science,Terabyte,Machine translation,Theoretical computer science,Natural language,Artificial intelligence,Scaling,Machine learning,Language model
Journal
Volume
ISSN
Citations 
abs/1810.10045
1530-2075
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Md. Mostofa Ali Patwary133814.72
Milind Chabbi200.34
Heewoo Jun300.34
Jiaji Huang4285.27
Gregory Frederick Diamos5111751.07
Kenneth W. Church647301415.25