Title
Large-scale deep learning at Baidu
Abstract
In the past 30 years, tremendous progress has been achieved in building effective shallow classification models. Despite the success, we come to realize that, for many applications, the key bottleneck is not the qualify of classifiers but that of features. Not being able to automatically get useful features has become the main limitation for shallow models. Since 2006, learning high-level features using deep architectures from raw data has become a huge wave of new learning paradigms. In recent two years, deep learning has made many performance breakthroughs, for example, in the areas of image understanding and speech recognition. In this talk, I will walk through some of the latest technology advances of deep learning within Baidu, and discuss the main challenges, e.g., developing effective models for various applications, and scaling up the model training using many GPUs. In the end of the talk I will discuss what might be interesting future directions.
Year
DOI
Venue
2013
10.1145/2505515.2514699
CIKM
Keywords
Field
DocType
large-scale deep learning,huge wave,shallow model,main limitation,effective shallow classification model,effective model,new learning paradigm,deep architecture,high-level feature,deep learning,main challenge
Data science,Bottleneck,Computer science,Raw data,Artificial intelligence,Deep learning
Conference
Citations 
PageRank 
References 
3
0.48
0
Authors
1
Name
Order
Citations
PageRank
Yu, Kai14799255.21