Title
The Utility of Knowledge Transfer for Noisy Data
Abstract
Knowledge transfer research has traditionally focused on features that are relevant for a class of problems. In contrast, our research focuses on features that are irrel- evant. When attempting to acquire a new concept from sen- sory data, a learner is exposed to significant volumes of extraneous data. In order to use knowledge transfer for quickly acquiring new concepts within a given class (e.g. learning a new character from the set of characters, a new face from the set of faces, a new vehicle from the set of vehicles, etc.), a learner must know which fea- tures are ignorable or it will repeatedly be forced to re- learn them. We have previously demonstrated knowledge transfer in deep convolutional neural nets (DCNN's) (Gutstein, Fuentes, & Freudenthal 2007). In this paper, we give experimental results that demonstrate the increased im- portance of knowledge transfer when learning new con- cepts from noisy data. Additionally, we exploit the layered nature of DCNN's to discover more efficient and targeted methods of trans- fer. We observe that most of the transfer occurs within the 3.2% of weights that are closest to the input.
Year
Venue
Keywords
2008
FLAIRS Conference
neural net
Field
DocType
Citations 
Noisy data,Computer science,Knowledge transfer,Exploit,Artificial intelligence,Artificial neural network,Machine learning
Conference
1
PageRank 
References 
Authors
0.35
20
3
Name
Order
Citations
PageRank
Steven Gutstein1162.68
Olac Fuentes224634.55
Eric Freudenthal339633.16