Title | ||
---|---|---|
Learning in large linear perceptrons and why the thermodynamic limit is relevant to the real world |
Abstract | ||
---|---|---|
We present a new method for obtaining the response function 9 and its average G from which most of the properties of learning and generalization in linear perceptrons can be derived. We first rederive the known results for the 'thermodynamic limit' of infinite perceptron size N and show explicitly that 9 is self-averaging in this limit. We then discuss extensions of our method to more gen(cid:173) eral learning scenarios with anisotropic teacher space priors, input distributions, and weight decay terms. Finally, we use our method to calculate the finite N corrections of order 1/ N to G and discuss the corresponding finite size effects on generalization and learning dynamics. An important spin-off is the observation that results obtained in the thermodynamic limit are often directly relevant to systems of fairly modest, 'real-world' sizes. |
Year | Venue | Keywords |
---|---|---|
1994 | NIPS | thermodynamic limit |
Field | DocType | Citations |
Mathematical optimization,Anisotropy,Weight decay,Thermodynamic limit,Learning dynamics,Artificial intelligence,Prior probability,Perceptron,Machine learning,Mathematics | Conference | 1 |
PageRank | References | Authors |
0.39 | 1 | 1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Peter Sollich | 1 | 298 | 38.11 |