Abstract | ||
---|---|---|
In this work, a probabilistic model is established for recurrent networks. The expectation-maximization (EM) algorithm is then applied to derive a new fast training algorithm for recurrent networks through mean-field approximation. This new algorithm converts training a complicated recurrent network into training an array of individual feedforward neurons. These neurons are then trained via a linear weighted regression algorithm. The training time has been improved by five to 15 times on benchmark problems. |
Year | DOI | Venue |
---|---|---|
1998 | 10.1109/72.655025 | IEEE Transactions on Neural Networks |
Keywords | Field | DocType |
complicated recurrent network,individual feedforward neuron,new fast training algorithm,recurrent network,benchmark problem,new algorithm,mean-field approximation,training time,em algorithm,linear weighted regression algorithm,probabilistic model,approximation algorithms,indexing terms,probability,process control,adaptive systems,transfer functions,vectors,probability density function,mean field approximation,expectation maximization,learning artificial intelligence,recurrent neural networks,expectation maximization algorithm,maximum likelihood estimation | Recurrent neural nets,Pattern recognition,Computer science,Expectation–maximization algorithm,Unit-weighted regression,Maximum likelihood,Statistical model,Artificial intelligence,Artificial neural network,Machine learning,Feed forward | Journal |
Volume | Issue | ISSN |
9 | 1 | 1045-9227 |
Citations | PageRank | References |
20 | 1.62 | 26 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sheng Ma | 1 | 1139 | 76.32 |
Chuanyi Ji | 2 | 812 | 124.04 |