Abstract | ||
---|---|---|
AbstractWe create an artificial neural network which is a version of echo state networks, ESNs. ESNs are recurrent neural networks but unlike most recurrent networks, they come with an efficient training method. We have previously Wang et al., 2011 adapted this method using ideas from neuroscale Tipping, 1996 so that the network can be used for projecting multivariate time series data onto a low dimensional manifold so that the structure in the time series can be identified by eye. In this paper, we review work on a minimal architecture echo state machine Wang et al., 2011 in the context of visualisation and show that it does not perform as well as the original. We then discuss three factors which may affect the capability of the network - its structure, size and sparsity - and show that, of these three, by far the most important is the size of the reservoir of neurons. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1504/IJCSE.2016.074561 | Periodicals |
Keywords | Field | DocType |
echo state machines, visualisation, multidimensional scaling | Time series,Multidimensional scaling,Visualization,Computer science,Recurrent neural network,Finite-state machine,Artificial intelligence,Artificial neural network,Machine learning,Manifold | Journal |
Volume | Issue | ISSN |
12 | 1 | 1742-7185 |
Citations | PageRank | References |
0 | 0.34 | 5 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tzai-der Wang | 1 | 119 | 15.65 |
Xiaochuan Wu | 2 | 2 | 1.11 |
Colin Fyfe | 3 | 324 | 35.74 |