Title | ||
---|---|---|
Variance Preserving Initialization For Training Deep Neuromorphic Photonic Networks With Sinusoidal Activations |
Abstract | ||
---|---|---|
Photonic neuromorphic hardware can provide significant performance benefits for Deep Learning (DL) applications by accelerating and reducing the energy requirements of DL models. However, photonic neuromorphic architectures employ different activation elements than those traditionally used in DL, slowing down the convergence of the training process for such architectures. An initialization scheme that can be used to efficiently train deep photonic networks that employ quadratic sinusoidal activation functions is proposed in this paper. The proposed initialization scheme can overcome these limitations, leading to faster and more stable training of deep photonic neural networks. The ability of the proposed method to improve the convergence of the training process is experimentally demonstrated using two different DL architectures and two datasets. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/icassp.2019.8682218 | 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) |
Keywords | Field | DocType |
Neuromorphic Hardware, Photonic Neural Networks, Sinusoidal Activations | Neuromorphic hardware,Convergence (routing),Pattern recognition,Computer science,Neuromorphic engineering,Quadratic equation,Artificial intelligence,Deep learning,Initialization,Artificial neural network,Computer engineering,Photonics | Conference |
ISSN | Citations | PageRank |
1520-6149 | 0 | 0.34 |
References | Authors | |
0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
N. Passalis | 1 | 117 | 33.70 |
George Mourgias-Alexandris | 2 | 2 | 4.34 |
Apostolos Tsakyridis | 3 | 3 | 4.63 |
Nikos Pleros | 4 | 25 | 23.69 |
Anastasios Tefas | 5 | 2055 | 177.05 |