Title
Solving Regression Models With First Order Stochastic Based Optimizers
Abstract
Most of the regression-based Deep Learning (DL) algorithms which were recently proposed are based on Convolutional Neural Networks (CNN) trained by using 2 l(2) loss function. In order to avoid the vulnerability of it to outliers, some authors propose to combine regression methods with a classification function like combination of l(2) loss function and softmax or a strategy based on bounding boxes what improves the performance of the DL neural network.In this research we propose using a combination of first order Stochastic Gradient Descent (SGD) optimizer for Artificial Neural Networks (ANNs) with Deep Learning (DL) and Deep Learning Multilayered Perceptron DLMLP with Back Propagation (BP). We focus on optimization of stochastic objectives, parameters and implementation of different activation functions in solving medical regression problems in high-dimensional space.
Year
DOI
Venue
2019
10.1109/ner.2019.8716995
2019 9TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER)
Keywords
Field
DocType
neural networks, optimizers, backpropagation, stochastic gradient descent, momentum, Nesterov momentum, Adam, AdaMax, activation function, loss function
Computer science,Regression analysis,First order,Artificial intelligence,Machine learning
Conference
ISSN
Citations 
PageRank 
1948-3546
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Natacha Gueorguieva16312.46
Iren Valova213625.44
Brian Keegan300.34