Sigmoid Activation function provides S shape curve graph and the values are always between 0 and 1. Sigmoid function converts the large negative values to 0 and high positive values to 1.
But with Sigmoid Activation function, we have major drawback is that it’s value is 0 near 0 and 1 which means during the back propagation method the weights values will never ever change and we will not get any output. Secondly, Sigmoid Activation Function faces the gradient vanishing problem at 0 and 1.
Below is the image of Sigmoid Curve:
SDN and NFV is the next phase of technology change which will help service provider to launch the services in single click. This is all about the programmability of the networks by using open source software defined network controller.
Monday, December 11, 2017
Rectified Linear Unit Activation Function In Deep Learning
Labels:
Machine Learning
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment