- Apache Spark Deep Learning Cookbook
- Ahmed Sherif Amrith Ravindra
- 184字
- 2025-02-26 11:49:45
Setting up the activation function with sigmoid
An activation function is used in a neural network to help determine the output, whether it is a yes or no, true or false, or in our case 0 or 1 (male/female). At this point, the inputs have been normalized and have been summed with the weights and bias: w1, w2, and b. However, the weights and bias are completely random at the moment and are not optimized to produce a predicted output that matches the actual output. The missing link in building the predicted outcome resides with the activation or sigmoid function, which is shown in the following diagram:

If the number that is produced out of the summation is very small, it will produce an activation of 0. Likewise, if the number produced out of the summation is quite large, it will produce an activation of 1. This function is useful because it restricts the output to a binary outcome, which is quite useful for classification. The consequences of these outputs will be discussed and clarified in the remainder of this chapter.