1.
a.
Suppose that a neural network has been trained to classify students as likely to graduate or not likely to graduate based on various input parameters. There is a single output neuron having four inputs, and one output, . The output is used to make the classification. The weights of the three inputs of this neuron are , , and . The bias of the neuron is . Find the output if and if the activation function is:
- ReLU
- Leaky ReLU defined by max (0.2x,x)
- Sigmoid
- Softplus
b.
Which of the four activation functions, applied at the output layer, would be best suited for this classification task? Using this activation function, how would the neural network classify the student?