בן גוריון - Deep learning Deep Learning Quiz

לחץ כאן לכל השאלות

During forward propagation, in the forward function for a layer l you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?

1
done
During backpropagation you need to know which activation was used in the forward propagation to be able to compute the correct derivative.
by
מיין לפי

* השאלה נוספה בתאריך: 22-06-2018