The log of training process.
Two(Accuracy and Cross entropy loss) from the left.
ReLU, Sigmoid, ReLU with He initialization.
yeonghyeon / compare_activation_function Goto Github PK
View Code? Open in Web Editor NEWCompare vanishing gradient problem case by case.