Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
international conference on learning representations, 2015.
We introduce the "exponential linear unit" (ELU) which speeds up learning in deep neural networks and leads to higher classification accuracies. Like rectified linear units (ReLUs), leaky ReLUs (LReLUs) and parametrized ReLUs (PReLUs), ELUs alleviate the vanishing gradient problem via the identity for positive values. However, ELUs have...More
PPT (Upload PPT)