Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)

international conference on learning representations, 2015.

Cited by: 2422|Bibtex|Views34|Links
EI

Abstract:

We introduce the "exponential linear unit" (ELU) which speeds up learning in deep neural networks and leads to higher classification accuracies. Like rectified linear units (ReLUs), leaky ReLUs (LReLUs) and parametrized ReLUs (PReLUs), ELUs alleviate the vanishing gradient problem via the identity for positive values. However, ELUs have...More

Code:

Data:

Your rating :
0

 

Tags
Comments