A survey on recently proposed activation functions for Deep Learning
论文地址:
简评
论文介绍了一些激活函数,但是没有给出实验结果,这里对不熟悉的激活函数做一个记录。
Swish
f(x)=x×sigmoid(βx) Mish
f(x)=x×tanh(softplus(x)) GCU
C(z)=z×cosz Non-Monotonic Cubic Unit (NCU)
f(z)=z−z3 Shifted Quadratic Unit (SQU)
f(z)=z2+z Decaying Sine Unit (DSU)
f(z)=2π(sinc(z−π)−sinc(z+π)) Shifted Sinc Unit (SSU)
f(z)=πsinc(z−π)