A survey on recently proposed activation functions for Deep Learning

论文地址:

简评

论文介绍了一些激活函数,但是没有给出实验结果,这里对不熟悉的激活函数做一个记录。

  1. Swish

    f(x)=x×sigmoid(βx)f(x)=x \times \operatorname{sigmoid}(\beta x)
  2. Mish

    f(x)=x×tanh(softplus(x))f(x)=x \times \tanh (\operatorname{softplus}(x))
  3. GCU

    C(z)=z×coszC(z)=z \times \cos z
  4. Non-Monotonic Cubic Unit (NCU)

    f(z)=zz3f(z)=z-z^{3}
  5. Shifted Quadratic Unit (SQU)

    f(z)=z2+zf(z)=z^{2}+z
  6. Decaying Sine Unit (DSU)

    f(z)=π2(sinc(zπ)sinc(z+π))f(z)=\frac{\pi}{2}(\operatorname{sinc}(z-\pi)-\operatorname{sinc}(z+\pi))
  7. Shifted Sinc Unit (SSU)

    f(z)=πsinc(zπ)f(z)=\pi \operatorname{sinc}(z-\pi)

Last updated