Skip to content

Non-official SPOCU activation function implementation for Pytorch

Notifications You must be signed in to change notification settings

sailfish009/spocu-pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

“SPOCU”: scaled polynomial constant unit activation function.

Non-official Pytorch implementation of the SPOCU activation function [1], for the case when c=infinite.

It can be included in your network given an alpha, beta and gamma value:

from spocu import SPOCU

alpha = 3.0937
beta = 0.6653
gamma = 4.437

spocu = SPOCU(alpha, beta, gamma)

x = torch.rand((10,10))
print(spocu(x))

Bibliography

[1] Kiseľák, J., Lu, Y., Švihra, J. et al. “SPOCU”: scaled polynomial constant unit activation function. Neural Comput & Applic (2020). https://doi.org/10.1007/s00521-020-05182-1

About

Non-official SPOCU activation function implementation for Pytorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%