Coder Social home page Coder Social logo

pouyaardehkhani / acttensor Goto Github PK

View Code? Open in Web Editor NEW
19.0 2.0 4.0 1.94 MB

ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani

Home Page: https://acttensor-dqmn2h3f355urhakwkprmn.streamlit.app/

License: MIT License

Python 100.00%
activation activation-function activation-functions activations data-science deep-learning deep-neural-networks neural-networks python tensorflow

acttensor's Introduction



ActTensor: Activation Functions for TensorFlow

license releases

What is it?

ActTensor is a Python package that provides state-of-the-art activation functions which facilitate using them in Deep Learning projects in an easy and fast manner.

Why not using tf.keras.activations?

As you may know, TensorFlow only has a few defined activation functions and most importantly it does not include newly-introduced activation functions. Wrting another one requires time and energy; however, this package has most of the widely-used, and even state-of-the-art activation functions that are ready to use in your models.

Requirements

numpy
tensorflow
setuptools
keras
wheel

Where to get it?

The source code is currently hosted on GitHub at: https://github.com/pouyaardehkhani/ActTensor

Binary installers for the latest released version are available at the Python Package Index (PyPI)

# PyPI
pip install ActTensor-tf

License

MIT

How to use?

import tensorflow as tf
import numpy as np
from ActTensor_tf import ReLU # name of the layer

functional api

inputs = tf.keras.layers.Input(shape=(28,28))
x = tf.keras.layers.Flatten()(inputs)
x = tf.keras.layers.Dense(128)(x)
# wanted class name
x = ReLU()(x)
output = tf.keras.layers.Dense(10,activation='softmax')(x)

model = tf.keras.models.Model(inputs = inputs,outputs=output)

sequential api

model = tf.keras.models.Sequential([tf.keras.layers.Flatten(),
                                    tf.keras.layers.Dense(128),
                                    # wanted class name
                                    ReLU(),
                                    tf.keras.layers.Dense(10, activation = tf.nn.softmax)])

NOTE:

The main function of the activation layers are also availabe but it maybe defined as different name. Check this for more information.

from ActTensor_tf import relu

Activations

Classes and Functions are available in ActTensor_tf

Activation Name Class Name Function Name
SoftShrink SoftShrink softSHRINK
HardShrink HardShrink hard_shrink
GLU GLU -
Bilinear Bilinear -
ReGLU ReGLU -
GeGLU GeGLU -
SwiGLU SwiGLU -
SeGLU SeGLU -
ReLU ReLU relu
Identity Identity identity
Step Step step
Sigmoid Sigmoid sigmoid
HardSigmoid HardSigmoid hard_sigmoid
LogSigmoid LogSigmoid log_sigmoid
SiLU SiLU silu
PLinear ParametricLinear parametric_linear
Piecewise-Linear PiecewiseLinear piecewise_linear
Complementary Log-Log CLL cll
Bipolar Bipolar bipolar
Bipolar-Sigmoid BipolarSigmoid bipolar_sigmoid
Tanh Tanh tanh
TanhShrink TanhShrink tanhshrink
LeCun's Tanh LeCunTanh leCun_tanh
HardTanh HardTanh hard_tanh
TanhExp TanhExp tanh_exp
Absolute ABS Abs
Squared-ReLU SquaredReLU squared_relu
P-ReLU ParametricReLU Parametric_ReLU
R-ReLU RandomizedReLU Randomized_ReLU
LeakyReLU LeakyReLU leaky_ReLU
ReLU6 ReLU6 relu6
Mod-ReLU ModReLU Mod_ReLU
Cosine-ReLU CosReLU Cos_ReLU
Sin-ReLU SinReLU Sin_ReLU
Probit Probit probit
Cos Cos Cosine
Gaussian Gaussian gaussian
Multiquadratic Multiquadratic Multi_quadratic
Inverse-Multiquadratic InvMultiquadratic Inv_Multi_quadratic
SoftPlus SoftPlus softPlus
Mish Mish mish
SMish Smish smish
P-SMish ParametricSmish Parametric_Smish
Swish Swish swish
ESwish ESwish eswish
HardSwish HardSwish hardSwish
GCU GCU gcu
CoLU CoLU colu
PELU PELU pelu
SELU SELU selu
CELU CELU celu
ArcTan ArcTan arcTan
Shifted-SoftPlus ShiftedSoftPlus Shifted_SoftPlus
Softmax Softmax softmax
Logit Logit logit
GELU GELU gelu
Softsign Softsign softsign
ELiSH ELiSH elish
HardELiSH HardELiSH hardELiSH
Serf Serf serf
ELU ELU elu
Phish Phish phish
QReLU QReLU qrelu
MQReLU MQReLU mqrelu
FReLU FReLU frelu

Which activation functions it supports?

  1. Soft Shrink:

  1. Hard Shrink:

  1. GLU:

  1. Bilinear:
  1. ReGLU:

    ReGLU is an activation function which is a variant of GLU.

  1. GeGLU:

    GeGLU is an activation function which is a variant of GLU.

  1. SwiGLU:

    SwiGLU is an activation function which is a variant of GLU.

  1. SeGLU:

    SeGLU is an activation function which is a variant of GLU.

  2. ReLU:

  1. Identity:

    $f(x) = x$

  1. Step:

  1. Sigmoid:

  1. Hard Sigmoid:

  1. Log Sigmoid:

  1. SiLU:

  1. ParametricLinear:

    $f(x) = a*x$

  2. PiecewiseLinear:

    Choose some xmin and xmax, which is our "range". Everything less than than this range will be 0, and everything greater than this range will be 1. Anything else is linearly-interpolated between.

  1. Complementary Log-Log (CLL):

  1. Bipolar:

  1. Bipolar Sigmoid:

  1. Tanh:

  1. Tanh Shrink:

  1. LeCunTanh:

  1. Hard Tanh:

  1. TanhExp:

  1. ABS:

  1. SquaredReLU:

  1. ParametricReLU (PReLU):

  1. RandomizedReLU (RReLU):

  1. LeakyReLU:

  1. ReLU6:

  1. ModReLU:

  1. CosReLU:

  1. SinReLU:

  1. Probit:

  1. Cosine:

  1. Gaussian:

  1. Multiquadratic:

    Choose some point (x,y).

  1. InvMultiquadratic:

  1. SoftPlus:

  1. Mish:

  1. Smish:

  1. ParametricSmish (PSmish):

  1. Swish:

  1. ESwish:

  1. Hard Swish:

  1. GCU:

  1. CoLU:

  1. PELU:

  1. SELU:

    where $\alpha \approx 1.6733$ & $\lambda \approx 1.0507$

  1. CELU:

  1. ArcTan:

  1. ShiftedSoftPlus:

  1. Softmax:

  1. Logit:

  1. GELU:

  1. Softsign:

  1. ELiSH:

  1. Hard ELiSH:

  1. Serf:

  1. ELU:

  1. Phish:

  1. QReLU:

  1. m-QReLU:

  1. FReLU:

Cite this repository

@software{Pouya_ActTensor_2022,
author = {Pouya, Ardehkhani and Pegah, Ardehkhani},
license = {MIT},
month = {7},
title = {{ActTensor}},
url = {https://github.com/pouyaardehkhani/ActTensor},
version = {1.0.0},
year = {2022}
}

acttensor's People

Contributors

pegah-ardehkhani avatar pouyaardehkhani avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.