activation function

An activation function in artificial neural networks (ANN) is a function which computes the output of an artificial neuron to solve non-linear tasks. The activation function of a neural network essentially decides whether a neuron should be activated or not.

ReLU function

The ReLU (rectified linear unit function) function is an ANN activation function which calculates a linear function of the inputs. If the result is positive, it outputs that result. If it is negative, it outputs 0. The mathematical formula for the ReLU function is f (x) = max(0, x). The graph of the ReLU function ... Read more

tanh function

The tanh function, also known as the hyperbolic tangent function, is an activation function in artificial neural networks whose output values are constrained between the values of −1 and 1. The following screenshot provides a graph of function f(x)=tanh(x), as output from the Geogebra free online calculator.

TLU

TLU stands for threshold logic unit. TLU is an output neuron which calculates the weighted sum of input neurons and then implements a step function. This is used in the perceptron artificial neural network model.