recall

Recall (also known as sensitivity) is the ratio of true positives (based on the confusion matrix) by all positives (=true positives + false negatives). It is commonly used in conjunction with precision and it is needed when we must minimize false negatives. Recall can be considered the opposite metric of specificity. Recall is a measure ... Read more

regularization

In machine learning, regularization is a method by which the ML model cost/error function is changed, to include an extra variable called the regularization hyperparameter. There are two basic types of regularization: L1-norm (lasso regression) and L2-norm (ridge regression). The Lasso regularization uses the L1 norm parameter. The lasso regularized cost function is calculated as ... Read more

ReLU function

The ReLU (rectified linear unit function) function is an ANN activation function which calculates a linear function of the inputs. If the result is positive, it outputs that result. If it is negative, it outputs 0. The mathematical formula for the ReLU function is f (x) = max(0, x). The graph of the ReLU function ... Read more

Responsible AI

While AI technologies advance, there are various ethical, legal, security and privacy considerations which must be taken into account. These considerations are commonly referred to as "Responsible AI". Each AI software and hardware vendor classifies responsible AI and AI ethics areas differently. Some example areas which constitute Responsible AI are the following: Explainability and interpretability ... Read more