F1 score

F1 score is the weighted average (harmonic mean) of precision and recall. The F1 score is calculated by the following formula: F1 score = (2 x Precision x Recall)/(Precision + Recall) The more the precision and recall metrics deviate from each other, the worse their harmonic mean (i.e. the F1 score).

PRC

PRC stands for precision–recall curve. It is a method of visualizing the tradeoff between precision and recall. Precision and recall are both model performance metrics , aka cost function, for classification systems.

Precision

In statistics and machine learning, precision is a measure of how often the positives identified by a learning model are true positives. This is a division of true positives (based on the confusion matrix) by all estimated positives (=true positives + false positives). The precision metric is commonly used in conjunction with recall, to evaluate ... Read more

recall

Recall (also known as sensitivity) is the ratio of true positives (based on the confusion matrix) by all positives (=true positives + false negatives). It is commonly used in conjunction with precision and it is needed when we must minimize false negatives. Recall can be considered the opposite metric of specificity. Recall is a measure ... Read more