In machine learning (ML), variance is a concept which is related to errors in the model's predictions, as a results of over-sensitivity and high correlation of the machine learning algorithm to the training data. Due to this over-sensitivity, the ML model becomes complex to explain (explainability) and it captures the complexity inside the training data in great detail. However it fails to generalize new test and training data adequately. In the case of high variance, the ML model learns the noise in the training data as well as any random fluctuations and not the underlying pattern between the dependent and independent variables. If a model has high variance, then it performs well on the  training data but fails on the testing data, thus falling under the overfitting area outside the good fit.

Related Cloud terms