Variance

Encord Computer Vision Glossary

Variance in machine learning is the degree of sensitivity of a model's predictions to changes in the training data. High variance means that the model is too complex and overfits the training data, i.e., it learns the noise and idiosyncrasies of the training data rather than the underlying patterns. As a result, the model performs well on the training data but poorly on the unseen data or test set.

Low variance, on the other hand, indicates that the model is too simple and underfits the training data, i.e., it fails to capture the relevant features and patterns in the data. This leads to poor performance both on the training and test sets.

From scaling to enhancing your model development with data-driven insights
medical banner

Variance is one of the components of the bias-variance trade-off, a fundamental concept in machine learning that refers to the trade-off between the complexity and flexibility of the model and its ability to generalize to new, unseen data. Balancing the bias-variance trade-off is crucial for developing accurate and robust machine learning models.

cta banner

Discuss this blog on Slack

Join the Encord Developers community to discuss the latest in computer vision, machine learning, and data-centric AI

Join the community
cta banner

Automate 97% of your annotation tasks with 99% accuracy