Variance
In statistics and probability theory, the variance is a way to measure how spread out a set of numbers is. It is the average of the squares of the differences from the set’s mean. Var stands for the variance of a random variable X. (X).
In math, variance is found by taking the difference between each data point and the set’s mean, squaring each difference, and then finding the average of these squared differences. This is how you figure out variance:
Var(X) = 1/n * Σ (Xi – X̄)²
where Xi is the i-th data point, X is the average of all the data points, and n is the total number of data points.
Variance is an important idea in statistics and probability theory because it shows how different a set of data is from one another. A high variance means that the data are spread out over a wider range, while a low variance means that the data are tightly clustered around the mean.
Usage
It is used in cost management, time management, and quality management
Reference
Refer to Standard Deviation