Variance Ratio - Definition & Meaning

Published in Statistics by MBA Skool Team

What is Variance Ratio?

Variance ratio or co-efficient of dispersion is defined as the ratio of variance to mean. It is defined only for those models, where the mean is non zero. It is frequently used with exponential and Poisson distribution for count data etc.


In the world of statistics, variance ratio is used to measure how dispersed or clustered set of events are, in a given interval of time or space.


If variance ratio = 0, the underlying distribution will be a constant random variable

If 0< ratio <1, underlying distribution could be binomial

If ratio = 1, underlying distribution is Poisson

If variance ratio >1, over-dispersed negative binomial distribution.


For example, variance ratio of 1 indicates Poisson distribution. Now if we want to detect, whether the distribution of earthquakes in a given region is Poissonian in nature or some other factors are involved, we try to collect the data set and find the variance ratio for the same. If VR = 1, it indicates Poisson distribution.

This article has been researched & authored by the Business Concepts Team. It has been reviewed & published by the MBA Skool Team. The content on MBA Skool has been created for educational & academic purpose only.

Browse the definition and meaning of more similar terms. The Management Dictionary covers over 1800 business concepts from 5 categories.

Continue Reading:



Share this Page on:
Facebook ShareTweetShare on Linkedin