Posted in Statistics, Total Reads: 2542

Definition: Interval

In statistics for the evaluation of any parameter, a range of values is generally stated within which the parameter must lie. This range of values is referred to as intervals. Generally intervals are chosen such that the parameter falls within it with a 95-99% probability. These intervals are called lower and upper confidence intervals.

The interval is generally chosen by estimating a statistic based on values measured on a random sample from the population. As per the probability theory, the properties of the sample must represent the properties of the entire population.

The width of the confidence interval is an indication of the certainty of the unknown parameter. A very wide interval can indicate the need to study a large number of data samples before we can say anything with surety about the population. For example, an 85% confidence interval means that we can be only 85% sure that the unknown parameter lies within the given range of values.

Hence, this concludes the definition of Interval along with its overview.

Browse the definition and meaning of more terms similar to Interval. The Management Dictionary covers over 7000 business concepts from 6 categories.

Search & Explore : Management Dictionary

Share this Page on:
Facebook ShareTweetShare on Linkedin