Sensitivity

Posted in Statistics, Total Reads: 684
Advertisements

Definition: Sensitivity

Sensitivity is used to determine the proportion of actual positives which are correctly identified under a given set of assumptions.


Sensitivity is also called the true positive rate. It determines how different values of an independent variable will impact a particular dependent variable. Sensitivity is used to find out positive results out of the total results.


Formula:

The sensitivity of a test can be written as:

Sensitivity = (Number of True Positives ) / Number of True positives + Number of False Positives


Sensitivity is prevalence-independent test characteristic, as its values are intrinsic to the test.


Application:

Sensitivity is used in Medical Treatment where it helps in identifying the diseases.

The understanding of the following terms is very important to determine the sensitivity in clinical test:

• True positive

o the test is positive and the patient has the disease

• False positive

o the test is positive but the patient does not have the disease.

• True negative

o the test is negative and the patient does not have the disease

• False negative

o the test is negative but still the patient has the disease


The sensitivity of a clinical test refers to the ability of the test to correctly identify those patients with the disease.


If the test has 100% sensitivity, it shows that all patients have diseases. If the test has 80% sensitivity, it shows 80% patients have diseases (True Positive) but 20% have no diseases (False Negative). If the disease is serious and treatable then higher sensitivity is important (e.g. cervical cancer).


Example:

Screening the female population by cervical smear testing is a sensitive test.

 

Advertisements



Looking for Similar Definitions & Concepts, Search Business Concepts