 The standard deviation of a probability distribution, random variable, or population or multiset of values is a measure of the spread of its values.
 It is usually denoted with the letter s (lower case sigma). It is defined as the square root of the variance.
 Variance is the average of the squared differences between data points and the mean. Variance is tabulated in units squared. Standard deviation, being the square root of that quantity, therefore measures the spread of data about the mean, measured in the same units as the data.
Method to find the varience of a set of data:
 Find the mean.
 Find the difference between each value in the set of data and the mean.
 Square each difference.
 Find the mean of the squares.
Method to find the standard deviation of a set of data:
 Find the varience.
 Find the square root of the varience.
Example:
Find the standard deviation of the set of the numbers 4 and 8.
 Find the arithmetic mean (or average) of 4 and 8,
(4 + 8) / 2 = 6.
 Find the deviation of each number from the mean,
4 − 6 = − 2
8 − 6 = 2.
 Square each of the deviations (amplifying larger deviations and making negative values positive),
( − 2)^{2} = 4
2^{2} = 4.
 Sum the obtained squares (as a first step to obtaining an average),
4 + 4 = 8.
 Divide the sum by the number of values, which here is 2 (giving an average),
8 / 2 = 4.
 Take the nonnegative square root of the quotient (converting squared units back to regular units),
So, the standard deviation is 2.
Therefore the standard deviation is the root mean square (RMS) deviation of values from their arithmetic mean.
If many data points are close to the mean, then the standard deviation is small
if many data points are far from the mean, then the standard deviation is large.
If all the data values are equal, then the standard deviation is zero.
Example:
Directions: Answer the following questions. Also write at least 5 examples of your own.
