/
Copyright © Cengage Learning. All rights reserved.   5 Joint Probability Distributions Copyright © Cengage Learning. All rights reserved.   5 Joint Probability Distributions

Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions - PowerPoint Presentation

min-jolicoeur
min-jolicoeur . @min-jolicoeur
Follow
345 views
Uploaded On 2019-11-01

Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions - PPT Presentation

Copyright Cengage Learning All rights reserved 5 Joint Probability Distributions and Random Samples Copyright Cengage Learning All rights reserved 54 The Distribution of the Sample Mean ID: 761828

sample distribution central normal distribution sample normal central limit population probability approximately standard large distributions clt deviation applications random

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Copyright © Cengage Learning. All right..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples

Copyright © Cengage Learning. All rights reserved. 5.4 The Distribution of the Sample Mean

The Distribution of the Sample MeanThe importance of the sample mean springs from its use in drawing conclusions about the population mean . Some of the most frequently used inferential procedures are based on properties of the sampling distribution of .A preview of these properties appeared in the calculations and simulation experiments of the previous section, where we noted relationships between E( ) and  and also among V( ),  2, and n.

The Distribution of the Sample MeanProposition

The Distribution of the Sample MeanAccording to Result 1, the sampling (i.e., probability) distribution of is centered precisely at the mean of the population from which the sample has been selected. Result 2 shows that the distribution becomes more concentrated about  as the sample size n increases.In marked contrast, the distribution of To becomes more spread out as n increases.Averaging moves probability in toward the middle, whereas totaling spreads probability out over a wider and wider range of values.

The Distribution of the Sample MeanThe standard deviation is often called the standard error of the mean; it describes the magnitude of a typical or representative deviation of the sample mean from the population mean.

Example 5.25 In a notched tensile fatigue test on a titanium specimen, the expected number of cycles to first acoustic emission (used to indicate crack initiation) is  = 28,000, and the standard deviation of the number of cycles is  = 5000.Let X1, X2, . . . , X 25 be a random sample of size 25, where each Xi is the number of cycles on a different randomly selected specimen.Then the expected value of the sample mean number of cycles until first emission is E( ) = 28,000, and the expected total number of cycles for the 25 specimens is E ( T o) = n  = 25(28,000) = 700,000.

Example 5.25 The standard deviation of (standard error of the mean) and of To are If the sample size increases to n = 100, E( ) is unchanged, but = 500, half of its previous value (the sample size must be quadrupled to halve the standard deviation of ). cont’d

The Case of a Normal Population Distribution

The Case of a Normal Population DistributionProposition We know everything there is to know about the and T o distributions when the population distribution is normal. In particular, probabilities such as P(a   b) and P( c  To  d ) can be obtained simply by standardizing.

The Case of a Normal Population DistributionFigure 5.15 illustrates the proposition. A normal population distribution and sampling distributions Figure 5.15

Example 5.26 The distribution of egg weights (g) of a certain type is normal with mean value 53 and standard deviation .3 (consistent with data in the article “Evaluation of Egg Quality Traits of Chickens Reared under Backyard System in Western Uttar Pradesh” (Indian J. of Poultry Sci., 2009: 261–262)).Let denote the weights of a dozen randomly selected eggs; these ’s constitute a random sample of size 12 from the specified normal distribution  

Example 5.26 The total weight of the 12 eggs is it is normally distributed with mean value E( ) = = 12(53) = 636 and variance V ( ) = n =12 = 1.08 . The probability that the total weight is between 635 and 640 is now obtained by standardizing and referring to Appendix Table A.3 :   cont’d

Example 5.26 If cartons containing a dozen eggs are repeatedly selected, in the long run slightly more than 83% of the eggs in a carton will weigh in total between 635 g and 640 g.Notice that 635 640 is equivalent to 52.9167 X 53.3333 (divide each term in the original system of inequalities by 12). Thus P (52.9167 53.3333) .8315. This latter probability can also be obtained by standardizing X directly.   cont’d

Example 5.26Now consider randomly selecting just four of these eggs. The sample mean weight is then normally distributed with mean value 53 and standard deviation = .3/ The probability that the sample mean weight exceeds 53.5 g is then Because 53.5 is 3.33 standard deviations (of ) larger than the mean value 53, it is exceedingly unlikely that the sample mean will exceed 53.5.  

The Central Limit Theorem

The Central Limit TheoremWhen the Xi ’s are normally distributed, so is for every sample size n. The derivations in Example 5.21 and simulation experiment of Example 5.24 suggest that even when the population distribution is highly nonnormal, averaging produces a distribution more bell-shaped than the one being sampled A reasonable conjecture is that if n is large, a suitable normal curve will approximate the actual distribution of . The formal statement of this result is the most important theorem of probability.

The Central Limit TheoremTheorem

The Central Limit TheoremFigure 5.16 illustrates the Central Limit Theorem. The Central Limit Theorem illustrated Figure 5.16

The Central Limit TheoremAccording to the CLT, when n is large and we wish to calculate a probability such as P(a   b), we need only “pretend” that is normal, standardize it, and use the normal table.The resulting answer will be approximately correct. The exact answer could be obtained only by first finding the distribution of , so the CLT provides a truly impressive shortcut.

Example 5.27 The amount of a particular impurity in a batch of a certain chemical product is a random variable with mean value 4.0 g and standard deviation 1.5 g.If 50 batches are independently prepared, what is the (approximate) probability that the sample average amount of impurity is between 3.5 and 3.8 g? According to the rule of thumb to be stated shortly, n = 50 is large enough for the CLT to be applicable.

Example 5.27 then has approximately a normal distribution with mean value = 4.0 and so cont’d

Example 5.27Now consider randomly selecting 100 batches, and let represent the total amount of impurity in these batches. Then the mean value and standard deviation of are100(4) = 400 and (1.5) = 15, respectively, and the CLT implies that has approximately a normal distribution. The probability that this total is at most 425 g is  

The Central Limit TheoremThe CLT provides insight into why many random variables have probability distributions that are approximately normal. For example, the measurement error in a scientific experiment can be thought of as the sum of a number of underlying perturbations and errors of small magnitude.A practical difficulty in applying the CLT is in knowing when n is sufficiently large. The problem is that the accuracy of the approximation for a particular n depends on the shape of the original underlying distribution being sampled.

The Central Limit TheoremIf the underlying distribution is close to a normal density curve, then the approximation will be good even for a small n, whereas if it is far from being normal, then a large n will be required. There are population distributions for which even an n of 40 or 50 does not suffice, but such distributions are rarely encountered in practice.

The Central Limit TheoremOn the other hand, the rule of thumb is often conservative; for many population distributions, an n much less than 30 would suffice. For example, in the case of a uniform population distribution, the CLT gives a good approximation for n  12.

Other Applications of the Central Limit Theorem

Other Applications of the Central Limit TheoremThe CLT can be used to justify the normal approximation to the binomial distribution discussed in Chapter 4.Recall that a binomial variable X is the number of successes in a binomial experiment consisting of n independent success/failure trials with p = P(S) for any particular trial. Define a new rv X1 byand define X2, X 3 , . . . , Xn analogously for the other n – 1 trials. Each X i indicates whether or not there is a success on the corresponding trial.

Other Applications of the Central Limit TheoremBecause the trials are independent and P(S) is constant from trial to trial, the Xi ’s are iid (a random sample from a Bernoulli distribution).The CLT then implies that if n is sufficiently large, both the sum and the average of the Xi ’s have approximately normal distributions.

Other Applications of the Central Limit TheoremWhen the X i’s are summed, a 1 is added for every S that occurs and a 0 for every F, so X1 + . . . + Xn = X. The sample mean of the Xi’s is X/ n, the sample proportion of successes.That is, both X and X/n are approximately normal when n is large.

Other Applications of the Central Limit TheoremThe necessary sample size for this approximation depends on the value of p: When p is close to .5, the distribution of each Xi is reasonably symmetric (see Figure 5.20), whereas the distribution is quite skewed when p is near 0 or 1. Using the approximation only if both np  10 and n(1  p)  10 ensures that n is large enough to overcome any skewness in the underlying Bernoulli distribution. Two Bernoulli distributions: (a) p = .4 (reasonably symmetric); (b) p = .1 (very skewed) Figure 5.20 (a) (b)

Other Applications of the Central Limit TheoremConsider n independent Poisson rv’s , … , , each having mean value . It can be shown that X = + … 1 has a Poisson distribution with mean value ( because in general a sum of independent Poisson rv’s has a Poisson distribution). The CLT then implies that a Poisson rv with sufficiently large has approximately a normal distribution. A common rule of thumb for this is 20.  

Other Applications of the Central Limit TheoremLastly, recall from Section 4.5 that X has a lognormal distribution if ln(X) has a normal distribution. Let be a random sample from a distribution for which only positive values are possible [ P ( 0) = 1]. Then if n is sufficiently large, the product Y = has approximately a lognormal distribution . To verify this, note that  

Other Applications of the Central Limit TheoremSince ln(Y) is a sum of independent and identically distributed rv’s [the ln(Xi)s], it is approximately normal when n is large, so Y itself has approximately a lognormal distribution. As an example of the applicability of this result, Bury (Statistical Models in Applied Science, Wiley, p. 590) argues that the damage process in plastic flow andcrack propagation is a multiplicative process, so that variables such as percentage elongation and rupture strength have approximately lognormal distributions.