 137K - views

# Topic Random Variables and Distribution Functions

1 Introduction statistics probability universe of sample space information and probability ask a question and de64257ne a random collect data variable organize into the organize into the empirical cumulative cumulative distribution function distrib

## Topic Random Variables and Distribution Functions

Download Pdf - The PPT/PDF document "Topic Random Variables and Distribution..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

## Presentation on theme: "Topic Random Variables and Distribution Functions"ŌĆö Presentation transcript:

Page 1
Topic 7 Random Variables and Distribution Functions 7.1 Introduction statistics probability universe of sample space - information and probability - ask a question and deﬁne a random collect data variable organize into the organize into the empirical cumulative cumulative distribution function distribution function compute sample compute distributional means and variances means and variances Table I: Corresponding notions between statistics and probability. Examining probabilities models and random variables will lead to strategies for the collection of data and

inference from these data. From the universe of possible information, we ask a question. To address this question, we might col- lect quantitative data and organize it, for example, using the empirical cumulative distribution func- tion. With this information, we are able to com- pute sample means, standard deviations, medians and so on. Similarly, even a fairly simple probability model can have an enormous number of outcomes. For example, ﬂip a coin 332 times. Then the num- ber of outcomes is more than a google ( 10 100 ) a number at least 100 quintillion times the num- ber of

elementary particles in the known universe. We may not be interested in an analysis that con- siders separately every possible outcome but rather some simpler concept like the number of heads or the longest run of tails. To focus our attention on the issues of interest, we take a given outcome and compute a number. This function is called a ran- dom variable Deﬁnition 7.1. random variable is a real val- ued function from the probability space. Generally speaking, we shall use capital letters near the end of the alphabet, e.g., X,Y,Z for random variables. The range of a random variable

is sometimes called the state space Exercise 7.2. Roll a die twice and consider the sample space i,j ); i,j =1 and give some random variables on Exercise 7.3. Flip a coin 10 times and consider the sample space , the set of 10-tuples of heads and tails, and give some random variables on 101
Page 2
Introduction to the Science of Statistics Random Variables and Distribution Functions We often create new random variables via composition of functions: # # )) Thus, if is a random variable, then so are exp X, +1 tan X, and so on. The last of these, rounding down to the nearest integer, is

called the ﬂoor function Exercise 7.4. How would we use the ﬂoor function to round down a number to decimal places. 7.2 Distribution Functions Having deﬁned a random variable of interest, , the question typically becomes, ōWhat are the chances that lands in some subset of values ?ö For example, odd numbers ,B greater than 1 or between 2 and 7 We write (7.1) to indicate those outcomes which have , the value of the random variable, in the subset . We shall often abbreviate (7.1) to the shorter statement . Thus, for the example above, we may write the events is an odd number

is greater than 1 X> is between 2 and 7 to correspond to the three choices above for the subset Many of the properties of random variables are not concerned with the speciﬁc random variable given above, but rather depends on the way distributes its values . This leads to a deﬁnition in the context of random variables that we saw previously with quantitive data.. Deﬁnition 7.5. (cumulative) distribution function of a random variable is deﬁned by )= Recall that with quantitative observations, we called the analogous notion the empirical cumulative distribution

function . Using the abbreviated notation above, we shall typically write the less explicit expression )= for the distribution function. Exercise 7.6. Establish the following identities that relate a random variable the complement of an event and the union and intersection of events 1. 2. For sets ,B ,... and 3. If ,...B form a partition of the sample space , then =1 ,...,n form a partition of the probability space 102
Page 3
Introduction to the Science of Statistics Random Variables and Distribution Functions Exercise 7.7. For a random variable and subset of the sample space ,

deﬁne )= Show that is a probability. For the complement of , we have the survival function )= X>x =1 =1 Choose a , then the event . Their set theoretic difference }∩{ a In words, the event that is less than or equal to but not less than or equal to is the event that is greater than and less than or equal to . Consequently, by the difference rule for probabilities, a }∩{ )= (7.2) Thus, we can compute the probability that a random variable takes values in an interval by subtracting the distri- bution function evaluated at the endpoints of the intervals. Care is needed on the

issue of the inclusion or exclusion of the endpoints of the interval. Example 7.8. To give the cumulative distribution function for , the sum of the values for two rolls of a die, we start with the table 2 3 4 5 6 7 8 9 10 11 12 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 and create the graph. 1 2 3 4 5 6 7 8 9 10 11 12 1/4 1/2 3/4 Figure 7.1: Graph of , the cumulative distribution function for the sum of the values for two rolls of a die. 103
Page 4
Introduction to the Science of Statistics Random Variables and Distribution Functions If we look at the graph of this

cumulative distribution function, we see that it is constant in between the possible values for and that the jump size at is equal to . In this example, =5 =4 36 , the size of the jump at =5 . In addition, (5) (2) = =3 =4 =5 36 36 36 36 We shall call a random variable discrete if it has a ﬁnite or countably inﬁnite state space. Thus, we have in general that: a a Exercise 7.9. Let be the number of heads on three independent ﬂips of a biased coin that turns ups heads with probability . Give the cumulative distribution function for Exercise 7.10. Let be the number of spades

in a collection of three cards. Give the cumulative distribution function for . Use to plot this function. Exercise 7.11. Find the cumulative distribution function of in terms of , the distribution function for 7.3 Properties of the Distribution Function A distribution function has the property that it starts at 0, ends at 1 and does not decrease with increasing values of .. This is the content of the next exercise. Exercise 7.12. 1. lim )=0 2. lim )=1 3. is nondecreasing. Figure 7.2: (top) Dartboard. (bottom) Cumula- tive distribution function for the dartboard ran- dom variable. The

cumulative distribution function of a discrete random variable is constant except for jumps. At the jump, is right continuous lim )= The next exercise ask that this be shown more generally. Exercise 7.13. Prove the statement concerning the right continuity of the distribution function from the continuity property of a probability. Deﬁnition 7.14. continuous random variable has a cumulative distribu- tion function that is differentiable. So, distribution functions for continuous random variables increase smoothly. To show how this can occur, we will develop an example of a continuous

random variable. Example 7.15. Consider a dartboard having unit radius. Assume that the dart lands randomly uniformly on the dartboard. Let be the distance from the center. For [0 1] )= area inside circle of radius area of circle 104
Page 5
Introduction to the Science of Statistics Random Variables and Distribution Functions Thus, we have the distribution function )= if if if x> The ﬁrst line states that cannot be negative. The third states that is at most 1, and the middle lines describes how distributes is values between 0 and 1. For example, indicates that with probability

1/4, the dart will land within 1/2 unit of the center of the dartboard. Exercise 7.16. Find the probability that the dart lands between 1/3 unit and 2/3 unit from the center. Exercise 7.17. Let the reward for through the dart be the inverse /X of the distance from the center. Find the cumulative distribution function for Exercise 7.18. An exponential random variable has cumulative distribution function )= if exp( if x> (7.3) for some . Show that has the properties of a distribution function. Its value at can be computed in using the command pexp(x,0.1) for =1 10 and drawn using >

curve(pexp(x,0.1),0,80) Figure 7.3: Cumulative distribution function for an exponential random variable with =1 10 Exercise 7.19. The time until the next bus arrives is an exponential random variable with =1 10 minutes. A person waits for a bus at the bus stop until the bus arrives, giving up when the wait reaches 20 minutes. Give the cumulative distribution function for , the time that the person remains at the bus station and sketch a graph. Even though the cumulative distribution function is deﬁned for every random variable, we will often use other characterizations, namely, the mass

function for discrete random variable and the density function for continuous random variables. Indeed, we typically will introduce a random variable via one of these two functions. In the next two sections we introduce these two concepts and develop some of their properties. 105
Page 6
Introduction to the Science of Statistics Random Variables and Distribution Functions 7.4 Mass Functions Deﬁnition 7.20. The (probability) mass function of a discrete random variable is )= The mass function has two basic properties: for all in the state space. )=1 The ﬁrst property is

based on the fact that probabilities are non-negative. The second follows from the observation that the collection )= for all , the state space for , forms a partition of the probability space . In Example 7.8, we saw the mass function for the random variable that is the sum of the values on two independent rolls of a fair dice. Example 7.21. LetÆs make tosses of a biased coin whose outcomes are independent. We shall continue tossing until we obtain a toss of heads. Let denote the random variable that gives the number of tails before the ﬁrst head and denote the probability of heads in

any given toss. Then (0) = =0 (1) = =1 TH =(1 (2) = =2 TTH =(1 )= TH =(1 So, the probability mass function )=(1 . Because the terms in this mass function form a geometric sequence, is called a geometric random variable . Recall that a geometric sequence c,cr,cr ,...,cr has sum cr cr cr (1 +1 for =1 . If , then lim =0 and thus has a limit as . In this case, the inﬁnite sum is the limit cr cr cr =lim Exercise 7.22. Establish the formula above for The mass function above forms a geometric sequence with the ratio =1 . Consequently, for positive integers and a +1 (1 =(1 +1 +(1 (1 +1 (1 +1 (1

=(1 +1 (1 +1 We can take =0 to ﬁnd the distribution function for a geometric random variable. )= =1 (1 +1 Exercise 7.23. Give a second way to ﬁnd the distribution function above by explaining why X>b =(1 +1 106
Page 7
Introduction to the Science of Statistics Random Variables and Distribution Functions The mass function and the cumulative distribution function for the geometric random variable with parameter =1 can be found in by writing > x<-c(0:10) > f<-dgeom(x,1/3) > F<-pgeom(x,1/3) The initial indicates ensity and indicates the robability from the distribution

function. > data.frame(x,f,F) xfF 1 0 0.333333333 0.3333333 2 1 0.222222222 0.5555556 3 2 0.148148148 0.7037037 4 3 0.098765432 0.8024691 5 4 0.065843621 0.8683128 6 5 0.043895748 0.9122085 7 6 0.029263832 0.9414723 8 7 0.019509221 0.9609816 9 8 0.013006147 0.9739877 10 9 0.008670765 0.9826585 11 10 0.005780510 0.9884390 Note that the difference in values in the distribution function 1) , giving the height of the jump in at , is equal to the value of the mass function. For example, (3) (2) = 0 7037037 5555556 = 0 148148148 = (2) Exercise 7.24. Check that the jumps in the cumulative

distribution function for the geometric random variable above is equal to the values of the mass function. Exercise 7.25. For the geometric random variable above, ﬁnd X> We can simulate 100 geometric random variables with parameter =1 using the command rgeom(100,1/3) (See Figure 7.4.) Figure 7.4: Histogram of 100 and 10,000 simulated geometric random variables with =1 . Note that the histogram looks much more like a geometric series for 10,000 simulations. We shall see later how this relates to the law of large numbers. 107
Page 8
Introduction to the Science of Statistics

Random Variables and Distribution Functions 7.5 Density Functions Deﬁnition 7.26. For a random variable whose distribution function has a derivative. The function satisfying )= dt is called the probability density function and is called a continuous random variable By the fundamental theorem of calculus, the density function is the derivative of the distribution function. )= lim In other words, x. We can compute probabilities by evaluating deﬁnite integrals a )= dt. Figure 7.5: The probability a is the area under the density function, above the axis between and The density

function has two basic properties that mirror the properties of the mass function: for all in the state space. dx =1 Return to the dart board example, letting be the dis- tance from the center of a dartboard having unit radius. Then, x =2 and has density )= if x< if if x> Exercise 7.27. Let be the density for a random variable and pick a number . Explain why =0 Example 7.28. For the exponential distribution function (7.3), we have the density function )= if if x> Example 7.29. Density functions do not need to be bounded, for example, if we take )= if if if x. 108
Page 9
Introduction

to the Science of Statistics Random Variables and Distribution Functions Then, to ﬁnd the value of the constant , we compute the integral 1= dt =2 =2 c. So =1 For a a dt a. Exercise 7.30. Give the cumulative distribution function for the random variable in the previous example. Exercise 7.31. Let be a continuous random variable with density , then the random variable aX has density )= (Hint: Begin with the deﬁnition of the cumulative distribution function for . Consider the cases a> and a< separately.) 7.6 Joint Distributions Because we will collect data on several observations,

we must, as well, consider more than one random variable at a time in order to model our experimental procedures. Consequently, we will expand on the concepts above to the case of multiple random variables and their joint distribution. For the case of two random variables, and , this means looking at the probability of events, ,X For discrete random variables, take and and deﬁne the joint probability mass function ,X ,x )= ,X For continuous random variables, we consider =( ,x and =( ,x and ask that for some function ,X , the joint probability density function to satisfy ,x ,X ,x Example

7.32. Generalize the notion of mass and density functions to more than two random variables. 7.6.1 Independent Random Variables Many of our experimental protocols will be designed so that observations are independent. More precisely, we will say that two random variables and are independent if any two events associated to them are independent, i.e., ,X In words, the probability that the two events and happen simultaneously is equal to the product of the probabilities that each of them happen individually. For independent discrete random variables, we have that ,X ,x )= ,X In this case, we say

that the joint probability mass function is the product of the marginal mass functions 109
Page 10
Introduction to the Science of Statistics Random Variables and Distribution Functions For continuous random variables, ,X ,x ,x Thus, for independent continuous random variables, the joint probability density function ,X ,x )= is the product of the marginal density functions Exercise 7.33. Generalize the notion of independent mass and density functions to more than two random variables. Soon, we will be looking at independent observations ,x ,...,x arising from an unknown density or

mass function . Thus, the joint density is Generally speaking, the density function will depend on the choice of a parameter value . (For example, the unknown parameter in the density function for an exponential random variable that describes the waiting time for a bus.) Given the data arising from the observations, the likelihood function arises by considering this joint density not as a function of ,...,x , but rather as a function of the parameter . We shall learn how the study of the likelihood plays a major role in parameter estimation and in the testing of hypotheses. 7.7 Simulating

Random Variables One goal for these notes is to provide the tools needed to design inferential procedures based on sound principles of statistical science. Thus, one of the very important uses of statistical software is the ability to generate pseudo- data to simulate the actual data. This provides the opportunity to test and reﬁne methods of analysis in advance of the need to use these methods on genuine data. This requires that we explore the properties of the data through simulation. For many of the frequently used families of random variables, provides commands for their simulation.

We shall examine these families and their properties in Topic 9, Examples of Mass Functions and Densities . For other circumstances, we will need to have methods for simulating sequence of independent random variables that possess a common distribution. We ﬁrst consider the case of discrete random variables. 7.7.1 Discrete Random Variables and the sample Command The sample command is used to create simple and stratiﬁed random samples. Thus, if we enter a sequence sample(x,40) chooses 40 entries from in such a way that all choices of size 40 have the same probability. This uses

the default command of sampling without replacement . We can use this command to simulate discrete random variables. To do this, we need to give the state space in a vector and a mass function . The call for replace=TRUE indicates that we are sampling with replacement . Then to give a sample of independent random variables having common mass function , we use sample(x,n,replace=TRUE,prob=f) Example 7.34. Let be described by the mass function 0.1 0.2 0.3 0.4 Then to simulate 50 independent observations from this mass function: > x<-c(1,2,3,4) > f<-c(0.1,0.2,0.3,0.4) > sum(f) 110
Page

11
Introduction to the Science of Statistics Random Variables and Distribution Functions  1 > data<-sample(x,50,replace=TRUE,prob=f) > data  1 4 4 4 4 4 3 3 4 3 3 2 3 3 3 4 4 3 3 2 4 1 3 3 4 2 3 3 3 1 2 4 3 2 3 4 4 4 4 2 4 1  2 3 4 4 1 4 3 4 Notice that 1 is the least represented value and 4 is the most represented. If the command prob=f is omitted, then sample will choose uniformly from the values in the vector . LetÆs check our simulation against the mass function that generated the data. (Notice the double equal sign, == .) First, recount the observations that take on each

possible value for . We can make a table. > table(data) data 1234 5 7 18 20 or use the counts to determine the simulated proportions. > counts<-rep(0,max(x)-min(x)+1) > for (i in min(x):max(x)){counts[i]<-length(data[data==i])} > simprob<-counts/(sum(counts)) > data.frame(x,f,simprob) x f simprob 1 1 0.1 0.10 2 2 0.2 0.14 3 3 0.3 0.36 4 4 0.4 0.40 Exercise 7.35. Simulate the sums on each of 20 rolls of a pair of dice. Repeat this for 1000 rolls and compare the simulation with the appropriate mass function. Figure 7.6: Illustrating the Probability Transform . First simulate uniform random

variables ,u ,...,u on the interval [0 1] . About 10% of the random numbers should be in the interval [0 4] . This corresponds to the 10% of the simulations on the interval [0 28 38] for a random variable with distribution function shown. Similarly, about 10% of the random numbers should be in the interval [0 8] which corresponds to the 10% of the simulations on the interval [0 96 51] for a random variable with distribution function , These values on the -axis can be obtained from taking the inverse function of , i.e., 111
Page 12
Introduction to the Science of Statistics Random

Variables and Distribution Functions 7.7.2 Continuous Random Variables and the Probability Transform If a continuous random variable with a density that is positive everywhere in its domain, then the distribution function )= is strictly increasing. In this case has a inverse function , known as the quantile function Exercise 7.36. if and only if The probability transform follows from an analysis of the random variable Note that has range from 0 to 1. It cannot take values below 0 or above 1. Thus, takes on values between 0 and 1. Thus, )=0 for u< and )=1 for For values of between 0 and 1, note

that )) = u. Thus, the distribution function for the random variable )= u< u< 11 u. If we can simulate , we can simulate a random variable with distribution via the quantile function (7.4) Take a derivative to see that the density )= u< 10 u< 01 u. Figure 7.7: The distribution function (red) and the empirical cumulative distribution function (black) based on 100 simula- tions of the dart board distribution. commands given below. Because the random variable has a constant density over the interval of its possible values, it is called uniform on the interval [0 1] . It is simulated in using the

runif command. The iden- tity (7.4) is called the probability transform . This transform is illustrated in Figure 7.6. We can see how the probability transform works in the following example. Example 7.37. For the dart board, for between 0 and 1, the distribution function )= and thus the quantile function )= u. We can simulate independent observations of the distance from the center ,X ,...,X of the dart board by simulating inde- pendent uniform random variables ,U ,...U and taking the quantile function 112
Page 13
Introduction to the Science of Statistics Random Variables and

Distribution Functions > u<-runif(100) > x<-sqrt(u) > xd<-seq(0,1,0.01) > plot(sort(x),1:length(x)/length(x), type="s",xlim=c(0,1),ylim=c(0,1), xlab="x",ylab="probability") > par(new=TRUE) > plot(xd,xdł2,type="l",xlim=c(0,1),ylim=c(0,1),xlab="",ylab="",col="red") Exercise 7.38. If is uniform on [0 1] , then so is =1 Sometimes, it is easier to simulate using Example 7.39. For an exponential random variable, set )=1 exp( and thus ln(1 Consequently, we can simulate independent exponential random variables ,X ,...,X by simulating independent uniform random variables ,V ,...V and taking the

transform ln accomplishes this directly through the rexp command. 7.8 Answers to Selected Exercises 7.2. The sum, the maximum, the minimum, the difference, the value on the ﬁrst die, the product. 7.3. The roll with the ﬁrst , the number of , the longest run of , the number of s after the ﬁrst 7.4. 10 10 7.6. A common way to show that two events and are equal is to pick an element and show that it is in . This proves . Then pick an element and show that it is in , proving that . Taken together, we have that the events are equal, . Sometimes the logic needed in showing

consist not solely of implications, but rather of equivalent statements. (We can indicate this with the symbol .) In this case we can combine the two parts of the argument. For this exercise, as the lines below show, this is a successful strategy. We follow an arbitrary outcome 1. . Thus, 2. for some for some . Thus, The identity with intersection is similar with for all instead of for some 3. We must show that the union of the is equal to the state space and that each pair are mutually exclusive. For this (a) Because are a partition of , and S, the state space. 113
Page 14

Introduction to the Science of Statistics Random Variables and Distribution Functions (b) For , and 7.7. LetÆs check the three axioms. Each veriﬁcation is based on the corresponding axiom for the probability 1. For any subset )= 2. For the sample space )= ) = 1 3. For mutually exclusive subsets ,i =1 , we have by the exercise above the mutually exclusive events ,i =1 . Thus, =1 =1 =1 =1 =1 7.9. For three tosses of a biased coin, we have 01 23 (1 (1 (1 Thus, the cumulative distribution function, )= for x< (1 for x< (1 +3 (1 =(1 (1 + 2 for x< (1 (1 + 2 )+3 (1 )=1 for x< for 7.10. From the

example in the section Basics of Probability , we know that 0123 0.41353 0.43588 0.13765 0.01294 To plot the distribution function, we use, > hearts<-c(0:3) > f<-choose(13,hearts) choose(39,3-hearts)/choose(52,3) > (F<-cumsum(f))  0.4135294 0.8494118 0.9870588 1.0000000 > plot(hearts,F,ylim=c(0,1),type="s") Thus, the cumulative distribution function, )= for x< 41353 for x< 84941 for x< 98706 for x< for 7.11. The cumulative distribution function for )= 7.12. To verify the three properties for the distri- bution function: 114
Page 15
Introduction to the Science of Statistics Random

Variables and Distribution Functions 1. Let be a decreasing sequence. Then >x Thus, For each outcome , eventually, for some >x , and and consequently no outcome is in all of the events and =1 Now, use the second continuity property of probabilities. 2. Let be an increasing sequence. Then Thus, For each outcome , eventually, for some , and =1 Now, use the ﬁrst continuity property of probabilities. 3. Let , then and by the monotonicity rule for probabilities or written in terms of the distribution function ,F 7.13. Let be a strictly decreasing sequence. Then >x =1 (Check this last

equality.) Then . Now, use the second continuity property of probabilities to obtain lim )=lim Because this holds for every strictly decreasing sequencing sequence with limit , we have that lim )= 7.15. Using the identity in (7.2), we have Check Exercise 7.22 to see that the answer does not depend on whether or not the endpoints of the interval are included. 7.17. Using the relation =1 /X , we ﬁnd that the distribution function for )= /X /y =1 X< /y =1 Thus uses the fact that =1 /y =0 7.18. We use the fact that the exponential function is increasing, and that lim exp( )=0 . Using the

numbering of the properties above 115
Page 16
Introduction to the Science of Statistics Random Variables and Distribution Functions 1. Because )=0 for all x< lim )=0 2. lim exp( )=0 . Thus, lim )=lim exp( )=1 3. For x< is constant, (0) = 0 . For , note that exp( is decreasing. Thus, )= exp( is increasing. Consequenlty, the distribution function is non-decreasing. 7.19. The distribution function has the graph shown in Figure 7.8. Figure 7.8: Cumulative distribution function for an exponential random variable with =1 10 and a jump at =20 The formula )= if x< exp( x/ 10) if x< 20 if 20

x. 7.22. For =1 , write the expressions for and rs and subtract. Notice that most of the terms cancel. cr cr cr rs cr cr cr cr +1 (1 cr +1 (1 +1 Now divide by to obtain the formula. 7.23 The event X>b is the same as having the ﬁrst +1 coin tosses turn up tails . Thus, the outcome is +1 independent events each with probability . Consequently, X>b =(1 +1 7.25. (3) = 8024691 (5) (2) = 0 8683128 5555556 = 0 3127572 and X> (4) = 1 8024691 = 0 1975309 7.27. Let be the density. Then x dx. Now the integral goes to 0 as . So, we must have =0 116
Page 17
Introduction to the Science of

Statistics Random Variables and Distribution Functions 7.28. Because the density is non-negative on the interval [0 1] )=0 if x< and )=1 if . For between 0 and 1, dt x. Thus, )= if if if x. 7.31. The random variable has distribution function )= aX aX For a> )= Now take a derivative and use the chain rule to ﬁnd the density )= )= For a< )= =1 Now the derivative )= )= 7.32. The joint density (mass function) for ,X ,...,X ,X ,...,X ,x ,...,x )= is the product of the marginal densities (mass functions). 7.35. Here is the code. > x<-c(2:12) >x  2 3 4 5 6 7 8 9 10 11 12 >

f<-c(1,2,3,4,5,6,5,4,3,2,1)/36 > sum(f)  1 > (twodice<-sample(x,20,replace=TRUE,prob=f))  9 7 3 9 3 6 9 5 5 5 5 10 10 12 9 8 6 8 11 8 > twodice<-sample(x,1000,replace=TRUE,prob=f) > counts<-rep(0,max(x)-min(x)+1) > for (i in min(x):max(x)){counts[i]<-length(twodice[twodice==i])} > freq=counts/(sum(counts)) > data.frame(x,f,freq[min(x):max(x)]) x f freq.min.x..max.x.. 1 2 0.02777778 0.031 2 3 0.05555556 0.054 117
Page 18
Introduction to the Science of Statistics Random Variables and Distribution Functions Figure 7.9: Sum on two fair dice. The empirical cumulative distribution

function from the simulation (in black) and the cumulative distribution function (in red) are shown for Exercise 7.31. 3 4 0.08333333 0.065 4 5 0.11111111 0.096 5 6 0.13888889 0.120 6 7 0.16666667 0.167 7 8 0.13888889 0.157 8 9 0.11111111 0.121 9 10 0.08333333 0.098 10 11 0.05555556 0.058 11 12 0.02777778 0.033 We also have a plot to compare the empirical cumulative distribution function from the simulation with the cumu- lative distribution function. > plot(sort(twodice),1:length(twodice)/length(twodice),type="s",xlim=c(2,12), ylim=c(0,1),xlab="",ylab="") > par(new=TRUE) >

plot(x,F,type="s",xlim=c(2,12),ylim=c(0,1),col="red") 7.39. is increasing and continuous, so the set is the interval ,F )] . In addition, is in this inverval precisely when 7.40. LetÆs ﬁnd . If v< , then =0 because is never greater than 1. Thus, )=0 Similarly, if =1 because is always greater than 0. Thus, )=1 . For v< )= =1 U< =1 (1 )= v. This matches the distribution function of a uniform random variable on [0 1] 118