/
Lecture 10 Probability Distributions Lecture 10 Probability Distributions

Lecture 10 Probability Distributions - PowerPoint Presentation

jade
jade . @jade
Follow
69 views
Uploaded On 2023-10-04

Lecture 10 Probability Distributions - PPT Presentation

John Rundle Econophysics PHYS 255 Probability Distributions Q Why should we care about probability distributions Why not just focus on the data A Outliers We want to know how probable are the outliers of large market moves so we can control our exposure and risk ID: 1022966

wiki distribution org wikipedia distribution wiki wikipedia org normal distributionhttps probability binomial poisson random cauchy distributions pareto log lorentz

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Lecture 10 Probability Distributions" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. Lecture 10Probability DistributionsJohn Rundle Econophysics PHYS 255

2. Probability DistributionsQ: Why should we care about probability distributions? Why not just focus on the data?A: Outliers. We want to know how probable are the outliers of large market moves, so we can control our exposure and risk

3. Probability Distributions: Gaussianhttps://en.wikipedia.org/wiki/Normal_distributionDensity FunctionCumulative Distribution FunctionIn probability theory, the normal (or Gaussian) distribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known.

4. The normal distribution is useful because of the central limit theorem. In its most general form, under some conditions (which include finite variance), it states that averages of random variables independently drawn from independent distributions converge in distribution to the normal, that is, become normally distributed when the number of random variables is sufficiently large. Physical quantities that are expected to be the sum of many independent processes (such as measurement errors) often have distributions that are nearly normal.Moreover, many results and methods (such as propagation of uncertainty and least squares parameter fitting) can be derived analytically in explicit form when the relevant variables are normally distributed.Example: In a large number of trials of a Binomial distribution, sums of random deviates from these trials will converge to a Normal distribution as the number of deviates in the sums increase.Probability Distributions: Gaussianhttps://en.wikipedia.org/wiki/Normal_distribution

5. Probability Distributions: Gaussianhttps://en.wikipedia.org/wiki/Normal_distribution

6. Standard Normal Distributionhttps://en.wikipedia.org/wiki/Normal_distribution

7. Gaussian Distribution: CDFhttps://en.wikipedia.org/wiki/Normal_distribution

8. Gaussian Distribution: CDFhttps://en.wikipedia.org/wiki/Normal_distributionAlso called the complement to the error function, erfc

9. Confidence Intervalshttps://en.wikipedia.org/wiki/Normal_distribution

10. Central Limit Theoremhttps://en.wikipedia.org/wiki/Normal_distribution

11. Quantile Definitionhttps://en.wikipedia.org/wiki/QuantileIn statistics and probability quantiles are cut points dividing the range of a probability distribution into continuous intervals with equal probabilities, or dividing the observations in a sample in the same way. There is one less quantile than the number of groups created. Thus quartiles are the three cut points that will divide a dataset into four equal-sized groups.

12. Summary of Normal Distributionhttps://en.wikipedia.org/wiki/Normal_distribution

13. Log Normal Distributionhttps://en.wikipedia.org/wiki/Log-normal_distributionIn probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable is log-normally distributed, then Y = ln ⁡(X) has a normal distribution. Likewise, if Y has a normal distribution, then X = exp⁡(Y) has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values. Whereas the Normal distribution describes a sum of random variables, the LogNormal distribution describes a product of random variablesStart Here 2/6/2023

14. Summary of LogNormal Distributionhttps://en.wikipedia.org/wiki/Log-normal_distribution

15. Log Normal Distributionhttps://en.wikipedia.org/wiki/Log-normal_distribution

16. Application of LogNormal Distributionto Stock PricesBecause of compounding, the change in price after N days (in %) will be a product of N factors.However, in real markets, there are non-random factors that lead to non-random (persistent) changes in stock prices.

17. Student’s t-Distributionhttps://en.wikipedia.org/wiki/Student's_t-distributionStudent's t-distribution (or simply the t-distribution) is any member of a family of continuous probability distributions that arises when estimating the mean of a normally distributed population in situations where the sample size is small and population standard deviation is unknown. Whereas a normal distribution describes a full population of deviates, t-distributions describe samples drawn from a full populationAccordingly, the t-distribution for each sample size is different, and the larger the sample, the more the distribution resembles a normal distribution.

18. Student’s t-Distributionhttps://en.wikipedia.org/wiki/Student's_t-distribution

19. Student’s t-Distributionhttps://en.wikipedia.org/wiki/Student's_t-distribution

20. Lorentz (Cauchy) Distributionhttps://en.wikipedia.org/wiki/Cauchy_distributionThe Cauchy distribution, named after Augustin Cauchy, is a continuous probability distribution. It is also known, especially among physicists, as the Lorentz distribution (after Hendrik Lorentz), Cauchy–Lorentz distribution, Lorentz(ian) function, or Breit–Wigner distribution. The Cauchy distribution is often used in statistics as the canonical example of a "pathological" distribution since both its mean and its variance are undefinedIts importance in physics is the result of it being the solution to the differential equation describing forced resonance.In spectroscopy, it is the description of the shape of spectral lines which are subject to homogeneous broadening in which all atoms interact in the same way with the frequency range contained in the line shape.

21. Lorentz (Cauchy) Density Functionhttps://en.wikipedia.org/wiki/Cauchy_distribution

22. Lorentz (Cauchy) Distributionhttps://en.wikipedia.org/wiki/Cauchy_distributionDensity FunctionCumulative Distribution Function

23. Lorentz (Cauchy) Distribution: CDFhttps://en.wikipedia.org/wiki/Cauchy_distribution

24. Summary of Lorentz Distributionhttps://en.wikipedia.org/wiki/Cauchy_distribution

25. Poisson Distributionhttps://en.wikipedia.org/wiki/Poisson_distributionIn probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since the last event.The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.The Poisson distribution is a 1-parameter distributionThe time intervals between arrivals in a Poisson process are exponentially distributed at the Poisson rate

26. Poisson Distributionhttps://en.wikipedia.org/wiki/Poisson_distribution

27. Poisson Distributionhttps://en.wikipedia.org/wiki/Poisson_distributionA 1-parameter distribution

28. Poisson Distributionhttps://en.wikipedia.org/wiki/Poisson_distribution

29.

30. Exponential Distributionhttps://en.wikipedia.org/wiki/Exponential_distributionThe exponential distribution is the probability distribution that describes the time between events in a Poisson process, i.e. a process in which events occur continuously and independently at a constant average rate. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of having no memory of prior events. In addition to being used for the analysis of Poisson processes, it is found in various other contexts.

31. Exponential Distributionhttps://en.wikipedia.org/wiki/Exponential_distribution

32. Exponential Distributionhttps://en.wikipedia.org/wiki/Exponential_distribution

33.

34. Binomial Distribution https://en.wikipedia.org/wiki/Binomial_distributionThe binomial distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. Example: Coin flipsA success/failure experiment is also called a Bernoulli experiment or Bernoulli trialWhen n = 1, the binomial distribution is a Bernoulli distribution. The binomial distribution is the basis for the popular binomial test of statistical significance.The binomial distribution is frequently used to model the number of successes in a sample of size n drawn with replacement from a population of size N.

35. Binomial Distribution https://en.wikipedia.org/wiki/Binomial_distribution

36. Binomial Distribution https://en.wikipedia.org/wiki/Binomial_distribution

37. Binomial Distribution https://en.wikipedia.org/wiki/Binomial_distribution

38. Weibull Distribution https://en.wikipedia.org/wiki/Binomial_distribution

39. Weibull Distribution https://en.wikipedia.org/wiki/Binomial_distributionIf the quantity X is a "time-to-failure", the Weibull distribution gives a distribution for which the failure rate is proportional to a power of timeA value of k < 1 indicates that the failure rate decreases over time. This happens if there is significant "infant mortality", or defective items failing early.A value of k = 1 indicates that the failure rate is constant over time. This might suggest random external events are causing mortality, or failure. In this case the Weibull distribution reduces to an exponential distribution A value of k > 1 indicates that the failure rate increases with time. This happens if there is an "aging" process, or parts that are more likely to fail as time goes on.

40. Weibull Distribution https://en.wikipedia.org/wiki/Binomial_distribution

41. Pareto Distributionhttps://en.wikipedia.org/wiki/Pareto_distributionThe Pareto distribution, named after the Italian civil engineer, economist, and sociologist Vilfredo Pareto, is a power law probability distribution that is used in description of social, scientific, geophysical, actuarial, and many other types of observable phenomena.

42. Pareto Distributionhttps://en.wikipedia.org/wiki/Pareto_distribution

43. Pareto Distributionhttps://en.wikipedia.org/wiki/Pareto_distribution

44. Pareto Distributionhttps://en.wikipedia.org/wiki/Pareto_distribution