Relationships among probability distributions
In probability theory and statistics, there are several relationships among probability distributions. These relations can be categorized in the following groups:
- One distribution is a special case of another with a broader parameter space
- Transforms (function of a random variable);
- Combinations (function of several variables);
- Approximation (limit) relationships;
- Compound relationships (useful for Bayesian inference);
- Duality;
- Conjugate priors.
Special case of distribution parametrization
- A binomial (n, p) random variable with n = 1, is a Bernoulli (p) random variable.
- A negative binomial distribution with r = 1 is a geometric distribution.
- A gamma distribution with shape parameter α = 1 and scale parameter β is an exponential (β) distribution.
- A gamma (α, β) random variable with α = ν/2 and β = 2, is a chi-squared random variable with ν degrees of freedom.
- A chi-squared distribution with 2 degrees of freedom is an exponential distribution with mean 2 and vice versa.
- A Weibull (1, β) random variable is an exponential random variable with mean β.
- A beta random variable with parameters α = β = 1 is a uniform random variable.
- A beta-binomial (n, 1, 1) random variable is a discrete uniform random variable over the values 0 ... n.
- A random variable with a t distribution with one degree of freedom is a Cauchy(0,1) random variable.
Transform of a variable
Multiple of a random variable
Multiplying the variable by any positive real constant yields a scaling of the original distribution. Some are self-replicating, meaning that the scaling yields the same family of distributions, albeit with a different parameter: Normal distribution, Gamma distribution, Cauchy distribution, Exponential distribution, Erlang distribution, Weibull distribution, Logistic distribution, Error distribution, Power distribution, Rayleigh distribution.
Example:
- If X is a gamma random variable with parameters (r, λ), then Y=aX is a gamma random variable with parameters (r,λ/a).
Linear function of a random variable
The affine transform ax + b yields a relocation and scaling of the original distribution. The following are self-replicating: Normal distribution, Cauchy distribution, Logistic distribution, Error distribution, Power distribution, Rayleigh distribution.
Example:
- If Z is a normal random variable with parameters (μ=m, σ2=s2), then X=aZ+b is a normal random variable with parameters (μ=am+b, σ2=a2s2).
Reciprocal of a random variable
The reciprocal 1/X of a random variable X, is a member of the same family of distribution as X, in the following cases: Cauchy distribution, F distribution, log logistic distribution.
Examples:
- If X is a Cauchy (μ, σ) random variable, then 1/X is a Cauchy (μ/C, σ/C) random variable where C = μ2 + σ2.
- If X is an F(ν1, ν2) random variable then 1/X is an F(ν2, ν1) random variable.
Other cases
Some distributions are invariant under a specific transformation.
Example:
- If X is a beta (α, β) random variable then (1 - X) is a beta (β, α) random variable.
- If X is a binomial (n, p) random variable then (n - X) is a binomial (n, 1-p) random variable.
- If X has cumulative distribution function FX, then the inverse of the cumulative distribution F−1
X(X) is a standard uniform (0,1) random variable - If X is a normal (μ, σ2) random variable then eX is a lognormal (μ, σ2) random variable.
- Conversely, if X is a lognormal (μ, σ2) random variable then log X is a normal (μ, σ2) random variable.
- If X is an exponential random variable with mean β, then X1/γ is a Weibull (γ, β) random variable.
- The square of a standard normal random variable has a chi-squared distribution with one degree of freedom.
- If X is a Student’s t random variable with ν degree of freedom, then X2 is an F (1,ν) random variable.
- If X is a double exponential random variable with mean 0 and scale λ, then |X| is an exponential random variable with mean λ.
- A geometric random variable is the floor of an exponential random variable.
- A rectangular random variable is the floor of a uniform random variable.
- A reciprocal random variable is the exponential of a uniform random variable.
Functions of several variables
Sum of variables
The distribution of the sum of independent random variables is called the convolution of the primal distribution.
- If it has a distribution from the same family of distributions as the original variables, that family of distributions is said to be closed under convolution.
Examples of such univariate distributions are: Normal distribution, Poisson distribution, Binomial distribution (with common success probability), Negative binomial distribution (with common success probability), Gamma distribution(with common rate parameter), Chi-squared distribution, Cauchy distribution, Hyper-exponential distribution.
Examples:[2]
- If X1 and X2 are Poisson random variables with means μ1 and μ2 respectively, then X1 + X2 is a Poisson random variable with mean μ1 + μ2.
- The sum of gamma (ni, β) random variables has a gamma (Σni, β) distribution.
- If X1 is a Cauchy (μ1, σ1) random variable and X2 is a Cauchy (μ2, σ2), then X1 + X2 is a Cauchy (μ1 + μ2, σ1 + σ2) random variable.
- If X1 and X2 are chi-squared random variables with ν1 and ν2 degrees of freedom respectively, then X1 + X2 is a chi-squared random variable with ν1 + ν2 degrees of freedom.
- If X1 is a normal (μ1, σ2
1) random variable and X2 is a normal (μ2, σ2
2) random variable, then X1 + X2 is a normal (μ1 + μ2, σ2
1 + σ2
2) random variable. - The sum of N chi-squared (1) random variables has a chi-squared distribution with N degrees of freedom.
Other distributions are not closed under convolution, but their sum has a known distribution:
- The sum of n Bernoulli (p) random variables is a binomial (n, p) random variable.
- The sum of n geometric random variable with probability of success p is a negative binomial random variable with parameters n and p.
- The sum of n exponential (β) random variables is a gamma (n, β) random variable.
- The sum of the squares of N standard normal random variables has a chi-squared distribution with N degrees of freedom.
Product of variables
The product of independent random variables X and Y may belong to the same family of distribution as X and Y: Bernoulli distribution and Log-normal distribution.
Example:
- If X1 and X2 are independent log-normal random variables with parameters (μ1, σ2
1) and (μ2, σ2
2) respectively, then X1 X2 is a log-normal random variable with parameters (μ1 + μ2, σ2
1 + σ2
2).
Minimum and maximum of independent random variables
For some distributions, the minimum value of several independent random variables is a member of the same family, with different parameters: Bernoulli distribution, Geometric distribution, Exponential distribution, Extreme value distribution, Pareto distribution, Rayleigh distribution, Weibull distribution.
Examples:
- If X1 and X2 are independent geometric random variables with probability of success p1 and p2 respectively, then min(X1, X2) is a geometric random variable with probability of success p = p1 + p2 - p1 p2. The relationship is simpler if expressed in terms probability of failure: q = q1 q2.
- If X1 and X2 are independent exponential random variables with rate μ1 and μ2 respectively, then min(X1, X2) is an exponential random variable with rate μ = μ1 + μ2.
Similarly, distributions for which the maximum value of several independent random variables is a member of the same family of distribution include: Bernoulli distribution, Power law distribution.
Other
- If X and Y are independent standard normal random variables, X/Y is a Cauchy (0,1) random variable.
- If X1 and X2 are chi-squared random variables with ν1 and ν2 degrees of freedom respectively, then (X1/ν1)/(X2/ν2) is an F(ν1, ν2) random variable.
- If X is a standard normal random variable and U is a chi-squared random variable with ν degrees of freedom, then is a Student's t (ν) random variable.
- If X1 is gamma (α1, 1) random variable and X2 is a gamma (α2, 1) random variable then X1/(X1 + X2) is a beta(α1, α2) random variable. More generally, if X1is gamma(α1, β1) random variable and X2 is gamma(α2, β2) random variable then β2 X1/(β2 X1 + β1 X2) is a beta(α1, α2) random variable.
- If X and Y are exponential random variables with mean μ, then X-Y is a double exponential random variable with mean 0 and scale μ.
Approximate (limit) relationships
Approximate or limit relationship means
- either that the combination of an infinite number of iid random variables tends to some distribution,
- or that the limit when a parameter tends to some value approaches to a different distribution.
Combination of iid random variables:
- Given certain conditions, the sum (hence the average) of a sufficiently large number of iid random variables, each with finite mean and variance, will be approximately normally distributed.(This is central limit theorem (CLT)).
Special case of distribution parametrization:
- X is a Hypergeometric (m, N, n) random variable. If n and m are large compared to N, and p = m / N is not close to 0 or 1, then X approximately has a Binomial(n, p) Distribution.
- X is a beta-binomial random variable with parameters (n, α, β). Let p = α/(α + β) and suppose α + β is large, then X approximately has a binomial(n, p) distribution.
- If X is a binomial (n, p) random variable and if n is large and np is small then X approximately has a Poisson(np) distribution.
- If X is a negative binomial random variable with r large, P near 1, and r(1-P) = λ, then X approximately has a Poisson distribution with mean λ.
Consequences of the CLT:
- If X is a Poisson random variable with large mean, then for integers j and k, P(j ≤ X ≤ k) approximately equals to P(j - 1/2 ≤ Y ≤ k + 1/2) where Y is a normal distribution with the same mean and variance as X.
- If X is a binomial(n, p) random variable with large n and np, then for integers j and k, P(j ≤ X ≤ k) approximately equals to P(j - 1/2 ≤ Y ≤ k + 1/2) where Y is a normal random variable with the same mean and variance as X, i. e. np and np(1-p).
- If X is a beta random variable with parameters α and β equal and large, then X approximately has a normal distribution with the same mean and variance, i. e. mean α/(α + β) and variance αβ/((α+β)2(α + β + 1)).
- If X is a gamma(α, β) random variable and the shape parameter α is large relative to the scale parameter β, then X approximately has a normal random variable with the same mean and variance.
- If X is a Student's t random variable with a large number of degrees of freedom ν then X approximately has a standard normal distribution.
- If X is an F(ν, ω) random variable with ω large, then ν X is approximately distributed As a chi-squared random variable with ν degrees of freedom.
Compound (or Bayesian) relationships
When one or more parameter(s) of a distribution are random variables, the compound distribution is the marginal distribution of the variable.
Examples:
- If X|N is a binomial (N,p) random variable, where parameter N is a random variable with negative-binomial (m, r) distribution, then X is distributed as a negative-binomial (m, r/(p+qr)).
- If X|N is a binomial (N,p) random variable, where parameter N is a random variable with Poisson (μ) distribution, then X is distributed as a Poisson (μp).
- If X|μ is a Poisson (μ) random variable and parameter μ is random variable with gamma(m, θ) distribution (where θ is the scale parameter), then X is distributed as a negative-binomial (m, θ/(1+θ)), sometimes called Gamma-Poisson distribution.
Some distributions have been specially named as compounds: Beta-Binomial distribution, Beta-Pascal distribution, Gamma-Normal distribution.
Examples:
- If X is a Binomial (n,p) random variable, and parameter p is a random variable with beta (α, β) distribution, then X is distributed as a Beta-Binomial(α, β,n).
- If X is a negative-binomial (m,p) random variable, and parameter p is a random variable with beta (α, β) distribution, then X is distributed as a Beta-Pascal(α, β,m).
See also
References
- ↑ LEEMIS, Lawrence M.; Jacquelyn T. MCQUESTON (February 2008). "Univariate Distribution Relationships" (PDF). American Statistician. 62 (1): 45–53. doi:10.1198/000313008x270448.
- ↑ Cook, John D. "Diagram of distribution relationships".
External links
- Interactive graphic: Univariate Distribution Relationships
- ProbOnto - Ontology and knowledge base of probability distributions: ProbOnto