The
Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q = 1 − p.
The
Rademacher distribution, which takes value 1 with probability 1/2 and value −1 with probability 1/2.
The
binomial distribution, which describes the number of successes in a series of independent Yes/No experiments all with the same probability of success.
The
beta-binomial distribution, which describes the number of successes in a series of independent Yes/No experiments with heterogeneity in the success probability.
The
degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of
random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
The
discrete uniform distribution, where all elements of a finite
set are equally likely. This is the theoretical distribution model for a balanced coin, an unbiased die, a casino roulette, or the first card of a well-shuffled deck.
The
hypergeometric distribution, which describes the number of successes in the first m of a series of n consecutive Yes/No experiments, if the total number of successes is known. This distribution arises when there is no replacement.
The
negative hypergeometric distribution, a distribution which describes the number of attempts needed to get the nth success in a series of Yes/No experiments without replacement.
The
Poisson binomial distribution, which describes the number of successes in a series of independent Yes/No experiments with different success probabilities.
Zipf's law or the Zipf distribution. A discrete
power-law distribution, the most famous example of which is the description of the frequency of words in the English language.
The
Zipf–Mandelbrot law is a discrete power law distribution which is a generalization of the Zipf distribution.
The
Boltzmann distribution, a discrete distribution important in
statistical physics which describes the probabilities of the various discrete energy levels of a system in
thermal equilibrium. It has a continuous analogue. Special cases include:
The
geometric distribution, a discrete distribution which describes the number of attempts needed to get the first success in a series of independent Bernoulli trials, or alternatively only the number of losses before the first success (i.e. one less).
The
Poisson distribution, which describes a very large number of individually unlikely events that happen in a certain time interval. Related to this distribution are a number of other distributions: the
displaced Poisson, the hyper-Poisson, the general Poisson binomial and the Poisson type distributions.
The
zeta distribution has uses in applied statistics and statistical mechanics, and perhaps may be of interest to number theorists. It is the
Zipf distribution for an infinite number of elements.
The
Hardy distribution, which describes the probabilities of the hole scores for a given golf player.
The
Beta distribution on [0,1], a family of two-parameter distributions with one mode, of which the uniform distribution is a special case, and which is useful in estimating success probabilities.
The
four-parameter Beta distribution, a straight-forward generalization of the Beta distribution to arbitrary bounded intervals .
The
arcsine distribution on [a,b], which is a special case of the Beta distribution if α = β = 1/2, a = 0, and b = 1.
The
uniform distribution or rectangular distribution on [a,b], where all points in a finite interval are equally likely, is a special case of the four-parameter Beta distribution.
The
Irwin–Hall distribution is the distribution of the sum of n independent random variables, each of which having the uniform distribution on [0,1].
The
Bates distribution is the distribution of the mean of n independent random variables, each of which having the uniform distribution on [0,1].
The
Dirac delta function, although not strictly a probability distribution, is a limiting form of many continuous probability functions. It represents a discrete probability distribution concentrated at 0 — a
degenerate distribution — it is a
Distribution (mathematics) in the generalized function sense; but the notation treats it as if it were a continuous distribution.
The
Kumaraswamy distribution is as versatile as the Beta distribution but has simple closed forms for both the cdf and the pdf.
The
logit metalog distribution, which is highly shape-flexible, has simple closed forms, and can be parameterized with data using linear least squares.
The
triangular distribution on [a, b], a special case of which is the distribution of the sum of two independent uniformly distributed random variables (the convolution of two uniform distributions).
The
Dirac comb of period 2π, although not strictly a function, is a limiting form of many directional distributions. It is essentially a wrapped
Dirac delta function. It represents a discrete probability distribution concentrated at 2πn — a
degenerate distribution — but the notation treats it as if it were a continuous distribution.
Supported on semi-infinite intervals, usually [0,∞)
The
Birnbaum–Saunders distribution, also known as the fatigue life distribution, is a probability distribution used extensively in reliability applications to model failure times.
The
chi-squared distribution, which is the sum of the squares of n independent Gaussian random variables. It is a special case of the Gamma distribution, and it is used in
goodness-of-fit tests in
statistics.
The
F-distribution, which is the distribution of the ratio of two (normalized) chi-squared-distributed random variables, used in the
analysis of variance. It is referred to as the
beta prime distribution when it is the ratio of two chi-squared variates which are not normalized by dividing them by their numbers of degrees of freedom.
The
Gamma distribution, which describes the time until n consecutive rare random events occur in a process with no memory.
The
Erlang distribution, which is a special case of the gamma distribution with integral shape parameter, developed to predict waiting times in
queuing systems
The
log-metalog distribution, which is highly shape-flexile, has simple closed forms, can be parameterized with data using linear least squares, and subsumes the
log-logistic distribution as a special case.
The
log-normal distribution, describing variables which can be modelled as the product of many small independent positive variables.
The
centralized inverse-Fano distribution, which is the distribution representing the ratio of independent normal and gamma-difference random variables.
The
metalog distribution, which is highly shape-flexible, has simple closed forms, and can be parameterized with data using linear least squares.
The
normal distribution, also called the Gaussian or the bell curve. It is ubiquitous in nature and statistics due to the
central limit theorem: every variable that can be modelled as a sum of many small independent, identically distributed variables with finite
mean and
variance is approximately normal.
The
generalized extreme value distribution has a finite upper bound or a finite lower bound depending on what range the value of one of the parameters of the distribution is in (or is supported on the whole real line for one special value of the parameter
The
metalog distribution, which provides flexibility for unbounded, bounded, and semi-bounded support, is highly shape-flexible, has simple closed forms, and can be fit to data using linear least squares.
The
Tukey lambda distribution is either supported on the whole real line, or on a bounded interval, depending on what range the value of one of the parameters of the distribution is in.
The
Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q = 1 − p.
The
Rademacher distribution, which takes value 1 with probability 1/2 and value −1 with probability 1/2.
The
binomial distribution, which describes the number of successes in a series of independent Yes/No experiments all with the same probability of success.
The
beta-binomial distribution, which describes the number of successes in a series of independent Yes/No experiments with heterogeneity in the success probability.
The
degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of
random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
The
discrete uniform distribution, where all elements of a finite
set are equally likely. This is the theoretical distribution model for a balanced coin, an unbiased die, a casino roulette, or the first card of a well-shuffled deck.
The
hypergeometric distribution, which describes the number of successes in the first m of a series of n consecutive Yes/No experiments, if the total number of successes is known. This distribution arises when there is no replacement.
The
negative hypergeometric distribution, a distribution which describes the number of attempts needed to get the nth success in a series of Yes/No experiments without replacement.
The
Poisson binomial distribution, which describes the number of successes in a series of independent Yes/No experiments with different success probabilities.
Zipf's law or the Zipf distribution. A discrete
power-law distribution, the most famous example of which is the description of the frequency of words in the English language.
The
Zipf–Mandelbrot law is a discrete power law distribution which is a generalization of the Zipf distribution.
The
Boltzmann distribution, a discrete distribution important in
statistical physics which describes the probabilities of the various discrete energy levels of a system in
thermal equilibrium. It has a continuous analogue. Special cases include:
The
geometric distribution, a discrete distribution which describes the number of attempts needed to get the first success in a series of independent Bernoulli trials, or alternatively only the number of losses before the first success (i.e. one less).
The
Poisson distribution, which describes a very large number of individually unlikely events that happen in a certain time interval. Related to this distribution are a number of other distributions: the
displaced Poisson, the hyper-Poisson, the general Poisson binomial and the Poisson type distributions.
The
zeta distribution has uses in applied statistics and statistical mechanics, and perhaps may be of interest to number theorists. It is the
Zipf distribution for an infinite number of elements.
The
Hardy distribution, which describes the probabilities of the hole scores for a given golf player.
The
Beta distribution on [0,1], a family of two-parameter distributions with one mode, of which the uniform distribution is a special case, and which is useful in estimating success probabilities.
The
four-parameter Beta distribution, a straight-forward generalization of the Beta distribution to arbitrary bounded intervals .
The
arcsine distribution on [a,b], which is a special case of the Beta distribution if α = β = 1/2, a = 0, and b = 1.
The
uniform distribution or rectangular distribution on [a,b], where all points in a finite interval are equally likely, is a special case of the four-parameter Beta distribution.
The
Irwin–Hall distribution is the distribution of the sum of n independent random variables, each of which having the uniform distribution on [0,1].
The
Bates distribution is the distribution of the mean of n independent random variables, each of which having the uniform distribution on [0,1].
The
Dirac delta function, although not strictly a probability distribution, is a limiting form of many continuous probability functions. It represents a discrete probability distribution concentrated at 0 — a
degenerate distribution — it is a
Distribution (mathematics) in the generalized function sense; but the notation treats it as if it were a continuous distribution.
The
Kumaraswamy distribution is as versatile as the Beta distribution but has simple closed forms for both the cdf and the pdf.
The
logit metalog distribution, which is highly shape-flexible, has simple closed forms, and can be parameterized with data using linear least squares.
The
triangular distribution on [a, b], a special case of which is the distribution of the sum of two independent uniformly distributed random variables (the convolution of two uniform distributions).
The
Dirac comb of period 2π, although not strictly a function, is a limiting form of many directional distributions. It is essentially a wrapped
Dirac delta function. It represents a discrete probability distribution concentrated at 2πn — a
degenerate distribution — but the notation treats it as if it were a continuous distribution.
Supported on semi-infinite intervals, usually [0,∞)
The
Birnbaum–Saunders distribution, also known as the fatigue life distribution, is a probability distribution used extensively in reliability applications to model failure times.
The
chi-squared distribution, which is the sum of the squares of n independent Gaussian random variables. It is a special case of the Gamma distribution, and it is used in
goodness-of-fit tests in
statistics.
The
F-distribution, which is the distribution of the ratio of two (normalized) chi-squared-distributed random variables, used in the
analysis of variance. It is referred to as the
beta prime distribution when it is the ratio of two chi-squared variates which are not normalized by dividing them by their numbers of degrees of freedom.
The
Gamma distribution, which describes the time until n consecutive rare random events occur in a process with no memory.
The
Erlang distribution, which is a special case of the gamma distribution with integral shape parameter, developed to predict waiting times in
queuing systems
The
log-metalog distribution, which is highly shape-flexile, has simple closed forms, can be parameterized with data using linear least squares, and subsumes the
log-logistic distribution as a special case.
The
log-normal distribution, describing variables which can be modelled as the product of many small independent positive variables.
The
centralized inverse-Fano distribution, which is the distribution representing the ratio of independent normal and gamma-difference random variables.
The
metalog distribution, which is highly shape-flexible, has simple closed forms, and can be parameterized with data using linear least squares.
The
normal distribution, also called the Gaussian or the bell curve. It is ubiquitous in nature and statistics due to the
central limit theorem: every variable that can be modelled as a sum of many small independent, identically distributed variables with finite
mean and
variance is approximately normal.
The
generalized extreme value distribution has a finite upper bound or a finite lower bound depending on what range the value of one of the parameters of the distribution is in (or is supported on the whole real line for one special value of the parameter
The
metalog distribution, which provides flexibility for unbounded, bounded, and semi-bounded support, is highly shape-flexible, has simple closed forms, and can be fit to data using linear least squares.
The
Tukey lambda distribution is either supported on the whole real line, or on a bounded interval, depending on what range the value of one of the parameters of the distribution is in.