Probability mass function
![]() | |||
Cumulative distribution function
![]() | |||
Parameters | (none) | ||
---|---|---|---|
Support | |||
PMF | |||
CDF | |||
Mean | |||
Median | |||
Mode | |||
Variance | |||
Skewness | (not defined) | ||
Excess kurtosis | (not defined) | ||
Entropy | 3.432527514776... [1] [2] [3] |
In mathematics, the Gauss窶適uzmin distribution is a discrete probability distribution that arises as the limit probability distribution of the coefficients in the continued fraction expansion of a random variable uniformly distributed in (0, 1). [4] The distribution is named after Carl Friedrich Gauss, who derived it around 1800, [5] and Rodion Kuzmin, who gave a bound on the rate of convergence in 1929. [6] [7] It is given by the probability mass function
Let
be the continued fraction expansion of a random number x uniformly distributed in (0, 1). Then
Equivalently, let
then
tends to zero as n tends to infinity.
In 1928, Kuzmin gave the bound
In 1929, Paul Lテゥvy [8] improved it to
Later, Eduard Wirsing showed [9] that, for ホサ = 0.30366... (the Gauss窶適uzmin窶展irsing constant), the limit
exists for every s in [0, 1], and the function ホィ(s) is analytic and satisfies ホィ(0) = ホィ(1) = 0. Further bounds were proved by K. I. Babenko. [10]
{{
cite book}}
: |journal=
ignored (
help)
Probability mass function
![]() | |||
Cumulative distribution function
![]() | |||
Parameters | (none) | ||
---|---|---|---|
Support | |||
PMF | |||
CDF | |||
Mean | |||
Median | |||
Mode | |||
Variance | |||
Skewness | (not defined) | ||
Excess kurtosis | (not defined) | ||
Entropy | 3.432527514776... [1] [2] [3] |
In mathematics, the Gauss窶適uzmin distribution is a discrete probability distribution that arises as the limit probability distribution of the coefficients in the continued fraction expansion of a random variable uniformly distributed in (0, 1). [4] The distribution is named after Carl Friedrich Gauss, who derived it around 1800, [5] and Rodion Kuzmin, who gave a bound on the rate of convergence in 1929. [6] [7] It is given by the probability mass function
Let
be the continued fraction expansion of a random number x uniformly distributed in (0, 1). Then
Equivalently, let
then
tends to zero as n tends to infinity.
In 1928, Kuzmin gave the bound
In 1929, Paul Lテゥvy [8] improved it to
Later, Eduard Wirsing showed [9] that, for ホサ = 0.30366... (the Gauss窶適uzmin窶展irsing constant), the limit
exists for every s in [0, 1], and the function ホィ(s) is analytic and satisfies ホィ(0) = ホィ(1) = 0. Further bounds were proved by K. I. Babenko. [10]
{{
cite book}}
: |journal=
ignored (
help)