In mathematics, uniform integrability is an important concept in real analysis, functional analysis and measure theory, and plays a vital role in the theory of martingales.
Uniform integrability is an extension to the notion of a family of functions being dominated in which is central in dominated convergence. Several textbooks on real analysis and measure theory use the following definition: [1] [2]
Definition A: Let be a positive measure space. A set is called uniformly integrable if , and to each there corresponds a such that
whenever and
Definition A is rather restrictive for infinite measure spaces. A more general definition [3] of uniform integrability that works well in general measures spaces was introduced by G. A. Hunt.
Definition H: Let be a positive measure space. A set is called uniformly integrable if and only if
where .
Since Hunt's definition is equivalent to Definition A when the underlying measure space is finite (see Theorem 2 below), Definition H is widely adopted in Mathematics.
The following result [4] provides another equivalent notion to Hunt's. This equivalency is sometimes given as definition for uniform integrability.
Theorem 1: If is a (positive) finite measure space, then a set is uniformly integrable if and only if
If in addition , then uniform integrability is equivalent to either of the following conditions
1. .
2.
When the underlying space is -finite, Hunt's definition is equivalent to the following:
Theorem 2: Let be a -finite measure space, and be such that almost surely. A set is uniformly integrable if and only if , and for any , there exits such that
whenever .
A consequence of Theorems 1 and 2 is that equivalence of Definitions A and H for finite measures follows. Indeed, the statement in Definition A is obtained by taking in Theorem 2.
In the theory of probability, Definition A or the statement of Theorem 1 are often presented as definitions of uniform integrability using the notation expectation of random variables., [5] [6] [7] that is,
1. A class of random variables is called uniformly integrable if:
or alternatively
2. A class of random variables is called uniformly integrable (UI) if for every there exists such that , where is the indicator function .
One consequence of uniformly integrability of a class of random variables is that family of laws or distributions is tight. That is, for each , there exists such that
This however, does not mean that the family of measures is tight. (In any case, tightness would require a topology on in order to be defined.)
There is another notion of uniformity, slightly different than uniform integrability, which also has many applications in probability and measure theory, and which does not require random variables to have a finite integral [9]
Definition: Suppose is a probability space. A classed of random variables is uniformly absolutely continuous with respect to if for any , there is such that whenever .
It is equivalent to uniform integrability if the measure is finite and has no atoms.
The term "uniform absolute continuity" is not standard,[ citation needed] but is used by some authors. [10] [11]
The following results apply to the probabilistic definition. [12]
In the following we use the probabilistic framework, but regardless of the finiteness of the measure, by adding the boundedness condition on the chosen subset of .
A sequence converges to in the norm if and only if it converges in measure to and it is uniformly integrable. In probability terms, a sequence of random variables converging in probability also converge in the mean if and only if they are uniformly integrable. [17] This is a generalization of Lebesgue's dominated convergence theorem, see Vitali convergence theorem.
In mathematics, uniform integrability is an important concept in real analysis, functional analysis and measure theory, and plays a vital role in the theory of martingales.
Uniform integrability is an extension to the notion of a family of functions being dominated in which is central in dominated convergence. Several textbooks on real analysis and measure theory use the following definition: [1] [2]
Definition A: Let be a positive measure space. A set is called uniformly integrable if , and to each there corresponds a such that
whenever and
Definition A is rather restrictive for infinite measure spaces. A more general definition [3] of uniform integrability that works well in general measures spaces was introduced by G. A. Hunt.
Definition H: Let be a positive measure space. A set is called uniformly integrable if and only if
where .
Since Hunt's definition is equivalent to Definition A when the underlying measure space is finite (see Theorem 2 below), Definition H is widely adopted in Mathematics.
The following result [4] provides another equivalent notion to Hunt's. This equivalency is sometimes given as definition for uniform integrability.
Theorem 1: If is a (positive) finite measure space, then a set is uniformly integrable if and only if
If in addition , then uniform integrability is equivalent to either of the following conditions
1. .
2.
When the underlying space is -finite, Hunt's definition is equivalent to the following:
Theorem 2: Let be a -finite measure space, and be such that almost surely. A set is uniformly integrable if and only if , and for any , there exits such that
whenever .
A consequence of Theorems 1 and 2 is that equivalence of Definitions A and H for finite measures follows. Indeed, the statement in Definition A is obtained by taking in Theorem 2.
In the theory of probability, Definition A or the statement of Theorem 1 are often presented as definitions of uniform integrability using the notation expectation of random variables., [5] [6] [7] that is,
1. A class of random variables is called uniformly integrable if:
or alternatively
2. A class of random variables is called uniformly integrable (UI) if for every there exists such that , where is the indicator function .
One consequence of uniformly integrability of a class of random variables is that family of laws or distributions is tight. That is, for each , there exists such that
This however, does not mean that the family of measures is tight. (In any case, tightness would require a topology on in order to be defined.)
There is another notion of uniformity, slightly different than uniform integrability, which also has many applications in probability and measure theory, and which does not require random variables to have a finite integral [9]
Definition: Suppose is a probability space. A classed of random variables is uniformly absolutely continuous with respect to if for any , there is such that whenever .
It is equivalent to uniform integrability if the measure is finite and has no atoms.
The term "uniform absolute continuity" is not standard,[ citation needed] but is used by some authors. [10] [11]
The following results apply to the probabilistic definition. [12]
In the following we use the probabilistic framework, but regardless of the finiteness of the measure, by adding the boundedness condition on the chosen subset of .
A sequence converges to in the norm if and only if it converges in measure to and it is uniformly integrable. In probability terms, a sequence of random variables converging in probability also converge in the mean if and only if they are uniformly integrable. [17] This is a generalization of Lebesgue's dominated convergence theorem, see Vitali convergence theorem.