![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
![]() | The contents of the Statistically close page were merged into Statistical distance on 11 July 2023. For the contribution history and old versions of the redirected page, please see its history; for the discussion at that location, see its talk page. |
Would be nice to have a comparison table to see how far the distance or divergence measures are from being a metric. Please feel free to fill in missing (-) data, references being welcome too. I mostly copied data from the articles of these distances:
Divergence between distributions | Symmetric | Nonnegative | Triangle inequality | Identity of indiscernibles | Metric |
---|---|---|---|---|---|
Kullback–Leibler divergence | no | yes | no | yes | no |
Hellinger distance | yes | yes | yes | yes | yes |
Total variation distance of probability measures | yes | yes | yes | yes | yes |
Jensen–Shannon divergence | yes | yes | no | yes | no |
Jensen–Shannon distance | yes | yes | yes | yes | yes |
Lévy–Prokhorov metric | yes | yes | yes | yes | yes |
Bhattacharyya distance | yes | yes | no | yes | no |
Wasserstein metric | yes | yes | yes | yes | yes |
Divergence between a point and a distribution | Symmetric | Nonnegative | Triangle inequality | Identity of indiscernibles | Metric |
---|---|---|---|---|---|
Mahalanobis distance | - | - | - | - | - |
Olli Niemitalo ( talk) 10:22, 3 December 2014 (UTC)
There is no mention of the statistical distance used in 100% of the crypto papers I've encountered. SD(u,v) = 1/2 ∑ |v_i - u_i| . Is there a reason for that, or is it just missing? For example "Three XOR-Lemmas -- An Exposition, Oded Goldreich" states it. "Randomness Extraction and Key Derivation Using the CBC, Cascade and HMAC Modes, Yevgeniy Dodis, Rosario Gennaro, Johan Hastad, Hugo Krawczyk4 and Tal Rabin" states the same definition. Those are the first two papers I checked.
David in oregon ( talk) 05:08, 1 October 2016 (UTC)
Never mind. It was a special case of the total variation distance. David in oregon ( talk) 23:27, 1 October 2016 (UTC)
Statistically close is a type of measurement within the wider statistical distance topic. Emir of Wikipedia ( talk) 22:17, 6 May 2023 (UTC)
I agree. Thatsme314 ( talk) 09:10, 5 June 2023 (UTC)
The name "statistical distance" lets the reader think that it is a mathematical distance. However, it is not always one since it does not satisfy the axioms of a distance. It would be nice to make that fact clearer by saying that typical statistical distances are not distances. It is only written that statistical distances are not metrics, which lets the reader believe that distances and metrics are not the same. 65.254.109.31 ( talk) 15:05, 25 May 2023 (UTC)
The article is incorrect in classifying Kullback–Leibler as a divergence. Kullback–Leibler divergence can be infinite and hence be not a "divergence", which (per the article's definition) must be real valued. (Cf. "Lawvere metric space"s, which are basically divergences, except they allow and require the triangle inequality.) Maybe we should talk about "extended" divergences(/premetrics/prametrics/quasi-distances). Thatsme314 ( talk) 09:41, 5 June 2023 (UTC)
The section Distances as metrics contains this sentence:
"Note that condition 1 and 2 together produce positive definiteness)".
But the link to a definition of positive definiteness does not give a definition.
Apparently whoever wrote that sentence was not aware that the linked article contains not one but two distinct definitions of a positive definite function.
That is simply bad writing if you either don't know what you're linking to, or don't care.
I hope someone familiar with this subject can fix this.
![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
![]() | The contents of the Statistically close page were merged into Statistical distance on 11 July 2023. For the contribution history and old versions of the redirected page, please see its history; for the discussion at that location, see its talk page. |
Would be nice to have a comparison table to see how far the distance or divergence measures are from being a metric. Please feel free to fill in missing (-) data, references being welcome too. I mostly copied data from the articles of these distances:
Divergence between distributions | Symmetric | Nonnegative | Triangle inequality | Identity of indiscernibles | Metric |
---|---|---|---|---|---|
Kullback–Leibler divergence | no | yes | no | yes | no |
Hellinger distance | yes | yes | yes | yes | yes |
Total variation distance of probability measures | yes | yes | yes | yes | yes |
Jensen–Shannon divergence | yes | yes | no | yes | no |
Jensen–Shannon distance | yes | yes | yes | yes | yes |
Lévy–Prokhorov metric | yes | yes | yes | yes | yes |
Bhattacharyya distance | yes | yes | no | yes | no |
Wasserstein metric | yes | yes | yes | yes | yes |
Divergence between a point and a distribution | Symmetric | Nonnegative | Triangle inequality | Identity of indiscernibles | Metric |
---|---|---|---|---|---|
Mahalanobis distance | - | - | - | - | - |
Olli Niemitalo ( talk) 10:22, 3 December 2014 (UTC)
There is no mention of the statistical distance used in 100% of the crypto papers I've encountered. SD(u,v) = 1/2 ∑ |v_i - u_i| . Is there a reason for that, or is it just missing? For example "Three XOR-Lemmas -- An Exposition, Oded Goldreich" states it. "Randomness Extraction and Key Derivation Using the CBC, Cascade and HMAC Modes, Yevgeniy Dodis, Rosario Gennaro, Johan Hastad, Hugo Krawczyk4 and Tal Rabin" states the same definition. Those are the first two papers I checked.
David in oregon ( talk) 05:08, 1 October 2016 (UTC)
Never mind. It was a special case of the total variation distance. David in oregon ( talk) 23:27, 1 October 2016 (UTC)
Statistically close is a type of measurement within the wider statistical distance topic. Emir of Wikipedia ( talk) 22:17, 6 May 2023 (UTC)
I agree. Thatsme314 ( talk) 09:10, 5 June 2023 (UTC)
The name "statistical distance" lets the reader think that it is a mathematical distance. However, it is not always one since it does not satisfy the axioms of a distance. It would be nice to make that fact clearer by saying that typical statistical distances are not distances. It is only written that statistical distances are not metrics, which lets the reader believe that distances and metrics are not the same. 65.254.109.31 ( talk) 15:05, 25 May 2023 (UTC)
The article is incorrect in classifying Kullback–Leibler as a divergence. Kullback–Leibler divergence can be infinite and hence be not a "divergence", which (per the article's definition) must be real valued. (Cf. "Lawvere metric space"s, which are basically divergences, except they allow and require the triangle inequality.) Maybe we should talk about "extended" divergences(/premetrics/prametrics/quasi-distances). Thatsme314 ( talk) 09:41, 5 June 2023 (UTC)
The section Distances as metrics contains this sentence:
"Note that condition 1 and 2 together produce positive definiteness)".
But the link to a definition of positive definiteness does not give a definition.
Apparently whoever wrote that sentence was not aware that the linked article contains not one but two distinct definitions of a positive definite function.
That is simply bad writing if you either don't know what you're linking to, or don't care.
I hope someone familiar with this subject can fix this.