This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||
|
Section doesn't feel like it's quite there yet. Maybe consider what happens when you're using a Renyi entropy as a measure of ecological diversity, and then realise that you need to split one species into two... -- Jheald 22:19, 23 January 2006 (UTC).
The first thing I saw was 1/(1-1) = inf Full Decent ( talk) 16:07, 11 December 2009 (UTC)
Renyi entropy was defined axiomatically in Renyi's Berkeley entropy paper. In this, a weakening of one of the Shannon axioms results in Renyi entropy; that's why α=1 is special. Also, some of Renyi entropy's applications - Statistical physics, General statistics, Machine learning, Signal processing, Cryptography (a measure of randomness, robustness), Shannon theory (generalizing, proving theorems), Source coding - should be added with context. I don't have all this handy right now, but I'm sure each piece of this is familiar to at least one person reading this page.... Calbaer 05:59, 5 May 2006 (UTC)
The statement that is non-decreasing in seems to contradict the statement that . Also, should that be a weak inequality? LachlanA 23:24, 21 November 2006 (UTC)
I would suggest removing the sentence "In the Heisenberg XY spin chain model, the Rényi entropy as a function of α can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group.[2][3]" from the beginning of the article. This is quite technical, and in no way an important fact about Rényi entropy, but rather about the Heisenberg model. It could be moved elsewhere in the article if so desired, although I personally don't see this as necessary. — Preceding unsigned comment added by 193.190.84.1 ( talk • contribs) 10:32, 26 January 2018 (UTC)
This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||
|
Section doesn't feel like it's quite there yet. Maybe consider what happens when you're using a Renyi entropy as a measure of ecological diversity, and then realise that you need to split one species into two... -- Jheald 22:19, 23 January 2006 (UTC).
The first thing I saw was 1/(1-1) = inf Full Decent ( talk) 16:07, 11 December 2009 (UTC)
Renyi entropy was defined axiomatically in Renyi's Berkeley entropy paper. In this, a weakening of one of the Shannon axioms results in Renyi entropy; that's why α=1 is special. Also, some of Renyi entropy's applications - Statistical physics, General statistics, Machine learning, Signal processing, Cryptography (a measure of randomness, robustness), Shannon theory (generalizing, proving theorems), Source coding - should be added with context. I don't have all this handy right now, but I'm sure each piece of this is familiar to at least one person reading this page.... Calbaer 05:59, 5 May 2006 (UTC)
The statement that is non-decreasing in seems to contradict the statement that . Also, should that be a weak inequality? LachlanA 23:24, 21 November 2006 (UTC)
I would suggest removing the sentence "In the Heisenberg XY spin chain model, the Rényi entropy as a function of α can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group.[2][3]" from the beginning of the article. This is quite technical, and in no way an important fact about Rényi entropy, but rather about the Heisenberg model. It could be moved elsewhere in the article if so desired, although I personally don't see this as necessary. — Preceding unsigned comment added by 193.190.84.1 ( talk • contribs) 10:32, 26 January 2018 (UTC)