From Wikipedia, the free encyclopedia

In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system. (See also Fisher information.)

Measures of information

Information measures (IM) are the most important tools of information theory. They measure either the amount of positive information or of "missing" information an observer possesses with regards to any system of interest. The most famous IM is the so-called Shannon-entropy (1948), which determines how much additional information the observer still requires in order to have all the available knowledge regarding a given system S, when all he/she has is a probability density function (PDF) defined on appropriate elements of such system. This is then a "missing" information measure. The IM is a function of the PDF only. If the observer does not have such a PDF, but only a finite set of empirically determined mean values of the system, then a fundamental scientific principle called the Maximum Entropy one (MaxEnt) asserts that the "best" PDF is the one that, reproducing the known expectation values, maximizes otherwise Shannon's IM.

Fisher's information measure

Fisher's information (FIM), named after Ronald Fisher, (1925) is another kind of measure, in two respects, namely,

1) it reflects the amount of (positive) information of the observer,
2) it depends not only on the PD but also on its first derivatives, a property that makes it a local quantity ( Shannon's is instead a global one).

The corresponding counterpart of MaxEnt is now the FIM-minimization, since Fisher's measure grows when Shannon's diminishes, and vice versa. The minimization here referred to (MFI) is an important theoretical tool in a manifold of disciplines, beginning with physics. In a sense it is clearly superior to MaxEnt because the later procedure yields always as the solution an exponential PD, while the MFI solution is a differential equation for the PD, which allows for greater flexibility and versatility.

Applications of the MFI

Thermodynamics

Much effort has been devoted to Fisher's information measure, shedding much light upon the manifold physical applications. [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15][ excessive citations] As a small sample, it can be shown that the whole field of thermodynamics (both equilibrium and non-equilibrium) can be derived from the MFI approach. [16] Here FIM is specialized to the particular but important case of translation families, i.e., distribution functions whose form does not change under translational transformations. In this case, Fisher measure becomes shift-invariant. Such minimizing of Fisher's measure leads to a Schrödinger-like equation for the probability amplitude, where the ground state describes equilibrium physics and the excited states account for non-equilibrium situations. [17]

Scale-invariant phenomena

More recently, Zipf's law has been shown to arise as the variational solution of the MFI when scale invariance is introduced in the measure, leading for the first time an explanation of this regularity from first principles. [18] It has been also shown that MFI can be used to formulate a thermodynamics based on scale invariance instead of translational invariance, allowing the definition of the scale-free ideal gas, the scale invariant equivalent of the ideal gas. [19]

References

  1. ^ Frieden, B. R. (2004). Science from Fisher information : a unification. Cambridge, UK: Cambridge University Press. ISBN  978-0-521-00911-9. OCLC  53325064.
  2. ^ Frieden, B. Roy (1989). "Fisher information as the basis for the Schrödinger wave equation". American Journal of Physics. 57 (11). American Association of Physics Teachers (AAPT): 1004–1008. Bibcode: 1989AmJPh..57.1004F. doi: 10.1119/1.15810. ISSN  0002-9505.
  3. ^ Frieden, B.Roy (1992). "Fisher information and uncertainty complementarity". Physics Letters A. 169 (3). Elsevier BV: 123–130. Bibcode: 1992PhLA..169..123F. doi: 10.1016/0375-9601(92)90581-6. ISSN  0375-9601.
  4. ^ B. R. Frieden, in Advances in Imaging and Electron Physics, edited by P. W. Hawkes, Academic, New York, 1994, Vol. 90, pp. 123204.
  5. ^ Frieden, B.Roy (1993). "Estimation of distribution laws, and physical laws, by a principle of extremized physical information". Physica A: Statistical Mechanics and Its Applications. 198 (1–2). Elsevier BV: 262–338. Bibcode: 1993PhyA..198..262F. doi: 10.1016/0378-4371(93)90194-9. ISSN  0378-4371.
  6. ^ Frieden, B. Roy; Hughes, Roy J. (1994-04-01). "Spectral 1/f noise derived from extremized physical information". Physical Review E. 49 (4). American Physical Society (APS): 2644–2649. Bibcode: 1994PhRvE..49.2644F. doi: 10.1103/physreve.49.2644. ISSN  1063-651X. PMID  9961526.
  7. ^ Nikolov, B.; Frieden, B. Roy (1994-06-01). "Limitation on entropy increase imposed by Fisher information". Physical Review E. 49 (6). American Physical Society (APS): 4815–4820. Bibcode: 1994PhRvE..49.4815N. doi: 10.1103/physreve.49.4815. ISSN  1063-651X. PMID  9961798.
  8. ^ Frieden, B. Roy (1990-04-01). "Fisher information, disorder, and the equilibrium distributions of physics". Physical Review A. 41 (8). American Physical Society (APS): 4265–4276. Bibcode: 1990PhRvA..41.4265F. doi: 10.1103/physreva.41.4265. ISSN  1050-2947. PMID  9903619.
  9. ^ Frieden, B. Roy; Soffer, Bernard H. (1995-09-01). "Lagrangians of physics and the game of Fisher-information transfer". Physical Review E. 52 (3). American Physical Society (APS): 2274–2286. Bibcode: 1995PhRvE..52.2274F. doi: 10.1103/physreve.52.2274. ISSN  1063-651X. PMID  9963668.
  10. ^ Frieden, B. Roy (1991). "Fisher information and the complex nature of the Schrödinger wave equation". Foundations of Physics. 21 (7). Springer Nature: 757–771. Bibcode: 1991FoPh...21..757F. doi: 10.1007/bf00733343. ISSN  0015-9018.
  11. ^ R. N. Silver, in E. T. Jaynes: Physics and Probability, edited by W. T. Grandy, Jr. and P. W. Milonni, Cambridge University Press, Cambridge, England, 1992.
  12. ^ Plastino, A.; Plastino, A.R.; Miller, H.G.; Khanna, F.C. (1996). "A lower bound for Fisher's information measure". Physics Letters A. 221 (1–2). Elsevier BV: 29–33. Bibcode: 1996PhLA..221...29P. doi: 10.1016/0375-9601(96)00560-9. ISSN  0375-9601.
  13. ^ Plastino, A. R.; Plastino, A. (1996-10-01). "Symmetries of the Fokker-Planck equation and the Fisher-Frieden arrow of time". Physical Review E. 54 (4). American Physical Society (APS): 4423–4426. Bibcode: 1996PhRvE..54.4423P. doi: 10.1103/physreve.54.4423. ISSN  1063-651X.
  14. ^ R. Plastino, A.; Miller, H. G.; Plastino, A. (1997-10-01). "Minimum Kullback entropy approach to the Fokker-Planck equation". Physical Review E. 56 (4). American Physical Society (APS): 3927–3934. Bibcode: 1997PhRvE..56.3927R. doi: 10.1103/physreve.56.3927. ISSN  1063-651X.
  15. ^ Plastino, A.; Plastino, A.R.; Miller, H.G. (1997). "On the relationship between the Fisher-Frieden-Soffer arrow of time, and the behaviour of the Boltzmann and Kullback entropies". Physics Letters A. 235 (2). Elsevier BV: 129–134. Bibcode: 1997PhLA..235..129P. doi: 10.1016/s0375-9601(97)00634-8. ISSN  0375-9601.
  16. ^ Frieden, B. R.; Plastino, A.; Plastino, A. R.; Soffer, B. H. (1999-07-01). "Fisher-based thermodynamics: Its Legendre transform and concavity properties". Physical Review E. 60 (1). American Physical Society (APS): 48–53. Bibcode: 1999PhRvE..60...48F. doi: 10.1103/physreve.60.48. ISSN  1063-651X.
  17. ^ Frieden, B. R.; Plastino, A.; Plastino, A. R.; Soffer, B. H. (2002-10-22). "Schrödinger link between nonequilibrium thermodynamics and Fisher information". Physical Review E. 66 (4). American Physical Society (APS): 046128. arXiv: cond-mat/0206107. Bibcode: 2002PhRvE..66d6128F. doi: 10.1103/physreve.66.046128. ISSN  1063-651X. PMID  12443280.
  18. ^ Hernando, A.; Puigdomènech, D.; Villuendas, D.; Vesperinas, C.; Plastino, A. (2009). "Zipf's law from a Fisher variational-principle". Physics Letters A. 374 (1). Elsevier BV: 18–21. arXiv: 0908.0501. Bibcode: 2009PhLA..374...18H. doi: 10.1016/j.physleta.2009.10.027. ISSN  0375-9601.
  19. ^ Hernando, A.; Vesperinas, C.; Plastino, A. (2010). "Fisher information and the thermodynamics of scale-invariant systems". Physica A: Statistical Mechanics and Its Applications. 389 (3): 490. arXiv: 0908.0504. Bibcode: 2010PhyA..389..490H. doi: 10.1016/j.physa.2009.09.054.
From Wikipedia, the free encyclopedia

In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system. (See also Fisher information.)

Measures of information

Information measures (IM) are the most important tools of information theory. They measure either the amount of positive information or of "missing" information an observer possesses with regards to any system of interest. The most famous IM is the so-called Shannon-entropy (1948), which determines how much additional information the observer still requires in order to have all the available knowledge regarding a given system S, when all he/she has is a probability density function (PDF) defined on appropriate elements of such system. This is then a "missing" information measure. The IM is a function of the PDF only. If the observer does not have such a PDF, but only a finite set of empirically determined mean values of the system, then a fundamental scientific principle called the Maximum Entropy one (MaxEnt) asserts that the "best" PDF is the one that, reproducing the known expectation values, maximizes otherwise Shannon's IM.

Fisher's information measure

Fisher's information (FIM), named after Ronald Fisher, (1925) is another kind of measure, in two respects, namely,

1) it reflects the amount of (positive) information of the observer,
2) it depends not only on the PD but also on its first derivatives, a property that makes it a local quantity ( Shannon's is instead a global one).

The corresponding counterpart of MaxEnt is now the FIM-minimization, since Fisher's measure grows when Shannon's diminishes, and vice versa. The minimization here referred to (MFI) is an important theoretical tool in a manifold of disciplines, beginning with physics. In a sense it is clearly superior to MaxEnt because the later procedure yields always as the solution an exponential PD, while the MFI solution is a differential equation for the PD, which allows for greater flexibility and versatility.

Applications of the MFI

Thermodynamics

Much effort has been devoted to Fisher's information measure, shedding much light upon the manifold physical applications. [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15][ excessive citations] As a small sample, it can be shown that the whole field of thermodynamics (both equilibrium and non-equilibrium) can be derived from the MFI approach. [16] Here FIM is specialized to the particular but important case of translation families, i.e., distribution functions whose form does not change under translational transformations. In this case, Fisher measure becomes shift-invariant. Such minimizing of Fisher's measure leads to a Schrödinger-like equation for the probability amplitude, where the ground state describes equilibrium physics and the excited states account for non-equilibrium situations. [17]

Scale-invariant phenomena

More recently, Zipf's law has been shown to arise as the variational solution of the MFI when scale invariance is introduced in the measure, leading for the first time an explanation of this regularity from first principles. [18] It has been also shown that MFI can be used to formulate a thermodynamics based on scale invariance instead of translational invariance, allowing the definition of the scale-free ideal gas, the scale invariant equivalent of the ideal gas. [19]

References

  1. ^ Frieden, B. R. (2004). Science from Fisher information : a unification. Cambridge, UK: Cambridge University Press. ISBN  978-0-521-00911-9. OCLC  53325064.
  2. ^ Frieden, B. Roy (1989). "Fisher information as the basis for the Schrödinger wave equation". American Journal of Physics. 57 (11). American Association of Physics Teachers (AAPT): 1004–1008. Bibcode: 1989AmJPh..57.1004F. doi: 10.1119/1.15810. ISSN  0002-9505.
  3. ^ Frieden, B.Roy (1992). "Fisher information and uncertainty complementarity". Physics Letters A. 169 (3). Elsevier BV: 123–130. Bibcode: 1992PhLA..169..123F. doi: 10.1016/0375-9601(92)90581-6. ISSN  0375-9601.
  4. ^ B. R. Frieden, in Advances in Imaging and Electron Physics, edited by P. W. Hawkes, Academic, New York, 1994, Vol. 90, pp. 123204.
  5. ^ Frieden, B.Roy (1993). "Estimation of distribution laws, and physical laws, by a principle of extremized physical information". Physica A: Statistical Mechanics and Its Applications. 198 (1–2). Elsevier BV: 262–338. Bibcode: 1993PhyA..198..262F. doi: 10.1016/0378-4371(93)90194-9. ISSN  0378-4371.
  6. ^ Frieden, B. Roy; Hughes, Roy J. (1994-04-01). "Spectral 1/f noise derived from extremized physical information". Physical Review E. 49 (4). American Physical Society (APS): 2644–2649. Bibcode: 1994PhRvE..49.2644F. doi: 10.1103/physreve.49.2644. ISSN  1063-651X. PMID  9961526.
  7. ^ Nikolov, B.; Frieden, B. Roy (1994-06-01). "Limitation on entropy increase imposed by Fisher information". Physical Review E. 49 (6). American Physical Society (APS): 4815–4820. Bibcode: 1994PhRvE..49.4815N. doi: 10.1103/physreve.49.4815. ISSN  1063-651X. PMID  9961798.
  8. ^ Frieden, B. Roy (1990-04-01). "Fisher information, disorder, and the equilibrium distributions of physics". Physical Review A. 41 (8). American Physical Society (APS): 4265–4276. Bibcode: 1990PhRvA..41.4265F. doi: 10.1103/physreva.41.4265. ISSN  1050-2947. PMID  9903619.
  9. ^ Frieden, B. Roy; Soffer, Bernard H. (1995-09-01). "Lagrangians of physics and the game of Fisher-information transfer". Physical Review E. 52 (3). American Physical Society (APS): 2274–2286. Bibcode: 1995PhRvE..52.2274F. doi: 10.1103/physreve.52.2274. ISSN  1063-651X. PMID  9963668.
  10. ^ Frieden, B. Roy (1991). "Fisher information and the complex nature of the Schrödinger wave equation". Foundations of Physics. 21 (7). Springer Nature: 757–771. Bibcode: 1991FoPh...21..757F. doi: 10.1007/bf00733343. ISSN  0015-9018.
  11. ^ R. N. Silver, in E. T. Jaynes: Physics and Probability, edited by W. T. Grandy, Jr. and P. W. Milonni, Cambridge University Press, Cambridge, England, 1992.
  12. ^ Plastino, A.; Plastino, A.R.; Miller, H.G.; Khanna, F.C. (1996). "A lower bound for Fisher's information measure". Physics Letters A. 221 (1–2). Elsevier BV: 29–33. Bibcode: 1996PhLA..221...29P. doi: 10.1016/0375-9601(96)00560-9. ISSN  0375-9601.
  13. ^ Plastino, A. R.; Plastino, A. (1996-10-01). "Symmetries of the Fokker-Planck equation and the Fisher-Frieden arrow of time". Physical Review E. 54 (4). American Physical Society (APS): 4423–4426. Bibcode: 1996PhRvE..54.4423P. doi: 10.1103/physreve.54.4423. ISSN  1063-651X.
  14. ^ R. Plastino, A.; Miller, H. G.; Plastino, A. (1997-10-01). "Minimum Kullback entropy approach to the Fokker-Planck equation". Physical Review E. 56 (4). American Physical Society (APS): 3927–3934. Bibcode: 1997PhRvE..56.3927R. doi: 10.1103/physreve.56.3927. ISSN  1063-651X.
  15. ^ Plastino, A.; Plastino, A.R.; Miller, H.G. (1997). "On the relationship between the Fisher-Frieden-Soffer arrow of time, and the behaviour of the Boltzmann and Kullback entropies". Physics Letters A. 235 (2). Elsevier BV: 129–134. Bibcode: 1997PhLA..235..129P. doi: 10.1016/s0375-9601(97)00634-8. ISSN  0375-9601.
  16. ^ Frieden, B. R.; Plastino, A.; Plastino, A. R.; Soffer, B. H. (1999-07-01). "Fisher-based thermodynamics: Its Legendre transform and concavity properties". Physical Review E. 60 (1). American Physical Society (APS): 48–53. Bibcode: 1999PhRvE..60...48F. doi: 10.1103/physreve.60.48. ISSN  1063-651X.
  17. ^ Frieden, B. R.; Plastino, A.; Plastino, A. R.; Soffer, B. H. (2002-10-22). "Schrödinger link between nonequilibrium thermodynamics and Fisher information". Physical Review E. 66 (4). American Physical Society (APS): 046128. arXiv: cond-mat/0206107. Bibcode: 2002PhRvE..66d6128F. doi: 10.1103/physreve.66.046128. ISSN  1063-651X. PMID  12443280.
  18. ^ Hernando, A.; Puigdomènech, D.; Villuendas, D.; Vesperinas, C.; Plastino, A. (2009). "Zipf's law from a Fisher variational-principle". Physics Letters A. 374 (1). Elsevier BV: 18–21. arXiv: 0908.0501. Bibcode: 2009PhLA..374...18H. doi: 10.1016/j.physleta.2009.10.027. ISSN  0375-9601.
  19. ^ Hernando, A.; Vesperinas, C.; Plastino, A. (2010). "Fisher information and the thermodynamics of scale-invariant systems". Physica A: Statistical Mechanics and Its Applications. 389 (3): 490. arXiv: 0908.0504. Bibcode: 2010PhyA..389..490H. doi: 10.1016/j.physa.2009.09.054.

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook