This
level-5 vital article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||
|
Daily pageviews of this article
A graph should have been displayed here but
graphs are temporarily disabled. Until they are enabled again, visit the interactive graph at
pageviews.wmcloud.org |
Spelling question: I've never (before now) seen the name spelled with a hyphen. Searches of Math Reviews (MathSciNet) and Current Index to Statistics show an overwhelming preference for no hyphen. Should the title, at least, be changed (move the article to "Moment generating function" with a redirect)? Zaslav 18:08, 8 December 2006 (UTC)
I would like _all_ the terms such as E to be defined explicitly. Otherwise these articles are unintelligible to the casual reader. I would have thought that all terms in any formula should be defined every any article, or else reference should be made to some common form of definition of terms for that context. How about a bit more help for the randomly browsing casual student? I would like to see a recommendation in the Wikipedia "guidelines for authors" defining some kind of standard for this, otherwise it is very arbitrary which terms are defined and which are expected to be known. —Preceding unsigned comment added by 220.253.60.249 ( talk • contribs)
Certainly one could put in links to those things, but this article is the wrong place to explain what "E" is, just as an article about Shakespeare's plays is the wrong place to explain how to spell "Denmark", saying that the "D" represents the initial sound in "dog", etc.
This is not written for people who already know this material.
It is written for people who already know what probability distributions are and the standard basic facts about them. Michael Hardy ( talk) 21:47, 10 July 2009 (UTC)
Thanks for the sarcasm, hope you feel a little better about yourself. misli h 23:32, 5 August 2009 (UTC)
of its probability distribution." —Preceding unsigned comment added by 71.199.181.122 ( talk) 21:46, 28 September 2010 (UTC)
There is a link to the expected value operator wiki. It would probably clutter articles to have detailed explanations of each preceding idea necessary to discuss the current, but it might be a good idea to include wikis that should be understood previous to reading the current wiki. —Preceding unsigned comment added by 141.225.193.194 ( talk) 01:32, 31 January 2011 (UTC)
I agree with a lot of the above, but I think it should be stated explicitly that t is just a dummy variable with no intrinsic meaning. Thomas Tvileren ( talk) 20:52, 15 November 2012 (UTC)
the definition of the n-th moment is wrong, the last equality is identically zero, as the nth derivative of 1 evaluated at t=0 will always be zero. the evaluation bar must be placed at the end (so we know we are differentiating Mx(t) n times and evaluating it at zero).
Please provide a few examples, e.g. for a Gaussian distribution.
I would also like to see some more in the article about some basic properties of the moment-generating function, such as convexity, non-negativity, the fact that M(0) always equals one, and also some other not-so-obvious properties (of which I lack knowledge) indicating what the mgf is used for. -- 130.94.162.64 00:55, 17 June 2006 (UTC)
Also, is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral" When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? If not, the quoted statement is wrong and should be removed from the article. —Preceding unsigned comment added by Sjayzzang ( talk • contribs) 20:02, 15 April 2009 (UTC)
We should mention the case when X is a vector of random variables or a stochastic process. Jackzhp 22:29, 3 September 2006 (UTC)
There are a whole bunch of properties of MGFs that it would be nice to include -- e.g. the MGF of a linear transformation of a random variable, MGF of a sum of independent random variables, etc.
something should be added about the discrete form of the mgf, no? 24.136.121.150 08:37, 20 January 2007 (UTC)
It would seem like a good and self-evidently obvious thing to include a link to a wikipedia page which tabulates common moment generating functions (ie: the moment generating functions for common statistical distributions), placing them online. The information is already there on wikipedia, it would just be a case of organising it a little better.
Also, there is probably some efficient way in which the set of all possible functions which commonly occur when dealing with statistical distributions can be organised to highly the possibility of inter-relationships (perhaps some mgf's are nested mgfs so that the fact that
could be highlighted in a list of mgf interdependencies...).
ConcernedScientist ( talk) 00:47, 18 February 2009 (UTC)
We have a theorem that if two mgf's coincide in some region around 0, then corresponding random variables have same distribution. There is however a concern that this statement being true from the point of view of mathematician, is not so reliable from the point of view of applied statistician. McCullagh (1954) [1] gives following example:
with cumulant generating functions
Although the densities are visibly different, their corresponding cgfs are virtually indistinguishable, with maximum difference less than 1.34•10-9 over the entire range. Thus from numerical standpoint mgfs fail to uniquely determine distribution.
On the other hand Waller (1995) [2] shows that characteristic function does much better job in determining the distribution.
Even from a mathematician's point of view, don't you need the MGF to be infinitely differentiable at 0 for uniqueness? Paulginz ( talk) 14:45, 24 November 2009 (UTC)
This page doesn't seem to explain the purpose of the function. —Preceding unsigned comment added by 129.16.204.227 ( talk) 13:07, 10 September 2009 (UTC)
Thanks. Your edits very much improves the quality of the article. -- 129.16.24.201 ( talk) 08:59, 10 November 2009 (UTC)
Kimaaron ( talk) 21:34, 20 February 2010 (UTC)
The MGF is calculating the raw moments as opposed to the central moments. It already says "moments around the origin" on the page, but perhaps this should be noted explicitly at the end of the introduction ("The moment generating function is named as such, because of its intimate link to the raw moments") and again in the definition (only inserting raw), f.ex. : ...where mn is the nth raw moment. Currently the first definition on the moment page is of central moments, so a bit of confusion could occur. /Jens — Preceding unsigned comment added by 203.110.235.1 ( talk) 23:55, 21 January 2013 (UTC)
Under "definitions" further up on this page, there is some concern that the article is not transparent to someone trying to learn about moment generating functions. Michael Hardy writes, "It is written for people who already know what probability distributions are and the standard basic facts about them." While this is true, I think it jumps somewhat between requiring only this basic amount of knowledge and requiring more substantial knowledge. I would imagine many of my students in my intro probability course would find this article impenetrable at enough places that they would give up trying to learn something from it. I would think it would be better to put anything that requires knowledge from courses at a comparable level of difficulty or more advanced than an intro probability course at the end.
Specifically, I think it would improve the article to:
Does anyone feel that these changes would not be an improvement? Barryriedsmith ( talk) 14:23, 6 October 2015 (UTC)
The comment(s) below were originally left at Talk:Moment-generating function/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral"
When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? I think the article qualifies for B rating now. Do others agree? Apoorv020 ( talk) 19:49, 11 October 2009 (UTC) |
Last edited at 19:49, 11 October 2009 (UTC). Substituted at 20:07, 1 May 2016 (UTC)
Consider
under "Other Properties". The estimate
is simply false for many values of k and t (k fixed, t large). 178.38.132.48 ( talk) 18:00, 4 December 2017 (UTC)
The article claims that the Expectation must exist in order for the MGF to exist. This is not true. The expectation can exists (e.g., Cauchy, or Log-Normal - where it's positive infinite) but not be finite - and then there is no MGF.
Also, the link in the Cauchy to Indeterminate form is wrong - since for Cauchy the expectation of e^tX is positive infinite. No indeterminate issues here. — Preceding unsigned comment added by 109.186.33.244 ( talk) 14:44, 30 November 2021 (UTC)
This
level-5 vital article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||
|
Daily pageviews of this article
A graph should have been displayed here but
graphs are temporarily disabled. Until they are enabled again, visit the interactive graph at
pageviews.wmcloud.org |
Spelling question: I've never (before now) seen the name spelled with a hyphen. Searches of Math Reviews (MathSciNet) and Current Index to Statistics show an overwhelming preference for no hyphen. Should the title, at least, be changed (move the article to "Moment generating function" with a redirect)? Zaslav 18:08, 8 December 2006 (UTC)
I would like _all_ the terms such as E to be defined explicitly. Otherwise these articles are unintelligible to the casual reader. I would have thought that all terms in any formula should be defined every any article, or else reference should be made to some common form of definition of terms for that context. How about a bit more help for the randomly browsing casual student? I would like to see a recommendation in the Wikipedia "guidelines for authors" defining some kind of standard for this, otherwise it is very arbitrary which terms are defined and which are expected to be known. —Preceding unsigned comment added by 220.253.60.249 ( talk • contribs)
Certainly one could put in links to those things, but this article is the wrong place to explain what "E" is, just as an article about Shakespeare's plays is the wrong place to explain how to spell "Denmark", saying that the "D" represents the initial sound in "dog", etc.
This is not written for people who already know this material.
It is written for people who already know what probability distributions are and the standard basic facts about them. Michael Hardy ( talk) 21:47, 10 July 2009 (UTC)
Thanks for the sarcasm, hope you feel a little better about yourself. misli h 23:32, 5 August 2009 (UTC)
of its probability distribution." —Preceding unsigned comment added by 71.199.181.122 ( talk) 21:46, 28 September 2010 (UTC)
There is a link to the expected value operator wiki. It would probably clutter articles to have detailed explanations of each preceding idea necessary to discuss the current, but it might be a good idea to include wikis that should be understood previous to reading the current wiki. —Preceding unsigned comment added by 141.225.193.194 ( talk) 01:32, 31 January 2011 (UTC)
I agree with a lot of the above, but I think it should be stated explicitly that t is just a dummy variable with no intrinsic meaning. Thomas Tvileren ( talk) 20:52, 15 November 2012 (UTC)
the definition of the n-th moment is wrong, the last equality is identically zero, as the nth derivative of 1 evaluated at t=0 will always be zero. the evaluation bar must be placed at the end (so we know we are differentiating Mx(t) n times and evaluating it at zero).
Please provide a few examples, e.g. for a Gaussian distribution.
I would also like to see some more in the article about some basic properties of the moment-generating function, such as convexity, non-negativity, the fact that M(0) always equals one, and also some other not-so-obvious properties (of which I lack knowledge) indicating what the mgf is used for. -- 130.94.162.64 00:55, 17 June 2006 (UTC)
Also, is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral" When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? If not, the quoted statement is wrong and should be removed from the article. —Preceding unsigned comment added by Sjayzzang ( talk • contribs) 20:02, 15 April 2009 (UTC)
We should mention the case when X is a vector of random variables or a stochastic process. Jackzhp 22:29, 3 September 2006 (UTC)
There are a whole bunch of properties of MGFs that it would be nice to include -- e.g. the MGF of a linear transformation of a random variable, MGF of a sum of independent random variables, etc.
something should be added about the discrete form of the mgf, no? 24.136.121.150 08:37, 20 January 2007 (UTC)
It would seem like a good and self-evidently obvious thing to include a link to a wikipedia page which tabulates common moment generating functions (ie: the moment generating functions for common statistical distributions), placing them online. The information is already there on wikipedia, it would just be a case of organising it a little better.
Also, there is probably some efficient way in which the set of all possible functions which commonly occur when dealing with statistical distributions can be organised to highly the possibility of inter-relationships (perhaps some mgf's are nested mgfs so that the fact that
could be highlighted in a list of mgf interdependencies...).
ConcernedScientist ( talk) 00:47, 18 February 2009 (UTC)
We have a theorem that if two mgf's coincide in some region around 0, then corresponding random variables have same distribution. There is however a concern that this statement being true from the point of view of mathematician, is not so reliable from the point of view of applied statistician. McCullagh (1954) [1] gives following example:
with cumulant generating functions
Although the densities are visibly different, their corresponding cgfs are virtually indistinguishable, with maximum difference less than 1.34•10-9 over the entire range. Thus from numerical standpoint mgfs fail to uniquely determine distribution.
On the other hand Waller (1995) [2] shows that characteristic function does much better job in determining the distribution.
Even from a mathematician's point of view, don't you need the MGF to be infinitely differentiable at 0 for uniqueness? Paulginz ( talk) 14:45, 24 November 2009 (UTC)
This page doesn't seem to explain the purpose of the function. —Preceding unsigned comment added by 129.16.204.227 ( talk) 13:07, 10 September 2009 (UTC)
Thanks. Your edits very much improves the quality of the article. -- 129.16.24.201 ( talk) 08:59, 10 November 2009 (UTC)
Kimaaron ( talk) 21:34, 20 February 2010 (UTC)
The MGF is calculating the raw moments as opposed to the central moments. It already says "moments around the origin" on the page, but perhaps this should be noted explicitly at the end of the introduction ("The moment generating function is named as such, because of its intimate link to the raw moments") and again in the definition (only inserting raw), f.ex. : ...where mn is the nth raw moment. Currently the first definition on the moment page is of central moments, so a bit of confusion could occur. /Jens — Preceding unsigned comment added by 203.110.235.1 ( talk) 23:55, 21 January 2013 (UTC)
Under "definitions" further up on this page, there is some concern that the article is not transparent to someone trying to learn about moment generating functions. Michael Hardy writes, "It is written for people who already know what probability distributions are and the standard basic facts about them." While this is true, I think it jumps somewhat between requiring only this basic amount of knowledge and requiring more substantial knowledge. I would imagine many of my students in my intro probability course would find this article impenetrable at enough places that they would give up trying to learn something from it. I would think it would be better to put anything that requires knowledge from courses at a comparable level of difficulty or more advanced than an intro probability course at the end.
Specifically, I think it would improve the article to:
Does anyone feel that these changes would not be an improvement? Barryriedsmith ( talk) 14:23, 6 October 2015 (UTC)
The comment(s) below were originally left at Talk:Moment-generating function/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral"
When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? I think the article qualifies for B rating now. Do others agree? Apoorv020 ( talk) 19:49, 11 October 2009 (UTC) |
Last edited at 19:49, 11 October 2009 (UTC). Substituted at 20:07, 1 May 2016 (UTC)
Consider
under "Other Properties". The estimate
is simply false for many values of k and t (k fixed, t large). 178.38.132.48 ( talk) 18:00, 4 December 2017 (UTC)
The article claims that the Expectation must exist in order for the MGF to exist. This is not true. The expectation can exists (e.g., Cauchy, or Log-Normal - where it's positive infinite) but not be finite - and then there is no MGF.
Also, the link in the Cauchy to Indeterminate form is wrong - since for Cauchy the expectation of e^tX is positive infinite. No indeterminate issues here. — Preceding unsigned comment added by 109.186.33.244 ( talk) 14:44, 30 November 2021 (UTC)