Mathematics desk | ||
---|---|---|
< July 4 | << Jun | July | Aug >> | July 6 > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
This problem is from Spivak's Calculus (4th ed.). Suppose f is a continuous function on [0,1] with the property that f(0) = f(1) and n is any natural number. The first part requires the reader to prove that the equation f(x) = f(x + 1/n) has at least one root in [0,1]; I managed to do that easily enough, by deducing a contradiction. However, the second part has me stumped: Suppose a is in (0,1) and is not equal to 1/n for any natural n. Find an f with the properties above for which the equation f(x) = f(x + a) has no root in [0,1]. Some help with constructing f, particularly by using the first part of the question, would be great. Thanks in advance. — Anonymous Dissident Talk 06:20, 5 July 2011 (UTC)
Could someone explain in a succint way the difference between these approaches? Is one a subset of the other?
In particular I have two large sets of variables X and Y, and I know Y to be linearly related to X and I suppose the error to be distributed as a Gaussian. I know I can use simple linear regression here to determine the intercept and slope. MLE is quite new to me so I apologise if I'm thinking about it stupidly. In this case it 'feels' like they're related but it also feels like there must be some distinction that must cause one to be preferred. -- Iae ( talk) 11:31, 5 July 2011 (UTC)
@Iae: You seem to imagine that MLE and regression are two approaches to something. "Regression" is a vague and general term that means estimation of the population mean or population median or similar location parameter, of one variable, conditional on the value of another variable, and all based on a sample. Maximum likelihood estimation is estimation in problems involving parametrized families of proability distributions. In some situations, least-squares estimates correspond exactly with maximum likelihood estimates; in others maximum likelihood makes no sense because there is no parametrized family of probability distributions, but regression is still done. Michael Hardy ( talk) 05:23, 7 July 2011 (UTC)
I think it's more accurate to say that MLE is a generalizable approach to estimation, and regression is an estimation procedure for determining the impact of one or more variables on a dependent variable. The results of a linear regression using ordinary least squares are equivalent to those you would get using MLE. But this is not true of other regression procedures; and MLE can be used to derive estimators for many quantities whose relationship cannot be characterized through regression. 12.186.80.1 ( talk) 18:49, 7 July 2011 (UTC)David
Does anyone know of a reference that presents a clear account of how to obtain a Brownian motion on a Riemannian manifold as a scaling limit of random walks? Sławomir Biały ( talk) 13:38, 5 July 2011 (UTC)
Mathematics desk | ||
---|---|---|
< July 4 | << Jun | July | Aug >> | July 6 > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
This problem is from Spivak's Calculus (4th ed.). Suppose f is a continuous function on [0,1] with the property that f(0) = f(1) and n is any natural number. The first part requires the reader to prove that the equation f(x) = f(x + 1/n) has at least one root in [0,1]; I managed to do that easily enough, by deducing a contradiction. However, the second part has me stumped: Suppose a is in (0,1) and is not equal to 1/n for any natural n. Find an f with the properties above for which the equation f(x) = f(x + a) has no root in [0,1]. Some help with constructing f, particularly by using the first part of the question, would be great. Thanks in advance. — Anonymous Dissident Talk 06:20, 5 July 2011 (UTC)
Could someone explain in a succint way the difference between these approaches? Is one a subset of the other?
In particular I have two large sets of variables X and Y, and I know Y to be linearly related to X and I suppose the error to be distributed as a Gaussian. I know I can use simple linear regression here to determine the intercept and slope. MLE is quite new to me so I apologise if I'm thinking about it stupidly. In this case it 'feels' like they're related but it also feels like there must be some distinction that must cause one to be preferred. -- Iae ( talk) 11:31, 5 July 2011 (UTC)
@Iae: You seem to imagine that MLE and regression are two approaches to something. "Regression" is a vague and general term that means estimation of the population mean or population median or similar location parameter, of one variable, conditional on the value of another variable, and all based on a sample. Maximum likelihood estimation is estimation in problems involving parametrized families of proability distributions. In some situations, least-squares estimates correspond exactly with maximum likelihood estimates; in others maximum likelihood makes no sense because there is no parametrized family of probability distributions, but regression is still done. Michael Hardy ( talk) 05:23, 7 July 2011 (UTC)
I think it's more accurate to say that MLE is a generalizable approach to estimation, and regression is an estimation procedure for determining the impact of one or more variables on a dependent variable. The results of a linear regression using ordinary least squares are equivalent to those you would get using MLE. But this is not true of other regression procedures; and MLE can be used to derive estimators for many quantities whose relationship cannot be characterized through regression. 12.186.80.1 ( talk) 18:49, 7 July 2011 (UTC)David
Does anyone know of a reference that presents a clear account of how to obtain a Brownian motion on a Riemannian manifold as a scaling limit of random walks? Sławomir Biały ( talk) 13:38, 5 July 2011 (UTC)