Mathematics desk | ||
---|---|---|
< November 16 | << Oct | November | Dec >> | Current desk > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
If I have a bunch of points xi,yi, with the xi known exactly but the yi subject to some error, and if I want to fit a straight line through the points, minimizing the square of each y error (either because I like the way Δy2 takes the absolute value for me, or because I want outliers to have more weight), I can of course use the classic linear regression technique, which will actually spit out coefficients for me immediately, in closed form. Similarly, if I want something other than a straight-line fit, I can do things like taking log(yi) before fitting, or do a polynomial regression.
But what if I want something much more general? What if the y(x) function I'm trying to fit is arbitrary, perhaps with arbitrarily-many coefficients? What if I want to define my own error function, perhaps taking Δx into account as well? Finally, what if I don't insist on closed-form output, but am willing to search, to iterate?
I'm sure there are well-studied ways of doing this, I just don't know what they're called. I could try to write my own program to do the searching, but there are probably extant ones out there that already work well.
I know about linear programming but I'm looking for something more general than that, too, because I'm not interested in limiting myself to linear functions. — Steve Summit ( talk) 03:08, 17 November 2020 (UTC)
Mathematics desk | ||
---|---|---|
< November 16 | << Oct | November | Dec >> | Current desk > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
If I have a bunch of points xi,yi, with the xi known exactly but the yi subject to some error, and if I want to fit a straight line through the points, minimizing the square of each y error (either because I like the way Δy2 takes the absolute value for me, or because I want outliers to have more weight), I can of course use the classic linear regression technique, which will actually spit out coefficients for me immediately, in closed form. Similarly, if I want something other than a straight-line fit, I can do things like taking log(yi) before fitting, or do a polynomial regression.
But what if I want something much more general? What if the y(x) function I'm trying to fit is arbitrary, perhaps with arbitrarily-many coefficients? What if I want to define my own error function, perhaps taking Δx into account as well? Finally, what if I don't insist on closed-form output, but am willing to search, to iterate?
I'm sure there are well-studied ways of doing this, I just don't know what they're called. I could try to write my own program to do the searching, but there are probably extant ones out there that already work well.
I know about linear programming but I'm looking for something more general than that, too, because I'm not interested in limiting myself to linear functions. — Steve Summit ( talk) 03:08, 17 November 2020 (UTC)