This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 |
A more general question that came up: this page is about global optimization. I propose to merge Optimization (mathematics) and global optimization and store it in global optimization. The entry Optimization (mathematics) should be changed to give links to global optimization and local optimization and describe the difference (like NP-completness for most of the GO-problems, local optimization more or less a topic in numerics with standard algorithms like conjugate gradient etc...).
I strongly argue into this direction as optimization has to be distinguished according to the targets that one follows.
I disagree with many points of view expressed in this comment. Initially, a clear distinction between the concept of an optimization problem and the capabilities and limitations of available numerical methods to solve it must be made. An optimization problem and its optimal solution are defined precisely as in the beginning of the article. Rigorously, the criteria that defines a local optimal solution does not necessarily meet the requirements for it to be regarded as a optimal solution, although many numerical methods have the strong limitation of not being able to guarantee rigorous (or global) optimality and treat local solutions as de facto solutions to the problem. In other words, a local optimal solution is not necessarily the actual solution to the problem, but it’s the best we can get using common NLP and MINLP solvers. Global optimization, on the other hand, is a field of study that aims the development of deterministic algorithms that are capable of guaranteeing convergence in finite time to the actual optimal solution of a non-linear (and non-convex) problem. Therefore, Global Optimization defines a set of methodologies and algorithms designed to obtain a proven optimal solution to an optimization problem, and it is not, by any means, the definition of the problem at hand. -- Jonnat 20:30, 17 March 2006 (UTC)
This article seems to be about mathematical programming, which is not the same as but one type of optimization. Am I right? Should it be moved to mathematical programming then? I mean: The Lagrange multiplier is not even mentioned. And the convex optimization section assumes that there exists a local minimum within the constraints. I am not an expert in this field so ... what do you think? -- Rade
I would suggest major modifications on the notations section of the article. Specially, I suggest the replacement of the entire section containing all the illustrative examples by a more rigorous exposition of general instances of mathematical programming problems. From the mathematical representation of the general problem, most “subfields” specified in the next section can be derived from the specification of simple characteristics of the functions in the general model, such as linearity or convexity.
Moreover, the remaining sections also require major alterations. The information under the “techniques” section only regards methods for solving non-linear programming problems. The criteria for a local minimum to be also a global one is not only the convexity of the objective function, but also of the feasible region. The other popular methods linkified are considered heuristic methods and are commonly regarded as a separate from deterministic optimization procedures.-- Jonnat 21:38, 17 March 2006 (UTC)
Below are the discussed suggestions. The next section would replace the current sections of “notations” and “major subfields”. --Jonnat 16:32, 18 April 2006 (UTC)
I always thought that logical programming is a special case of mathematical programming (where the search is for a model that satisfies a given theory), but the page on logical programming is written from a different viewpoint, like it is just another programming paradigm. I wonder if I am correct, maybe the log. progr. page could be improved by people here. Samohyl Jan 16:58, 14 June 2006 (UTC)
I support the comments above about placing the more technical content later in the article, and therefore propose a short introduction quickly followed by an expanded section on Uses. The ones given at the moment seem relatively specialized. It's important to mention that optimization lies at the heart of traditional (neo-classical) economics, and works out the consequences of a major hypothesis about human behaviour (i.e. that people make optimizing choices). One should also mention its role in statistical estimation – least-squares methods, maximum likelihood.
After Uses, History would be relatively non-technical and a good rationale for the more technical content that follows. Indeed, the history would provide a good way to organize the technical content.
One also needs to mention, or link to, extreme-value functions and duality theory.
[[User: Rwb001 16:07, 16 October 2006 (UTC)]]
hi. AFAIK mathematical programming is completely unrelated to computer programming, however the article is a bit ambiguous with regard to this. I've not changed since i'm not truly sure... Jabbba ( talk) 21:27, 3 February 2008 (UTC)
This article currently states: Constrained problems can often be transformed into unconstrained problems with the help of Lagrange multipliers. This statement is misleading in this article. Lagrange multipliers create saddle points, not stable optima, where the constraints are satisfied. Since (nearly?) all of the optimization techniques listed in this article seek stable optima, the implied procedure (use Lagrange multipliers to create an unconstrained problem, and then solve with your favorite optimization technique) will not work--the optimizer will just converge to values of +/- infinity. If there is a known way to use Lagrange multipliers to create an unconstrained optimization problem that is suitable for these optimization techniques, then more details are needed. —Preceding unsigned comment added by 128.187.80.2 ( talk) 18:40, 28 May 2009 (UTC)
Hi. i miss there the link to high-level modeling language 'lpl', which was designed and developed at the University of Fribourg over the last 20 years and it is one of the leading (mathematical) modeling language by all means Crisgh ( talk) 15:12, 29 October 2009 (UTC)
User Bonnans would like his books added to this article's reference section. I'm not familiar with these books (nor whether they were used in editing this article). Jwesley 78 18:07, 21 March 2010 (UTC)
Some authors of the papers edit these important pages of wikipedia and put their recent methods which are not verified in the literature in these article. For example someone has put his algorithm called cuckoo in all the pages about optimization and evolutionary computation. I think wikipedia is not a tool for advertising methods and doing science marketing. If we are going to list some minor algorithms like these algorithm in the list of important optimization methods, this list should include more that 100 algorithms.
Lets use wikipedia to share the knowledge not to highlight our works. —Preceding unsigned comment added by 74.194.214.87 ( talk) 22:59, 23 November 2010 (UTC)
I think there should be a section on this article considering the case when at least one of the following components change over time: (a)the objective functions; (b) the problem instance; or (c) the variable constraints. Obviously, many real world optimization problems share this property of temporal change. If no one objects this proposal, I'll be creating a section on dynamic optimization in the following weeks. —Preceding unsigned comment added by Crbazevedo ( talk • contribs) 13:20, 3 December 2010 (UTC)
- Stochastic programming studies the case in which some of the constraints or parameters depend on random variables.
- Infinite-dimensional optimization studies the case when the set of feasible solutions is a subset of an infinite- dimensional space, such as a space of functions.
- Trajectory optimization is the specialty of optimizing trajectories for air and space vehicles.
- In a number of subfields, the techniques are designed primarily for optimization in dynamic contexts (that is, decision making over time):
- Calculus of variations seeks to optimize an objective defined over many points in time, by considering how the objective function changes if there is a small change in the choice path.
- Optimal control theory is a generalization of the calculus of variations.
- Dynamic programming studies the case in which the optimization strategy is based on splitting the problem into smaller subproblems. The equation that describes the relationship between these subproblems is called the Bellman equation.
This article does a pretty good job of providing an advanced overview of optimization methods, but it seems to me it's missing an example section, perhaps between sections 3 and 4. Just work through a simple example of minimizing (or maximizing) a smooth, convex (or concave) function by calculating the first-order condition associated with the optimum. And show a graph. And explain how the secont-order condition verifies optimality. I will do this myself if I find time, but it would be great if someone beat me to it. Rinconsoleao ( talk) 10:51, 12 May 2011 (UTC)
I recently did some changes to section "Computational optimization techniques" that Kiefer Wolfowitz reverted. Please see the comparison between my version and the current one (my version is on the left). Here's my main edit summary:
"Every search method is an algorithm. And every search method is iterative or uses an iteration in some step (even the Simplex algorithm). Thus, this classification needs to be rewritten."
In the current version, optimization techniques are classified as:
That's sloppy terminology. Both the "iterative" and the Euristic methods are algorithms (indeed the main article for "Heuristics" is called Heuristic algorithms), and obviously all of them have a "finite number of steps" (even if some of these steps are iterated).
All the optimization techniques are search algorithms, and the link to the relevant article is not even provided in the current version. The very important distinction between techniques for global optimization and techniques which can only search for local extrema is not even mentioned in the current version.
Moreover, the phrase "algorithms that terminate in a finite number of steps", which is currently used to define one of the three types of optimization techniques is misleading. All algorithms, by definition, have a finite number of steps. If they are iterative algorithms, some of these steps are repeated again and again. And no search method, not even the most complex one, is supposed to have an infinite number of steps or an infinite number of iterations.
My edits may not be perfect (I was stopped before I could complete them), but I think they are a much better starting point than the current version.
Paolo.dL ( talk) 11:17, 4 August 2011 (UTC)
I confess to being a little bit confused by all this. The page (either version) doesn't actually give an explicit definition of "algorithm" (and the algorithm page leaves room for ambiguity), so I'm not sure which established definition is being disputed here. It strikes me that the distinction being made in the current version between "algorithms" and "iterative methods" is really a distinction between methods that give an exact answer in finite time (e.g. the simplex algorithm), and methods that give successively better approximations at each iteration (e.g. Newton's method). The distinction is real, but I agree that it could be described in a better way. Jowa fan ( talk) 13:50, 4 August 2011 (UTC)
Algorithm#By implementation explains that algorithms may be recursive or iterative, ... deterministic or non-determinisitc (including Heuristic algorithms) and even exact or approximate ("While many algorithms reach an exact solution, approximation algorithms seek an approximation that is close to the true solution. Approximation may use either a deterministic or a random strategy"). "Performance guarantee" is not listed as a criterion for classification. Moreover, according to Algorithm characterizations, Both Kuth (1968, 1973) and Stone (1972) define an algorithm as a program that "must terminate in a finite number of steps... a reasonable number" (see also note 11 in Algorithm). This definition is repeated in this article, at the beginning of the section we are discussing about, and even Kiefer supports it (see his comments on my talk page). Since no optimization technique is supposed to run forever (i.e. all of them are supposed to terminate according to precise rules or "stopping criteria", including a threshold for number of iterations), I am convinced that in this context the terms "algorithm", "technique" and "method" are equivalent. In other words, the only optimization techniques that are not algorithms are those with bugs.
Based on these considerations, and according to what Jowa fan suggested, I propose using this classification:
Paolo.dL ( talk) 22:53, 4 August 2011 (UTC)
Thank you for the interesting discussion about the strict definition of the term "algorithm". This definition, however, is not explained in this article and a reader (even an experienced one) is not supposed to know it. What a reader would be glad to know is that some techniques, as Jowa fan noted, give an exact answer, some others do not. Also, some can only find local extrema (and hence are not always appropriate for operating on non-convex or non-concave goal functions), some other can find global extrema. Conversely, the current version explains that the techniques classified as "algorithms" terminate in a finite number of steps. It follows that either
Isn't that a perfect example of a sloppy classification? I already explained this, and I believe that my explanation was simply ignored. Paolo.dL ( talk) 19:59, 10 August 2011 (UTC)
This article is not the typology for Computing Reviews. Rather, it is supposed to inform the public. Therefore, the article should emphasize the most common problems, algorithms, and methods that are used in basic courses and scientific/computational practice. We should agree to follow a leading textbook's organization. Such an organization must be a compromise, of course.
Such an effort should be led onlyideally by a person who has moved a mathematics/computer science article to "good article" status on English Wikipedia (or by researchers of international standing, like several Wikipedians).
Kiefer.
Wolfowitz
14:16, 6 August 2011 (UTC)
There is an ongoing debate about the scope of convex optimization (or convex programming) on Talk:Convex optimization. Additional views are welcome. Isheden ( talk) 21:19, 3 February 2012 (UTC)
Conjugate gradient is an iterative method to solve a system of linear equation where the system matrix is symmetric and and positive-definite, so I don't think it's a good idea to list it in the iterative methods for optimization, although many optimization problems can be converted to solve linear equations. — Preceding unsigned comment added by Chaohuang ( talk • contribs) 02:10, 14 June 2012 (UTC)
There is a Wikipedia article called Nonlinear programming. It is much shorter than this article. I think that is a subtopic of this article's subject, but a large one, probably covering the majority of cases in science and engineering. It seems to be more simplified and didactically oriented than this one, so there may be advantages to its being separate, but it is odd organization for an article to contain more information on a subtopic than an article on that subtopic does. (I myself am new to these articles but not to this subject.) David R. Ingham ( talk) 00:53, 21 February 2013 (UTC)
Is it true that combinatorial optimization is considered a subfield of mathematical optimization? In optimization (disambiguation), mathematical optimization is defined as "the theory and computation of extrema or stationary points of functions". This would be in line with the definition in note 1: "Mathematical programming is the study or use of the mathematical program" and "A mathematical program is an optimization problem of the form: Maximize f(x): x in X, g(x) <= 0, h(x) = 0, where..." Everything else on this page seems to be concerned with optimization of continuous functions, so why is combinatorial optimization suddenly mentioned as a subfield? Isheden ( talk) 11:56, 25 April 2012 (UTC)
The comment(s) below were originally left at Talk:Mathematical optimization/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Lead needs to be an overview with more details in the body. History needs expansion and citation. Lists need more structure and explanation. Geometry guy 10:29, 28 May 2007 (UTC) |
Substituted at 18:30, 17 July 2016 (UTC)
Statistics is something not like mathematics. The mathematical theory of extreme value cannot be simply applied in Statistics. This is because an optimizer constructed with sample data is randomly variable, and the extreme value of the optimizer (minimum or maximun) cannot be more significant than other values of the optimizer. We should take the expectation of the optimizer to do statistical decision, e.g. model selection.
I think that the property of an optimizer is a random variable. If the minimum or maximum of a random variable is not more significant than other sampling points, why the minimum or maximum of an optimizer can do more than other values in a statistical analysis? People might say that an optimizer will be converged to its minimum or maximum. This is wrong since any random variable will not converge to its extreme value but to its expectation only. Yuanfangdelang ( talk) 21:02, 30 August 2016 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on Mathematical optimization. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 14:47, 5 June 2017 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on Mathematical optimization. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 01:33, 27 July 2017 (UTC)
The first paragraph seems to be rather badly written in how it formats what is the proper name of this discipline.
104.228.101.152 ( talk) 15:19, 30 August 2017 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on Mathematical optimization. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 11:24, 21 January 2018 (UTC)
As far as I can see, the book of Boyd and Vandenberghe cannot be downloaded from the website, in contradiction to the recent edit. Did I perhaps miss the correct link? -- Jitse Niesen 12:22, 8 Nov 2004 (UTC)
The book can be downloaded from ( http://www.stanford.edu/~boyd/bv_cvxbook.pdf ). Right ?
The "Computational optimization techniques" section presents three categories of techniques: finite termination algorithmic "optimization algorithms", convergent iterative "iterative methods", and heuristics. Under the iterative category, this is further broken down into "hessian", "gradient" and "function values" categories.
Having this kind of nomenclature/categorization clearly defined is great and very important! There are a few points that I think could be improved to make this section really useful:
- There are no references given where a reader can see the categories explained in more detail, or to support this categorization scheme. Is this categorization standard in the field? Can references be added to support this?
- It is confusing how the "Global Convergence" fits in to the categories. Are all algorithmic and iterative methods local, and all heuristics are global?
- Isn't the content of the Metaheuristic page more relevant to this topic than the Heuristic algorithm page? What is the difference in terminology here?
Maybe someone with a broad scope of knowledge in the field can weigh in on this and clarify the terminology/categorization of the techniques? ReadEuler ( talk) 23:45, 21 March 2021 (UTC)
module 1 120.29.69.185 ( talk) 07:22, 22 September 2021 (UTC)
This article doesn't mention that most (if not all) physical theories are expressed as optimization problem. My suspicion is that optimization problems are reverse of dynamic problems (diriclitch problems ) . 41.190.245.207 ( talk) 23:11, 18 December 2021 (UTC)
Plan 102.132.225.41 ( talk) 15:13, 30 May 2022 (UTC)
There are a fair number of optimization problems where the optimum is when two quantities are equal. For example, the maximum power transfer is when the load impedance equals the source impedance. The actual reason I am writing this is that there is no Kelvin's law for conductor size where the optimum is when the annual power loss equals the annual costs of the power line. But there are so many problems that have a similar form, or at least close enough, that there should be a page for that. Gah4 ( talk) 07:40, 2 June 2022 (UTC)
One thing I hoped to find in the article is the answer to this question:
Was it clearly either Newton or Leibniz before the other one knew this?
Or did they both claim to have priority on it that point in their reported quarrel about who did what first?
Or, perhaps history just doesn't know the answer.
But there must be an earliest known publication of that method of finding maxima and minima!
(This optimization technique, which every first-semester calculus student learns, has undoubtedly saved the world many billions of dollars, and I suspect I am underestimating.)
If anyone knows the answer, it would be good to include that information in the article. 2601:200:C000:1A0:D0B:17BA:E5BF:AD9F ( talk) 23:40, 20 June 2022 (UTC)
The page on Programming paradigm says that "mathematical" programming is a type of " declarative programming" "in which the desired result is declared as the solution of an optimization problem", and the link on the word "mathematical" links here because it is a link to " Mathematical programming", which redirects to here. As far as I can tell, this page doesn't give any indications or examples as to how a fully fledged programming language could be made up of solving optimization problems, though. (Here, by "fully fledged" I'm thinking "Turing complete/equivalent", but I have to admit that it doesn't that on the "programming paradigm" page, and the word "programming language" can refer to languages that are not Turing complete, even if it usually doesn't.) The closest thing I could find was the "Machine learning" section under "Applications", which consists of a link to Machine learning#Optimization. DubleH ( talk) 18:22, 15 July 2022 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 |
A more general question that came up: this page is about global optimization. I propose to merge Optimization (mathematics) and global optimization and store it in global optimization. The entry Optimization (mathematics) should be changed to give links to global optimization and local optimization and describe the difference (like NP-completness for most of the GO-problems, local optimization more or less a topic in numerics with standard algorithms like conjugate gradient etc...).
I strongly argue into this direction as optimization has to be distinguished according to the targets that one follows.
I disagree with many points of view expressed in this comment. Initially, a clear distinction between the concept of an optimization problem and the capabilities and limitations of available numerical methods to solve it must be made. An optimization problem and its optimal solution are defined precisely as in the beginning of the article. Rigorously, the criteria that defines a local optimal solution does not necessarily meet the requirements for it to be regarded as a optimal solution, although many numerical methods have the strong limitation of not being able to guarantee rigorous (or global) optimality and treat local solutions as de facto solutions to the problem. In other words, a local optimal solution is not necessarily the actual solution to the problem, but it’s the best we can get using common NLP and MINLP solvers. Global optimization, on the other hand, is a field of study that aims the development of deterministic algorithms that are capable of guaranteeing convergence in finite time to the actual optimal solution of a non-linear (and non-convex) problem. Therefore, Global Optimization defines a set of methodologies and algorithms designed to obtain a proven optimal solution to an optimization problem, and it is not, by any means, the definition of the problem at hand. -- Jonnat 20:30, 17 March 2006 (UTC)
This article seems to be about mathematical programming, which is not the same as but one type of optimization. Am I right? Should it be moved to mathematical programming then? I mean: The Lagrange multiplier is not even mentioned. And the convex optimization section assumes that there exists a local minimum within the constraints. I am not an expert in this field so ... what do you think? -- Rade
I would suggest major modifications on the notations section of the article. Specially, I suggest the replacement of the entire section containing all the illustrative examples by a more rigorous exposition of general instances of mathematical programming problems. From the mathematical representation of the general problem, most “subfields” specified in the next section can be derived from the specification of simple characteristics of the functions in the general model, such as linearity or convexity.
Moreover, the remaining sections also require major alterations. The information under the “techniques” section only regards methods for solving non-linear programming problems. The criteria for a local minimum to be also a global one is not only the convexity of the objective function, but also of the feasible region. The other popular methods linkified are considered heuristic methods and are commonly regarded as a separate from deterministic optimization procedures.-- Jonnat 21:38, 17 March 2006 (UTC)
Below are the discussed suggestions. The next section would replace the current sections of “notations” and “major subfields”. --Jonnat 16:32, 18 April 2006 (UTC)
I always thought that logical programming is a special case of mathematical programming (where the search is for a model that satisfies a given theory), but the page on logical programming is written from a different viewpoint, like it is just another programming paradigm. I wonder if I am correct, maybe the log. progr. page could be improved by people here. Samohyl Jan 16:58, 14 June 2006 (UTC)
I support the comments above about placing the more technical content later in the article, and therefore propose a short introduction quickly followed by an expanded section on Uses. The ones given at the moment seem relatively specialized. It's important to mention that optimization lies at the heart of traditional (neo-classical) economics, and works out the consequences of a major hypothesis about human behaviour (i.e. that people make optimizing choices). One should also mention its role in statistical estimation – least-squares methods, maximum likelihood.
After Uses, History would be relatively non-technical and a good rationale for the more technical content that follows. Indeed, the history would provide a good way to organize the technical content.
One also needs to mention, or link to, extreme-value functions and duality theory.
[[User: Rwb001 16:07, 16 October 2006 (UTC)]]
hi. AFAIK mathematical programming is completely unrelated to computer programming, however the article is a bit ambiguous with regard to this. I've not changed since i'm not truly sure... Jabbba ( talk) 21:27, 3 February 2008 (UTC)
This article currently states: Constrained problems can often be transformed into unconstrained problems with the help of Lagrange multipliers. This statement is misleading in this article. Lagrange multipliers create saddle points, not stable optima, where the constraints are satisfied. Since (nearly?) all of the optimization techniques listed in this article seek stable optima, the implied procedure (use Lagrange multipliers to create an unconstrained problem, and then solve with your favorite optimization technique) will not work--the optimizer will just converge to values of +/- infinity. If there is a known way to use Lagrange multipliers to create an unconstrained optimization problem that is suitable for these optimization techniques, then more details are needed. —Preceding unsigned comment added by 128.187.80.2 ( talk) 18:40, 28 May 2009 (UTC)
Hi. i miss there the link to high-level modeling language 'lpl', which was designed and developed at the University of Fribourg over the last 20 years and it is one of the leading (mathematical) modeling language by all means Crisgh ( talk) 15:12, 29 October 2009 (UTC)
User Bonnans would like his books added to this article's reference section. I'm not familiar with these books (nor whether they were used in editing this article). Jwesley 78 18:07, 21 March 2010 (UTC)
Some authors of the papers edit these important pages of wikipedia and put their recent methods which are not verified in the literature in these article. For example someone has put his algorithm called cuckoo in all the pages about optimization and evolutionary computation. I think wikipedia is not a tool for advertising methods and doing science marketing. If we are going to list some minor algorithms like these algorithm in the list of important optimization methods, this list should include more that 100 algorithms.
Lets use wikipedia to share the knowledge not to highlight our works. —Preceding unsigned comment added by 74.194.214.87 ( talk) 22:59, 23 November 2010 (UTC)
I think there should be a section on this article considering the case when at least one of the following components change over time: (a)the objective functions; (b) the problem instance; or (c) the variable constraints. Obviously, many real world optimization problems share this property of temporal change. If no one objects this proposal, I'll be creating a section on dynamic optimization in the following weeks. —Preceding unsigned comment added by Crbazevedo ( talk • contribs) 13:20, 3 December 2010 (UTC)
- Stochastic programming studies the case in which some of the constraints or parameters depend on random variables.
- Infinite-dimensional optimization studies the case when the set of feasible solutions is a subset of an infinite- dimensional space, such as a space of functions.
- Trajectory optimization is the specialty of optimizing trajectories for air and space vehicles.
- In a number of subfields, the techniques are designed primarily for optimization in dynamic contexts (that is, decision making over time):
- Calculus of variations seeks to optimize an objective defined over many points in time, by considering how the objective function changes if there is a small change in the choice path.
- Optimal control theory is a generalization of the calculus of variations.
- Dynamic programming studies the case in which the optimization strategy is based on splitting the problem into smaller subproblems. The equation that describes the relationship between these subproblems is called the Bellman equation.
This article does a pretty good job of providing an advanced overview of optimization methods, but it seems to me it's missing an example section, perhaps between sections 3 and 4. Just work through a simple example of minimizing (or maximizing) a smooth, convex (or concave) function by calculating the first-order condition associated with the optimum. And show a graph. And explain how the secont-order condition verifies optimality. I will do this myself if I find time, but it would be great if someone beat me to it. Rinconsoleao ( talk) 10:51, 12 May 2011 (UTC)
I recently did some changes to section "Computational optimization techniques" that Kiefer Wolfowitz reverted. Please see the comparison between my version and the current one (my version is on the left). Here's my main edit summary:
"Every search method is an algorithm. And every search method is iterative or uses an iteration in some step (even the Simplex algorithm). Thus, this classification needs to be rewritten."
In the current version, optimization techniques are classified as:
That's sloppy terminology. Both the "iterative" and the Euristic methods are algorithms (indeed the main article for "Heuristics" is called Heuristic algorithms), and obviously all of them have a "finite number of steps" (even if some of these steps are iterated).
All the optimization techniques are search algorithms, and the link to the relevant article is not even provided in the current version. The very important distinction between techniques for global optimization and techniques which can only search for local extrema is not even mentioned in the current version.
Moreover, the phrase "algorithms that terminate in a finite number of steps", which is currently used to define one of the three types of optimization techniques is misleading. All algorithms, by definition, have a finite number of steps. If they are iterative algorithms, some of these steps are repeated again and again. And no search method, not even the most complex one, is supposed to have an infinite number of steps or an infinite number of iterations.
My edits may not be perfect (I was stopped before I could complete them), but I think they are a much better starting point than the current version.
Paolo.dL ( talk) 11:17, 4 August 2011 (UTC)
I confess to being a little bit confused by all this. The page (either version) doesn't actually give an explicit definition of "algorithm" (and the algorithm page leaves room for ambiguity), so I'm not sure which established definition is being disputed here. It strikes me that the distinction being made in the current version between "algorithms" and "iterative methods" is really a distinction between methods that give an exact answer in finite time (e.g. the simplex algorithm), and methods that give successively better approximations at each iteration (e.g. Newton's method). The distinction is real, but I agree that it could be described in a better way. Jowa fan ( talk) 13:50, 4 August 2011 (UTC)
Algorithm#By implementation explains that algorithms may be recursive or iterative, ... deterministic or non-determinisitc (including Heuristic algorithms) and even exact or approximate ("While many algorithms reach an exact solution, approximation algorithms seek an approximation that is close to the true solution. Approximation may use either a deterministic or a random strategy"). "Performance guarantee" is not listed as a criterion for classification. Moreover, according to Algorithm characterizations, Both Kuth (1968, 1973) and Stone (1972) define an algorithm as a program that "must terminate in a finite number of steps... a reasonable number" (see also note 11 in Algorithm). This definition is repeated in this article, at the beginning of the section we are discussing about, and even Kiefer supports it (see his comments on my talk page). Since no optimization technique is supposed to run forever (i.e. all of them are supposed to terminate according to precise rules or "stopping criteria", including a threshold for number of iterations), I am convinced that in this context the terms "algorithm", "technique" and "method" are equivalent. In other words, the only optimization techniques that are not algorithms are those with bugs.
Based on these considerations, and according to what Jowa fan suggested, I propose using this classification:
Paolo.dL ( talk) 22:53, 4 August 2011 (UTC)
Thank you for the interesting discussion about the strict definition of the term "algorithm". This definition, however, is not explained in this article and a reader (even an experienced one) is not supposed to know it. What a reader would be glad to know is that some techniques, as Jowa fan noted, give an exact answer, some others do not. Also, some can only find local extrema (and hence are not always appropriate for operating on non-convex or non-concave goal functions), some other can find global extrema. Conversely, the current version explains that the techniques classified as "algorithms" terminate in a finite number of steps. It follows that either
Isn't that a perfect example of a sloppy classification? I already explained this, and I believe that my explanation was simply ignored. Paolo.dL ( talk) 19:59, 10 August 2011 (UTC)
This article is not the typology for Computing Reviews. Rather, it is supposed to inform the public. Therefore, the article should emphasize the most common problems, algorithms, and methods that are used in basic courses and scientific/computational practice. We should agree to follow a leading textbook's organization. Such an organization must be a compromise, of course.
Such an effort should be led onlyideally by a person who has moved a mathematics/computer science article to "good article" status on English Wikipedia (or by researchers of international standing, like several Wikipedians).
Kiefer.
Wolfowitz
14:16, 6 August 2011 (UTC)
There is an ongoing debate about the scope of convex optimization (or convex programming) on Talk:Convex optimization. Additional views are welcome. Isheden ( talk) 21:19, 3 February 2012 (UTC)
Conjugate gradient is an iterative method to solve a system of linear equation where the system matrix is symmetric and and positive-definite, so I don't think it's a good idea to list it in the iterative methods for optimization, although many optimization problems can be converted to solve linear equations. — Preceding unsigned comment added by Chaohuang ( talk • contribs) 02:10, 14 June 2012 (UTC)
There is a Wikipedia article called Nonlinear programming. It is much shorter than this article. I think that is a subtopic of this article's subject, but a large one, probably covering the majority of cases in science and engineering. It seems to be more simplified and didactically oriented than this one, so there may be advantages to its being separate, but it is odd organization for an article to contain more information on a subtopic than an article on that subtopic does. (I myself am new to these articles but not to this subject.) David R. Ingham ( talk) 00:53, 21 February 2013 (UTC)
Is it true that combinatorial optimization is considered a subfield of mathematical optimization? In optimization (disambiguation), mathematical optimization is defined as "the theory and computation of extrema or stationary points of functions". This would be in line with the definition in note 1: "Mathematical programming is the study or use of the mathematical program" and "A mathematical program is an optimization problem of the form: Maximize f(x): x in X, g(x) <= 0, h(x) = 0, where..." Everything else on this page seems to be concerned with optimization of continuous functions, so why is combinatorial optimization suddenly mentioned as a subfield? Isheden ( talk) 11:56, 25 April 2012 (UTC)
The comment(s) below were originally left at Talk:Mathematical optimization/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Lead needs to be an overview with more details in the body. History needs expansion and citation. Lists need more structure and explanation. Geometry guy 10:29, 28 May 2007 (UTC) |
Substituted at 18:30, 17 July 2016 (UTC)
Statistics is something not like mathematics. The mathematical theory of extreme value cannot be simply applied in Statistics. This is because an optimizer constructed with sample data is randomly variable, and the extreme value of the optimizer (minimum or maximun) cannot be more significant than other values of the optimizer. We should take the expectation of the optimizer to do statistical decision, e.g. model selection.
I think that the property of an optimizer is a random variable. If the minimum or maximum of a random variable is not more significant than other sampling points, why the minimum or maximum of an optimizer can do more than other values in a statistical analysis? People might say that an optimizer will be converged to its minimum or maximum. This is wrong since any random variable will not converge to its extreme value but to its expectation only. Yuanfangdelang ( talk) 21:02, 30 August 2016 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on Mathematical optimization. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 14:47, 5 June 2017 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on Mathematical optimization. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 01:33, 27 July 2017 (UTC)
The first paragraph seems to be rather badly written in how it formats what is the proper name of this discipline.
104.228.101.152 ( talk) 15:19, 30 August 2017 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on Mathematical optimization. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 11:24, 21 January 2018 (UTC)
As far as I can see, the book of Boyd and Vandenberghe cannot be downloaded from the website, in contradiction to the recent edit. Did I perhaps miss the correct link? -- Jitse Niesen 12:22, 8 Nov 2004 (UTC)
The book can be downloaded from ( http://www.stanford.edu/~boyd/bv_cvxbook.pdf ). Right ?
The "Computational optimization techniques" section presents three categories of techniques: finite termination algorithmic "optimization algorithms", convergent iterative "iterative methods", and heuristics. Under the iterative category, this is further broken down into "hessian", "gradient" and "function values" categories.
Having this kind of nomenclature/categorization clearly defined is great and very important! There are a few points that I think could be improved to make this section really useful:
- There are no references given where a reader can see the categories explained in more detail, or to support this categorization scheme. Is this categorization standard in the field? Can references be added to support this?
- It is confusing how the "Global Convergence" fits in to the categories. Are all algorithmic and iterative methods local, and all heuristics are global?
- Isn't the content of the Metaheuristic page more relevant to this topic than the Heuristic algorithm page? What is the difference in terminology here?
Maybe someone with a broad scope of knowledge in the field can weigh in on this and clarify the terminology/categorization of the techniques? ReadEuler ( talk) 23:45, 21 March 2021 (UTC)
module 1 120.29.69.185 ( talk) 07:22, 22 September 2021 (UTC)
This article doesn't mention that most (if not all) physical theories are expressed as optimization problem. My suspicion is that optimization problems are reverse of dynamic problems (diriclitch problems ) . 41.190.245.207 ( talk) 23:11, 18 December 2021 (UTC)
Plan 102.132.225.41 ( talk) 15:13, 30 May 2022 (UTC)
There are a fair number of optimization problems where the optimum is when two quantities are equal. For example, the maximum power transfer is when the load impedance equals the source impedance. The actual reason I am writing this is that there is no Kelvin's law for conductor size where the optimum is when the annual power loss equals the annual costs of the power line. But there are so many problems that have a similar form, or at least close enough, that there should be a page for that. Gah4 ( talk) 07:40, 2 June 2022 (UTC)
One thing I hoped to find in the article is the answer to this question:
Was it clearly either Newton or Leibniz before the other one knew this?
Or did they both claim to have priority on it that point in their reported quarrel about who did what first?
Or, perhaps history just doesn't know the answer.
But there must be an earliest known publication of that method of finding maxima and minima!
(This optimization technique, which every first-semester calculus student learns, has undoubtedly saved the world many billions of dollars, and I suspect I am underestimating.)
If anyone knows the answer, it would be good to include that information in the article. 2601:200:C000:1A0:D0B:17BA:E5BF:AD9F ( talk) 23:40, 20 June 2022 (UTC)
The page on Programming paradigm says that "mathematical" programming is a type of " declarative programming" "in which the desired result is declared as the solution of an optimization problem", and the link on the word "mathematical" links here because it is a link to " Mathematical programming", which redirects to here. As far as I can tell, this page doesn't give any indications or examples as to how a fully fledged programming language could be made up of solving optimization problems, though. (Here, by "fully fledged" I'm thinking "Turing complete/equivalent", but I have to admit that it doesn't that on the "programming paradigm" page, and the word "programming language" can refer to languages that are not Turing complete, even if it usually doesn't.) The closest thing I could find was the "Machine learning" section under "Applications", which consists of a link to Machine learning#Optimization. DubleH ( talk) 18:22, 15 July 2022 (UTC)