This is the
talk page for discussing improvements to the
Big O notation article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1, 2, 3, 4Auto-archiving period: 90 days |
This
level-5 vital article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||
|
What's "sup" in the "Limit Definitions" of "Big Oh" and "Big Omega (HLw)" at #Family of Bachmann–Landau notations ?? and Yashpalgoyal1304 ( talk) 13:40, 6 June 2022 (UTC)
limsup
, and searching for it fetches this:
Limit inferior and limit superior which researching on this page itself, is also mentioned in the sections:
#Formal definition and
#See also.
Yashpalgoyal1304 (
talk) 13:55, 6 June 2022 (UTC)
It seems, from the opening of this section, that it is meant to discuss the fact that some computer scientists use O where should be used instead (i.e., for a tight bound); I'm not sure if it is desirable to discuss improper usages in the article. At any rate, the rest of the section does not match its beginning, as it illustrates the proper use. It should be edited, perhaps removed. AmirOnWiki ( talk) 14:07, 30 June 2023 (UTC)
"Determining if a binary number is even or odd" isn't a particularly useful example, because there will only one number being tested. Finding the median value for a sorted list of measurements is a better example. SlowJog ( talk) 02:32, 29 October 2023 (UTC)
My note on this topic disapeared, but I still think there is common misinterpretation there. Suppose you have algorithm which \Theta(n) times calls a method of a data structure. The assymptotic complexity of the method is described as O(\log s) where s is the size of the data structure. The actual use of the data structure is such that s never exceeds 1. (Weird case) If you simply multiply the number of method calls by their complexity you obtain bound 0.
There are several solutions of this wrong calculation ... I recommend considering the minimal complexity of a call be 1 what changes the result to \Theta(n). I do not think we would change all complexity descriptions to O(1+\log s) ..., but one should know about this use asymptotic when the size does not go to infinity problem. Hippo.69 ( talk) 13:50, 12 December 2023 (UTC)
Hippo.69 ( talk) 21:04, 14 December 2023 (UTC)
This is the
talk page for discussing improvements to the
Big O notation article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1, 2, 3, 4Auto-archiving period: 90 days |
This
level-5 vital article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||
|
What's "sup" in the "Limit Definitions" of "Big Oh" and "Big Omega (HLw)" at #Family of Bachmann–Landau notations ?? and Yashpalgoyal1304 ( talk) 13:40, 6 June 2022 (UTC)
limsup
, and searching for it fetches this:
Limit inferior and limit superior which researching on this page itself, is also mentioned in the sections:
#Formal definition and
#See also.
Yashpalgoyal1304 (
talk) 13:55, 6 June 2022 (UTC)
It seems, from the opening of this section, that it is meant to discuss the fact that some computer scientists use O where should be used instead (i.e., for a tight bound); I'm not sure if it is desirable to discuss improper usages in the article. At any rate, the rest of the section does not match its beginning, as it illustrates the proper use. It should be edited, perhaps removed. AmirOnWiki ( talk) 14:07, 30 June 2023 (UTC)
"Determining if a binary number is even or odd" isn't a particularly useful example, because there will only one number being tested. Finding the median value for a sorted list of measurements is a better example. SlowJog ( talk) 02:32, 29 October 2023 (UTC)
My note on this topic disapeared, but I still think there is common misinterpretation there. Suppose you have algorithm which \Theta(n) times calls a method of a data structure. The assymptotic complexity of the method is described as O(\log s) where s is the size of the data structure. The actual use of the data structure is such that s never exceeds 1. (Weird case) If you simply multiply the number of method calls by their complexity you obtain bound 0.
There are several solutions of this wrong calculation ... I recommend considering the minimal complexity of a call be 1 what changes the result to \Theta(n). I do not think we would change all complexity descriptions to O(1+\log s) ..., but one should know about this use asymptotic when the size does not go to infinity problem. Hippo.69 ( talk) 13:50, 12 December 2023 (UTC)
Hippo.69 ( talk) 21:04, 14 December 2023 (UTC)