![]() | This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||
|
An
automated Wikipedia link suggester has some possible wiki link suggestions for the article on
Andrey Markov, and these have been placed on
this page for your convenience.
Tip: Some people find it helpful if these suggestions are shown on this talk page, rather than on another page. To do this, just add {{User:LinkBot/suggestions/Andrey_Markov}} to this page. —
LinkBot 10:38, 17 Dec 2004 (UTC)
In the article there is "Andreyevich" and "Andreevich" which presumably differ only in transliteration. Is there a standard transliteration for this? 67.101.45.65 ( talk) 20:07, 28 June 2008 (UTC)
I suggest to rearrange articles on A. A. Markov and his son in the following way.
Current situation leads to many confusions that are difficult to detect (see at "What links here" and look for logic or hockey-related articles). Of course, they can be linked properly, but such mistakes will emerge again almost surely, as any editor who make a link to A.A.Markov (meaning Jr.) can even test this link, read the first phrase of the article "A.A. Markov was a Russian mathematician..." and became satisfied.-- 92.39.161.221 ( talk) 21:51, 4 May 2008 (UTC)
Hi all,
Hierarchical Hidden Markov Models, of great importance in machine learning in the recent years, apparently were pioneered by Markov himself. I have, however, not found primary sources on this.
According to the following, Markov himself and subsequently Norbert Wiener hypothesized models of hierarchical sequences of states:
"The Russian mathematician Andrei Andreyevich Markov (1856 - 1922) built a mathematical theory of hierarchical sequences of states. The model was based on the possibility of traversing the states in one chain, and if that was successful, triggering a state in the next higher level in the hierarchy. Sound familiar? Markov's model included probabilities of each state's successfully occurring. He went on to hypothesize a situation in which a system has such a hierarchy of linear sequences of states, but those are unable to be directly examined-- hence the name hidden Markov models. The lowest level of the hierarchy emits signals, which are all we are allowed to see. Markov provides sa mathematical technique to compute what the probabilities of each transition must be based on the observed output. The method was subsequently refined by Norbert Wiener in 1923. Wiener's refinement also provided a way to determine the connections in the Markov model; essentially any connection with too low a probability was considered not to exist. This is essentially how the human neocortex trims connections-- if they are rarely or never used, they are considered unlikely and are pruned away. In our case, " Source: How to create a mind, Ray Kurzweil.
I have so far located Markov's first paper on chains, his 1907 "Extension of the limit theorems of probability theory to a sum of variables connected in a chain". I have yet not found his hierarchical models.
Does anyone know more about this? Kurzweil himself claims to have pioneered the use of HHMMs in the 80s and 90s. — Preceding unsigned comment added by Robolobster ( talk • contribs) 14:27, 7 February 2013 (UTC)
A large portion of the current biography seems to be copied from a (non-free) book, which isn't even cited in the article: http://books.google.com/books?id=IjFJNoq638kC&pg=PA98. Ian ( talk) 01:15, 6 July 2015 (UTC)
I've changed continuous fractions -> continued fractions, per fr:Andreï Markov (mathématicien) and the MacTutor archive. It is possible that 'continu' in French and 'непрерывная' in Russian mean both 'continuous' and 'continued' depending on context. In fact, the Russian article on continued fractions is at ru:Непрерывная дробь. Continued fractions are the things invented by John Wallis; continuous fractions aren't anything in particular.
In fact, ru:Andrei Markov gives us the title of his thesis as «Об интегрировании дифференциальных уравнений при помощи непрерывных дробей» ('Integration of differential equations by means of continued fractions'). So непрерывных дробей must be our usual continued fractions. EdJohnston ( talk) 20:25, 18 September 2018 (UTC)
![]() | This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||
|
An
automated Wikipedia link suggester has some possible wiki link suggestions for the article on
Andrey Markov, and these have been placed on
this page for your convenience.
Tip: Some people find it helpful if these suggestions are shown on this talk page, rather than on another page. To do this, just add {{User:LinkBot/suggestions/Andrey_Markov}} to this page. —
LinkBot 10:38, 17 Dec 2004 (UTC)
In the article there is "Andreyevich" and "Andreevich" which presumably differ only in transliteration. Is there a standard transliteration for this? 67.101.45.65 ( talk) 20:07, 28 June 2008 (UTC)
I suggest to rearrange articles on A. A. Markov and his son in the following way.
Current situation leads to many confusions that are difficult to detect (see at "What links here" and look for logic or hockey-related articles). Of course, they can be linked properly, but such mistakes will emerge again almost surely, as any editor who make a link to A.A.Markov (meaning Jr.) can even test this link, read the first phrase of the article "A.A. Markov was a Russian mathematician..." and became satisfied.-- 92.39.161.221 ( talk) 21:51, 4 May 2008 (UTC)
Hi all,
Hierarchical Hidden Markov Models, of great importance in machine learning in the recent years, apparently were pioneered by Markov himself. I have, however, not found primary sources on this.
According to the following, Markov himself and subsequently Norbert Wiener hypothesized models of hierarchical sequences of states:
"The Russian mathematician Andrei Andreyevich Markov (1856 - 1922) built a mathematical theory of hierarchical sequences of states. The model was based on the possibility of traversing the states in one chain, and if that was successful, triggering a state in the next higher level in the hierarchy. Sound familiar? Markov's model included probabilities of each state's successfully occurring. He went on to hypothesize a situation in which a system has such a hierarchy of linear sequences of states, but those are unable to be directly examined-- hence the name hidden Markov models. The lowest level of the hierarchy emits signals, which are all we are allowed to see. Markov provides sa mathematical technique to compute what the probabilities of each transition must be based on the observed output. The method was subsequently refined by Norbert Wiener in 1923. Wiener's refinement also provided a way to determine the connections in the Markov model; essentially any connection with too low a probability was considered not to exist. This is essentially how the human neocortex trims connections-- if they are rarely or never used, they are considered unlikely and are pruned away. In our case, " Source: How to create a mind, Ray Kurzweil.
I have so far located Markov's first paper on chains, his 1907 "Extension of the limit theorems of probability theory to a sum of variables connected in a chain". I have yet not found his hierarchical models.
Does anyone know more about this? Kurzweil himself claims to have pioneered the use of HHMMs in the 80s and 90s. — Preceding unsigned comment added by Robolobster ( talk • contribs) 14:27, 7 February 2013 (UTC)
A large portion of the current biography seems to be copied from a (non-free) book, which isn't even cited in the article: http://books.google.com/books?id=IjFJNoq638kC&pg=PA98. Ian ( talk) 01:15, 6 July 2015 (UTC)
I've changed continuous fractions -> continued fractions, per fr:Andreï Markov (mathématicien) and the MacTutor archive. It is possible that 'continu' in French and 'непрерывная' in Russian mean both 'continuous' and 'continued' depending on context. In fact, the Russian article on continued fractions is at ru:Непрерывная дробь. Continued fractions are the things invented by John Wallis; continuous fractions aren't anything in particular.
In fact, ru:Andrei Markov gives us the title of his thesis as «Об интегрировании дифференциальных уравнений при помощи непрерывных дробей» ('Integration of differential equations by means of continued fractions'). So непрерывных дробей must be our usual continued fractions. EdJohnston ( talk) 20:25, 18 September 2018 (UTC)