![]() | This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||
|
|
This is a quote from the article
The ability to infer types automatically makes many programming tasks easier, leaving the programmer free to omit type annotations while maintaining some level of type safety
Why does it say "SOME" and not "A COMPLETE LEVEL OF SAFETY"? Are there downsides that someone would like to discuss? —Preceding unsigned comment added by 194.73.67.254 ( talk) 18:16, 4 December 2007 (UTC)
Type inference is historically linked to funbctional language but I don't think it is limited to them.
There's a thread in the archive of the Haskell mailing list with contributions by Philip Wadler and others explaining the history of the Hindley-Milner algorithm, but I'm not sure if this is important enough to be incorporated into the article as long as the article doesn't explain the algorithm itself. The tread also contains detailed references to the original publications of Curry, Feys, Hindley, Milner and Damas. -- Tobias Bergemann 08:12, 2004 Nov 16 (UTC)
Since there are no ad-hoc polymorphic subfunctions in the function definition, we can declare the function to be parametric polymorphic.
I followed the links trying to understand this sentence but it didn't work. Could somebody explain it? -- euyyn 22:52, 2 July 2006 (UTC)
80.7.195.184 Yes but computer Reals are not Real Reals, Real Reals would have Length INFINITY, the Reals are rational and what does that make the complex but imagined?
Could we call this Run Time Binding or Real Time Binding?
Also is it because atomics are not objects that we can type inference?
i.e. An Object must be made of something, another object Or something which is part of an object (eg an atomic)
Something must be mass or else its energy? So 3 is energy, t=3 means t is an integer, but could become a simulated real, either way we don't mean t as in the letter between s & u, for that we would say t="t" or t='t'
so its syntax and how to chew it
Is it just me, does this article dive right into the deep end with technical jargon and mark-up? Perhaps at least some pointers to articles which might explain some of the notation as well as terminology would be in order...?
(P.S., isn't there a template to mark an article as too technical?) —The preceding unsigned comment was added by 71.70.210.188 ( talk) 04:49, 11 December 2006 (UTC).
This article states that "Type inference is a feature present in some strongly statically typed programming languages." However, python implements type inference but is dynamically typed. The opening statement should be clarified that type inference is not only a feature of "strongly statically typed programming languages." mbecker 13:57, 23 August 2007 (UTC)
Python definitely does not implement any sort of type inferencing. Python is interpreted, not compiled. Type inferencing is something you do at compile time to assert all objects are correctly typed. Dynamically typed languages like Python only do runtime type checking, and thus, type errors can only be caught when executing the code. Tac-Tics 07:13, 9 September 2007 (UTC)
is it possible to say that c++ implements static type inferencing for functions ? ie give given a templated method definition template< class T> void func( T &v) then depending on the use context (caller type of argument) an appropriate version of func will be inststantiated; what is it that distinguishses this sort of behavoir from ml or ocaml type infererncing. note i am not arguing for c++ inclusion as representative example in the article but am interested to know the distinction. —Preceding unsigned comment added by 213.56.155.247 ( talk) 20:05, 5 July 2008 (UTC)
In 1978 Robin Milner, independently of Hindley's work, provided an equivalent algorithm
I think this tells the truth, but not the whole truth.
From Milner "A Theory of Type Polymorphism in Programming" (1978):
After doing this work we became aware of Hindley's method for deriving the "principal type scheme" (which is what we call a polymorphic type) for a term in combinatory logic. Hindley appears to have been the first to notice that the Unification Algorithm of Robinson is appropriate to this problem. Our work can be regarded as an extension of Hindley’s method to programming languages with local declarations, and as a semantic justification of the method.
Looks like it was independent, but immediately acknowledged and Milner prefers to see his work as a "semantic justification" of Hindley's method. — EatMyShortz ( talk) 16:44, 19 August 2008 (UTC)
I removed the mention of implicit typing, since afaik that is a more general term. ie, type inference is a kind of implicit typing, but we would say "dynamic" languages like python use implicit typing but not type inference. Does this sound right? If not, we should probably point the redirect at implicit typing somewhere else. And if so, latent typing has problems. ErikHaugen ( talk) 22:38, 23 August 2010 (UTC)
The opening paragraph contains a sentence starting with "Some languages that include type inference are [...]", followed by a list of languages with type inference. Please note the word "some" – this is not supposed to be an exhaustive list, and I think only the most important languages ("important" in relation to type inference) ought to actually be listed there. Someone keeps adding Vala to the list; I would like to know why Vala is important enough to be on this list? – Adrian Willenbücher ( talk) 10:19, 31 August 2010 (UTC)
The body of the section on Algorithm W (Hindley–Milner type inference algorithm) purporting to outline it, presents only an unrelated type checking (not type inference!) algorithm. This would be acceptable, if the section would be continued to present the real one. But since the section is unfinished, it harms more but it helps. Algorithm W uses unification in a very different manner. Right now, the section misleads completely.
If no expert is available to summarize Algorithm W, the quality of the article could be improved by removing everything from this section but the introduction. The section itself should be kept, since Hindley-Milner is the type inference method for languages dealing with functions and expressions.
In a way, removing the body of the section would help much to re-balance the article. IMO, it would be better to devote a separate article to Algorithm W and only mention it here.
As today, the problem has not been dealt with. Clegoues, who promised to help above, did not yet made any modification beside adding a reference to a greek Wikipedia page. To reiterate, the section titled "Hindley–Milner type inference algorithm" has nothing to do with Hindley-Milner's, but something unrelated, perhaps some introduction to System F (which does type-checking and is undecidable), is summarized. The author of the section got it all wrong, so it is misleading at best. Please see
for more technical details. In particular, HM is so outstanding because it does not need any type-annotation at all, as wrongly added in the claimed summary: . There, in the abstraction "" defines to be the type of the parameter . Additionally, a point of HM is LET-polymorphism, which is also not present in the purported summary. Variations and extensions aside the expression in HM's is .
A proper summary could be shaped along the following lines
The later would have to deal with recursive expressions and type functions.
I agree it is neither simple to understand nor to summarize HM's, but presenting something very different is not helpful in any way. Since the matter is not dealt with since 2009, i suggest to remove the section in question. Instead a reference to a new HM's and/or Algorithm W page could be added leaving space to describe the particular method. If one wants to summarize HM's briefly, only the syntax could be described, stated that HM finds the most general type and that it extends easily for recursion and type functions. -- 87.174.224.10 ( talk) 19:52, 31 July 2011 (UTC)
The new article on Hindley-Milner is now on-line. This should close the above mentioned issue.
Please let me know about related concerns or suggestions.
-- Cobalt pen ( talk) 17:04, 10 September 2011 (UTC)
I believe this article is almost entirely wrong.
Type inference is a system whereby the types of variables, expressions and functions are computed from their use. A restricted form of this, where types of expressions are calculated from their subterms, it called type deduction.
Technically, deduction is a simple recursive descent, S-attributed or bottom up calculation which ALL compilers with static type systems can do. For example in C++98
int x = 1;
the compiler can deduce the type x should have from the initialising expression, and check it is compatible with the declared type. In C++11 you can write:
auto x = 1;
Now you don't need to state the type of the variable, but the type calculation is the same. There is no type inference in either case. Here is type inference in Ocaml:
let f x = x + 1
The compiler infers x is of type int, because the function + has been applied to it. This is inference not deduction, because the type is calculated from the way the variable is used later. And here:
let f x = g x in f 1
the argument to f is taken to be int because f is applied to an int later, and thus the argument of g must also be int. Again, this is type inference.
As a result of the wrong definition in this article a lot of languages are claiming to have inferencing type systems, including Rust and Go, which cannot do type inference. As a general guide any language with overloading is unlikely to also have any kind of inference, since the two features are very hard to mix. [Actually C# has overloading and a limited kind of inference, and there are research papers which demonstrate systems with both features and discuss the restrictions which arise as a result]
I think I must add the interesting note that in fact C++ can do type inference, not quite the kind you'd expect. C++ can infer the type of function template type parameters from function arguments. This calculation requires unification, and is not S-attributed, so technically it qualifies as type inference (despite the fact the Standard actually describes this as deduction :)
Skaller (
talk)
15:25, 29 May 2013 (UTC)
In Type inference:
The opposite operation of type inference is called type erasure.
In Type erasure:
The opposite of type erasure is called reification.
And yet there is no mention of reification in Type inference, or mentioning of type inferencing in Reification (computer science). -- Voidvector ( talk) 05:05, 24 November 2014 (UTC)
Visual Basic 6.0 supported defining variables with the type "Any". However, I'm not sure if this means that Visual Basic 6.0 supports type inference. Does anyone have any ideas? Lcaa9 ( talk) 10:43, 31 January 2016 (UTC)
![]() | This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||
|
|
This is a quote from the article
The ability to infer types automatically makes many programming tasks easier, leaving the programmer free to omit type annotations while maintaining some level of type safety
Why does it say "SOME" and not "A COMPLETE LEVEL OF SAFETY"? Are there downsides that someone would like to discuss? —Preceding unsigned comment added by 194.73.67.254 ( talk) 18:16, 4 December 2007 (UTC)
Type inference is historically linked to funbctional language but I don't think it is limited to them.
There's a thread in the archive of the Haskell mailing list with contributions by Philip Wadler and others explaining the history of the Hindley-Milner algorithm, but I'm not sure if this is important enough to be incorporated into the article as long as the article doesn't explain the algorithm itself. The tread also contains detailed references to the original publications of Curry, Feys, Hindley, Milner and Damas. -- Tobias Bergemann 08:12, 2004 Nov 16 (UTC)
Since there are no ad-hoc polymorphic subfunctions in the function definition, we can declare the function to be parametric polymorphic.
I followed the links trying to understand this sentence but it didn't work. Could somebody explain it? -- euyyn 22:52, 2 July 2006 (UTC)
80.7.195.184 Yes but computer Reals are not Real Reals, Real Reals would have Length INFINITY, the Reals are rational and what does that make the complex but imagined?
Could we call this Run Time Binding or Real Time Binding?
Also is it because atomics are not objects that we can type inference?
i.e. An Object must be made of something, another object Or something which is part of an object (eg an atomic)
Something must be mass or else its energy? So 3 is energy, t=3 means t is an integer, but could become a simulated real, either way we don't mean t as in the letter between s & u, for that we would say t="t" or t='t'
so its syntax and how to chew it
Is it just me, does this article dive right into the deep end with technical jargon and mark-up? Perhaps at least some pointers to articles which might explain some of the notation as well as terminology would be in order...?
(P.S., isn't there a template to mark an article as too technical?) —The preceding unsigned comment was added by 71.70.210.188 ( talk) 04:49, 11 December 2006 (UTC).
This article states that "Type inference is a feature present in some strongly statically typed programming languages." However, python implements type inference but is dynamically typed. The opening statement should be clarified that type inference is not only a feature of "strongly statically typed programming languages." mbecker 13:57, 23 August 2007 (UTC)
Python definitely does not implement any sort of type inferencing. Python is interpreted, not compiled. Type inferencing is something you do at compile time to assert all objects are correctly typed. Dynamically typed languages like Python only do runtime type checking, and thus, type errors can only be caught when executing the code. Tac-Tics 07:13, 9 September 2007 (UTC)
is it possible to say that c++ implements static type inferencing for functions ? ie give given a templated method definition template< class T> void func( T &v) then depending on the use context (caller type of argument) an appropriate version of func will be inststantiated; what is it that distinguishses this sort of behavoir from ml or ocaml type infererncing. note i am not arguing for c++ inclusion as representative example in the article but am interested to know the distinction. —Preceding unsigned comment added by 213.56.155.247 ( talk) 20:05, 5 July 2008 (UTC)
In 1978 Robin Milner, independently of Hindley's work, provided an equivalent algorithm
I think this tells the truth, but not the whole truth.
From Milner "A Theory of Type Polymorphism in Programming" (1978):
After doing this work we became aware of Hindley's method for deriving the "principal type scheme" (which is what we call a polymorphic type) for a term in combinatory logic. Hindley appears to have been the first to notice that the Unification Algorithm of Robinson is appropriate to this problem. Our work can be regarded as an extension of Hindley’s method to programming languages with local declarations, and as a semantic justification of the method.
Looks like it was independent, but immediately acknowledged and Milner prefers to see his work as a "semantic justification" of Hindley's method. — EatMyShortz ( talk) 16:44, 19 August 2008 (UTC)
I removed the mention of implicit typing, since afaik that is a more general term. ie, type inference is a kind of implicit typing, but we would say "dynamic" languages like python use implicit typing but not type inference. Does this sound right? If not, we should probably point the redirect at implicit typing somewhere else. And if so, latent typing has problems. ErikHaugen ( talk) 22:38, 23 August 2010 (UTC)
The opening paragraph contains a sentence starting with "Some languages that include type inference are [...]", followed by a list of languages with type inference. Please note the word "some" – this is not supposed to be an exhaustive list, and I think only the most important languages ("important" in relation to type inference) ought to actually be listed there. Someone keeps adding Vala to the list; I would like to know why Vala is important enough to be on this list? – Adrian Willenbücher ( talk) 10:19, 31 August 2010 (UTC)
The body of the section on Algorithm W (Hindley–Milner type inference algorithm) purporting to outline it, presents only an unrelated type checking (not type inference!) algorithm. This would be acceptable, if the section would be continued to present the real one. But since the section is unfinished, it harms more but it helps. Algorithm W uses unification in a very different manner. Right now, the section misleads completely.
If no expert is available to summarize Algorithm W, the quality of the article could be improved by removing everything from this section but the introduction. The section itself should be kept, since Hindley-Milner is the type inference method for languages dealing with functions and expressions.
In a way, removing the body of the section would help much to re-balance the article. IMO, it would be better to devote a separate article to Algorithm W and only mention it here.
As today, the problem has not been dealt with. Clegoues, who promised to help above, did not yet made any modification beside adding a reference to a greek Wikipedia page. To reiterate, the section titled "Hindley–Milner type inference algorithm" has nothing to do with Hindley-Milner's, but something unrelated, perhaps some introduction to System F (which does type-checking and is undecidable), is summarized. The author of the section got it all wrong, so it is misleading at best. Please see
for more technical details. In particular, HM is so outstanding because it does not need any type-annotation at all, as wrongly added in the claimed summary: . There, in the abstraction "" defines to be the type of the parameter . Additionally, a point of HM is LET-polymorphism, which is also not present in the purported summary. Variations and extensions aside the expression in HM's is .
A proper summary could be shaped along the following lines
The later would have to deal with recursive expressions and type functions.
I agree it is neither simple to understand nor to summarize HM's, but presenting something very different is not helpful in any way. Since the matter is not dealt with since 2009, i suggest to remove the section in question. Instead a reference to a new HM's and/or Algorithm W page could be added leaving space to describe the particular method. If one wants to summarize HM's briefly, only the syntax could be described, stated that HM finds the most general type and that it extends easily for recursion and type functions. -- 87.174.224.10 ( talk) 19:52, 31 July 2011 (UTC)
The new article on Hindley-Milner is now on-line. This should close the above mentioned issue.
Please let me know about related concerns or suggestions.
-- Cobalt pen ( talk) 17:04, 10 September 2011 (UTC)
I believe this article is almost entirely wrong.
Type inference is a system whereby the types of variables, expressions and functions are computed from their use. A restricted form of this, where types of expressions are calculated from their subterms, it called type deduction.
Technically, deduction is a simple recursive descent, S-attributed or bottom up calculation which ALL compilers with static type systems can do. For example in C++98
int x = 1;
the compiler can deduce the type x should have from the initialising expression, and check it is compatible with the declared type. In C++11 you can write:
auto x = 1;
Now you don't need to state the type of the variable, but the type calculation is the same. There is no type inference in either case. Here is type inference in Ocaml:
let f x = x + 1
The compiler infers x is of type int, because the function + has been applied to it. This is inference not deduction, because the type is calculated from the way the variable is used later. And here:
let f x = g x in f 1
the argument to f is taken to be int because f is applied to an int later, and thus the argument of g must also be int. Again, this is type inference.
As a result of the wrong definition in this article a lot of languages are claiming to have inferencing type systems, including Rust and Go, which cannot do type inference. As a general guide any language with overloading is unlikely to also have any kind of inference, since the two features are very hard to mix. [Actually C# has overloading and a limited kind of inference, and there are research papers which demonstrate systems with both features and discuss the restrictions which arise as a result]
I think I must add the interesting note that in fact C++ can do type inference, not quite the kind you'd expect. C++ can infer the type of function template type parameters from function arguments. This calculation requires unification, and is not S-attributed, so technically it qualifies as type inference (despite the fact the Standard actually describes this as deduction :)
Skaller (
talk)
15:25, 29 May 2013 (UTC)
In Type inference:
The opposite operation of type inference is called type erasure.
In Type erasure:
The opposite of type erasure is called reification.
And yet there is no mention of reification in Type inference, or mentioning of type inferencing in Reification (computer science). -- Voidvector ( talk) 05:05, 24 November 2014 (UTC)
Visual Basic 6.0 supported defining variables with the type "Any". However, I'm not sure if this means that Visual Basic 6.0 supports type inference. Does anyone have any ideas? Lcaa9 ( talk) 10:43, 31 January 2016 (UTC)