![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
The article describes the intrinsic dimension of a function. But in data mining it is possible to talk of the intrinsic dimension of a dataset, and this should also be described, either here or perhaps in a separate article.
The idea is that data points in a high-dimensional space may all lie close to a submanifold of that space, and the intrinsic dimension is then the dimension of that submanifold. This is relevant to understanding the curse of dimensionality for datasets. There may be a probability density function for the data points, but the intrinsic dimension of that function is not the same as that of the dataset.
Sources that could be used for the article include:
JonH ( talk) 11:11, 10 April 2010 (UTC)
The "Generalizations" subsection as it stands is strange if taken at face value:
This makes the intrinsic dimension of every non-constant function 1, because you can take a1 = f and g(x) = x. Presumably the functions ai are supposed to be restricted to some class of functions, such as functions expressible in closed form. It would be better to somehow characterize the classes of functions used in practice. -- Coffee2theorems ( talk) 16:27, 5 August 2012 (UTC)
Can someone explain the transformation in the example? Is f(y1,y2) = g(y1) really the correct answer? — Preceding unsigned comment added by 178.10.183.34 ( talk) 06:57, 17 July 2015 (UTC)
![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
The article describes the intrinsic dimension of a function. But in data mining it is possible to talk of the intrinsic dimension of a dataset, and this should also be described, either here or perhaps in a separate article.
The idea is that data points in a high-dimensional space may all lie close to a submanifold of that space, and the intrinsic dimension is then the dimension of that submanifold. This is relevant to understanding the curse of dimensionality for datasets. There may be a probability density function for the data points, but the intrinsic dimension of that function is not the same as that of the dataset.
Sources that could be used for the article include:
JonH ( talk) 11:11, 10 April 2010 (UTC)
The "Generalizations" subsection as it stands is strange if taken at face value:
This makes the intrinsic dimension of every non-constant function 1, because you can take a1 = f and g(x) = x. Presumably the functions ai are supposed to be restricted to some class of functions, such as functions expressible in closed form. It would be better to somehow characterize the classes of functions used in practice. -- Coffee2theorems ( talk) 16:27, 5 August 2012 (UTC)
Can someone explain the transformation in the example? Is f(y1,y2) = g(y1) really the correct answer? — Preceding unsigned comment added by 178.10.183.34 ( talk) 06:57, 17 July 2015 (UTC)