![]() | This article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
This page has archives. Sections older than 365 days may be automatically archived by Lowercase sigmabot III when more than 10 sections are present. |
It is my understanding that a vector, per se (i.e. not its representation), is neither covariant or contravariant. Instead covariance or contravariance refers to a given representation of the vector, depending on the basis being used. For example, one can write the same vector in either manner as, . I think this point (in addition to the variance topic as a whole) can be both subtle and confusing to people first learning these topics, and thus the terminology should be used very carefully, consistently, and rigorously. I tried to change some of the terminology in the article to say, "vectors with covariant components" instead of "covariant vectors" (for example), but this has been reversed as inaccurate. So I wanted to open a discussion in case I am mistaken or others disagree. @JRSpriggs. Zhermes ( talk) 16:45, 2 January 2020 (UTC)
I had precisely the same question, that I opened in math stackexchange. I haven't got any satisfactory answer yet. I hope somebody will pick this thread and continue the discussion. https://math.stackexchange.com/questions/4297246/conceptual-difference-between-covariant-and-contravariant-tensors — Preceding unsigned comment added by 99.62.6.96 ( talk) 05:24, 29 December 2021 (UTC)
I am a bit confused : this article takes the gradient to be a prime example of a "covariant vector", but the Gradient article claims that it is a contravariant vector. Which is correct? (Sorry if this is the wrong place to ask) -- 93.25.93.82 ( talk) 19:56, 7 July 2020 (UTC)
Currently: "On the other hand, for instance, a triple consisting of the length, width, and height of a rectangular box could make up the three components of an abstract vector, but this vector would not be contravariant, since a change in coordinates on the space does not change the box's length, width, and height: instead these are scalars." I think this is not helpful since there is no obvious meaning to the vector space of box [length, width, height]. What do vector scaling and vector addition correspond to? I suggest this example be removed. Intellec7 ( talk) 05:19, 28 August 2020 (UTC)
If you have a basis in a three dimensional Euclidean space you can construct the coordinates of a given point by drawing lines through the point parallel to each basis vector. Those lines will intersect with each other and the distance of the intersection point from the origin divided by the length of the corresponding basis vector gives you the covariant components . But how do you construct the coordinates in respect to the dual basis ???? 2003:E7:2F3B:B0D8:11C:4108:841E:BA24 ( talk) 07:00, 17 April 2021 (UTC)
The word "scalar" is used in the page to mean something that multiplies a unit vector. But a scalar is supposed to be coordinate-free or gauge-invariant or invariant to changes of coordinates. The quantities that multiply unit vectors do not have this property, because the unit vectors change (and hence the components of the vectors change) as you change coordinates. -- David W. Hogg ( talk) 12:58, 16 May 2021 (UTC)
I have some "issues" with the fundamental jargon used in this article. I would like to help improve it; but I don't want to start an editing war. So, let me test the waters with a few comments:
Vectors and covectors are not the same thing. And they are both invariant under (proper, invertible) linear transformation. (Note: this is the passive/alias viewpoint of transformations). It is a semantic error to say that a vector is contravariant (or that a covector is covariant). The co/vectors, and tensors in general, are invariant under linear transformations. If the space has a metric, then one can "convert" a vector into a covector, and vice versa. But the existence of a metric is not obligatory and, in fact, confuses people into thinking that vectors and covectors are fungible. They are not. Vectors are linear approximations to the curves defined by the intersection of coordinate functions; covectors are linear functional approximations to the level (hyper)surfaces of a single coordinate function. The figure at the top of the article kind of hints at this, but then garbles the message by overlaying arrow quantities with level-surface quantities on the right-hand side. Too bad. Remove those blue arrows and you'd have a right proud representation of covectors in a cobasis.
Co/contra-variance is a property of the components and of the basis elements. For a vector , the components are contravariant and the basis vectors are covariant; for a covector , the components are covariant and the basis covectors are contravariant. In either case, when you contract the components with the basis—one of which is covariant and the other contravariant—then you get an invariant quantity, as required of a tensor.
I won't try to define/defend here (yet;-) what co/contra-variant mean. (Spoiler alert: contravariant quantities transform as the Jacobian of a coordinate transformation; covariant quantities transform as the inverse Jacobian; this seems backwards to what I, for one, would expect from the concepts of co- and contra-; but it is what it is!)
To whomever has purview over this article: if these comments make sense and seem worth the trouble of editing the article, please let me know and I'll try to collaborate. On the other hand, if you think these are distinctions without a difference—or worse, misguided—then I'll stand down.-- ScriboErgoSum ( talk) 08:25, 15 November 2021 (UTC)
The article describes covariance and contravariance in terms of coordinates and components, a perspective that is rather dated. The terms have a meaning independent of any choice of basis or coordinates, and the article should reflect that. There is a lot of variation in the literature, but essentially there are three styles:
Tensors can then be defined as either tensor products or as multilinear maps. It is common to just classify tensors by the covariant and contravariant ranks, but if there is a (pseudo)metric involved then order matters because of raising and lowering of indexes. -- Shmuel (Seymour J.) Metz Username:Chatul ( talk) 11:32, 1 March 2022 (UTC)
I am adding this (3/26/2022) without reading the below because I have a comment on the definition section. We have f = (X1, ..., Xn) with each of the Xis being basis a vector. I was going to add a remark that said this makes f a matrix and indeed it is used as a matrix in the unnumbered equation v=f vf]. I decided not to add that comment as it contradicts Note 1. In the note preceding Eq 1 it says "regarding f as a row vector whose entries are the elements of the basis". Maybe I am naive but I find this pretty confusing as I've always considered matrices and vectors to be different objects. If f is to be considered a row vector but with each element a basis vector instead of just a number (as one usually thinks of vectors) this needs to be explained. If the note is incorrect then it should be removed. — Preceding unsigned comment added by 76.113.29.12 ( talk • contribs) 14:45, 26 March 2022 (UTC)
Should the article discuss relative vectors, whose transformation includes a power of the transformation determinant as a factor? In more modern language, these are tensor products of vectors with tensor densities, or for orientable manifolds, liftings of line bundles. -- Shmuel (Seymour J.) Metz Username:Chatul ( talk) 14:31, 13 November 2023 (UTC)
@
JRSpriggs: In
permalink/1233199815 I changed in
3-d general
curvilinear coordinates (q1, q2, q3), a
tuple of numbers to define a point in a
position space. Note the basis and cobasis coincide only when the basis is
orthogonal
to in
3-d general
curvilinear coordinates (q1, q2, q3), a
tuple of numbers to define a point in a
position space. Note the basis and cobasis coincide only when the basis is
orthonormall
. In
permalink/1233348907,
JRSpriggs reverted to the to last version by Derek farn, undoing edits by 76.116.252.35, me and
Antoni Parellada .
Orthogonality ia not a strong enough condition for the basis and cobasis to coincide; they have to be both orthogonal and of norm 1, i.e.,
orthonormal. --
Shmuel (Seymour J.) Metz Username:Chatul (
talk)
14:33, 12 July 2024 (UTC)
![]() | This article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
This page has archives. Sections older than 365 days may be automatically archived by Lowercase sigmabot III when more than 10 sections are present. |
It is my understanding that a vector, per se (i.e. not its representation), is neither covariant or contravariant. Instead covariance or contravariance refers to a given representation of the vector, depending on the basis being used. For example, one can write the same vector in either manner as, . I think this point (in addition to the variance topic as a whole) can be both subtle and confusing to people first learning these topics, and thus the terminology should be used very carefully, consistently, and rigorously. I tried to change some of the terminology in the article to say, "vectors with covariant components" instead of "covariant vectors" (for example), but this has been reversed as inaccurate. So I wanted to open a discussion in case I am mistaken or others disagree. @JRSpriggs. Zhermes ( talk) 16:45, 2 January 2020 (UTC)
I had precisely the same question, that I opened in math stackexchange. I haven't got any satisfactory answer yet. I hope somebody will pick this thread and continue the discussion. https://math.stackexchange.com/questions/4297246/conceptual-difference-between-covariant-and-contravariant-tensors — Preceding unsigned comment added by 99.62.6.96 ( talk) 05:24, 29 December 2021 (UTC)
I am a bit confused : this article takes the gradient to be a prime example of a "covariant vector", but the Gradient article claims that it is a contravariant vector. Which is correct? (Sorry if this is the wrong place to ask) -- 93.25.93.82 ( talk) 19:56, 7 July 2020 (UTC)
Currently: "On the other hand, for instance, a triple consisting of the length, width, and height of a rectangular box could make up the three components of an abstract vector, but this vector would not be contravariant, since a change in coordinates on the space does not change the box's length, width, and height: instead these are scalars." I think this is not helpful since there is no obvious meaning to the vector space of box [length, width, height]. What do vector scaling and vector addition correspond to? I suggest this example be removed. Intellec7 ( talk) 05:19, 28 August 2020 (UTC)
If you have a basis in a three dimensional Euclidean space you can construct the coordinates of a given point by drawing lines through the point parallel to each basis vector. Those lines will intersect with each other and the distance of the intersection point from the origin divided by the length of the corresponding basis vector gives you the covariant components . But how do you construct the coordinates in respect to the dual basis ???? 2003:E7:2F3B:B0D8:11C:4108:841E:BA24 ( talk) 07:00, 17 April 2021 (UTC)
The word "scalar" is used in the page to mean something that multiplies a unit vector. But a scalar is supposed to be coordinate-free or gauge-invariant or invariant to changes of coordinates. The quantities that multiply unit vectors do not have this property, because the unit vectors change (and hence the components of the vectors change) as you change coordinates. -- David W. Hogg ( talk) 12:58, 16 May 2021 (UTC)
I have some "issues" with the fundamental jargon used in this article. I would like to help improve it; but I don't want to start an editing war. So, let me test the waters with a few comments:
Vectors and covectors are not the same thing. And they are both invariant under (proper, invertible) linear transformation. (Note: this is the passive/alias viewpoint of transformations). It is a semantic error to say that a vector is contravariant (or that a covector is covariant). The co/vectors, and tensors in general, are invariant under linear transformations. If the space has a metric, then one can "convert" a vector into a covector, and vice versa. But the existence of a metric is not obligatory and, in fact, confuses people into thinking that vectors and covectors are fungible. They are not. Vectors are linear approximations to the curves defined by the intersection of coordinate functions; covectors are linear functional approximations to the level (hyper)surfaces of a single coordinate function. The figure at the top of the article kind of hints at this, but then garbles the message by overlaying arrow quantities with level-surface quantities on the right-hand side. Too bad. Remove those blue arrows and you'd have a right proud representation of covectors in a cobasis.
Co/contra-variance is a property of the components and of the basis elements. For a vector , the components are contravariant and the basis vectors are covariant; for a covector , the components are covariant and the basis covectors are contravariant. In either case, when you contract the components with the basis—one of which is covariant and the other contravariant—then you get an invariant quantity, as required of a tensor.
I won't try to define/defend here (yet;-) what co/contra-variant mean. (Spoiler alert: contravariant quantities transform as the Jacobian of a coordinate transformation; covariant quantities transform as the inverse Jacobian; this seems backwards to what I, for one, would expect from the concepts of co- and contra-; but it is what it is!)
To whomever has purview over this article: if these comments make sense and seem worth the trouble of editing the article, please let me know and I'll try to collaborate. On the other hand, if you think these are distinctions without a difference—or worse, misguided—then I'll stand down.-- ScriboErgoSum ( talk) 08:25, 15 November 2021 (UTC)
The article describes covariance and contravariance in terms of coordinates and components, a perspective that is rather dated. The terms have a meaning independent of any choice of basis or coordinates, and the article should reflect that. There is a lot of variation in the literature, but essentially there are three styles:
Tensors can then be defined as either tensor products or as multilinear maps. It is common to just classify tensors by the covariant and contravariant ranks, but if there is a (pseudo)metric involved then order matters because of raising and lowering of indexes. -- Shmuel (Seymour J.) Metz Username:Chatul ( talk) 11:32, 1 March 2022 (UTC)
I am adding this (3/26/2022) without reading the below because I have a comment on the definition section. We have f = (X1, ..., Xn) with each of the Xis being basis a vector. I was going to add a remark that said this makes f a matrix and indeed it is used as a matrix in the unnumbered equation v=f vf]. I decided not to add that comment as it contradicts Note 1. In the note preceding Eq 1 it says "regarding f as a row vector whose entries are the elements of the basis". Maybe I am naive but I find this pretty confusing as I've always considered matrices and vectors to be different objects. If f is to be considered a row vector but with each element a basis vector instead of just a number (as one usually thinks of vectors) this needs to be explained. If the note is incorrect then it should be removed. — Preceding unsigned comment added by 76.113.29.12 ( talk • contribs) 14:45, 26 March 2022 (UTC)
Should the article discuss relative vectors, whose transformation includes a power of the transformation determinant as a factor? In more modern language, these are tensor products of vectors with tensor densities, or for orientable manifolds, liftings of line bundles. -- Shmuel (Seymour J.) Metz Username:Chatul ( talk) 14:31, 13 November 2023 (UTC)
@
JRSpriggs: In
permalink/1233199815 I changed in
3-d general
curvilinear coordinates (q1, q2, q3), a
tuple of numbers to define a point in a
position space. Note the basis and cobasis coincide only when the basis is
orthogonal
to in
3-d general
curvilinear coordinates (q1, q2, q3), a
tuple of numbers to define a point in a
position space. Note the basis and cobasis coincide only when the basis is
orthonormall
. In
permalink/1233348907,
JRSpriggs reverted to the to last version by Derek farn, undoing edits by 76.116.252.35, me and
Antoni Parellada .
Orthogonality ia not a strong enough condition for the basis and cobasis to coincide; they have to be both orthogonal and of norm 1, i.e.,
orthonormal. --
Shmuel (Seymour J.) Metz Username:Chatul (
talk)
14:33, 12 July 2024 (UTC)