This is the
talk page for discussing improvements to the
Eigenvalues and eigenvectors article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Archives:
1,
2,
3Auto-archiving period: 365 days
![]() |
![]() | Eigenvalues and eigenvectors is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed. | ||||||||||||||||||
![]() | This article appeared on Wikipedia's Main Page as Today's featured article on November 1, 2005. | ||||||||||||||||||
|
![]() | This ![]() It is of interest to multiple WikiProjects. | |||||||||||||
|
I removed the Technical template. Yes, the subject matter is technical mathematics, but I do not see how anyone can understand the subject matter without first understanding concepts like field, scalar, vector space, vector, linear algebra, and linear transformation, all of which are linked in the lead. Explaining them in depth here would be tedious and redundant. Several knowledgeable editors have been painstakingly refining this article for months to make it more readable, and I believe its present state is very readable for readers who already understand the aforementioned concepts.— Anita5192 ( talk) 21:48, 8 February 2019 (UTC)
Eigenvalues and eigenvectors belong in a characteristic way to a linear transformation, as dealt with in linear algebra. The transformation of an eigenvector results in just scaling it by the factor given by the eigenvalue belonging to this eigenvector. More formally ...
Please folks, Wikipedia is not a textbook. It is a reference resource, like an encyclopedia, dictionary, or technical manual: a place where people come to know the definitions and basic facts about some topic. It is fine to make the articles more accessible and understandable, as long as one keeps those readers in mind.
So, for example, the lead of an article about "polynomial" should assume that the reader only knows the concept of "function" and the very basic of algebra; but the lead of an article about "eigenvectors and eigenvalues" should assume that the reader at least knows what are vectors and linear transformations, or matrices and matrix products. Trying to make the lead (or the article) understandable to people without that basic knowledge is futile and a disservice to those readers to wohm the article should be directed. --
Jorge Stolfi (
talk) 13:50, 13 June 2024 (UTC)
the purpose of Wikipedia is to summarize accepted knowledge, not to teach subject matter. The previous start was far more helpful in this regard. Per WP:BRD, I'll revert these changes, and the discussion can continue here. Chumpih t 20:19, 13 June 2024 (UTC)
Wikipedia is intended to be a general-purpose encyclopaedia. This marks a contrast with a technical manual for the already expert practitioner.
But in this article, the lead is deeply technical, followed by a section "formal definition" which likewise is deeply technical. The intended readership is left floundering in meaninglessness.
We need the article to arrive very quickly at a general overview. The section "overview", especially with its pictorial illustration, would be much better placed earlier. So I propose reversing the order of "formal definition" and "overview". (Perhaps other minor adjustments might become necessary, but they are second-order effects.) Unless there is serious objection, I propose doing this in about a week (21 June 2021). Feline Hymnic ( talk) 16:12, 14 June 2021 (UTC)
The discussion about zero vector being an eigenvector was confusing. The source [1] cited said nothing of the kind, and there is a general consensus among mathematicians consistent with the rest of the article.
I've actually checked the reference provided and there is nothing about eigenvectors at the referenced page (p. 77). The chapter about eigenvectors from that book is actually freely available [ link ] and it nicely explains the philosophy of eigenvectors as invariant sub-spaces. All the definitions in the book are consistent with excluding the zero vector as an eigenvector.
[1] Axler, Sheldon (18 July 2017), Linear Algebra Done Right (3rd ed.), Springer, p. 77, ISBN (Links to an external site.) 978-3-319-30765-7 — Preceding unsigned comment added by Ormulogun ( talk • contribs) 15:56, 10 February 2022 (UTC)
Was reading the article, and the line starting "Equation (2) has a nonzero solution..." and couldn't find the referenced equation.
The equation numbers are on the extreme far right side of the page, and as such are a) difficult to find, and b) difficult to associate with their equation when several equations are listed vertically. It's especially a problem with this article because all equations are fairly short.
Is there some way of moving the equation reference numbers closer to the actual equations, on pages where the equations are short?
(I'd venture to guess that nearly all equations in all math pages are short, relative to the width of modern monitors.) — Preceding unsigned comment added by 64.223.87.47 ( talk) 20:52, 22 August 2022 (UTC)
Some sections in the article use bars for the determinant, even when applied to a symbolic expression, i.e.
This is IMO wrong; vertical bars for determinant is a shorthand in which the bars replace the parenthesis delimiting an matrix given in terms of its elements:
130.243.94.123 ( talk) 12:39, 21 August 2023 (UTC)
The current Short description is excessive and my attempts to find a suitable replacement were rejected. Please see WP:SDSHORT and WP:SDNOTDEF and suggest possible alternative Short descriptions. I have removed the current SD to suppress the errors pending a viable alternative — GhostInTheMachine talk to me 19:10, 16 November 2023 (UTC)
Eigenvectors? — GhostInTheMachine talk to me 19:36, 16 November 2023 (UTC)
Vectors only scaled by linear transformations(45 characters) seems to be OK. It seems to make more sense than references to
Characteristic vectors. — GhostInTheMachine talk to me 20:06, 16 November 2023 (UTC)
Vectors whose direction is fixed by a linear map, and the corresponding scalars(79 characters) — about twice as long as it should be. Please suggest sensible alternatives — GhostInTheMachine talk to me 19:50, 16 November 2023 (UTC)
Quantities that together describe a linear map. To me, this seems like a reasonable thing to have appear in the places mentioned above, and it doesn't slight one half in favor of the other. XOR'easter ( talk) 21:14, 16 November 2023 (UTC)
Mathematical concept linking vectors and matricesis more in the spirit, IMHO. Chumpih t 22:41, 16 November 2023 (UTC)
Mathematical concept, then? The title of the article itself already has
vectorin it, after all. XOR'easter ( talk) 02:26, 17 November 2023 (UTC)
linking vectors and matricesmean? XOR'easter ( talk) 02:24, 17 November 2023 (UTC)
Mathematical concepts involving vectors and matricesto then? Chumpih t 10:11, 17 November 2023 (UTC)
I don't believe there is any way to describe eigenvectors and eigenvalues accurately in the small space of a short description. Per WP:SDCONTENT, "short descriptions are meant to distinguish an article from similarly-named articles in search results, and not to define the subject." Hence, I think we should be content with something short and general, instead of attempting to fit completeness into a small space. I propose "Concepts from linear algebra."— Anita5192 ( talk) 14:25, 18 November 2023 (UTC)
Concepts from linear algebrasounds OK to me. Chumpih t 22:47, 23 November 2023 (UTC)
I see a lot of people adding the notion of "unchanged direction". Not all (general vector space) vectors have direction. Think functions & linear differential operators, etc. Ponor ( talk) 13:46, 17 November 2023 (UTC)
"The notion of the 'angle' between a pair of subspaces in a Hilbert space is a fruitful one. It often allows one to give a geometric interpretation to what appears to be a purely analytical or topological result."] I think it's important to mention a notion of "unchanged direction" at the top for accessibility to a broad audience. Not everyone knows what scalar multiplication means, and trying to unpack several semesters of undergraduate math courses into the lead paragraph of this article is untenable, so we need to quickly give readers some images to hold onto instead of only providing a fully general version in abstract mathematical notation. If you like I can try to add some sources for this definition. Searching in Google scholar search turns up literally thousands of papers and book chapters (from a wide variety of fields, including some talking about more abstract vector spaces) where an eigenvector is described as a vector whose "direction is unchanged", "direction is fixed", "direction is preserved", "direction is not changed", or similar. I'm sure with some effort screening them we can find a few which would be generally useful surveys for a newcomer reader to take a look at. I want to expand and slightly reorganize the first few sections after the lead to first include a geometric/visual overview and more clearly relate it to finite-dimensional matrix arithmetic, but it would also be good to have some introductory description somewhere near the top about more general kinds of vector spaces, what their linear transformations look like, why we want to know their eigenvectors, etc. – jacobolus (t) 19:31, 17 November 2023 (UTC)
magnitude, direction and orientation– this would depend on whether you consider a "direction" to refer to lines or "oriented lines". The words "attitude", "direction", "orientation", "sense", "bearing", etc. are used imprecisely and often interchangeably in English. Making these concepts precise requires formally defining them, but that's not really the point in the context of a few informal sentences here intended to give readers the right basic idea without too much technical overhead. – jacobolus (t) 07:32, 18 November 2023 (UTC)
The clause in question is "The eigenvectors and eigenvalues of a linear transformation serve to characterize it" within the lead section. The problem is that a pronoun typically refers to the subject of the sentence, which in this case is 'the eigenvectors and eigenvalues'. But here the pronoun refers to the object (the linear transformation), differentiated solely by a match in plurality between the pronoun and the subject (the pronoun being singular, matching the object, and the subject being plural). This is jarring. A better form could be The eigenvectors and eigenvalues of a linear transformation serve to characterize that transformation
. Is there any advantage to keeping the troubling pronoun?
Chumpih
t 23:18, :30, 15 June 2024 (UTC)
Here the troublesome clause is 'fed as inputs to the same inputs', which is not easy to parse. Perhaps a better form would be Sometimes a particular system is represented by a linear transformation whose outputs are
fed back as inputs to the transformation
. In addition, the phrasing 'In particular, it is often the case that' is perhaps better represented as 'sometimes'.
Chumpih
t 23:45, 15 June 2024 (UTC)
"The eigenvectors and eigenvalues of a linear transformation serve to characterize it, and so they play important roles in [applications]. In particular, it is often the case that a system is represented by a linear transformation whose outputs are fed ....". This sentence is awkward and could certainly be improved, but the point of this phrasing is to especially emphasize the case of systems modeled as linear transformations + feedback, because in these cases the largest eigenvalue and its associated eigenvector play a special role.
"Sometimes a particular system is represented by a linear transformation whose outputs are fed ...". Changing from particular applications to particular systems is an unrelated use of the word "particular"; both removing the first particular and adding the second particular change the meaning and focus of the sentence. We're no longer calling out a specific special class of applications, or referring back to the previous sentence at all. "Sometimes" makes a clean break, not tightly tied to anything prior, and it's not at all clear what the new "particular" is even for. Readers can still infer from reading the rest of the paragraph that there's a relationship between these sentences, but that inference is not reinforced by the language. – jacobolus (t) 00:30, 16 June 2024 (UTC)
...[applications]. In such cases, models for the systems can include linear transformations that feed back their outputs to their inputs. In such representations, the largest eigenvalue is of use because it indicates the long-term behavior of the system, after many applications of the linear transformation, and the associated eigenvector is the steady state of the system.. Note that the model doesn't typically "govern" a system, though i can represent or indicate what is happening. Chumpih t 00:52, 16 June 2024 (UTC)
I don't agree with the full set of changes in special:diff/1217830133/1228837393 by @ Jorge Stolfi, but I think he has some valuable ideas for changes to the lead which should at least be discussed / taken into consideration instead of summarily rejected. Pinging @ XOR'easter, @ Tito Omburo, @ Chetvorno who chimed in a bit about the lead here previously (I'm probably missing other folks). Here's how Jorge Stolfi left the lead:
In linear algebra, an eigenvector ( /ˈaɪɡən-/ EYE-gən-) or characteristic vector of a linear transformation is a non-zero vector that is only multiplied by some scalar factor when transformed by ; that is . The scalar is the corresponding eigenvalue or characteristic value associated with . [1]
Eigenvalues are often called characteristic roots because they are the roots of a characteristic polynomial associated with the linear transformation. Specifically, if is the square matrix of the linear transformation on some specified basis, the eigenvalues of are the values of that satisfy the equation , where denotes the determinant of a matrix and is the identity matrix with same size as .
In two- and three-dimensional geometry, a vector can be depicted as an arrow with a specific length and direction. A linear transformation acts on such vectors by some combination of scaling, rotation, or shearing. Its eigenvectors are those non-zero vectors that are only scaled. The corresponding eigenvalue is the factor by which an eigenvector is scaled. If the eigenvalue is positive, the eigenvector is stretched or shrunk, without changing its direction; if it is negative, the vector's direction is also reversed; and if it is zero, the vector is anihilated. [2]
Any linear transformation of a space of dimension is completely defined by a set of mutually orthogonal eigenvectors and their associated eigenvalues. For this and other reasons, eigenvalues and eigenvectors are extremely important in all areas of mathematics, science, and technology where linear algebra is used. In mechanics, for example, they arise in the analysis of the rotation of a rigid body, of the behavior of non- isotropic materials under stress, of the vibrations in an elastic structure, and many other phenomena. In statistics, the eigenvectors of the covariance matrix of a set of -dimensional data points may reveal the main independent factors affecting those data. In the control of industrial systems, the eigenvalues of the matrix describing the feedback signals can reveal the stability of system under momentary distrurbances.
I think the first paragraph here is unnecessarily less accessible to follow than the previous version, some technical detail can be deferred a bit maybe, and I don't like specifying 2–3 dimensions when vectors can be thought of geometrically in any dimension (albeit mental images get more abstract). But in particular I like the expanded discussion of various applications, which is more concrete and balanced than what was there before. The change from "serve to characterize it" to "completely defined by" is more emphatic though I'm not sure all that different. (Eigenvectors don't have to be mutually orthogonal, but that mistake can be written out.) – jacobolus (t) 01:12, 16 June 2024 (UTC)
References
This is the
talk page for discussing improvements to the
Eigenvalues and eigenvectors article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Archives:
1,
2,
3Auto-archiving period: 365 days
![]() |
![]() | Eigenvalues and eigenvectors is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed. | ||||||||||||||||||
![]() | This article appeared on Wikipedia's Main Page as Today's featured article on November 1, 2005. | ||||||||||||||||||
|
![]() | This ![]() It is of interest to multiple WikiProjects. | |||||||||||||
|
I removed the Technical template. Yes, the subject matter is technical mathematics, but I do not see how anyone can understand the subject matter without first understanding concepts like field, scalar, vector space, vector, linear algebra, and linear transformation, all of which are linked in the lead. Explaining them in depth here would be tedious and redundant. Several knowledgeable editors have been painstakingly refining this article for months to make it more readable, and I believe its present state is very readable for readers who already understand the aforementioned concepts.— Anita5192 ( talk) 21:48, 8 February 2019 (UTC)
Eigenvalues and eigenvectors belong in a characteristic way to a linear transformation, as dealt with in linear algebra. The transformation of an eigenvector results in just scaling it by the factor given by the eigenvalue belonging to this eigenvector. More formally ...
Please folks, Wikipedia is not a textbook. It is a reference resource, like an encyclopedia, dictionary, or technical manual: a place where people come to know the definitions and basic facts about some topic. It is fine to make the articles more accessible and understandable, as long as one keeps those readers in mind.
So, for example, the lead of an article about "polynomial" should assume that the reader only knows the concept of "function" and the very basic of algebra; but the lead of an article about "eigenvectors and eigenvalues" should assume that the reader at least knows what are vectors and linear transformations, or matrices and matrix products. Trying to make the lead (or the article) understandable to people without that basic knowledge is futile and a disservice to those readers to wohm the article should be directed. --
Jorge Stolfi (
talk) 13:50, 13 June 2024 (UTC)
the purpose of Wikipedia is to summarize accepted knowledge, not to teach subject matter. The previous start was far more helpful in this regard. Per WP:BRD, I'll revert these changes, and the discussion can continue here. Chumpih t 20:19, 13 June 2024 (UTC)
Wikipedia is intended to be a general-purpose encyclopaedia. This marks a contrast with a technical manual for the already expert practitioner.
But in this article, the lead is deeply technical, followed by a section "formal definition" which likewise is deeply technical. The intended readership is left floundering in meaninglessness.
We need the article to arrive very quickly at a general overview. The section "overview", especially with its pictorial illustration, would be much better placed earlier. So I propose reversing the order of "formal definition" and "overview". (Perhaps other minor adjustments might become necessary, but they are second-order effects.) Unless there is serious objection, I propose doing this in about a week (21 June 2021). Feline Hymnic ( talk) 16:12, 14 June 2021 (UTC)
The discussion about zero vector being an eigenvector was confusing. The source [1] cited said nothing of the kind, and there is a general consensus among mathematicians consistent with the rest of the article.
I've actually checked the reference provided and there is nothing about eigenvectors at the referenced page (p. 77). The chapter about eigenvectors from that book is actually freely available [ link ] and it nicely explains the philosophy of eigenvectors as invariant sub-spaces. All the definitions in the book are consistent with excluding the zero vector as an eigenvector.
[1] Axler, Sheldon (18 July 2017), Linear Algebra Done Right (3rd ed.), Springer, p. 77, ISBN (Links to an external site.) 978-3-319-30765-7 — Preceding unsigned comment added by Ormulogun ( talk • contribs) 15:56, 10 February 2022 (UTC)
Was reading the article, and the line starting "Equation (2) has a nonzero solution..." and couldn't find the referenced equation.
The equation numbers are on the extreme far right side of the page, and as such are a) difficult to find, and b) difficult to associate with their equation when several equations are listed vertically. It's especially a problem with this article because all equations are fairly short.
Is there some way of moving the equation reference numbers closer to the actual equations, on pages where the equations are short?
(I'd venture to guess that nearly all equations in all math pages are short, relative to the width of modern monitors.) — Preceding unsigned comment added by 64.223.87.47 ( talk) 20:52, 22 August 2022 (UTC)
Some sections in the article use bars for the determinant, even when applied to a symbolic expression, i.e.
This is IMO wrong; vertical bars for determinant is a shorthand in which the bars replace the parenthesis delimiting an matrix given in terms of its elements:
130.243.94.123 ( talk) 12:39, 21 August 2023 (UTC)
The current Short description is excessive and my attempts to find a suitable replacement were rejected. Please see WP:SDSHORT and WP:SDNOTDEF and suggest possible alternative Short descriptions. I have removed the current SD to suppress the errors pending a viable alternative — GhostInTheMachine talk to me 19:10, 16 November 2023 (UTC)
Eigenvectors? — GhostInTheMachine talk to me 19:36, 16 November 2023 (UTC)
Vectors only scaled by linear transformations(45 characters) seems to be OK. It seems to make more sense than references to
Characteristic vectors. — GhostInTheMachine talk to me 20:06, 16 November 2023 (UTC)
Vectors whose direction is fixed by a linear map, and the corresponding scalars(79 characters) — about twice as long as it should be. Please suggest sensible alternatives — GhostInTheMachine talk to me 19:50, 16 November 2023 (UTC)
Quantities that together describe a linear map. To me, this seems like a reasonable thing to have appear in the places mentioned above, and it doesn't slight one half in favor of the other. XOR'easter ( talk) 21:14, 16 November 2023 (UTC)
Mathematical concept linking vectors and matricesis more in the spirit, IMHO. Chumpih t 22:41, 16 November 2023 (UTC)
Mathematical concept, then? The title of the article itself already has
vectorin it, after all. XOR'easter ( talk) 02:26, 17 November 2023 (UTC)
linking vectors and matricesmean? XOR'easter ( talk) 02:24, 17 November 2023 (UTC)
Mathematical concepts involving vectors and matricesto then? Chumpih t 10:11, 17 November 2023 (UTC)
I don't believe there is any way to describe eigenvectors and eigenvalues accurately in the small space of a short description. Per WP:SDCONTENT, "short descriptions are meant to distinguish an article from similarly-named articles in search results, and not to define the subject." Hence, I think we should be content with something short and general, instead of attempting to fit completeness into a small space. I propose "Concepts from linear algebra."— Anita5192 ( talk) 14:25, 18 November 2023 (UTC)
Concepts from linear algebrasounds OK to me. Chumpih t 22:47, 23 November 2023 (UTC)
I see a lot of people adding the notion of "unchanged direction". Not all (general vector space) vectors have direction. Think functions & linear differential operators, etc. Ponor ( talk) 13:46, 17 November 2023 (UTC)
"The notion of the 'angle' between a pair of subspaces in a Hilbert space is a fruitful one. It often allows one to give a geometric interpretation to what appears to be a purely analytical or topological result."] I think it's important to mention a notion of "unchanged direction" at the top for accessibility to a broad audience. Not everyone knows what scalar multiplication means, and trying to unpack several semesters of undergraduate math courses into the lead paragraph of this article is untenable, so we need to quickly give readers some images to hold onto instead of only providing a fully general version in abstract mathematical notation. If you like I can try to add some sources for this definition. Searching in Google scholar search turns up literally thousands of papers and book chapters (from a wide variety of fields, including some talking about more abstract vector spaces) where an eigenvector is described as a vector whose "direction is unchanged", "direction is fixed", "direction is preserved", "direction is not changed", or similar. I'm sure with some effort screening them we can find a few which would be generally useful surveys for a newcomer reader to take a look at. I want to expand and slightly reorganize the first few sections after the lead to first include a geometric/visual overview and more clearly relate it to finite-dimensional matrix arithmetic, but it would also be good to have some introductory description somewhere near the top about more general kinds of vector spaces, what their linear transformations look like, why we want to know their eigenvectors, etc. – jacobolus (t) 19:31, 17 November 2023 (UTC)
magnitude, direction and orientation– this would depend on whether you consider a "direction" to refer to lines or "oriented lines". The words "attitude", "direction", "orientation", "sense", "bearing", etc. are used imprecisely and often interchangeably in English. Making these concepts precise requires formally defining them, but that's not really the point in the context of a few informal sentences here intended to give readers the right basic idea without too much technical overhead. – jacobolus (t) 07:32, 18 November 2023 (UTC)
The clause in question is "The eigenvectors and eigenvalues of a linear transformation serve to characterize it" within the lead section. The problem is that a pronoun typically refers to the subject of the sentence, which in this case is 'the eigenvectors and eigenvalues'. But here the pronoun refers to the object (the linear transformation), differentiated solely by a match in plurality between the pronoun and the subject (the pronoun being singular, matching the object, and the subject being plural). This is jarring. A better form could be The eigenvectors and eigenvalues of a linear transformation serve to characterize that transformation
. Is there any advantage to keeping the troubling pronoun?
Chumpih
t 23:18, :30, 15 June 2024 (UTC)
Here the troublesome clause is 'fed as inputs to the same inputs', which is not easy to parse. Perhaps a better form would be Sometimes a particular system is represented by a linear transformation whose outputs are
fed back as inputs to the transformation
. In addition, the phrasing 'In particular, it is often the case that' is perhaps better represented as 'sometimes'.
Chumpih
t 23:45, 15 June 2024 (UTC)
"The eigenvectors and eigenvalues of a linear transformation serve to characterize it, and so they play important roles in [applications]. In particular, it is often the case that a system is represented by a linear transformation whose outputs are fed ....". This sentence is awkward and could certainly be improved, but the point of this phrasing is to especially emphasize the case of systems modeled as linear transformations + feedback, because in these cases the largest eigenvalue and its associated eigenvector play a special role.
"Sometimes a particular system is represented by a linear transformation whose outputs are fed ...". Changing from particular applications to particular systems is an unrelated use of the word "particular"; both removing the first particular and adding the second particular change the meaning and focus of the sentence. We're no longer calling out a specific special class of applications, or referring back to the previous sentence at all. "Sometimes" makes a clean break, not tightly tied to anything prior, and it's not at all clear what the new "particular" is even for. Readers can still infer from reading the rest of the paragraph that there's a relationship between these sentences, but that inference is not reinforced by the language. – jacobolus (t) 00:30, 16 June 2024 (UTC)
...[applications]. In such cases, models for the systems can include linear transformations that feed back their outputs to their inputs. In such representations, the largest eigenvalue is of use because it indicates the long-term behavior of the system, after many applications of the linear transformation, and the associated eigenvector is the steady state of the system.. Note that the model doesn't typically "govern" a system, though i can represent or indicate what is happening. Chumpih t 00:52, 16 June 2024 (UTC)
I don't agree with the full set of changes in special:diff/1217830133/1228837393 by @ Jorge Stolfi, but I think he has some valuable ideas for changes to the lead which should at least be discussed / taken into consideration instead of summarily rejected. Pinging @ XOR'easter, @ Tito Omburo, @ Chetvorno who chimed in a bit about the lead here previously (I'm probably missing other folks). Here's how Jorge Stolfi left the lead:
In linear algebra, an eigenvector ( /ˈaɪɡən-/ EYE-gən-) or characteristic vector of a linear transformation is a non-zero vector that is only multiplied by some scalar factor when transformed by ; that is . The scalar is the corresponding eigenvalue or characteristic value associated with . [1]
Eigenvalues are often called characteristic roots because they are the roots of a characteristic polynomial associated with the linear transformation. Specifically, if is the square matrix of the linear transformation on some specified basis, the eigenvalues of are the values of that satisfy the equation , where denotes the determinant of a matrix and is the identity matrix with same size as .
In two- and three-dimensional geometry, a vector can be depicted as an arrow with a specific length and direction. A linear transformation acts on such vectors by some combination of scaling, rotation, or shearing. Its eigenvectors are those non-zero vectors that are only scaled. The corresponding eigenvalue is the factor by which an eigenvector is scaled. If the eigenvalue is positive, the eigenvector is stretched or shrunk, without changing its direction; if it is negative, the vector's direction is also reversed; and if it is zero, the vector is anihilated. [2]
Any linear transformation of a space of dimension is completely defined by a set of mutually orthogonal eigenvectors and their associated eigenvalues. For this and other reasons, eigenvalues and eigenvectors are extremely important in all areas of mathematics, science, and technology where linear algebra is used. In mechanics, for example, they arise in the analysis of the rotation of a rigid body, of the behavior of non- isotropic materials under stress, of the vibrations in an elastic structure, and many other phenomena. In statistics, the eigenvectors of the covariance matrix of a set of -dimensional data points may reveal the main independent factors affecting those data. In the control of industrial systems, the eigenvalues of the matrix describing the feedback signals can reveal the stability of system under momentary distrurbances.
I think the first paragraph here is unnecessarily less accessible to follow than the previous version, some technical detail can be deferred a bit maybe, and I don't like specifying 2–3 dimensions when vectors can be thought of geometrically in any dimension (albeit mental images get more abstract). But in particular I like the expanded discussion of various applications, which is more concrete and balanced than what was there before. The change from "serve to characterize it" to "completely defined by" is more emphatic though I'm not sure all that different. (Eigenvectors don't have to be mutually orthogonal, but that mistake can be written out.) – jacobolus (t) 01:12, 16 June 2024 (UTC)
References