Eigenthings (eigenvectors and eigenvalues) (Discussion)

Eigenvectors and eigenvalues (alternative Simple English Wikipedia page) are a topic you hear a lot in linear algebra and data science machine learning.

However, these are very abstract terms and are difficult to understand why they are useful and what they really mean.

This forum post is to catalog helpful resources on uncovering the mysteries of these eigenthings and discuss common confusions around understanding them.

Here are some resources on the topic I have found useful:

Haven’t looked through these, but look promising:

1 Like

So, if I read the top answer from “What is the importance of eigenvalues/eigenvectors?” correctly…

a scalar is a scalar, unless all it does to a vector is stretch/compress/flip it, then it becomes an eigenvalue and the vector it stretches/compresses/flips becomes an eigenvector?

Although I think I understand where you’re coming from, I just want to be clear on when a scalar “becomes” an eigenvalue.

I don’t think you can just create eigenvalues, but rather they are a property of the matrix you’re multiplying (which some describe as a linear transformation).

But to answer your question, it is a bit more nuanced the difference between a scalar and an eigenvalue. Yes, you are correct that an eigenvalue and scalar both stretch /compress/flip vectors. However, eigenvalues do these transformations (stretch/compress/flip) because they are actually (to my understanding) matrices that are being multiplied.

In exploring certain matrix multiplications, there are certain eigenvalues (with particular scalar amounts) with corresponding eigenvectors that have this special property of a simple stretch/compressing/flipping being the result of a matrix multiplication.

That’s how I understand it and I hope that helps. If there’s anything inaccurate or confusing from my explanation, please point it out
:slight_smile:

From the link here,

I really liked the last paragraph answer in the “Slightly Longer Answer” section being (emphasis added),

That is the essence of what one hopes to do with the eigenvectors and eigenvalues: “decouple” the ways in which the linear transformation acts into a number of independent actions along separate “directions”, that can be dealt with independently. A lot of problems come down to figuring out these “lines of independent action”, and understanding them can really help you figure out what the matrix/linear transformation is “really” doing.

I really like this answer because it gives my previously unknown insight into these eigenpairs. Specifically, thinking of linear transformations (i.e. the A in the Av = λv), it is difficult to really understand what it is actually doing.

Is it moving vectors to the left? Is it compressing them? Is it rotating things around?

These questions are difficult to answer if you were to look at the linear transformation directly. However, if you “decompose” the linear transformation and to discover its eigenpairs (eigenvalues and eigenvectors), you can get better idea of what the linear transformation is doing to the vector it is being applied to (i.e. the v in the Av = λv).