Part of the linear algebra notes
Eigenstuff
(just taking notes during class on my computer)
- A matrix is a transformation that takes vectors to vectors
- If a matrix takes a vector to a scaled version of itself, this vector is an eigenvector
Examples:
- In 2d space, a uniform dilation around the origin scales every vector, so every vector is an eigenvector
- In 2d space, a nonuniform dilation moves the axes to dilations of themselves and throws everything else off-axis, so only the axes have eigenvectors
- In 2d space, a rotation by 10 degrees moves every vector apart from the zero vector, so only the zero vector is the eigenvector
The zero vector is always an eigenvector but it’s boring so we usually don’t talk about it.
The amount that a particular eigenvector gets scaled during the transformation is the eigenvalue. For example, if a transformation takes vector to , then is an eigenvector of and its eigenvalue is .
trick to finding eigenvectors
It feels like it’d be easier to find the eigenvectors then look for their values. But it’s actually easier to find the eigenvalues then look for the corresponding vectors.
The usual trick is to subtract some number from everything along the diagonal and tweak the number until the determinant equals 0.
This is a quadratic and meh. Theres a good 3blue1brown about what this exactly means. ANyway so you’ll get some solutions of λ, so you plug those lambdas into
for each eigenvalue , etc, where is the eigenvector you’re looking for, and is like the identity matrix but with in place of 1’s on the diagonal.
algebraic multiplicity
you might have equations like
it still has 3 eigen vectors even though it’s
eigenspace
uhhhhh