Part of the linear algebra notes
(just taking notes during class on my computer)
Examples:
The zero vector is always an eigenvector but it’s boring so we usually don’t talk about it.
The amount that a particular eigenvector gets scaled during the transformation is the eigenvalue. For example, if a transformation takes vector to , then is an eigenvector of and its eigenvalue is .
It feels like it’d be easier to find the eigenvectors then look for their values. But it’s actually easier to find the eigenvalues then look for the corresponding vectors.
The usual trick is to subtract some number from everything along the diagonal and tweak the number until the determinant equals 0.
This is a quadratic and meh. Theres a good 3blue1brown about what this exactly means. ANyway so you’ll get some solutions of λ, so you plug those lambdas into
for each eigenvalue , etc, where is the eigenvector you’re looking for, and is like the identity matrix but with in place of 1’s on the diagonal.
you might have equations like
it still has 3 eigen vectors even though it’s
uhhhhh