Eigenvalues & Eigenvectors

Eigenvalues & Eigenvectors

Definition

Core Statement

For a square matrix A, an Eigenvector (v) is a non-zero vector that, when multiplied by A, does not change direction, only magnitude. The Eigenvalue (λ) is the scalar factor by which it stretches or shrinks.

Av=λv

Purpose

  1. Principal Component Analysis (PCA): The eigenvectors of the covariance matrix are the "Principal Components" (directions of max variance). The eigenvalues represent the amount of variance explained.
  2. Google PageRank: The dominant eigenvector of the web graph matrix determines page importance.
  3. Stability Analysis: Determining if a system (physics, economics) will explode or settle down.

Intuition

Imagine a transformation matrix A acts like stretching a piece of fabric.


Worked Example: PCA Context

Variance of Data

You have a dataset with Covariance Matrix Σ:

Σ=[4223]

You calculate eigenvalues and eigenvectors.

Result:

  1. λ1=5.56, v1=[0.79,0.61]
  2. λ2=1.44, v2=[0.61,0.79]

Interpretation:

  • Direction: The data varies most along the direction [0.79,0.61]. This is PC1.
  • Magnitude: The variance along this axis is 5.56.
  • Proportion Explained: 5.565.56+1.44=5.56779.4%.
  • PC1 captures 79.4% of the information.

Assumptions


Properties

Property Description
Trace Sum of eigenvalues = Sum of diagonal elements (Trace of A).
Determinant Product of eigenvalues = Determinant of A.
Symmetric Matrix Eigenvalues are Real numbers; Eigenvectors are Orthogonal (perpendicular).
Covariance Matrix Always Symmetric Positive Semi-Definite (λ0).

Python Implementation

import numpy as np

# 2x2 Covariance Matrix
A = np.array([[4, 2], 
              [2, 3]])

# Calculate Eig
eigenvalues, eigenvectors = np.linalg.eig(A)

print("Eigenvalues:", eigenvalues)
print("Eigenvectors:\n", eigenvectors)

# Check Av = lambda v
v1 = eigenvectors[:, 0]
lam1 = eigenvalues[0]

print("Av:", A @ v1)
print("lambda*v:", lam1 * v1)