123
Calculator-Cloud

Eigenvalue Calculator

Enter 2×2 Matrix:

What Are Eigenvalues and Eigenvectors?

Eigenvalues and eigenvectors are fundamental concepts in linear algebra that reveal the essential behavior of linear transformations. When a matrix transforms a vector, most vectors change both direction and magnitude. However, eigenvectors are special vectors that only get stretched or compressed—they maintain their direction (or flip 180°).

The eigenvalue tells you by how much the eigenvector gets scaled. If Av = λv, then v is an eigenvector of matrix A, and λ (lambda) is the corresponding eigenvalue. The equation says: applying matrix A to vector v produces the same result as simply multiplying v by the scalar λ.

Fundamental Equation

Av = λv

Where:

  • A is the matrix
  • v is the eigenvector (non-zero)
  • λ (lambda) is the eigenvalue

Calculating Eigenvalues for 2×2 Matrices

For a 2×2 matrix, eigenvalues can be found using a direct formula derived from the characteristic equation.

Step 1: Write the Characteristic Equation

For matrix A = [a b; c d], we solve det(A - λI) = 0:

det([a-λ b; c d-λ]) = 0

(a-λ)(d-λ) - bc = 0

λ² - (a+d)λ + (ad-bc) = 0

Step 2: Apply the Quadratic Formula

Eigenvalue Formula for 2×2 Matrix

λ = [trace ± √(trace² - 4·det)] / 2

Where:

  • trace = a + d (sum of diagonal)
  • det = ad - bc (determinant)

Example Calculation

Find eigenvalues of A = [4 2; 1 3]:

  • trace = 4 + 3 = 7
  • det = 4×3 - 2×1 = 10
  • λ = [7 ± √(49 - 40)] / 2 = [7 ± 3] / 2
  • λ₁ = 5, λ₂ = 2

Finding Eigenvectors

Once you have an eigenvalue λ, find its eigenvector by solving (A - λI)v = 0. This gives a system of linear equations whose non-trivial solutions are the eigenvectors.

Eigenvector Example

For A = [4 2; 1 3] with λ₁ = 5:

Solve (A - 5I)v = 0:

[-1 2; 1 -2][x; y] = [0; 0]

-x + 2y = 0, so x = 2y

Eigenvector: v₁ = [2, 1] (or any scalar multiple)

Types of Eigenvalues

Real Distinct Eigenvalues

When the discriminant (trace² - 4·det) is positive, you get two different real eigenvalues. Each has its own eigenvector direction. The matrix stretches space differently along these two independent directions.

Real Repeated Eigenvalues

When the discriminant equals zero, there's only one eigenvalue (with multiplicity 2). The matrix might have one or two linearly independent eigenvectors, affecting its diagonalizability.

Complex Eigenvalues

When the discriminant is negative, eigenvalues come as complex conjugate pairs (a ± bi). This occurs when the matrix involves rotation. Complex eigenvalues indicate oscillatory behavior in dynamical systems.

Geometric Interpretation

Eigenvectors as Invariant Directions

Eigenvectors define directions that are preserved by the linear transformation. When you apply the matrix, vectors along eigenvector directions simply get scaled (stretched, compressed, or flipped), but they don't rotate to point in a different direction.

Eigenvalues as Scale Factors

  • λ > 1: Stretches vectors along the eigenvector direction
  • 0 < λ < 1: Compresses vectors
  • λ = 1: No change in length
  • λ < 0: Flips direction (and scales by |λ|)
  • λ = 0: Collapses to zero (singular matrix)

Complex Eigenvalues and Rotation

When eigenvalues are complex (a ± bi), the transformation involves rotation. The angle of rotation is θ = arctan(b/a), and the scaling factor is √(a² + b²). A pure rotation matrix has eigenvalues with magnitude 1.

Properties of Eigenvalues

Sum Equals Trace

The sum of all eigenvalues equals the trace (sum of diagonal elements) of the matrix: λ₁ + λ₂ + ... + λₙ = tr(A)

Product Equals Determinant

The product of all eigenvalues equals the determinant: λ₁ × λ₂ × ... × λₙ = det(A)

Eigenvalues of Powers

If λ is an eigenvalue of A, then λⁿ is an eigenvalue of Aⁿ with the same eigenvector.

Eigenvalues of Inverse

If λ is an eigenvalue of A, then 1/λ is an eigenvalue of A⁻¹ (assuming A is invertible and λ ≠ 0).

Applications of Eigenvalues

Principal Component Analysis (PCA)

PCA uses eigenvectors of the covariance matrix to find directions of maximum variance in data. The eigenvalues indicate how much variance each principal component captures. This is fundamental to dimensionality reduction and data visualization.

Google's PageRank Algorithm

The original PageRank algorithm finds the dominant eigenvector of a modified web link matrix. This eigenvector gives the "importance" score for each webpage. The eigenvalue 1 is guaranteed to exist for stochastic matrices.

Quantum Mechanics

In quantum mechanics, observables are represented by operators (matrices). Eigenvalues represent possible measurement outcomes, and eigenvectors represent the states that yield those measurements with certainty.

Vibration Analysis

Natural frequencies of vibrating systems are eigenvalues of the system matrix. Eigenvectors represent the mode shapes—patterns in which the system naturally oscillates.

Stability Analysis

In dynamical systems, eigenvalues determine stability. If all eigenvalues have negative real parts, the system is stable. If any eigenvalue has positive real part, the system is unstable.

Markov Chains

The long-term behavior of Markov chains is determined by the eigenvector corresponding to eigenvalue 1 of the transition matrix. This gives the steady-state probability distribution.

Special Matrices and Their Eigenvalues

Symmetric Matrices

Real symmetric matrices always have real eigenvalues. Their eigenvectors corresponding to different eigenvalues are orthogonal. This property is crucial in many applications.

Orthogonal Matrices

Orthogonal matrices (rotation and reflection) have eigenvalues with magnitude 1. They lie on the unit circle in the complex plane.

Diagonal Matrices

The eigenvalues of a diagonal matrix are simply its diagonal entries. The eigenvectors are the standard basis vectors.

Triangular Matrices

For triangular matrices, eigenvalues are the diagonal entries (same as diagonal matrices).

The Characteristic Polynomial

The characteristic polynomial p(λ) = det(A - λI) is a polynomial of degree n for an n×n matrix. Its roots are the eigenvalues. The coefficients encode important information:

  • Constant term: (-1)ⁿ det(A)
  • Coefficient of λⁿ⁻¹: -trace(A)

Frequently Asked Questions

Can eigenvalues be zero?

Yes. An eigenvalue of zero means the matrix is singular (non-invertible). It indicates that the transformation collapses some direction to zero.

Can a matrix have complex eigenvalues if its entries are real?

Yes. A real matrix can have complex eigenvalues, which always come in conjugate pairs (a+bi and a-bi). This happens when the matrix involves rotation.

What does it mean if eigenvalues are equal?

Repeated eigenvalues can occur. The matrix might still have two independent eigenvectors (like the identity matrix) or only one (making it non-diagonalizable).

Why are eigenvalues important for stability?

In differential equations and dynamical systems, eigenvalues determine whether solutions grow, decay, or oscillate. Negative real parts mean decay (stability), positive means growth (instability).