Matrix (Linear) Algebra


  • Introduction
  • Matrix multiplication
  • Transpose
  • Gaussian Elimination
  • Determinants
  • Inverse of a matrix
  • Cramer's rule
  • Eigenvalues and eigenvectors
  • Projections
  • Diagonalization


  • What is a matrix?
  • Which operations can we do between two matrices? (SHS page 551)

If you have never seen matrices before, I strongly suggest completing Khan Academy's lectures on linear algebra.

Matrix multiplication

How can we represent a system of equations in matrix form?

See the example on SHS page 552 for a motivation for matrix multiplication.

Rules for matrix multiplication.

What happens when we multiply by the identity matrix?


What is the transpose?

What is the transpose of a symmetric matrix?

Where does the transpose appear in econometrics?


Dowling 10.1

Gaussian Elimination

  1. Interchange any pair of rows
  2. Multiply any row by a scalar
  3. Add any multiple of one row to a different row


Start with determinants of 2 x 2 matrices

Calculate the solution to the following system of equations:

$$ a_{11}x_1 + a_{12}x_2 = b_1 \\ a_{21}x_1 + a_{22}x_2 = b_1 $$

What is the denominator?

Could we also express the numerator as a determinant?

What is the geometric interpretation?

Details on SHS Page 585.


Dowling 11.1

Determinants of 3 x 3 matrices

This is more difficult so we use:

  • cofactor expansion
  • diagonals shortcut method (Sarrus's rule)

What is the geometric interpretation?

Matrix Inverse

Find a matrix $A^{-1}$ such that $A A^{-1} = I$.

$A$ has an inverse if $|A| \neq 0$.

Find a formula for the 2x2 matrix inverse using Cramer's rule.

We can use the matrix inverse to solve a system of equations.

Find the inverse using elementary row operations.

Eigenvalues and eigenvectors

Khan Academy has a great sequence of videos on eigenvalues and eigenvectors.

This video is also helpful for motivating why eigenvalues are important for differential equations.

There are exercises in Dowling on page 280 if you need some practice.


Watch this video.

Note that if a matrix $A$ is symmetric, the matrix of eigenvectors $\Lambda$ will be orthogonal. If a matrix is orthogonal, the inverse of this matrix is equal to the tranpose of the matrix. $\Lambda^{-1} = \Lambda'$. (See here for a proof.)

See Chiang and Wainwright page 310 example 6 for one application of diagonalization.


Watch the following videos from Ben Lambert: