Summary and further resources

2.7. Summary and further resources#

Specific learning goals for this chapter

  • Be familiar with basic linear algebra concepts, including subspace, span, column space, linear independence, basis, dimension, orthogonality.

  • State Pythogoras and Cauchy-Schwarz.

  • Compute the expansion of a vector in an orthonormal basis.

  • Define invertibility of a matrix.

  • State the orthogonal projection theorem. Compute the orthogonal projection of a vector given the orthonormal basis of a subspace. Compute the matrix representation of the orthogonal projection.

  • Define the concept of an orthogonal matrix.

  • State the linear least squares problem and its solution through the normal equations; implement in Numpy.

  • State the Gram-Schmidt theorem and describe its proof through Gram-Schmidt orthonormalization; compute the output of Gram-Schmidt on simple examples; implement in Numpy.

  • Define the QR decomposition and explain its connection to Gram-Schmidt.

  • Describe the back substitution procedure and perform it on a simple system. Same for forward substitution. Implement in Numpy.

  • State all the steps involved in solving least squares via QR.

  • Define the linear and polynomial regression problems; describe how to solve them as least squares problems; implement in Numpy.

Just the Code

An interactive Jupyter notebook featuring the code in this chapter can be accessed below (Google Colab recommended). It is also available as a slideshow.

Auto-quizzes

Automatically generated quizzes for this chapter can be accessed here (Google Colab recommended):