{"nbformat_minor": 0, "nbformat": 3, "worksheets": [{"cells": [{"cell_type": "markdown", "source": ["# Least Squares using the SVD"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 2, "language": "python", "input": ["import numpy as np\n", "import numpy.linalg as la\n", "import scipy.linalg as spla"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 19, "language": "python", "input": ["# tall and skinny w/nullspace\n", "np.random.seed(12)\n", "A = np.random.randn(6, 4)\n", "b = np.random.randn(6)\n", "A[3] = A[4] + A[5]\n", "A[1] = A[5] + A[1]\n", "A[2] = A[3] + A[1]\n", "A[0] = A[3] + A[1]"], "metadata": {}}, {"cell_type": "markdown", "source": ["### Part I: Singular least squares using QR\n", "\n", "Let's see how successfully we can solve the least squares problem **when the matrix has a nullspace** using QR:"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 4, "language": "python", "input": ["Q, R = la.qr(A)"], "metadata": {}}, {"outputs": [{"prompt_number": 5, "output_type": "pyout", "text": ["array([[-4.526, 3.492, -0.204, -3.647],\n", " [ 0. , 0.796, 0.034, 0.603],\n", " [ 0. , 0. , -1.459, 0.674],\n", " [ 0. , -0. , -0. , 0. ]])"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 5, "language": "python", "input": ["R.round(3)"], "metadata": {}}, {"cell_type": "markdown", "source": ["We can choose `x_qr[3]` as we please:"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 6, "language": "python", "input": ["x_qr = np.zeros(A.shape[1])"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 20, "language": "python", "input": ["x_qr[3] = 0"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 21, "language": "python", "input": ["QTbnew = Q.T.dot(b)[:3,] - R[:3, 3] * x_qr[3]\n", "x_qr[:3] = spla.solve_triangular(R[:3,:3], QTbnew, lower=False)"], "metadata": {}}, {"cell_type": "markdown", "source": ["Let's take a look at the residual norm and the norm of `x_qr`:"], "metadata": {}}, {"outputs": [{"prompt_number": 22, "output_type": "pyout", "text": ["array([ -4.44089210e-16, 0.00000000e+00, -1.11022302e-16,\n", " -7.35042876e-01])"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 22, "language": "python", "input": ["R.dot(x_qr)-Q.T.dot(b)[:4]"], "metadata": {}}, {"outputs": [{"prompt_number": 23, "output_type": "pyout", "text": ["2.1267152888030982"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 23, "language": "python", "input": ["la.norm(A.dot(x_qr)-b, 2)"], "metadata": {}}, {"outputs": [{"prompt_number": 24, "output_type": "pyout", "text": ["0.82393512974131577"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 24, "language": "python", "input": ["la.norm(x_qr, 2)"], "metadata": {}}, {"cell_type": "markdown", "source": ["Choose a different `x_qr[3]` and compare residual and norm of `x_qr`."], "metadata": {}}, {"cell_type": "markdown", "source": ["--------------\n", "### Part II: Solving least squares using the SVD\n", "Now compute the SVD of $A$:"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 25, "language": "python", "input": ["U, sigma, VT = la.svd(A)"], "metadata": {}}, {"cell_type": "markdown", "source": ["Make a matrix `Sigma` of the correct size:"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 26, "language": "python", "input": ["Sigma = np.zeros(A.shape)\n", "Sigma[:4,:4] = np.diag(sigma)"], "metadata": {}}, {"cell_type": "markdown", "source": ["And check that we've actually factorized `A`:"], "metadata": {}}, {"outputs": [{"prompt_number": 27, "output_type": "pyout", "text": ["array([[ 0., -0., 0., 0.],\n", " [ 0., -0., 0., 0.],\n", " [ 0., -0., 0., 0.],\n", " [ 0., -0., -0., 0.],\n", " [ 0., -0., -0., 0.],\n", " [ 0., 0., 0., 0.]])"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 27, "language": "python", "input": ["(U.dot(Sigma).dot(VT) - A).round(4)"], "metadata": {}}, {"cell_type": "markdown", "source": ["Now define `Sigma_pinv` as the \"pseudo-\"inverse of `Sigma`, where \"pseudo\" means \"don't divide by zero\":"], "metadata": {}}, {"outputs": [{"prompt_number": 28, "output_type": "pyout", "text": ["array([[ 0.147, 0. , 0. , 0. , 0. , 0. ],\n", " [ 0. , 0.624, 0. , 0. , 0. , 0. ],\n", " [ 0. , 0. , 1.055, 0. , 0. , 0. ],\n", " [ 0. , 0. , 0. , 0. , 0. , 0. ]])"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 28, "language": "python", "input": ["Sigma_pinv = np.zeros(A.shape).T\n", "Sigma_pinv[:3,:3] = np.diag(1/sigma[:3])\n", "Sigma_pinv.round(3)"], "metadata": {}}, {"cell_type": "markdown", "source": ["Now compute the SVD-based solution for the least-squares problem:"], "metadata": {}}, {"outputs": [], "cell_type": "code", "collapsed": false, "prompt_number": 29, "language": "python", "input": ["x_svd = VT.T.dot(Sigma_pinv).dot(U.T).dot(b)"], "metadata": {}}, {"outputs": [{"prompt_number": 30, "output_type": "pyout", "text": ["2.1267152888030982"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 30, "language": "python", "input": ["la.norm(A.dot(x_svd)-b, 2)"], "metadata": {}}, {"outputs": [{"prompt_number": 31, "output_type": "pyout", "text": ["0.77354943014895838"], "metadata": {}}], "cell_type": "code", "collapsed": false, "prompt_number": 31, "language": "python", "input": ["la.norm(x_svd)"], "metadata": {}}, {"cell_type": "markdown", "source": ["* What do you observe about $\\|\\text{x_svd}\\|_2$ compared to $\\|\\text{x_qr}\\|_2$?\n", "* Is $\\|\\text{x_svd}\\|_2$ compared to $\\|\\text{x_qr}\\|_2$?"], "metadata": {}}], "metadata": {}}], "metadata": {"name": "", "signature": "sha256:1cf8582a25abfca95e2659f7c419bde3607a612ca37c144430f8cd0f89dc3246"}}