# Least Squares using the SVD

In :
import numpy as np
import numpy.linalg as la
import scipy.linalg as spla

In :
# tall and skinny w/nullspace
np.random.seed(12)
A = np.random.randn(6, 4)
b = np.random.randn(6)
A = A + A
A = A + A
A = A + A
A = A + A


### Part I: Singular least squares using QR

Let's see how successfully we can solve the least squares problem when the matrix has a nullspace using QR:

In :
Q, R = la.qr(A)

In :
R.round(3)

Out:
array([[-4.526,  3.492, -0.204, -3.647],
[ 0.   ,  0.796,  0.034,  0.603],
[ 0.   ,  0.   , -1.459,  0.674],
[ 0.   , -0.   , -0.   ,  0.   ]])


We can choose x_qr as we please:

In :
x_qr = np.zeros(A.shape)

In :
x_qr = 0

In :
QTbnew = Q.T.dot(b)[:3,] - R[:3, 3] * x_qr
x_qr[:3] = spla.solve_triangular(R[:3,:3], QTbnew, lower=False)


Let's take a look at the residual norm and the norm of x_qr:

In :
R.dot(x_qr)-Q.T.dot(b)[:4]

Out:
array([ -4.44089210e-16,   0.00000000e+00,  -1.11022302e-16,
-7.35042876e-01])

In :
la.norm(A.dot(x_qr)-b, 2)

Out:
2.1267152888030982

In :
la.norm(x_qr, 2)

Out:
0.82393512974131577


Choose a different x_qr and compare residual and norm of x_qr.

### Part II: Solving least squares using the SVD

Now compute the SVD of $A$:

In :
U, sigma, VT = la.svd(A)


Make a matrix Sigma of the correct size:

In :
Sigma = np.zeros(A.shape)
Sigma[:4,:4] = np.diag(sigma)


And check that we've actually factorized A:

In :
(U.dot(Sigma).dot(VT) - A).round(4)

Out:
array([[ 0., -0.,  0.,  0.],
[ 0., -0.,  0.,  0.],
[ 0., -0.,  0.,  0.],
[ 0., -0., -0.,  0.],
[ 0., -0., -0.,  0.],
[ 0.,  0.,  0.,  0.]])


Now define Sigma_pinv as the "pseudo-"inverse of Sigma, where "pseudo" means "don't divide by zero":

In :
Sigma_pinv = np.zeros(A.shape).T
Sigma_pinv[:3,:3] = np.diag(1/sigma[:3])
Sigma_pinv.round(3)

Out:
array([[ 0.147,  0.   ,  0.   ,  0.   ,  0.   ,  0.   ],
[ 0.   ,  0.624,  0.   ,  0.   ,  0.   ,  0.   ],
[ 0.   ,  0.   ,  1.055,  0.   ,  0.   ,  0.   ],
[ 0.   ,  0.   ,  0.   ,  0.   ,  0.   ,  0.   ]])


Now compute the SVD-based solution for the least-squares problem:

In :
x_svd = VT.T.dot(Sigma_pinv).dot(U.T).dot(b)

In :
la.norm(A.dot(x_svd)-b, 2)

Out:
2.1267152888030982

In :
la.norm(x_svd)

Out:
0.77354943014895838

• What do you observe about $\|\text{x_svd}\|_2$ compared to $\|\text{x_qr}\|_2$?
• Is $\|\text{x_svd}\|_2$ compared to $\|\text{x_qr}\|_2$?