We will see how to compute the

QR decomposition of a matrix A and how to use Q and R to solve the linear equation system Ax=b using the from described

here.

from numpy import *
A = floor(random.rand(4,4)*20-10) # random matrix
Q,R = linalg.qr(A) # qr decomposition of A
b = floor(random.rand(4,1)*20-10)
# solve Ax = b using the standard numpy function
x = linalg.solve(A,b)
# solve Ax = b using Q and R
y = dot(Q.T,b)
xQR = linalg.solve(R,y)
print "\nSolution compared"
print x.T,'Ax=b'
print xQR.T,'Rx=y'

And now we can compare the solutions obtained

Solution compared
[[ 0.69207502 0.05565638 0.68965517 -0.2183908 ]] Ax=b
[[ 0.69207502 0.05565638 0.68965517 -0.2183908 ]] Rx=y

When using QR decomposition in Numpy, the first basis vector that it chooses can sometimes affect the numerical accuracy of the solution. See this post for an example where the L1-norm of the difference between the QR decomp solution and the "exact" solution was not zero:

ReplyDeletehttp://bpchesney.org/?p=657

Thanks

ReplyDeleteIn the example you give, the results are identical (within the limit of displayed digits). This would benefit from relevant examples comparing accuracy and efficiency (computational cost). Is it correct that QR decomposition is useful only/mainly(?) if you wish to solve the same system Ax=b for many vectors b, because the cost of computation of Q & R is the same as solving Ax=b by Gaussian elimination?

ReplyDeleteQR can be more expensive computationally, but can help in cases the system is moderately ill conditioned.

Delete