Basic Concepts
Definition 1: A QR factorization (or QR decomposition) of a square matrix A consists of an orthogonal matrix Q and an upper triangular matrix R such that A = QR.
Property 1 (QR Factorization): For any n × n invertible matrix A, we can construct a QR factorization.
Proof: Let A1, …, An represent the columns of A. Since A is invertible, we know that A1, …, An are independent and forms a basis for the set of all n × 1 column vectors. We can therefore we can use the Gram-Schmidt process to construct an orthonormal basis Q1, …, Qn for the set of all n × 1 column vectors. Let Q be the n × n matrix whose columns are Q1, …, Qn. By Property 4 of Orthogonal Vectors and Matrices, Q is an orthogonal matrix, and so by Property 6 of Orthogonal Vectors and Matrices, QTQ = I.
We seek a matrix R such that A = QR. Since QTQ = I, it follows that QTA = QTQR = R, and so we conclude that R = QTA is the required matrix. Let R = [rij], and so rij = Qi · Aj. As we saw in the proof of Theorem 1 of Orthogonal Vectors and Matrices, each Qi can be expressed as a linear combination of A1, …, Ai, which means that rij = 0 for i > j, which shows that R is an upper triangular matrix. It is also easy to show by induction that rij > 0 for each i.
Non-square matrices
We can extend Property 1 to certain non-square matrices. The proof is similar and Q and R are constructed in the same way.
Property 2 (QR Factorization): For any m × n matrix A with rank (A) = n ≤ m , we can construct an m × n orthonormal matrix Q and an n × n upper triangular matrix R such that A = QR.
Example
Example 1: Find a QR Factorization for the matrix A that is formed from the columns in Example 1 of Orthogonal Vectors and Matrices.
As can be seen in Figure 1 of Orthogonal Vectors and Matrices, A is a 4 × 3 matrix. The supplemental array formula =ELIM(A4:E7) produces a 4 × 3 matrix with only one zero row showing that rank(A) = 3 (see Appendices Determinants and Linear Equations and Linear Independent Vectors for details). Since rank(A) = 3 ≤ 4, we can apply Property 2.
The orthonormal matrix Q is calculated in Example 1 of Orthogonal Vectors and Matrices. The matrix R = QTA is shown in range M4:O6 of Figure 1 of Orthogonal Vectors and Matrices, as calculated using the array formula =MMULT(TRANSPOSE(I4:K7),A4:C7). Note that R is indeed an upper triangular matrix.
The results from Example 1 are replicated in Figure 1 below.
Observation: Shortly we will show how to use the QR Factorization to calculate eigenvalues and eigenvectors, and in fact, we will show a better way to produce such a QR Factorization, which is less prone to rounding errors.
Solving Linear Equations
For now, we show how to use QR factorization to find the solution to AX = C, the problem studied in System of Linear Equations. First we note that since A = QR and Q is orthogonal, if AX = C, then C = AX = QRX and so QTC = QTQRX = RX. Thus RX = QTC. But since R is an upper triangular matrix all of whose diagonal elements are non-zero (actually they are positive), we can use backwards substitution to find the solution for X. In fact if R =[rij], X = [xj] and QTC = [bi], then we will have a series of equations of the form (looking from the bottom up):
The first equation can be used to find the value of x1. Substituting this value for x1 in the second equation enables you to find x2. Continuing in this manner you can find xn-1. Substituting these values in the last equation enables you to find xn.
Example 2: Solve the following system of linear equations using QR Factorization.
We will use the QR Factorization from Example 1, as shown in Figure 1.
Figure 1 – QR Factorization
We could solve this problem using Gaussian elimination, as can be seen from Figure 2 where the array formula =ELIM(U4:X7) in range U10:X13 produces the solution x = -0.34944, y = 0.176822 and z = 0.792727.
Figure 2 – Solving AX = C using QR Factorization
We now show how to find a solution using QR factorization. First we calculate QTC (range R4:R6) as =MMULT(TRANSPOSE(E4:G7),P4:P7) using the values for Q shown in Figure 1. We next calculate the values for X (range T4:T6) via backwards substitution using the formulas in Figure 3.
Figure 3 – Formulas from Figure 2
Householder Matrix
Definition 2: A Householder matrix A is a square matrix for which there exists a vector V such that VTV = I and A = I – 2VVT.
Property 3: A Householder matrix is orthogonal.
Proof: Let A be a Householder matrix. First we show that A is symmetric, as follows:
We next show that A2 = I, from which it follows that AAT = A2 = I, which by definition means that A is orthogonal.
Using Gram-Schmidt
In Orthogonal Vectors and Matrices, we how to generate a QR factorization for any matrix using the Gram-Schmidt procedure. Unfortunately, for practical purposes, this procedure is unstable, in the sense that as it is applied on a computer, round off errors accumulate and so while theoretically a valid QR factorization is obtained, in practice the results can sometimes be wrong.
Using Householder matrices
We now present a procedure for constructing a QR factorization, using Householder matrices, which is more stable. As we did previously, we start with the case of a square matrix.
Property 4: Every invertible square matrix A has a QR factorization. Furthermore, there is an efficient algorithm for finding this QR factorization.
Define Vk = [vi] by
vi = 0 for i < k
Define Pk = I – 2VkVkT, Qk = PkQk-1 and Rk+1 = PkRk.
Now define R = Rn and Q = Qn-1T.
Observation: Property 4 holds even when the square matrix A is not invertible. The proof requires a little tweaking of the procedure described above.
Definition 3: The QR factorization procedure for finding the eigenvalues/vectors of a square matrix is as follows:
Let Q and R be defined as above. Let A0= A, Q0= Q and R0 = R. Define Ak+1 = RkQk. Since Ak+1 is also invertible it has a QR decomposition Ak+1 = Qk+1Rk+1.
Property 5: If the eigenvalues of A are distinct and non-zero then Ak converges to an upper triangular matrix with the same eigenvalues as A. If A is symmetric then Ak converges to a diagonal matrix with the same eigenvalues as A.
Observation: Since the eigenvalues of an upper triangular matrix are found on the diagonal, this means that for k sufficiently large the eigenvalues of A are found on the diagonal of Ak.
Example using Householder matrices
Example 3: Use the QR decomposition method to find the eigenvalues of
Creating QR Factorization
We begin by finding Q and R.
Figure 4 – QR Factorization using a Householder matrix (step 1)
Thus
where A = QR, R is an upper triangular matrix and QTQ = I.
Finding eigenvalues
Calling A0 = A, R0 = R and Q0 = Q, we now define a new A = RQ (i.e. A1 = R0Q0) and repeat the process.
Figure 5 – QR Factorization using a Householder matrix (step 2)
The result is a new R and Q, which we now call R1 and Q1 such that A1 = Q1R1, R1 is an upper triangular matrix and Q1TQ1 = I. As before we now define a new A, i.e. A2 = R1Q1 and repeat the process. We continue this process until we get an Ak which is sufficiently close to being a diagonal matrix.
We now skip to the 9th iteration where A has now converged to a diagonal matrix:
Figure 6 – QR Factorization using a Householder matrix (step 9)
Thus
Thus the eigenvalues of the original matrix A are 4, 1 and 1.
Finding Eigenvectors
Once you find an eigenvalue you can use the Gaussian Elimination Method to find the corresponding eigenvectors since these are solutions to the homogeneous equation (A – λI) X = 0, which can be solved as described in Definition 4 of System of Linear Equations. The only problem with this approach is when the eigenvalues of a matrix are not all distinct (such as in Example 3). Gaussian elimination can find one such eigenvector, but not a set of orthogonal eigenvectors (one for each instance of the repeated eigenvectors, as is required in Example 3).
A better approach is to use the fact that the corresponding eigenvectors are the columns of the matrix B = Q0Q1…Qk (where k is the iteration where Ak is sufficiently close to a diagonal matrix). For Example 3 we define B0 = Q0 and Bn+1 = BnQn (shown as B′Q in K32:M34 of Figure 5 and K214:M216 of Figure 6. Thus we conclude that the eigenvectors corresponding to 4, 1, 1 are the columns in the matrix
Real Statistics solution for eigenvectors
Since the process described above is complicated and tedious to carry out, it is simpler and faster to use the following Real Statistics functions eigVAL, eigVECT and related functions, as described in Eigenvalues and Eigenvectors.
Example 4: Use the eigVECTSym function to find the eigenvalues and eigenvectors for Example 3.
The output from eigVECTSym(A6:C8) is shown in Figure 6.
Figure 7 – Eigenvalues/vectors using eigVECTSym
which is the same result we obtained previously.
QR Factorization Property
We can extend Property 4 to certain non-square matrices. The proof is similar and Q and R are constructed in the same way.
Property 6 (QR Factorization): For any m × n matrix A with rank(A) = n ≤ m, we can construct in an efficient way an m × n orthonormal matrix Q and an n × n upper triangular matrix R such that A = QR.
Since the process described above is complicated and tedious to carry out, it is simpler and faster to use the following worksheet functions to create a QR factorization of a matrix.
Worksheet Functions
Real Statistics Functions: The Real Statistics Resource Pack provides the following array functions, where R1 is an m × n array or cell range.
QRFactorR(R1, prec): outputs the n × n array R for which A = QR where A is the matrix in R1.
QRFactorQ(R1, prec): outputs the m × n array R for which A = QR where A is the matrix in R1.
QRFactor(R1, prec): outputs an m+n × n array. The first m rows of the output consist of Q and the next n rows of the output consist of R where A = QR and A is the matrix in range R1.
prec can be used to specify what is considered to be zero in the algorithm used to calculate these functions. The default is 0, but it may be acceptable to use some other small value (e.g. 0.00001) as sufficiently small to be considered to be zero.
Note that to obtain Q when Property 6 holds, you can also use the fact that Q = AR-1, and so Q can be obtained via =MMULT(R1, MINVERSE(R2)) where R2 contains the formula =QRFactorR(R1).
Examples using worksheet functions
Example 5: Find the QR factorization of the matrix in A4:D9 of Figure 8.
Figure 8 – QR Factorization
Range F4:I7 contains the array formula =QRFactorR(A4,D9), Range F10:I15 contains =QRFactorQ(A4,D9), although since R is invertible, Q = AR-1, and so Q can also be obtained using the worksheet formula =MMULT(A4,D9, MINVERSE(F4:I7)). Finally, range K4:N13 contains the formula =QRFactorR(A4,D9).
We see that A = QR from the formula =MMULT(F10:I15,F4:I7) in range F18:I23. From the formula =MMULT(TRANSPOSE(F10:I15),F10:I15) in range K18:N21, we also see that QTQ = I, which demonstrates partial orthogonality, since QQT ≠ I.
Observation: Property 6 holds no matter what the rank of matrix A is.
For example, in Figure 9 we display the QR factorization of two 3 × 3 square matrices with rank less than 3.
Figure 9 – QR factorization where rank(A) < 3
Click here for more examples of QR factorization, including the distinction between full and reduced QR factorization and the case where the number of rows in the matrix is less than the number of columns.
More Worksheet Functions
Real Statistics Functions: As we saw in Example 2, QR Factorization can be used to solve a system of linear equations. The approach described above can be extended to invert a matrix. The Real Statistics Resource Pack provides the following two functions to automate these processes.
QRSolve(R1, R2) – assuming R1 is an m × n range describing matrix A and R2 is an m × 1 range describing the column vector C, QRSolve outputs an n × 1 column vector X containing the solution to AX = C
QRInverse(R1) = inverse of the matrix described by range R1 (assumes that R1 is a square matrix)
Observation: Referring to Figure 2, QRSolve(M4:O7, P4:P7) would output the result described in T4:T7.
Example 6: Find the inverse of the matrix in range W18:Y20 of Figure 9 using QRInverse.
Figure 10 – Matrix inversion using QR Factorization
Using QRInverse(W18:Y20), you obtain the results shown in range AA18:AC20. This is the same result as obtained using MINVERSE(W18:Y20).
Data Analysis Tool
Real Statistics Data Analysis Tool: The Matrix data analysis tool contains a QR Factorization option that computes the QR factorization of the matrix in the Input Range. See Figures 1 and 3 of Matrix Operations Tool for an example of the use of this tool.
Examples Workbook
Click here to download the Excel workbook with the examples described on this webpage.
References
Golub, G. H., Van Loan, C. F. (1996) Matrix computations. 3rd ed. Johns Hopkins University Press
Searle, S. R. (1982) Matrix algebra useful for statistics. Wiley
Perry, W. L. (1988) Elementary linear algebra. McGraw-Hill
Fasshauer, G. (2015) Linear algebra.
https://math.iit.edu/~fass/532_handouts.html
Lambers, J. (2010) Numerical linear algebra
https://www.yumpu.com/en/document/view/41276350