Concept:
Diagonalization of the matrix:
If a square matrix Q of order n has n linearly independent Eigenvectors, then matrix P can be found such that \({P^{ - 1}}QP\) is a diagonal matrix.
Let Q be a square matrix of order 3.
Let λ1, λ2, and λ3 be Eigenvalues of matrix Q and \({X_1} = \left[ {\begin{array}{*{20}{c}} {{x_1}}\\ {{y_1}}\\ {{z_1}} \end{array}} \right],\;{X_2} = \left[ {\begin{array}{*{20}{c}} {{x_2}}\\ {{y_2}}\\ {{z_2}} \end{array}} \right],\;{X_3} = \left[ {\begin{array}{*{20}{c}} {{x_3}}\\ {{y_3}}\\ {{z_3}} \end{array}} \right]\) be the corresponding Eigenvectors.
Let denote the square matrix \(\left[ {\begin{array}{*{20}{c}} {{X_1}}&{{X_2}}&{{X_3}} \end{array}} \right] = \left[ {\begin{array}{*{20}{c}} {{x_1}}&{{x_2}}&{{x_3}}\\ {{y_1}}&{{y_2}}&{{y_3}}\\ {{z_1}}&{{z_2}}&{{z_3}} \end{array}} \right]\) by P.
Now, the given matrix A can be diagonalized by \(D = {P^{ - 1}}QP\)
Or the matrix A can be represented by \(Q = PD{P^{ - 1}}\)
Where D is the diagonal matrix and it is represented by \(D = \left[ {\begin{array}{*{20}{c}} {{\lambda _1}}&0&0\\ 0&{{\lambda _2}}&0\\ 0&0&{{\lambda _3}} \end{array}} \right]\)
Properties of Eigenvalues:
The sum of Eigenvalues of a matrix A is equal to the trace of that matrix A
The product of Eigenvalues of a matrix A is equal to the determinant of that matrix A
Calculation:
Option 1: In a lower triangular matrix or in an upper triangular matrix, the diagonal elements itself are the Eigenvalues.
Therefore, the product of the diagonal elements is equal to the product of Eigenvalues.
Option 2:
For n × n matrices A and B i.e. for square matrices
AB = I
⇒ B = A-1
Now, BA = A-1A = I
If AB = I, then BA = I. It is always true.
Therefore, the given statement is false.
Note: But the above statement is valid for non-square matrices.
Example: \(A = \left[ {\begin{array}{*{20}{c}}1&0&0\\0&1&0\end{array}} \right],B = \left[ {\begin{array}{*{20}{c}}1&0\\0&1\\0&0\end{array}} \right]\)
Option 3:
If there exists an invertible matrix Q such that A = QBQ-1, the det A = det B.
From the diagonalization of matrix A, the above statement is true.
Option 4:
A is an invertible matrix. So, matrix A is a nonsingular matrix and hence the rank of a matrix is n.
Therefore, the rows of A are linearly independent.