Properties Of Determinants

In linear algebra, a determinant is a special number that can be determined from a square matrix. The determinant of a matrix, denoted as det(P), |P| or det P, has some useful properties which allow us to obtain the same results with different and simpler entries (elements). There are 10 main properties of determinants, namely: reflection property, all-zero property, proportionality or repetition property, switching property, scalar multiple properties, sum property, invariance property, factor property, triangle property, and co-factor matrix property. All these properties are covered in detail below, along with solved examples.

Determinants: All Topics

Introduction to Determinants

Minors and Cofactors

Properties of Determinants

System of Linear Equations Using Determinants

Differentiation and Integration of Determinants

Standard Determinants

Important Properties of Determinants

  1. The determinant of a matrix is equal to the product of its eigenvalues.
  2. The determinant of a matrix is equal to the sum of its principal minors.
  3. The determinant of a triangular matrix is equal to the product of its diagonal entries.
  4. The determinant of a matrix is invariant under row and column operations.
  5. The determinant of a matrix is invariant under transposition.
  6. The determinant of a matrix is zero if and only if the matrix is singular.
  7. The determinant of a product of matrices is equal to the product of their determinants.

1. Reflection Property:

The determinant remains unaltered if its columns are changed into rows and the rows into columns. This is known as the property of reflection.

2. All-Zero Property:

If all the elements of a row (or column) are zero, then the determinant is zero.

3. Proportionality (Repetition) Property:

If all elements of a row (or column) are proportional (identical) to the elements of some other row (or column), then the determinant is zero.

4. Property Switching:

The sign of a determinant is changed when any two rows (or columns) are interchanged.

5. Scalar Multiple Property:

The determinant of a row (or column) multiplied by a non-zero constant will result in the determinant being multiplied by the same constant.

6. Sum Property:

(\left| \begin{matrix} {{a}_{1}} & {{c}_{1}} & {{d}_{1}} \ {{a}_{2}} & {{c}_{2}} & {{d}_{2}} \ {{a}_{3}} & {{c}_{3}} & {{d}_{3}} \ \end{matrix} \right| + \left| \begin{matrix} {{b}_{1}} & {{c}_{1}} & {{d}_{1}} \ {{b}_{2}} & {{c}_{2}} & {{d}_{2}} \ {{b}_{3}} & {{c}_{3}} & {{d}_{3}} \ \end{matrix} \right| = \left| \begin{matrix} {{a}_{1}}+{{b}_{1}} & {{c}_{1}} & {{d}_{1}} \ {{a}_{2}}+{{b}_{2}} & {{c}_{2}} & {{d}_{2}} \ {{a}_{3}}+{{b}_{3}} & {{c}_{3}} & {{d}_{3}} \ \end{matrix} \right|)

Property of Invariance:

(\left| \begin{matrix} {{a}_{1}} & {{b}_{1}} & {{c}_{1}} \ {{a}_{2}} & {{b}_{2}} & {{c}_{2}} \ {{a}_{3}} & {{b}_{3}} & {{c}_{3}} \ \end{matrix} \right|=\left| \begin{matrix} {{a}_{1}}+\alpha {{b}_{1}}+\beta {{c}_{1}} & {{b}_{1}} & {{c}_{1}} \ {{a}_{2}}+\alpha {{b}_{2}}+\beta {{c}_{2}} & {{b}_{2}} & {{c}_{2}} \ {{a}_{3}}+\alpha {{b}_{3}}+\beta {{c}_{3}} & {{b}_{3}} & {{c}_{3}} \ \end{matrix} \right|)

That is, a determinant remains unaltered under an operation of the form.

Ci $\rightarrow$ Ci + $\alpha$Cj + $\beta$Ck, where j, k $\neq$ i

Either

An operation of the form R_i \rightarrow R_i + \alpha R_j + \beta R_k, where j, k \neq i.

Factor Property:

(x - α) is a factor of Δ if Δ becomes zero when x = α.

Triangle Property:

The determinant is equal to the product of diagonal elements if all elements above or below the main diagonal are zeros. That is,

(\left| \begin{matrix} {{a}_{1}} & 0 & 0 \ {{a}_{2}} & {{b}_{2}} & 0 \ {{a}_{3}} & {{b}_{3}} & {{c}_{3}} \ \end{matrix} \right|={{a}_{1}}{{b}_{2}}{{c}_{3}})

10. Finding the Determinant of a Cofactor Matrix:

(\begin{array}{l} \Delta=\left| \begin{matrix} {{a}_{11}} & {{a}_{12}} & {{a}_{13}} \ {{a}_{21}} & {{a}_{22}} & {{a}_{23}} \ {{a}_{31}} & {{a}_{32}} & {{a}_{33}} \ \end{matrix} \right|;\text{then};{{\Delta }_{1}}=\left| \begin{matrix} {{C}_{11}} & {{C}_{12}} & {{C}_{13}} \ {{C}_{21}} & {{C}_{22}} & {{C}_{23}} \ {{C}_{31}} & {{C}_{32}} & {{C}_{33}} \ \end{matrix} \right|\end{array})

Where $\Delta C_{ij}$ denotes the cofactor of the element $a_{ij}$ in $\Delta$.

Properties of Determinants: Example Problems

Question 1: Prove that using properties of determinants?

(\left| \begin{matrix} a & b & c \ b & c & a \ c & a & b \ \end{matrix} \right| = (a+b+c)(ab+bc+ca - a^2 - b^2 - c^2))

Given:

This is a heading

Solution:

This is a heading

We can prove the given problem by utilizing the invariance and scalar multiple properties of determinants.

(\begin{array}{l}\Delta =\left| \begin{matrix} a & b & c \ b & c & a \ c & a & b \ \end{matrix} \right|\=\left| \begin{matrix} a+b+c & b & c \ b+c+a & c & a \ c+a+b & a & b \ \end{matrix} \right| \text{Operating } C_1 \to C_1 + C_2 + C_3 \end{array})

(\begin{array}{l}=\left( a+b+c \right)\left| \begin{matrix} 1 & b & c \ 0 & c-b & a-c \ 0 & a-b & b-c \ \end{matrix} \right|\=\left( a+b+c \right)\left| \begin{matrix} 1 & 0 & 0 \ 0 & c-b & a-c \ 0 & a-b & b-c \ \end{matrix} \right| [Operating \left( {{R}_{2}}\to {{R}_{2}}-{{R}_{1}},and,{{R}_{3}}\to {{R}_{3}}-{{R}_{1}} \right)]\end{array} )

=(a + b + c) \[(c - b) \cdot (b - c) - (a - b) \cdot (a - c)\]

$$\left( a+b+c \right)\left( ab+bc+ca-{{a}^{2}}-{{b}^{2}}-{{c}^{2}} \right)$$

Answer:

Using the Laplace Expansion for a 3x3 matrix, we can expand the determinant as:

$$\begin{array}{l}\left| \begin{matrix} -{{\alpha }^{2}} & \beta \alpha & \gamma \alpha \ \alpha \beta & -{{\beta }^{2}} & \gamma \beta \ \alpha \gamma & \beta \gamma & -{{\gamma }^{2}} \ \end{matrix} \right|=\left( -{{\alpha }^{2}} \right)\left| \begin{matrix} -{{\beta }^{2}} & \gamma \beta \ \beta \gamma & -{{\gamma }^{2}} \ \end{matrix} \right|+\left( \beta \alpha \right)\left| \begin{matrix} \alpha \beta & -{{\beta }^{2}} \ \alpha \gamma & \beta \gamma \ \end{matrix} \right|+\left( \gamma \alpha \right)\left| \begin{matrix} \alpha \beta & \gamma \beta \ \alpha \gamma & -{{\gamma }^{2}} \ \end{matrix} \right|\end{array}$$

Using the Laplace Expansion for a 2x2 matrix, we can expand the determinants in the above equation as:

$$\begin{array}{l}\left| \begin{matrix} -{{\alpha }^{2}} & \beta \alpha & \gamma \alpha \ \alpha \beta & -{{\beta }^{2}} & \gamma \beta \ \alpha \gamma & \beta \gamma & -{{\gamma }^{2}} \ \end{matrix} \right|=\left( -{{\alpha }^{2}} \right)\left( -{{\beta }^{2}}\left( -{{\gamma }^{2}} \right)-\left( \gamma \beta \right)\left( \beta \gamma \right) \right)+\left( \beta \alpha \right)\left( \alpha \beta \left( -{{\gamma }^{2}} \right)-\left( \alpha \gamma \right)\left( \beta \gamma \right) \right)+\left( \gamma \alpha \right)\left( \alpha \beta \left( \gamma \beta \right)-\left( \alpha \gamma \right)\left( -{{\beta }^{2}} \right) \right)\end{array}$$

Simplifying the above equation, we get

$$\begin{array}{l}\left| \begin{matrix} -{{\alpha }^{2}} & \beta \alpha & \gamma \alpha \ \alpha \beta & -{{\beta }^{2}} & \gamma \beta \ \alpha \gamma & \beta \gamma & -{{\gamma }^{2}} \ \end{matrix} \right|=4{{\alpha }^{2}}{{\beta }^{2}}{{\gamma }^{2}}\end{array}$$

Given:

This is a heading

Solution:

This is a heading

Take common (\begin{array}{l}\alpha ,\beta ,\gamma\end{array}) from the left-hand side and then use scalar multiple properties and the invariance property of the determinant to prove the given problem.

(\Delta = \left|\begin{matrix} -\alpha^2 & \beta\alpha & \gamma\alpha \ \alpha\beta & -\beta^2 & \gamma\beta \ \alpha\gamma & \beta\gamma & -\gamma^2 \end{matrix}\right|)

(\Delta =\alpha \beta \gamma \left| \begin{matrix} -\alpha & \alpha & \alpha \ \beta & -\beta & \beta \ \gamma & \gamma & -\gamma \ \end{matrix} \right|) taken from ({{C}_{1}},{{C}_{2}},{{C}_{3}} ; ;with ;\alpha ,\beta ,\gamma ;common;respectively)

(\begin{array}{l}Now ;taking; [\alpha ,\beta ,\gamma ] ;common; from ;{R}_{1},{R}_{2},{R}_{3};respectively \\ ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;\Rightarrow ;Taking ;[\alpha ,\beta ,\gamma ] ;common ;from ;{R}_{1},{R}_{2},{R}_{3};respectively\end{array})

(\Delta = \alpha^2 \beta^2 \gamma^2 \left| \begin{matrix} -1 & 1 & 1 \ 1 & -1 & 1 \ 1 & 1 & -1 \ \end{matrix} \right|)

Now applying and $$\begin{array}{l}{R}_{3}\to {R}_{3}+{R}_{1};we; have ;\Delta ={\alpha }^{2}{\beta }^{2}{\gamma }^{2}\left| \begin{matrix} -1 & 1 & 1 \ 0 & 0 & 2 \ 0 & 2 & 0 \ \end{matrix} \right|\end{array}$$

(\begin{array}{l}Now; expanding; along; {{C}_{1}},\Delta {{\alpha }^{2}}\times {{\beta }^{2}}\left( -1 \right)\times {{\gamma }^{2}}\left( -1 \right)\left| \begin{matrix} 0 & 2 \ 2 & 0 \ \end{matrix} \right|=;;4{{\alpha }^{2}}{{\beta }^{2}}{{\gamma }^{2}}\end{array})

Therefore, it is proven.

Answer 3:

Using Laplace Expansion along the first column:

\begin{align*} \left| \begin{matrix} \alpha & \beta & \gamma \ \theta & \phi & \psi \ \lambda & \mu & v \ \end{matrix} \right| &= \alpha\left| \begin{matrix} \phi & \psi \ \mu & v \ \end{matrix} \right| - \beta\left| \begin{matrix} \theta & \psi \ \lambda & v \ \end{matrix} \right| + \gamma\left| \begin{matrix} \theta & \phi \ \lambda & \mu \ \end{matrix} \right| \ &= \alpha\left| \begin{matrix} \mu & v \ \phi & \psi \ \end{matrix} \right| - \beta\left| \begin{matrix} \lambda & v \ \theta & \psi \ \end{matrix} \right| + \gamma\left| \begin{matrix} \lambda & \mu \ \theta & \phi \ \end{matrix} \right| \ &= \left| \begin{matrix} \beta & \mu & \phi \ \alpha & \lambda & \theta \ \gamma & v & \psi \ \end{matrix} \right| \end{align*}

Given:

This is a statement

Solution:

This is a statement

Reflecting across the diagonal and then switching the rows and columns using the determinant’s switching property, we can obtain the desired result.

L.H.S. = (\begin{vmatrix} \alpha & \beta & \gamma \ \theta & \phi & \psi \ \lambda & \mu & v \ \end{vmatrix} = \begin{vmatrix} \alpha & \theta & \lambda \ \beta & \phi & \mu \ \gamma & \psi & v \ \end{vmatrix} )

Interchanging rows and columns across the diagonal

(\begin{array}{l}=\left| \begin{matrix} \beta & \mu & \phi \ \alpha & \lambda & \theta \ \gamma & v & \psi \ \end{matrix} \right|\end{array})

R.H.S.

Question 4: If (a), (b), and (c) are all different and (\left| \begin{matrix} a & {{a}^{2}} & 1+{{a}^{3}} \ b & {{b}^{2}} & 1+{{b}^{3}} \ c & {{c}^{2}} & 1+{{c}^{3}} \ \end{matrix} \right|=0), then what is the value of (a), (b), and (c)?

Proof:

Let a, b, and c be real numbers.

Then, abc = a(bc) = a(-1) = -a

Therefore, abc = -1

Given:

This is a heading

Solution:

This is a heading

Split the given determinant using the sum property. Then, by using scalar multiple, switching and invariance properties of determinants, we can prove the given equation.

(\begin{array}{l}D=\left| \begin{matrix} a & {{a}^{2}} & 1 \ b & {{b}^{2}} & 1 \ c & {{c}^{2}} & 1 \ \end{matrix} \right|+abc\left| \begin{matrix} 1 & a & {{a}^{2}} \ 1 & b & {{b}^{2}} \ 1 & c & {{c}^{2}} \ \end{matrix} \right|\=\left| \begin{matrix} a & {{a}^{2}} & 1 \ b & {{b}^{2}} & 1 \ c & {{c}^{2}} & 1 \ \end{matrix} \right|+abc\left| \begin{matrix} 1 & a & {{a}^{2}} \ 1 & b & {{b}^{2}} \ 1 & c & {{c}^{2}} \ \end{matrix} \right|\end{array})

(\begin{array}{l}={{\left( -1 \right)}^{1}}\left| \begin{matrix} 1 & {{a}^{2}} & c \ 1 & {{b}^{2}} & b \ 1 & {{c}^{2}} & a \ \end{matrix} \right|+abc\left| \begin{matrix} 1 & a & {{a}^{2}} \ 1 & b & {{b}^{2}} \ 1 & c & {{c}^{2}} \ \end{matrix} \right|; \left[ {{C}_{1}}\leftrightarrow {{C}_{3}},in,,1st,,\det . \right]\end{array} )

\(\begin{array}{l}={{\left( -1 \right)}^{2}}\left| \begin{matrix} 1 & a & {{a}^{2}} \\ 1 & c & {{c}^{2}} \\ 1 & b & {{b}^{2}} \\ \end{matrix} \right|+abc\left| \begin{matrix} 1 & a & {{a}^{2}} \\ 1 & c & {{c}^{2}} \\ 1 & b & {{b}^{2}} \\ \end{matrix} \right| \;\;\left[ {{C}\_{2}}\leftrightarrow {{C}\_{3}}\,in\,\,1st\,\,\det . \right]\end{array} \)

(\begin{array}{l}=\left( 1+abc \right)\left| \begin{matrix} 1 & a & {{a}^{2}} \ 1 & b & {{b}^{2}} \ 1 & c & {{c}^{2}} \ \end{matrix} \right|\\end{array})

(\begin{array}{l}=\left( 1+abc \right)\left| \begin{matrix} 1 & a & {{a}^{2}} \ 0 & b-a & {{b}^{2}}-{{a}^{2}}-{{a}^{2}} \ 0 & c-a & {{c}^{2}}-{{a}^{2}}-{{a}^{2}} \ \end{matrix} \right| ;;\left[ {{R}_{2}}\to {{R}_{2}}-a{{R}_{1,,}}and,,{{R}_{3}}\to {{R}_{3}}-a{{R}_{1}} \right]\end{array} )

(\begin{array}{l}=\left( 1+abc \right)\left| \begin{matrix} b-a & {{b}^{2}}-{{a}^{2}} \ c-a & {{c}^{2}}-{{a}^{2}} \ \end{matrix} \right| ;\textbf{(expanding ;along ;1st ;row)} \=\left( 1+abc \right)\left( b-a \right)\left( c-a \right)\left| \begin{matrix} 1 & b+a \ 1 & c+a \ \end{matrix} \right|\end{array} )

(\begin{array}{l}=\left( 1+abc \right)\left( b-c \right)\left( c-a \right)\left( c+a-b-a \right)\=\left( 1+abc \right)\left( b-a \right)\left( c-a \right)\left( c-b \right)\end{array} )

(\begin{array}{l}\Rightarrow 0 = \left( 1+abc \right)\left( a-b \right)\left( b-c \right)\left( c-a \right);\end{array})

(\begin{array}{l}\Rightarrow \left( 1+abc \right)\left( a-b \right)\left( b-c \right)\left( c-a \right)=0 \end{array})

(abc + 1) = 0

Since $a$, $b$, and $c$ are different (i.e. $a \ne b$, $b \ne c$, and $c \ne a$), hence, $abc = -1$.

Question 5: Prove that $$\left| \begin{matrix} a+b+2c & a & b \ c & b+c+2a & b \ c & a & c+a+2b \ \end{matrix} \right|=2{{\left( a+b+c \right)}^{3}}$$

Given:

This is a heading

Solution:

This is a heading

We can expand the L.H.S. simply by using switching and scalar multiple properties.

(\begin{vmatrix} a+b+2c & a & b \ c & b+c+2a & b \ c & a & c+a+2b \ \end{vmatrix})

Applying $$\begin{array}{l}{{C}_{1}}\to {{C}_{1}}+\left( {{C}_{2}}+{{C}_{3}} \right),\end{array}$$ we obtain

$\begin{array}{l} \left| \begin{matrix} 2\left( a+b+c \right) & a & b \ 2\left( a+b+c \right) & b+c+2a & b \ 2\left( a+b+c \right) & a & c+a+2b \ \end{matrix} \right|\ =2\left( a+b+c \right)\left| \begin{matrix} 1 & a & b \ 1 & b+c+2a & b \ 1 & a & c+a+2b \ \end{matrix} \right| \end{array}$

(\begin{array}{l}{R}_{2} \to {R}_{1}+{R}_{1};;and;;{R}_{3} \to {R}_{3}+{R}_{1};;(given)\end{array})

(\begin{array}{l}2\left( a+b+c \right)\left| \begin{matrix} 1 & a & b \ 0 & b+c+a & 0 \ 0 & 0 & c+a+b \ \end{matrix} \right|\ =2\left( a+b+c \right)\left{ \left( b+c+a \right)\left( c+a+b \right)-\left( 0\times 0 \right) \right}\ =2{{\left( a+b+c \right)}^{3}}\end{array} )

Therefore, it has been proven.

Answer 6:

Let (A = \begin{vmatrix} {{a}^{2}}+1 & ab & ac \ ab & {{b}^{2}}+1 & bc \ ac & bc & {{c}^{2}}+1 \ \end{vmatrix}).

Expanding along the first row, we have

(A = ({{a}^{2}}+1)({{b}^{2}}+{{c}^{2}}+1-bc)-ab(ac-bc)+ac(ab-ac) )

Simplifying, we have

(A = {{a}^{2}}({{b}^{2}}+{{c}^{2}}+1)+1-{{b}^{2}}-{{c}^{2}}-bc+ac(bc-ab))

Since (bc = ab), we have

(A = {{a}^{2}}({{b}^{2}}+{{c}^{2}}+1)+1-{{b}^{2}}-{{c}^{2}}+ac(ab-bc))

But (ac = bc), so

(A = {{a}^{2}}({{b}^{2}}+{{c}^{2}}+1)+1-{{b}^{2}}-{{c}^{2}} )

(A = 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} )

Therefore, (\left| \begin{matrix} {{a}^{2}}+1 & ab & ac \ ab & {{b}^{2}}+1 & bc \ ac & bc & {{c}^{2}}+1 \ \end{matrix} \right|=1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}})

Given:

This is a Heading

Solution:

This is a Heading

(\begin{array}{l}\left| \begin{matrix} {{a}^{2}}+1 & ab & ac \ ab & {{b}^{2}}+1 & bc \ ac & bc & {{c}^{2}}+1 \ \end{matrix} \right|\ =a^2+b^2+c^2+1+2abc\end{array})

By utilizing scalar multiplicative and invariance properties.

L.H.S.= (\begin{array}{l}\left| \begin{matrix} {{a}^{2}}+a & a^2b & a^2c \ ab & {{b}^{2}}+b & b^2c \ ac & bc & {{c}^{2}}+c \ \end{matrix} \right|;\end{array} ) Multiplying C1,C2,C3 by a, b, c respectively

(\begin{array}{l}=\frac{1}{abc}\left| \begin{matrix} a\left( {{a}^{2}}+1 \right) & a{{b}^{2}} & a{{c}^{2}} \ {{a}^{2}}b & b\left( {{b}^{2}}+1 \right) & b{{c}^{2}} \ {{a}^{2}}c & {{b}^{2}}c & c\left( {{c}^{2}}+1 \right) \ \end{matrix} \right| \cdot \left| \begin{matrix} a & b & c \ a & b & c \ a & b & c \ \end{matrix} \right|;\end{array} ) Now taking a, b, c common from R1,R2,R3 respectively

(\begin{array}{l}=\frac{abc}{abc}\left| \begin{matrix} {{a}^{2}}+1 & {{b}^{2}} & {{c}^{2}} \ {{a}^{2}}+1 & {{b}^{2}}+1 & {{c}^{2}} \ {{a}^{2}}+1 & {{b}^{2}} & {{c}^{2}}+1 \ \end{matrix} \right|\=\left| \begin{matrix} 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} & {{b}^{2}} & {{c}^{2}} \ 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} & {{b}^{2}}+1 & {{c}^{2}} \ 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} & {{b}^{2}} & {{c}^{2}}+1 \ \end{matrix} \right| ;;;\left[ {{C}_{1}}\to {{C}_{1}}+{{C}_{2}}+{{C}_{3}} \right]\end{array} )

(\begin{array}{l}=\left( 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} \right)\left| \begin{matrix} 1 & {{b}^{2}} & {{c}^{2}} \ 0 & 1 & 0 \ 0 & 0 & 1 \ \end{matrix} \right|\=\left( 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} \right)\left| \begin{matrix} 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 \ \end{matrix} \right| ;;\left[ {{R}_{1}}\to {{R}_{1}}-{{R}_{2}},-{{R}_{3}} \right]\end{array} )

(\begin{array}{l} R.H.S. = 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} \ \left( 1+{{a}^{2}}+{{b}^{2}}+{{c}^{2}} \right)\left( 1 \right)=R.H.S. \end{array})

Hence Proved

Matrices and Determinants: Important Topics

JEE Maths - Matrices and Determinants - Important Topics

Important Questions on Matrices and Determinants

JEE Maths: Matrices and Determinants - Important Questions

Top 10 Most Important and Expected JEE Main Questions on Matrices and Determinants

Matrices and Determinants Top 10 Most Important and Expected JEE Main Questions

Frequently Asked Questions

The reflection property of determinants states that the determinant of a matrix is equal to the determinant of its transpose.

The determinant remains unchanged if the columns are changed into rows and the rows into columns. This is known as the property of reflection.

What happens if two rows or columns of a determinant are interchanged?

The sign of the determinant changes when two rows or columns are swapped.

The proportionality property of determinants states that the value of a determinant is unchanged when any row or column is multiplied by a non-zero constant.

The determinant of a matrix is zero if any row or column is proportional or identical to another row or column; this is referred to as the repetition property.

The triangle property of determinants states that the determinant of a triangular matrix is equal to the product of the elements on the main diagonal of the matrix.

The determinant is equal to the product of diagonal elements if all the elements of the determinant above or below the main diagonal are zeros.

Yes, a determinant can be zero.

Yes. A determinant can be zero, negative or positive.

Yes, the determinant of an identity matrix is equal to 1.

Yes. The determinant of an identity matrix is always 1.

The All Zero Property of Determinants states that if all elements of a determinant are zero, then the value of the determinant is also zero.

According to the all-zero property of determinants, if all the elements of a row/column are zero, then the determinant is equal to zero.

The determinant of A-1 is equal to the reciprocal of the determinant of A.

If $\mathbf{A^{-1}}$ is the inverse of matrix $\mathbf{A}$, then $\det(\mathbf{A^{-1}}) = \frac{1}{\det(\mathbf{A})}$.