Determinants are mathematical objects that are very useful in the analysis and solution of systems of linear equations. The determinant of a matrix defines whether the matrix is invertible or not/has a unique solution.
We are breaking the matrix into three smaller matrices. Each element from the first row is multiplied by a matrix constructed by all elements that are outside of the row and column of the element. Signs before the elements are determined in a alternating pattern:
In fact, we can chose any row/column from the matrix and use the its elements as coefficients. The benefit of this flexibility is that we can try to choose rows that have the most zero values, to make our calculation simpler.
Let be matrices (and this works for any set of matrices), except that one row in each matrix is different, and that different row is the sum of the different rows from the first and second matrices, then we know the sum of the determinants of the first and second matrices is equal to the determinant of the third matrix.
Let be a matrix and if we switch any row with any other row in , the determinant of the new βswapped-row matrixβ is equal to the negative determinant of . In other words,
It works for any matrices.
Based on that rule any matrix with two identical rows or columns has determinant of . And if the matrix determinant is then any matrix with any two identical rows/columns is not invertiable.