Enter a problem...
Linear Algebra Examples
Step 1
Step 1.1
Two matrices can be multiplied if and only if the number of columns in the first matrix is equal to the number of rows in the second matrix. In this case, the first matrix is and the second matrix is .
Step 1.2
Multiply each row in the first matrix by each column in the second matrix.
Step 1.3
Simplify each element of the matrix by multiplying out all the expressions.
Step 2
Step 2.1
Choose the row or column with the most elements. If there are no elements choose any row or column. Multiply every element in row by its cofactor and add.
Step 2.1.1
Consider the corresponding sign chart.
Step 2.1.2
The cofactor is the minor with the sign changed if the indices match a position on the sign chart.
Step 2.1.3
The minor for is the determinant with row and column deleted.
Step 2.1.4
Multiply element by its cofactor.
Step 2.1.5
The minor for is the determinant with row and column deleted.
Step 2.1.6
Multiply element by its cofactor.
Step 2.1.7
The minor for is the determinant with row and column deleted.
Step 2.1.8
Multiply element by its cofactor.
Step 2.1.9
Add the terms together.
Step 2.2
Multiply by .
Step 2.3
Multiply by .
Step 2.4
Evaluate .
Step 2.4.1
The determinant of a matrix can be found using the formula .
Step 2.4.2
Simplify the determinant.
Step 2.4.2.1
Simplify each term.
Step 2.4.2.1.1
Multiply by .
Step 2.4.2.1.2
Multiply by .
Step 2.4.2.2
Add and .
Step 2.5
Simplify the determinant.
Step 2.5.1
Multiply by .
Step 2.5.2
Add and .
Step 2.5.3
Add and .
Step 3
Since the determinant is non-zero, the inverse exists.
Step 4
Set up a matrix where the left half is the original matrix and the right half is its identity matrix.
Step 5
The right half of the reduced row echelon form is the inverse.