Enter a problem...
Linear Algebra Examples
, ,
Step 1
Find the from the system of equations.
Step 2
Step 2.1
Find the determinant.
Step 2.1.1
Choose the row or column with the most elements. If there are no elements choose any row or column. Multiply every element in row by its cofactor and add.
Step 2.1.1.1
Consider the corresponding sign chart.
Step 2.1.1.2
The cofactor is the minor with the sign changed if the indices match a position on the sign chart.
Step 2.1.1.3
The minor for is the determinant with row and column deleted.
Step 2.1.1.4
Multiply element by its cofactor.
Step 2.1.1.5
The minor for is the determinant with row and column deleted.
Step 2.1.1.6
Multiply element by its cofactor.
Step 2.1.1.7
The minor for is the determinant with row and column deleted.
Step 2.1.1.8
Multiply element by its cofactor.
Step 2.1.1.9
Add the terms together.
Step 2.1.2
Evaluate .
Step 2.1.2.1
The determinant of a matrix can be found using the formula .
Step 2.1.2.2
Simplify the determinant.
Step 2.1.2.2.1
Simplify each term.
Step 2.1.2.2.1.1
Multiply by .
Step 2.1.2.2.1.2
Multiply by .
Step 2.1.2.2.2
Subtract from .
Step 2.1.3
Evaluate .
Step 2.1.3.1
The determinant of a matrix can be found using the formula .
Step 2.1.3.2
Simplify the determinant.
Step 2.1.3.2.1
Simplify each term.
Step 2.1.3.2.1.1
Multiply by .
Step 2.1.3.2.1.2
Multiply by .
Step 2.1.3.2.2
Subtract from .
Step 2.1.4
Evaluate .
Step 2.1.4.1
The determinant of a matrix can be found using the formula .
Step 2.1.4.2
Simplify the determinant.
Step 2.1.4.2.1
Simplify each term.
Step 2.1.4.2.1.1
Multiply by .
Step 2.1.4.2.1.2
Multiply by .
Step 2.1.4.2.2
Add and .
Step 2.1.5
Simplify the determinant.
Step 2.1.5.1
Simplify each term.
Step 2.1.5.1.1
Multiply by .
Step 2.1.5.1.2
Multiply by .
Step 2.1.5.1.3
Multiply by .
Step 2.1.5.2
Subtract from .
Step 2.1.5.3
Subtract from .
Step 2.2
Since the determinant is non-zero, the inverse exists.
Step 2.3
Set up a matrix where the left half is the original matrix and the right half is its identity matrix.
Step 2.4
Find the reduced row echelon form.
Step 2.4.1
Perform the row operation to make the entry at a .
Step 2.4.1.1
Perform the row operation to make the entry at a .
Step 2.4.1.2
Simplify .
Step 2.4.2
Perform the row operation to make the entry at a .
Step 2.4.2.1
Perform the row operation to make the entry at a .
Step 2.4.2.2
Simplify .
Step 2.4.3
Multiply each element of by to make the entry at a .
Step 2.4.3.1
Multiply each element of by to make the entry at a .
Step 2.4.3.2
Simplify .
Step 2.4.4
Perform the row operation to make the entry at a .
Step 2.4.4.1
Perform the row operation to make the entry at a .
Step 2.4.4.2
Simplify .
Step 2.4.5
Multiply each element of by to make the entry at a .
Step 2.4.5.1
Multiply each element of by to make the entry at a .
Step 2.4.5.2
Simplify .
Step 2.4.6
Perform the row operation to make the entry at a .
Step 2.4.6.1
Perform the row operation to make the entry at a .
Step 2.4.6.2
Simplify .
Step 2.4.7
Perform the row operation to make the entry at a .
Step 2.4.7.1
Perform the row operation to make the entry at a .
Step 2.4.7.2
Simplify .
Step 2.4.8
Perform the row operation to make the entry at a .
Step 2.4.8.1
Perform the row operation to make the entry at a .
Step 2.4.8.2
Simplify .
Step 2.5
The right half of the reduced row echelon form is the inverse.
Step 3
Left multiply both sides of the matrix equation by the inverse matrix.
Step 4
Any matrix multiplied by its inverse is equal to all the time. .
Step 5
Step 5.1
Two matrices can be multiplied if and only if the number of columns in the first matrix is equal to the number of rows in the second matrix. In this case, the first matrix is and the second matrix is .
Step 5.2
Multiply each row in the first matrix by each column in the second matrix.
Step 5.3
Simplify each element of the matrix by multiplying out all the expressions.
Step 6
Simplify the left and right side.
Step 7
Find the solution.