Advanced Math/Row reducing a matrix-linear equations solutions
Sorry for the long winded question
I know that when we are finding solutions for a linear equation there are usually three types of solution solutions. infinitely many solutions, one solution, and no solution. I am wondering if I were to use row reduction and matrices to solve linear equations how would I know when there is no solution or many solutions?
Would one row become 0,0,0]2 where the identity matrix can't be formed with row reduction and for example above there would be 0=2. Or is there a short cut where I could find the determinant of a 3x3 matrix which represent a x,y,z linear system and if the determinant = 0 there would be no solution. But my problem is can I find the determinant and use it to determine if a linear system is infinitely many solutions?
Many Many Thanks,
You need to specify whether the equations are homogeneous or not, that is
Ax = 0 homogenous
Ax = b inhomogeneous.
For the homogenous case, the only way to have a non-zero solution (where x is a vector, x = (x1, x2, ..., xn), with at least one of its components non-zero), is if detA = 0. This can be seen by letting detA ≠ 0, in which case A has an inverse and x = [A^-1]0 = 0, so that x is all zeroes.
In a similar way, for the inhomogeneous case, we must have detA ≠ 0 for there to be a solution. The solution obtained is unique. On the other hand, if detA = 0, then the solution to the inhomogeneous equation is not unique.