perspectives of looking at matrix multiplication
dot product (straightforward)
column

row


Inverses
For a square matrix M if the rows or columns are linearly dependent on each other then the inverse does not exist
“let us assume that a matric N exists for which MN or NM =I (M is a singular square matrix)
for MN essentially the final matrix columns are linear combinations of the columns of M and since at least 2 columns are linearly dependent in the final product matrix for atleast one of the columns there cannot exist a linear combination of columns of M which would give the identity matrix equivalent which is 1 in the diagonal and zero everywhere else hence N cannot exist
for NM case it can be argued that the final rows of the product matrix is a linear combination of the rows of M and the same reasoning as above applies
A square matrix has no inverse if for any non zero vector x Ax=0 the product gives a zero vector
let the inverse exist that means AA-1=A-1A=I
which means A-1Ax=0A-1 ⇒Ix=0 ⇒x=0 which is not possible therefore inverse doesn’t exist” NOT CORRECT VERIFICATION REQUIRED
(write the dependent column as a linear combination of the independent columns and multiply N with it we show that the same column in the final matrix is dependent on the columns of the final matrix which is not possible if the final matrix is I) CORRECT
Systems of equations and different views of it
