3.1 cheat sheet

  3.1.1 Summary of content of what will be in exam 2
  3.1.2 possible questions and how to answer them
PDF (letter size)
PDF (legal size)

3.1.1 Summary of content of what will be in exam 2

section 3.1

1.
Possible solutions of \(Ax=b\) are: no solution, unique solution, infinite number of solutions.
2.
Elementray row operations: multiply one row by non-zero constant, interchange two rows, add multiple of one row to another row

section 3.2

1.
Matrices, Gaussian elimination.
2.
Setting up augumented matrix \(\left ( A|b\right ) \)
3.
Two matrices are row equivelent if we can do operations on one matrix, and obtain the other matrix
4.
Echelon form. Backsubstitution to obtain solution.

section 3.3

1.
Reduced Echeoln form: Each leadeing entry in row must be one. All entries in same column as leading entry, above it or below it must be zero. Gauss-Jordan elimination generates Reduced Echeoln form. We basically do Gaussian elimination, followed by backward elimnination, then normalize all diagonal elements to 1.

section 3.4 This section is mainly on matrix operations. Multiplications. How to multiply matrices. How to write system is linear equations as \(Ax=b\). All basic stuff.

section 3.5 Inverses of matrices.

1.
To find \(A^{-1}\). Set up the \(\left ( A|I\right ) \) and generate reduced echelon form.
2.
Definition of matrix inverse. \(B\) is inverse of \(A\) if \(AB=I\ \) and \(BA=I\).
3.
Matrix inverse is unique. (theorem 1)
4.
Theorem 3: \(\left ( A^{-1}\right ) ^{-1}=A,\left ( AB\right ) ^{-1}=B^{-1}A^{-1}\)
5.
If \(A\) is square and \(Ax=b\) has unique solution then \(x=A^{-1}b\) (thm 4)
6.
square Matrix is invertible, iff it is row equivalent to \(I_{n}.\) Invertible matrix is also called non-singular.

section 3.6 Determinants.

1.
To find determinants. Do cofactor expansion along a row or column. Pick one with most zeros in it, to save time.

(a)
Property 1: If we multiply one row (or column) of \(A\) by \(k\) then \(\left \vert A\right \vert \) becomes \(k\left \vert A\right \vert \)
(b)
property 2: interchanging two rows, introduces a minus sign in \(\left \vert A\right \vert \)
(c)
property 3: If two rows or columns are the same then \(\left \vert A\right \vert =0\)
(d)
property 4: if \(A_{1},A_{2},B\) are identical, except that \(i^{th}\) row of \(B\) is the sum of the \(i^{th}\) of \(A_{1}\) and \(A_{2}\), then \(\left \vert B\right \vert =\left \vert A_{1}\right \vert +\left \vert A_{2}\right \vert \)
(e)
property 5: Adding constant multiple of one row (or column) to another row (or column) do not change the determinant.
(f)
property 6: for upper or lower triangle matrix, \(\left \vert A\right \vert \) is the product of all diagonal elements.
2.
Matrix transpose. (but we did not use this much in class).
3.
Thm 2. Matrix \(A\) is invertible iff \(\left \vert A\right \vert \neq 0\)
4.
thm 3. \(\left \vert AB\right \vert =\left \vert A\right \vert \left \vert B\right \vert \)., But in general \(\left \vert A+B\right \vert \neq \left \vert A\right \vert +\left \vert B\right \vert \)
5.
\(\left \vert A^{-1}\right \vert =\frac{1}{\left \vert A\right \vert }\)
6.
Cramer rule. But we did not use it. Thm 4. We also did not do thm 5 (adjoint matrices).

Section 3.7 Linear equations, curve fitting. Did not cover.

section 4.1 Vector spaces.

1.
Define \(\mathbb{R} ^{3}\) as set of all ordered triples \(\left ( a,b,c\right ) \) of real numbers. (coordinates)
2.
Thm 1. If \(u,v,w\) are vectors in \(\mathbb{R} ^{3}\) then we have properties of communtativity, associativity, additive inverse and zero element, and distrbutivity. See page 230 for list.
3.
Thm 2. Two vectors \(\mathbf{u},\mathbf{v}\) are Linearly dependent iff there exist scalars \(a,b\) not both zero, such that \(a\mathbf{u}+b\mathbf{v}=\mathbf{0}\)
4.
3 vectors in \(\mathbb{R} ^{3}\) are L.D. if one vector is linear combination of the other two vectors.
5.
THM 4. If we put 3 vectors as columns of \(A\) and then find \(\left \vert A\right \vert =0\) then the 3 vectors are L.D.
6.
For square matrix, if \(Ax=0\) has only trivial solution, then columns of \(A\) are L.I.
7.
THM 5. If 3 vectors in \(\mathbb{R} ^{3}\) are L.I., then they are basis vectors.
8.
subspaces of \(\mathbb{R} ^{3}\). None empty subset \(W\) of vectors of \(\mathbb{R} ^{3}\) is subspace iff it is closed under addition and closed under scalar multiplication. Basic problems here, is to show if vectors make subspace or not. By seeing if the space is closed under additon and scalar multiplication.

section 4.2 Vector space \(\mathbb{R} ^{n}\) and subspaces. (page 238).

1.
Definition of \(\mathbb{R} ^{n}\) vector space. Page 240. 7 points listed.
2.
THM 1. Subspace. A subset of \(\mathbb{R} ^{n}\) which is also a vector space is called subspace. We only need to verify closed under additions and closed under multiplication for subspace.
3.
Solution space: The space in which solution of \(A_{m\times n}x_{n\times 1}=0_{m\times 1}\) live. THis will always be subspace of \(\mathbb{R} ^{n}\). To find it, do G.E. and find the free variables. The number of free variables, tell us the dimension of the subspace. If there are 2 free variables, then there will be two basis for the solution space. Each vector will be \(n\) length. So the solution space is subspace of \(\mathbb{R} ^{n}\)
4.
Solution space of \(A_{m\times n}x=0\) is always subspace of \(\mathbb{R} ^{n}\)

section 4.3 Linear combinations and independence of vectors

1.
Given a vector \(w\) and set of L.I. vectors \(v_{i}\), find if that vector can be expressed as linear combination of the set of vectors. Set up \(w=c_{1}v_{1}+c_{2}v_{2}+\cdots \) and solve \(Ac=w\) and see if \(c\) is all zeros or not.
2.
Definition: L.I. of vectors. Solve \(Ac=0\) and see if \(c=0\) or not. If \(c=0\) is solution, then L.I.
3.
For square matrix, the columns are L.I. if \(\left \vert A\right \vert \neq 0\).
4.
For \(A_{m\times n}\), with \(m>n\), then if rank \(A\) is \(n\), then the columns of \(A\) are L.I.

section 4.4 Basis and dimensions of vector spaces. Did not cover for exam.

3.1.2 possible questions and how to answer them

Question Given a set of linear equations in form \(Ax=b\) and asks if the system is consistent or not.

Answer System is consistent if it has solution. The solution can be either unique or infinite number of them. To answer this, setup the augmented matrix \(\left ( A|b\right ) \) and generate Echelon form (using Gaussian elimination). Then look at the last row. Lets say \(A\) had \(m\) rows. If last entry in last row is \(0=0\), then there are infinite solutions, so consistent because this means \(0x_{m}=0\) and \(x_{m}\) can be anything.

If last entry in last row looks like \(0=r\) where \(r\) is a number not zero, then no solution, hence not consistent. If last entry in last row looks line \(number=anything\) then unique solution. So consistent.

So we really need to check if last entry in last row is \(0=r\) to decide. Be careful, do not check to see if \(\left \vert A\right \vert \) not equal to zero and then say it is consistent. Because \(\left \vert A\right \vert =0\) can still be consistent, since we can have infinite number of solutions. \(\left \vert A\right \vert =0\) does not necessarily mean no solution.

For example, this system

\begin{align*} 3x_{1}+x_{2}-3x_{3} & =-4\\ x_{1}+x_{2}+x_{3} & =1\\ 5x_{1}+6x_{2}+8x_{3} & =0 \end{align*}

For the above \(\left \vert A\right \vert =0\). And it happened that this system has no solution hence not consistent. And the following system

\begin{align*} x_{1}+3x_{2}+3x_{3} & =13\\ 2x_{1}+5x_{2}+4x_{3} & =23\\ 2x_{1}+7x_{2}+8x_{3} & =29 \end{align*}

has also \(\left \vert A\right \vert =0\). But the above has infinite number of solutions. Hence consistent. So bottom line, do not use \(\left \vert A\right \vert \) to answer questions about consistent or not. (also \(\left \vert A\right \vert \) only works for square matrices any way).  So what does \(\left \vert A\right \vert \) give? If \(\left \vert A\right \vert \) is not zero, it says the solution is unique. So if the question gives square matrix, and asks if solution is unique, only then check \(\left \vert A\right \vert =0\) or not.

Question Problem gives set of linear equations in form \(Ax=b\) and asks if system has unique solution, no solution, or infinite solution.

Answer Same as above. Follow same steps.

Question Problem gives square matrix \(A\) and asks to find \(A^{-1}\).

Answer Set up the augmented matrix \(\left ( A|I\right ) \) where \(I\) is the identity matrix. Go through the forward elimination to reach echelon form. Then go though the backward elimination to each reduced Echelon form. Then make all diagonals in \(A\)  be \(1\). While doing these row operations, always do them on the whole \(\left ( A|I\right ) \) system, not just on \(A\). At then end, \(A^{-1}\) will be where \(I\) was sitting.

Question Problem gives square matrix \(A\) and and square matrix \(B\) and asks if \(B\) is the inverse of \(A\)

Answer Start by multiplying \(AB\) and see if you can get \(I\) as result. Also need to do \(BA\) and see if you can get \(I\) as well. If so, then \(B\) is the inverse of \(A\). To get to \(I\) need to do some matrix manipulation in the middle. But it is all algebra. This is all based on \(AA^{-1}=A^{-1}A=I\). (if \(A\) is invertible ofcourse). Remember also that \(A^{-1}\) is unique. i.e. given a matrix, it has only one matrix which is its inverse.

Question Problem asks to proof that matrix inverse is unique.

Answer Let \(A\) be invertible. Let \(B\) be its inverse. Assume now that \(C\) is also its inverse but \(C\neq B\). Then \(C=CI=C\left ( AB\right ) =\left ( CA\right ) B=IB=B\), hence \(C=B\). Proof by contradiction. So only unique inverse.

Question