2.4 HW 4

  2.4.1 Problems listing
  2.4.2 Problem 9 section 4.3
  2.4.3 Problem 17 section 4.3
  2.4.4 Problem 18 section 4.3
  2.4.5 Problem 6 section 4.4
  2.4.6 Problem 16 section 4.4
  2.4.7 Problem 20 section 4.4
  2.4.8 Problem 5 section 4.5
  2.4.9 Problem 7 section 4.5
  2.4.10 Problem 15 section 4.5
  2.4.11 Additional problem 1
  2.4.12 Additional problem 2
  2.4.13 Additional problem 3
  2.4.14 key solution for HW 4

2.4.1 Problems listing

PDF

PDF (letter size)
PDF (legal size)

2.4.2 Problem 9 section 4.3

In Problems 9–16, express the indicated vector \(\boldsymbol {w}\) as a linear combination of the given vectors \(\boldsymbol {v}_{1};\boldsymbol {v}_{2}\cdots \boldsymbol {v}_{k}\) if this is possible. If not, show that it is impossible\[ \boldsymbol {w}=\begin {bmatrix} 1\\ 0\\ -7 \end {bmatrix} ,\boldsymbol {v}_{1}=\begin {bmatrix} 5\\ 3\\ 4 \end {bmatrix} ,\boldsymbol {v}_{2}=\begin {bmatrix} 3\\ 2\\ 5 \end {bmatrix} \] solution

Let \(\boldsymbol {w}=c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}\). In matrix form this becomes\begin {align*} c_{1}\begin {bmatrix} 5\\ 3\\ 4 \end {bmatrix} +c_{2}\begin {bmatrix} 3\\ 2\\ 5 \end {bmatrix} & =\begin {bmatrix} 1\\ 0\\ -7 \end {bmatrix} \\\begin {bmatrix} 5 & 3\\ 3 & 2\\ 4 & 5 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\end {bmatrix} & =\begin {bmatrix} 1\\ 0\\ -7 \end {bmatrix} \end {align*}

Augmented matrix is\[\begin {bmatrix} 5 & 3 & 1\\ 3 & 2 & 0\\ 4 & 5 & -7 \end {bmatrix} \] \(R_{1}\rightarrow 3R_{1}\) and \(R_{2}\rightarrow 5R_{2}\) gives\[\begin {bmatrix} 15 & 9 & 3\\ 15 & 10 & 0\\ 4 & 5 & -7 \end {bmatrix} \] \(R_{2}\rightarrow -R_{1}+R_{2}\) gives\[\begin {bmatrix} 15 & 9 & 3\\ 0 & 1 & -3\\ 4 & 5 & -7 \end {bmatrix} \] \(R_{1}\rightarrow 4R_{1}\) and \(R_{3}\rightarrow 15R_{3}\) gives\[\begin {bmatrix} 60 & 36 & 12\\ 0 & 1 & -3\\ 60 & 75 & -105 \end {bmatrix} \] \(R_{3}\rightarrow -R_{1}+R_{3}\) gives\[\begin {bmatrix} 60 & 36 & 12\\ 0 & 1 & -3\\ 0 & 39 & -117 \end {bmatrix} \] \(R_{3}\rightarrow -39R_{2}+R_{3}\) gives\[\begin {bmatrix} 60 & 36 & 12\\ 0 & 1 & -3\\ 0 & 0 & 0 \end {bmatrix} \] Hence the system becomes\[\begin {bmatrix} 60 & 36\\ 0 & 1\\ 0 & 0 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\end {bmatrix} =\begin {bmatrix} 12\\ -3\\ 0 \end {bmatrix} \] From second row \(c_{2}=-3\) and from first row \(60c_{1}+36\left ( c_{2}\right ) =12\) or \(c_{1}=\frac {12-36\left ( -3\right ) }{60}=\allowbreak 2\). Hence\[ \boldsymbol {w}=2\boldsymbol {v}_{1}-3\boldsymbol {v}_{2}\] \(\boldsymbol {w}\) is linear combination.

2.4.3 Problem 17 section 4.3

In Problems 17–22, three vectors \(\boldsymbol {v}_{1},\boldsymbol {v}_{2}\), and \(\boldsymbol {v}_{3}\) are given. If they are linearly independent, show this; otherwise find a nontrivial linear combination of them that is equal to the zero vector.\[ \boldsymbol {v}_{1}=\begin {bmatrix} 1\\ 0\\ 1 \end {bmatrix} ,\boldsymbol {v}_{2}=\begin {bmatrix} 2\\ -3\\ 4 \end {bmatrix} ,\boldsymbol {v}_{3}=\begin {bmatrix} 3\\ 5\\ 2 \end {bmatrix} \] solution

The vectors are Linearly independent if\[ c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}+c_{3}\boldsymbol {v}_{3}=\boldsymbol {0}\] only when \(c_{1}=c_{2}=c_{3}=0\). If we can find at least one \(c_{i}\) where the above is true, then the vectors are Linearly dependent.

Writing the above as \(A\boldsymbol {c}=\boldsymbol {0}\) gives\begin {equation} \begin {bmatrix} 1 & 2 & 3\\ 0 & -3 & 5\\ 1 & 4 & 2 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \tag {1} \end {equation} The augmented matrix is\[\begin {bmatrix} 1 & 2 & 3 & 0\\ 0 & -3 & 5 & 0\\ 1 & 4 & 2 & 0 \end {bmatrix} \] \(R_{3}\rightarrow -R_{1}+R_{3}\) gives\[\begin {bmatrix} 1 & 2 & 3 & 0\\ 0 & -3 & 5 & 0\\ 0 & 2 & -1 & 0 \end {bmatrix} \] \(R_{3}\rightarrow R_{3},R_{2}\rightarrow 2R_{2}\) gives\[\begin {bmatrix} 1 & 2 & 3 & 0\\ 0 & -6 & 10 & 0\\ 0 & 6 & -3 & 0 \end {bmatrix} \] \(R_{3}\rightarrow R_{2}+R_{3}\) gives\[\begin {bmatrix} 1 & 2 & 3 & 0\\ 0 & -6 & 10 & 0\\ 0 & 0 & 7 & 0 \end {bmatrix} \] Hence the original system (1) in Echelon form becomes\[\begin {bmatrix} 1 & 2 & 3\\ 0 & -6 & 10\\ 0 & 0 & 7 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \] Leading variables are \(c_{1},c_{2},c_{3}\). Since there are no free variables, then only the trivial solution exist. We see this by backsubstitution. Last row gives \(c_{3}=0\). Second row gives \(c_{2}=0\) and first row gives \(c_{1}=0\).

Since all \(c_{i}=0\), then the vectors are Linearly independent.

2.4.4 Problem 18 section 4.3

In Problems 17–22, three vectors \(\boldsymbol {v}_{1},\boldsymbol {v}_{2}\), and \(\boldsymbol {v}_{3}\) are given. If they are linearly independent, show this; otherwise find a nontrivial linear combination of them that is equal to the zero vector.\[ \boldsymbol {v}_{1}=\begin {bmatrix} 2\\ 0\\ -3 \end {bmatrix} ,\boldsymbol {v}_{2}=\begin {bmatrix} 4\\ -5\\ -6 \end {bmatrix} ,\boldsymbol {v}_{3}=\begin {bmatrix} -2\\ 1\\ 3 \end {bmatrix} \] solution

The vectors are Linearly independent if\[ c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}+c_{3}\boldsymbol {v}_{3}=\boldsymbol {0}\] only when \(c_{1}=c_{2}=c_{3}=0\). If we can find at least one \(c_{i}\) where the above is true, then the vectors are Linearly dependent.

Writing the above as \(A\boldsymbol {c}=\boldsymbol {0}\) gives\begin {equation} \begin {bmatrix} 2 & 4 & -2\\ 0 & -5 & 1\\ -3 & -6 & 3 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \tag {1} \end {equation} The augmented matrix is\[\begin {bmatrix} 2 & 4 & -2 & 0\\ 0 & -5 & 1 & 0\\ -3 & -6 & 3 & 0 \end {bmatrix} \] \(R_{1}\rightarrow 3R_{1},R_{3}\rightarrow 2R_{3}\) gives\[\begin {bmatrix} 6 & 12 & -6 & 0\\ 0 & -5 & 1 & 0\\ -6 & -12 & 6 & 0 \end {bmatrix} \] \(R_{3}\rightarrow R_{1}+R_{3}\) gives\[\begin {bmatrix} 6 & 12 & -6 & 0\\ 0 & -5 & 1 & 0\\ 0 & 0 & 0 & 0 \end {bmatrix} \] Hence the system (1) becomes\[\begin {bmatrix} 6 & 12 & -6\\ 0 & -5 & 1\\ 0 & 0 & 0 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \] The leading variables are \(c_{1},c_{2}\) and free variable is \(c_{3}\). Since there is a free variable, then the vectors are Linearly dependent. To see this, let \(c_{3}=t\). From second row \(-5c_{2}+t=0\) or \(c_{2}=\frac {1}{5}t\). From first row \(6c_{1}+12c_{2}-6t=0\). Or \(c_{1}=\frac {6t-12\left ( \frac {1}{5}t\right ) }{6}=\frac {3}{5}t\). Hence\[\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} \frac {3}{5}t\\ \frac {1}{5}t\\ t \end {bmatrix} =t\begin {bmatrix} \frac {3}{5}\\ \frac {1}{5}\\ 1 \end {bmatrix} =\frac {1}{5}t\begin {bmatrix} 3\\ 1\\ 5 \end {bmatrix} \] Taking \(\tilde {t}=5\) the above becomes\[\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 3\\ 1\\ 5 \end {bmatrix} \] Therefore we found one solution where\begin {align*} c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}+c_{3}\boldsymbol {v}_{3} & =\boldsymbol {0}\\ 3\boldsymbol {v}_{1}+\boldsymbol {v}_{2}+5\boldsymbol {v}_{3} & =\boldsymbol {0} \end {align*}

not all \(c_{i}\) zero. Hence linearly dependent vectors.

2.4.5 Problem 6 section 4.4

In Problems 1–8, determine whether or not the given vectors in \(\mathbb {R} ^{n}\) form a basis for \(\mathbb {R} ^{n}\)\[ \boldsymbol {v}_{1}=\begin {bmatrix} 0\\ 0\\ 1 \end {bmatrix} ,\boldsymbol {v}_{2}=\begin {bmatrix} 0\\ 1\\ 2 \end {bmatrix} ,\boldsymbol {v}_{3}=\begin {bmatrix} 1\\ 2\\ 3 \end {bmatrix} \] solution

If the vectors are Linearly independent, then they form basis. To check, we solve \(A\boldsymbol {c}=\boldsymbol {0}\) and see if the solution is the trivial solution or not. If the solution is the trivial solution, then the vectors are linearly independent and hence form basis.\[ c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}+c_{3}\boldsymbol {v}_{3}=\boldsymbol {0}\] Writing the above as \(A\boldsymbol {c}=\boldsymbol {0}\) gives\begin {equation} \begin {bmatrix} 0 & 0 & 1\\ 0 & 1 & 2\\ 1 & 2 & 3 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \tag {1} \end {equation} The augmented matrix is\[\begin {bmatrix} 0 & 0 & 1 & 0\\ 0 & 1 & 2 & 0\\ 1 & 2 & 3 & 0 \end {bmatrix} \] Since the pivot \(\left ( 1,1\right ) \) is pivot, we replace \(R_{1}\) with \(R_{3}\) first.\[\begin {bmatrix} 1 & 2 & 3 & 0\\ 0 & 1 & 2 & 0\\ 0 & 0 & 1 & 0 \end {bmatrix} \] This is in Echelon form. No free variables. Therefore, the solution is the trivial solution. Eq (1) becomes \[\begin {bmatrix} 1 & 2 & 3\\ 0 & 1 & 2\\ 0 & 0 & 1 \end {bmatrix}\begin {bmatrix} c_{1}\\ c_{2}\\ c_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \] Which shows that \(c_{1}=0,c_{2}=0,c_{3}=0\). Hence the vectors form a basis for \(\mathbb {R} ^{3}\)

2.4.6 Problem 16 section 4.4

In Problems 15–26, find a basis for the solution space of the given homogeneous linear system\begin {align*} x_{1}+3x_{2}+4x_{3} & =0\\ 3x_{1}+8x_{2}+7x_{3} & =0 \end {align*}

solution

\(A\boldsymbol {x}=\boldsymbol {0}\) gives\[\begin {bmatrix} 1 & 3 & 4\\ 3 & 8 & 7 \end {bmatrix}\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0 \end {bmatrix} \] The augmented matrix is \[\begin {bmatrix} 1 & 3 & 4 & 0\\ 3 & 8 & 7 & 0 \end {bmatrix} \] \(R_{2}\rightarrow -3R_{1}+R_{2}\) gives\[\begin {bmatrix} 1 & 3 & 4 & 0\\ 0 & -1 & -5 & 0 \end {bmatrix} \] Hence the leading variables are \(x_{1},x_{2}\) and the free variable is \(x_{3}=t\). The system becomes\[\begin {bmatrix} 1 & 3 & 4\\ 0 & -1 & -5 \end {bmatrix}\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\end {bmatrix} =\begin {bmatrix} 0\\ 0 \end {bmatrix} \] Last row gives \(-x_{2}-5x_{3}=0\) or \(-x_{2}=5t\). Hence \(x_{2}=-5t\). From first row, \(x_{1}+3x_{2}+4x_{3}=0\), or \(x_{1}=-3x_{2}-4x_{3}\) or \(x_{1}=-3\left ( -5t\right ) -4t=11t\). Therefore the solution is\[\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\end {bmatrix} =\begin {bmatrix} 11t\\ -5t\\ t \end {bmatrix} =t\begin {bmatrix} 11\\ -5\\ 1 \end {bmatrix} \] Let \(t=1\). The basis is \[\begin {bmatrix} 11\\ -5\\ 1 \end {bmatrix} \] A one dimensional subspace.

2.4.7 Problem 20 section 4.4

In Problems 15–26, find a basis for the solution space of the given homogeneous linear system\begin {align*} x_{1}-3x_{2}-10x_{3}+5x_{4} & =0\\ x_{1}+4x_{2}+11x_{3}-2x_{4} & =0\\ x_{1}+3x_{2}+8x_{3}-x_{4} & =0 \end {align*}

solution

\(A\boldsymbol {x}=\boldsymbol {0}\) gives\[\begin {bmatrix} 1 & -3 & -10 & 5\\ 1 & 4 & 11 & -2\\ 1 & 3 & 8 & -1 \end {bmatrix}\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\\ x_{4}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \] The augmented matrix is\[\begin {bmatrix} 1 & -3 & -10 & 5 & 0\\ 1 & 4 & 11 & -2 & 0\\ 1 & 3 & 8 & -1 & 0 \end {bmatrix} \] \(R_{2}\rightarrow -R_{1}+R_{2}\) gives\[\begin {bmatrix} 1 & -3 & -10 & 5 & 0\\ 0 & 7 & 21 & -7 & 0\\ 1 & 3 & 8 & -1 & 0 \end {bmatrix} \] \(R_{3}\rightarrow -R_{1}+R_{3}\) gives\[\begin {bmatrix} 1 & -3 & -10 & 5 & 0\\ 0 & 7 & 21 & -7 & 0\\ 0 & 6 & 18 & -6 & 0 \end {bmatrix} \] \(R_{3}\rightarrow 7R_{3}\) and \(R_{2}\rightarrow 6R_{2}\) gives\[\begin {bmatrix} 1 & -3 & -10 & 5 & 0\\ 0 & 42 & 126 & -42 & 0\\ 0 & 42 & 126 & -42 & 0 \end {bmatrix} \] \(R_{3}\rightarrow -R_{2}+R_{3}\) gives\[\begin {bmatrix} 1 & -3 & -10 & 5 & 0\\ 0 & 42 & 126 & -42 & 0\\ 0 & 0 & 0 & 0 & 0 \end {bmatrix} \] Leading variables are \(x_{1},x_{2}\) Free variables are \(x_{3}=t,x_{4}=s\). The system becomes\[\begin {bmatrix} 1 & -3 & -10 & 5\\ 0 & 42 & 126 & -42\\ 0 & 0 & 0 & 0 \end {bmatrix}\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\\ x_{4}\end {bmatrix} =\begin {bmatrix} 0\\ 0\\ 0 \end {bmatrix} \] second row gives \(42x_{2}+126x_{3}-42x_{4}=0\) or \(42x_{2}=-126t+42s\) or \(x_{2}=-\frac {126}{42}t+\frac {42}{42}s=-3t+s\).

First row gives \(x_{1}-3x_{2}-10x_{3}+5x_{4}=0\) or \(x_{1}=3x_{2}+10x_{3}-5x_{4}\) or \(x_{1}=3\left ( -3t+s\right ) +10t-5s=t-2s\). Hence the solution is\[\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\\ x_{4}\end {bmatrix} =\begin {bmatrix} t-2s\\ -3t+s\\ t\\ s \end {bmatrix} =t\begin {bmatrix} 1\\ -3\\ 1\\ 0 \end {bmatrix} +s\begin {bmatrix} -2\\ 1\\ 0\\ 1 \end {bmatrix} \] Let \(t=1,s=1\). The basis are\[\begin {bmatrix} 1\\ -3\\ 1\\ 0 \end {bmatrix} ,\begin {bmatrix} -2\\ 1\\ 0\\ 1 \end {bmatrix} \] A two dimensional subspace.

2.4.8 Problem 5 section 4.5

In Problems 1–12, find both a basis for the row space and a basis for the column space of the given matrix \(A\).\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 3 & 1 & -3 & 4\\ 2 & 5 & 11 & 12 \end {bmatrix} \] solution

We start by converting the matrix to reduced Echelon form.

\(R_{2}\rightarrow -3R_{1}+R_{2}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 0 & -2 & -6 & 3\\ 2 & 5 & 11 & 12 \end {bmatrix} \] \(R_{3}\rightarrow -2R_{1}+R_{3}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 0 & -2 & -6 & 3\\ 0 & 3 & 9 & 10 \end {bmatrix} \] \(R_{2}\rightarrow 3R_{2}\) and \(R_{3}\rightarrow 2R_{3}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 0 & -6 & -18 & 9\\ 0 & 6 & 18 & 20 \end {bmatrix} \] \(R_{3}\rightarrow R_{2}+R_{3}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 0 & -6 & -18 & 9\\ 0 & 0 & 0 & 29 \end {bmatrix} \] Now to start the reduce Echelon form phase. The pivots all needs to be \(1\).

\(R_{2}\rightarrow \frac {-1}{6}R_{2}\) and \(R_{3}\rightarrow \frac {1}{29}R_{3}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 0 & 1 & 3 & \frac {3}{2}\\ 0 & 0 & 0 & 1 \end {bmatrix} \] Now we need to zero all elements above each pivot.

\(R_{2}\rightarrow R_{2}-\frac {3}{2}R_{2}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 1\\ 0 & 1 & 3 & 0\\ 0 & 0 & 0 & 1 \end {bmatrix} \] \(R_{1}\rightarrow R_{1}-R_{3}\) gives\[\begin {bmatrix} 1 & 1 & 1 & 0\\ 0 & 1 & 3 & 0\\ 0 & 0 & 0 & 1 \end {bmatrix} \] \(R_{1}\rightarrow R_{1}-R_{2}\) gives\[\begin {bmatrix} 1 & 0 & -2 & 0\\ 0 & 1 & 3 & 0\\ 0 & 0 & 0 & 1 \end {bmatrix} \] The above is now in reduced Echelon form. Now we can answer the question. The basis for the row space are all the rows which are not zero. Hence row space basis are (I prefer to show all basis as column vectors, instead of row vectors. This just makes it easier to read them).\[\begin {bmatrix} 1\\ 0\\ -2\\ 0 \end {bmatrix} ,\begin {bmatrix} 0\\ 1\\ 3\\ 0 \end {bmatrix} ,\begin {bmatrix} 0\\ 0\\ 0\\ 1 \end {bmatrix} \] The dimension is \(3\). The column space correspond to pivot columns in original A. These are column \(1,2,4\). Hence basis for column space are\[\begin {bmatrix} 1\\ 3\\ 2 \end {bmatrix} ,\begin {bmatrix} 1\\ 1\\ 5 \end {bmatrix} ,\begin {bmatrix} 1\\ 4\\ 12 \end {bmatrix} \] The dimension is \(3\). We notice that the dimension of the row space and the column space is equal as expected. (This is called the rank of \(A\). Hence rank\(\left ( A\right ) =3\).)

The Null space of \(A\) has dimension \(1\), since there is only one free variable (\(x_{3}\)). We see that the number of columns of \(A\) (which is \(4\)) is therefore the sum of column space dimension (or the rank) and the null space dimension as expected.

2.4.9 Problem 7 section 4.5

In Problems 1–12, find both a basis for the row space and a basis for the column space of the given matrix \(A\).\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 1 & 4 & 5 & 16\\ 1 & 3 & 3 & 13\\ 2 & 5 & 4 & 23 \end {bmatrix} \] solution

We start by converting the matrix to reduced Echelon form.

\(R_{2}\rightarrow -R_{1}+R_{2}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 3 & 6 & 9\\ 1 & 3 & 3 & 13\\ 2 & 5 & 4 & 23 \end {bmatrix} \] \(R_{3}\rightarrow -R_{1}+R_{3}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 3 & 6 & 9\\ 0 & 2 & 4 & 6\\ 2 & 5 & 4 & 23 \end {bmatrix} \] \(R_{4}\rightarrow -2R_{1}+R_{4}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 3 & 6 & 9\\ 0 & 2 & 4 & 6\\ 0 & 3 & 6 & 9 \end {bmatrix} \] \(R_{2}\rightarrow 2R_{2}\) and \(R_{3}\rightarrow 3R_{3}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 6 & 12 & 18\\ 0 & 6 & 12 & 18\\ 0 & 3 & 6 & 9 \end {bmatrix} \] \(R_{3}\rightarrow -R_{2}+R_{3}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 6 & 12 & 18\\ 0 & 0 & 0 & 0\\ 0 & 3 & 6 & 9 \end {bmatrix} \] \(R_{4}\rightarrow -\frac {1}{2}R_{2}+R_{4}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 6 & 12 & 18\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 \end {bmatrix} \] Pivot (leading) columns are \(1,2\) and free variables go with \(3,4\) columns. The Null space of \(A\) is therefore have dimension \(2\).  We now convert it to reduced Echelon form.

\(R_{2}\rightarrow \frac {1}{6}R_{2}\) gives\[\begin {bmatrix} 1 & 1 & -1 & 7\\ 0 & 1 & 2 & 3\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 \end {bmatrix} \] \(R_{1}\rightarrow R_{1}-R_{2}\) gives\[\begin {bmatrix} 1 & 0 & -3 & 4\\ 0 & 1 & 2 & 3\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 \end {bmatrix} \] The above is reduced Echelon form. The basis for the row space are all the rows which are not zero. Hence row space basis are (dimension 2)\[ \left \{ \begin {bmatrix} 1\\ 0\\ -3\\ 4 \end {bmatrix} ,\begin {bmatrix} 0\\ 1\\ 2\\ 3 \end {bmatrix} \right \} \] The column space correspond to pivot columns in original A. These are columns \(1,2\). Hence basis for column space are (dimension 2)\[ \left \{ \begin {bmatrix} 1\\ 1\\ 1\\ 2 \end {bmatrix} ,\begin {bmatrix} 1\\ 4\\ 3\\ 5 \end {bmatrix} \right \} \] We notice that the dimension of the row space and the column space is equal as expected.

The Null space of \(A\) has dimension \(2\), since there is two free variables. We see that the number of columns of \(A\) (which is \(4\)) is therefore the sum of column space dimension and the null space dimension as expected.

2.4.10 Problem 15 section 4.5

In Problems 13–16, a set \(S\) of vectors in \(\mathbb {R} ^{4}\) is given. Find a subset of \(S\) that forms a basis for the subspace of \(\mathbb {R} ^{4}\) spanned by \(S\)\[ \boldsymbol {v}_{1}=\begin {bmatrix} 3\\ 2\\ 2\\ 2 \end {bmatrix} ,\boldsymbol {v}_{2}=\begin {bmatrix} 2\\ 1\\ 2\\ 1 \end {bmatrix} ,\boldsymbol {v}_{3}=\begin {bmatrix} 4\\ 3\\ 2\\ 3 \end {bmatrix} ,\boldsymbol {v}_{4}=\begin {bmatrix} 1\\ 2\\ 3\\ 4 \end {bmatrix} \] solution

We set up a matrix made of the above vectors, then find the dimensions of the column space. \[\begin {bmatrix} 3 & 2 & 4 & 1\\ 2 & 1 & 3 & 2\\ 2 & 2 & 2 & 3\\ 2 & 1 & 3 & 4 \end {bmatrix} \] \(R_{1}\rightarrow 2R_{1}\) and \(R_{2}\rightarrow 3R_{2}\) and \(R_{3}\rightarrow 2R_{3}\) and \(R_{4}\rightarrow 3R_{4}\). This gives\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 6 & 3 & 9 & 6\\ 6 & 6 & 6 & 9\\ 6 & 3 & 9 & 12 \end {bmatrix} \] \(R_{2}\rightarrow -R_{1}+R_{2}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 6 & 6 & 6 & 9\\ 6 & 3 & 9 & 12 \end {bmatrix} \] \(R_{3}\rightarrow -R_{1}+R_{3}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 0 & 2 & -2 & 7\\ 6 & 3 & 9 & 12 \end {bmatrix} \] \(R_{4}\rightarrow -R_{1}+R_{4}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 0 & 2 & -2 & 7\\ 0 & -1 & 1 & 10 \end {bmatrix} \] \(R_{3}\rightarrow 2R_{2}+R_{3}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 0 & 0 & 0 & 15\\ 0 & -1 & 1 & 10 \end {bmatrix} \] \(R_{4}\rightarrow -R_{2}+R_{4}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 0 & 0 & 0 & 15\\ 0 & 0 & 0 & 6 \end {bmatrix} \] \(R_{4}\rightarrow 15R_{4}\) and \(R_{3}\rightarrow 6R_{3}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 0 & 0 & 0 & 90\\ 0 & 0 & 0 & 90 \end {bmatrix} \] \(R_{4}\rightarrow R_{3}+R_{4}\)\[\begin {bmatrix} 6 & 4 & 8 & 2\\ 0 & -1 & 1 & 4\\ 0 & 0 & 0 & 90\\ 0 & 0 & 0 & 0 \end {bmatrix} \] Hence, the pivot columns are \(1,2,4\). Therefore the column space basis are \(\boldsymbol {v}_{1},\boldsymbol {v}_{2},\boldsymbol {v}_{4}\) given by

\[ \left \{ \begin {bmatrix} 3\\ 2\\ 2\\ 2 \end {bmatrix} ,\begin {bmatrix} 2\\ 1\\ 2\\ 1 \end {bmatrix} ,\begin {bmatrix} 1\\ 2\\ 3\\ 4 \end {bmatrix} \right \} \] The above is the subset required.

2.4.11 Additional problem 1

Let \(\boldsymbol {v}_{1}\) and \(\boldsymbol {v}_{2}\) be any linearly independent vectors. Show that \(\boldsymbol {u}_{1}\) \(=2\boldsymbol {v}_{1}\) and \(\boldsymbol {u}_{2}\) \(=\boldsymbol {v}_{1}+\boldsymbol {v}_{2}\) are also linearly independent.

solution

We want to solve for \(c_{1},c_{2}\) from\begin {equation} c_{1}\boldsymbol {u}_{1}+c_{2}\boldsymbol {u}_{2}=\boldsymbol {0} \tag {1} \end {equation} And see if the solution is only the trivial solution or not. The above becomes\begin {align*} c_{1}\left ( 2\boldsymbol {v}_{1}\right ) +c_{2}\left ( \boldsymbol {v}_{1}+\boldsymbol {v}_{2}\right ) & =\boldsymbol {0}\\ 2c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2} & =\boldsymbol {0}\\ \left ( 2c_{1}+c_{2}\right ) \boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2} & =\boldsymbol {0} \end {align*}

Let \(2c_{1}+c_{2}=c_{3}\) a new constant. The above becomes\[ c_{3}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}=\boldsymbol {0}\] But we are told that \(\boldsymbol {v}_{1}\) and \(\boldsymbol {v}_{2}\) be any linearly independent. Therefore only choice for the above is that \(c_{2}=0,c_{3}=0\). But \(c_{3}=2c_{1}+c_{2}\) which means that \(c_{1}=0\). Therefore we just showed that \(c_{1}=c_{2}=0\) is only solution to (1). This implies that \(\boldsymbol {u}_{1},\boldsymbol {u}_{2}\) are linearly independent vectors.

2.4.12 Additional problem 2

In section 4.2, we looked at the set \(W\) consisting of all vectors in \(\mathbb {R} ^{3}\) where \(x_{1}=5x_{2}\) and determined it was a subspace of \(\mathbb {R} ^{3}\). Find a basis for \(W\). What is the dimension of \(W\)?

solution

Let \(\boldsymbol {v}=\begin {bmatrix} x_{1}\\ x_{2}\\ x_{3}\end {bmatrix} \). Let \(x_{2}=t,x_{3}=s\). Therefore \begin {align*} \boldsymbol {v} & =\begin {bmatrix} 5t\\ t\\ s \end {bmatrix} \\ & =t\begin {bmatrix} 5\\ 1\\ 0 \end {bmatrix} +s\begin {bmatrix} 0\\ 0\\ 1 \end {bmatrix} \end {align*}

Hence basis for \(W\) are\[\begin {bmatrix} 5\\ 1\\ 0 \end {bmatrix} ,\begin {bmatrix} 0\\ 0\\ 1 \end {bmatrix} \] And the dimension of \(W\) is \(2.\)

2.4.13 Additional problem 3

Let \(S=\left \{ \boldsymbol {v}_{1},\boldsymbol {v}_{2},\boldsymbol {v}_{3}\right \} \) be a set of linearly independent vectors and suppose that \(\boldsymbol {v}\) is not an element of span \(S\). Show that \(S^{\prime }=\left \{ \boldsymbol {v},\boldsymbol {v}_{1},\boldsymbol {v}_{2},\boldsymbol {v}_{3}\right \} \) is linearly independent.

solution

Proof by contradiction. Assuming the vectors \(\boldsymbol {v},\boldsymbol {v}_{1},\boldsymbol {v}_{2},\boldsymbol {v}_{3}\) are linearly dependent. Therefore we can find constants \(c_{1},c_{2},c_{3},c_{4}\) not all zero, such that\[ c_{1}\boldsymbol {v}_{1}+c_{2}\boldsymbol {v}_{2}+c_{3}\boldsymbol {v}_{3}+c_{4}\boldsymbol {v}=\boldsymbol {0}\] Or\[ -\frac {c_{1}}{c_{4}}\boldsymbol {v}_{1}-\frac {c_{1}}{c_{4}}\boldsymbol {v}_{2}-\frac {c_{1}}{c_{4}}\boldsymbol {v}_{3}=\boldsymbol {v}\] Renaming the constants gives\begin {equation} C_{1}\boldsymbol {v}_{1}+C_{2}\boldsymbol {v}_{2}+C_{3}\boldsymbol {v}_{3}=\boldsymbol {v} \tag {1} \end {equation} The above says, we can represent \(\boldsymbol {v}\) as linear combination of \(\boldsymbol {v}_{1},\boldsymbol {v}_{2},\boldsymbol {v}_{3}\). But \(\boldsymbol {v}\) is not in the span of \(S\), which means we can not reach \(\boldsymbol {v}\) using any linear combination of the vectors \(\left \{ \boldsymbol {v}_{1},\boldsymbol {v}_{2},\boldsymbol {v}_{3}\right \} \). Hence (1) is not possible.

Therefore our assumption that the vectors are linearly dependent is invalid. Hence they must be linearly independent.

2.4.14 key solution for HW 4

PDF