1.19 Lecture 17. Thursday October 23 2014

How to determine \(e^{At}\)

Summary: We solve \(x^{\prime }=A\left ( t\right ) x\left ( t\right ) +B\left ( t\right ) u\left ( t\right ) \) with \(X\left ( 0\right ) =x^{0}\). We assumed continuity on \(A,B\) (piecewise continuous is OK). First step was to find \(\Psi \left ( t\right ) \), where \(\Psi \left ( t\right ) =\begin{pmatrix} \Psi ^{1}\left ( t\right ) & \Psi ^{2}\left ( t\right ) & \cdots & \Psi ^{n}\left ( t\right ) \end{pmatrix} \). This matrix is \(n\times n\) and is not unique. Now we formed \(\Phi \left ( t,\tau \right ) =\Psi \left ( t\right ) \Psi ^{-1}\left ( \tau \right ) \) called the transition matrix, which is unique (Q: How can \(\Phi \left ( t,\tau \right ) \) be unique if \(\Psi \) is not?). Then we found \[ x\left ( t\right ) =\Phi \left ( t,0\right ) X\left ( 0\right ) +{\displaystyle \int \limits _{0}^{t}} \Phi \left ( t,\tau \right ) B\left ( \tau \right ) u\left ( \tau \right ) d\tau \] We can also easily get the output equation as well. Which is \[ y\left ( t\right ) =C\left ( t\right ) \Phi \left ( t,0\right ) X\left ( 0\right ) +{\displaystyle \int \limits _{0}^{t}} C\left ( t\right ) \Phi \left ( t,\tau \right ) B\left ( \tau \right ) u\left ( \tau \right ) d\tau +D\left ( t\right ) u\left ( t\right ) \] Today we will talk about LTI (linear time invariant), where \(A,B,C,D\) matrices are now constants and do not depend on time \begin{align*} x^{\prime } & =Ax\left ( t\right ) +Bu\left ( t\right ) \\ y & =Cx\left ( t\right ) +Du\left ( t\right ) \end{align*}

Where now \(A,B,C,D\) are constant matrices. We want to find solution to this as special case of \(LTV\). Is \(\Phi \left ( t,\tau \right ) \) easy to get now? Would we still need Picard iterations? Yes, it is easier to get and we do not need to use Picard iterations to solve the LTI. We will introduce matrix exponential \(e^{At}\) where \(A\) is matrix. Define as\begin{align*} e^{At} & =I+At+\frac{A^{2}t^{2}}{2!}+\cdots \\ & =\sum _{k=0}^{\infty }\frac{A^{k}t^{k}}{k!} \end{align*}

Is this even well defined? We ask, is it convergent sum? Let us view the partial sum \(S_{k}\) as sequence in space of bounded functions and show that this converges uniformly. \(S_{k}=\sum _{i=0}^{k}\frac{A^{i}t^{i}}{i!}\). This is an \(n\times n\) matrix, it is continuous since we only get polynomials in \(t\) as entries in this matrix. View as vector in space of bounded functions \(B\left ( \left [ 0,T\right ] ,M^{m\times n}\right ) \). The norm of this space is the \(\sup \) norm since this is a bounded space. Now let us look at \(\left \Vert S_{k}\left ( t\right ) \right \Vert \)\begin{align*} \left \Vert S_{k}\right \Vert _{I} & =\sup \left \Vert S_{k}\left ( t\right ) \right \Vert \\ & =\sup \left \Vert \sum _{i=0}^{k}\frac{A^{i}t^{i}}{i!}\right \Vert \\ & \leq \sum _{i=0}^{k}\frac{\left \Vert A^{i}\right \Vert t^{i}}{i!} \end{align*}

But \(\left \Vert A^{i}\right \Vert =\left \Vert AA\cdots A\right \Vert \leq \left \Vert A\right \Vert \left \Vert A\right \Vert \cdots \left \Vert A\right \Vert =\left \Vert A\right \Vert ^{i}\) so the above becomes\[ \left \Vert S_{k}\right \Vert _{I}\leq \sum _{i=0}^{k}\frac{\left \Vert A\right \Vert ^{i}t^{i}}{i!}\] Now we use Weierstrass M test. Let \(M^{i}=\left ( \left \Vert A\right \Vert t\right ) ^{i}\) then we need to see if \(\sum _{i=0}^{\infty }\frac{M^{i}}{i!}\)converges. But\[ \sum _{i=0}^{\infty }\frac{M^{i}}{i!}=e^{M}\] Since it converges, then this implies that \(e^{At}\) converges uniformly. So we found out that \(e^{At}\) is continuous and converges uniformly. So it is well defined definition we have above. OK, now we have introduced \(e^{At}\), but now we need to see how to use it to solve the LTI.

reader: \(e^{At}\) is fundamental matrix \(\Psi \left ( t\right ) \) for LTI system \(x^{\prime }=Ax\). One thing to check is that at \(t=0\) the matrix \(\Psi \left ( 0\right ) \) has \(n\) linearly independent columns. We also need each column to be a solution of the state equation \(\Psi ^{\prime }=A\Psi \). Since \begin{align*} \frac{d}{dt}e^{At} & =\frac{d}{dt}\left ( I+At+\frac{A^{2}t^{2}}{2!}+\cdots \right ) \\ & =0+A+A^{2}t+\frac{A^{3}t^{2}}{2!}+\cdots \\ & =A\left ( I+At+\frac{A^{2}t^{2}}{2!}+\cdots \right ) \\ & =Ae^{At} \end{align*}

Therefore, \(e^{At}\) satisfies the state equation. What about transition matrix? Let

\begin{align*} \Phi \left ( t,\tau \right ) & =\Psi \left ( t\right ) \Psi ^{-1}\left ( \tau \right ) \\ & =e^{At}\left ( e^{A\tau }\right ) ^{-1} \end{align*}

Reader: Show that \(\left ( e^{A\tau }\right ) ^{-1}=e^{-A\tau }\).

Proof: (For case of distinct eigenvalues only): Using \(e^{At}=V\begin{pmatrix} e^{\lambda _{1}t} & & \\ & \ddots & \\ & & e^{\lambda _{n}t}\end{pmatrix} V^{-1}\) which is \(e^{At}=V\Lambda V^{-1}\), then \(\left ( e^{At}\right ) ^{-1}=\left ( V\Lambda V^{-1}\right ) ^{-1},\) but for matrices, \(\left ( AB\right ) ^{-1}=B^{-1}A^{-1}\), hence \(\left ( e^{At}\right ) ^{-1}=V^{-1}\left ( V\Lambda \right ) ^{-1}=V\Lambda ^{-1}V^{-1}\), but \(\Lambda ^{-1}=\begin{pmatrix} e^{-\lambda _{1}t} & & \\ & \ddots & \\ & & e^{-\lambda _{n}t}\end{pmatrix} \), hence \(\left ( e^{At}\right ) ^{-1}=e^{-At}\). QED. Therefore the above becomes \(\Phi \left ( t,\tau \right ) =e^{At}e^{-A\tau }.\) Question: I assumed distinct eigenvalues for \(A\) in the above proof for the reader. What about if \(A\) has repeated eigenvalues?

Reader: Show that \(e^{At}e^{-A\tau }=e^{A\left ( t-\tau \right ) }\). To show this, use the series definition above, multiply things out and simplify. To do

So now that we showed \(e^{At}\) is fundamental matrix for \(x^{\prime }=Ax\,\) we can write the state solution using it as\[ x\left ( t\right ) =e^{A\left ( t-0\right ) }X\left ( 0\right ) +{\displaystyle \int _{0}^{t}} e^{A\left ( t-\tau \right ) }Bu\left ( \tau \right ) d\tau \]

Note: in LTV, \(\Phi \left ( t,\tau \right ) \) was a function of 2 parameters \(t\) and \(\tau \). Here \(e^{A\left ( t-\tau \right ) }\) is function of only one parameter, which is the difference \(t-\tau \).

Some properties of \(e^{At}\):

1.
Reader: show that \(e^{At}\) commute with \(A\). i.e. \(Ae^{At}=e^{At}A\).
2.
Reader: is \(e^{At_{1}}e^{At_{2}}=e^{A\left ( t_{1}+t_{2}\right ) }\)?
3.
Reader: Is \(e^{A_{1}}e^{A_{2}}=e^{A_{1}+A_{2}}\)?
4.
Reader: Is \(e^{A_{1}}e^{A_{2}}=e^{A_{2}}e^{A_{2}}\)? (no, in general).

How to determine \(e^{At}\):

There are many ways to determine \(e^{At}\) (18 or more). We will cover two ways. One uses the eigenvector/eigenvalues approach and one is good for hand calculations

First method: This method assume there are \(n\) distinct eigenvalues and \(n\) distinct eigenvector. This method will not work as is if there are no \(n\) distinct eigenvalue. Most of \(A\) matrices have distinct eigenvalues, unless we hardcoded some values in them in practice. Now, let \(v^{1},v^{2},\cdots v^{n}\) be the \(n\) eigenvectors and let \(\lambda _{1},\lambda _{2},\cdots \lambda _{n}\) be the eigenvalues. Where \(Av^{i}=\lambda _{i}v^{i}\). Form the modal matrix \(V=\begin{pmatrix} v^{1} & v^{2} & \cdots & v^{n}\end{pmatrix} \). This matrix diagonalizes \(A\). Hence we write \[ V^{-1}AV=\Lambda \] Where \(\Lambda =\begin{pmatrix} \lambda _{1} & 0 & 0 & 0\\ 0 & \lambda _{2} & 0 & 0\\ 0 & 0 & \ddots & 0\\ 0 & 0 & 0 & \lambda _{n}\end{pmatrix} \), hence we have\begin{align*} A & =V\Lambda V^{-1}\\ e^{At} & =\sum _{k=0}^{\infty }\frac{\left ( V\Lambda V^{-1}\right ) ^{k}t^{k}}{k!} \end{align*}

\(VV^{-1}\) cancel leaving\begin{align*} e^{At} & =V\left ( \sum _{k=0}^{\infty }\frac{\Lambda ^{k}t^{k}}{k!}\right ) V^{-1}\\ & =V\left ( \sum _{k=0}^{\infty }\frac{\begin{pmatrix} \lambda _{1} & 0 & 0 & 0\\ 0 & \lambda _{2} & 0 & 0\\ 0 & 0 & \ddots & 0\\ 0 & 0 & 0 & \lambda _{n}\end{pmatrix} ^{k}t^{k}}{k!}\right ) V^{-1}\\ & =V\begin{pmatrix} \sum _{k=0}^{\infty }\frac{\lambda _{1}^{k}t^{k}}{k!} & 0 & 0 & 0\\ 0 & \sum _{k=0}^{\infty }\frac{\lambda _{2}^{k}t^{k}}{k!} & 0 & 0\\ 0 & 0 & \ddots & 0\\ 0 & 0 & 0 & \sum _{k=0}^{\infty }\frac{\lambda _{n}^{k}t^{k}}{k!}\end{pmatrix} V^{-1}\\ & =V\begin{pmatrix} e^{\lambda _{1}t} & 0 & 0 & 0\\ 0 & e^{\lambda _{2}t} & 0 & 0\\ 0 & 0 & \ddots & 0\\ 0 & 0 & 0 & e^{\lambda _{2}t}\end{pmatrix} V^{-1} \end{align*}

Next time we will look at the other method to find \(e^{At}\)

HW5 assigned.