5.10 Matrices and linear algebra

Commutator is defined as\[ \left [ M,N\right ] =MN-NM \] Where \(N,M\) are matrices.

Anti-commutator is when\[ \left [ M,N\right ] _{+}=MN+NM \] Two matrices commute means \(MN-NM=0\). Matrices that commute share an eigenbasis.

Properties of commutators\begin {align*} \left [ A+B,C\right ] & =\left [ A,C\right ] +\left [ B,C\right ] \\ \left [ A,B+C\right ] & =\left [ A,B\right ] +\left [ B,C\right ] \\ \left [ A,A\right ] & =0\\ \left [ A^{2},B\right ] & =A\left [ A,B\right ] +\left [ A,B\right ] A\\ \left [ AB,C\right ] & =A\left [ B,C\right ] +\left [ A,C\right ] B\\ \left [ A,BC\right ] & =\left [ A,B\right ] C+B\left [ A,C\right ] \end {align*}

Matrices are generally noncommutative. i.e.  \[ MN\neq NM \] Matrix Inverse\[ A^{-1}=\frac {1}{\left \vert A\right \vert }A_{c}^{T}\] Where \(A_{c}\) is the cofactor matrix.

Matrix inverse satisfies\[ A^{-1}A=I=AA^{-1}\] Matrix adjoint is same as Transpose for real matrix. If Matrix is complex, then Matrix adjoint does conjugate in addition to transposing. This is also called dagger.\[ A_{ij}^{\dag }=A_{ji}^{\ast }\] So dagger is just transpose but for complex, we also do conjugate after transposing. That is all.

If \(A_{ij}=A_{ji}\) then matrix is symmetric. If \(A_{ij}=-A_{ji}\) then antisymmetric.

Hermitian matrix is one which \(A^{\dag }=A\). If \(A^{\dag }=-A\) then it is antiHermitian.

Any real symmetric matrix is always Hermitian. But for complex matrix, non-symmetric can still be Hermitian. An example is \(\begin {pmatrix} 1 & -i\\ i & 2 \end {pmatrix} \).

Unitary matrix Is one whose dagger is same as  its inverse. i.e. \begin {align*} A^{\dag } & =A^{-1}\\ A^{\dag }A & =I \end {align*}

Remember, dagger is just transpose followed by conjugate if complex. Example of unitary matrix is \(\frac {1}{\sqrt {2}}\begin {pmatrix} 1 & i\\ i & 1 \end {pmatrix} \). Determinant of a unitrary matrix must be complex number whose magnitude is \(1\).

Also \(\left \vert Av\right \vert =\left \vert v\right \vert \) if \(A\) is unitary. This means \(A\) maps vector of some norm, to vector which must have same length as the original vector.

A unitary operator looks the same in any basis.

Orthogonal matrix One which satisfies \begin {align*} AA^{T} & =I\\ A^{T}A & =I\\ A^{-1} & =A^{T} \end {align*}

commute means \(\left [ MN\right ] =MN-NM\). Also \([MN]_{+}=\begin {pmatrix} 0 & 0\\ 0 & 0 \end {pmatrix} \).

Another property is that \(\det \left (\alpha _{i}\right ) =-1\). Since they are Hermitian and unitary, then \(\alpha _{i}^{-1}=\alpha _{i}\).

If \(H\) is Hermitian, then \(U=e^{iH}\) is unitary.

When moving a number out of a BRA, make sure to complex conjugate it. For example \(\langle 3v_{1}|v_{2}\rangle =3^{\ast }\langle v_{1}|v_{2}\rangle \). But for the ket, no need to. For example \(\langle v_{1}|3v_{2}\rangle =3\langle v_{1}|v_{2}\rangle \)

item \(\langle f|\Omega |g\rangle ^{\ast }=\langle \left ( \Omega |g\right ) ^{\ast }|f\rangle =\langle g|\Omega ^{\dag }|f\rangle \)

item when moving operator from ket to bra, remember to dagger it. \(\langle u|Tv\rangle =\langle T^{\dag }u|v\rangle \)

item  if given set of vectors and asked to show L.I., then set up \(Ax=0\) system, and check \(\left \vert A\right \vert \). If determinant is zero, then there exist non-trivial solution, which means Linearly dependent. Otherwise, L.I.

item if given \(A\), then to represent it in say basis \(e_{i}\), we say \(A_{ki}^{\relax (e) }=\langle e_{k},Ae_{i}\rangle =\langle e_{k}|A|e_{i}\rangle \). i.e \(A_{1,1}=\langle e_{1},Ae_{1}\rangle \) and \(A_{1,2}=\langle e_{1},Ae_{2}\rangle \) and so on.