1

 linear subspaces
 eigenvalues and eigenvectors
 matrix inversion formulas
 invariant subspaces
 vector norms and matrix norms
 singular value decomposition
 generalized inverses
 semidefinite matrices

2

 linear combination: a_{1}x_{1}+a_{2}x_{2}_{
}+…+a_{k}x_{k}, x_{i}Î F^{n}, a_{i}Î F
 span{x_{1},…,x_{k}}:={x= a_{1}x_{1}+…+a_{k}x_{k}:
a_{i}Î F}
 x_{1},…,x_{k}Î F^{n} linearly dependent if there exists
a_{1},…,a_{k} Î F not all zero such
that a_{1}x_{1}+…+a_{k}x_{k}=0;
otherwise they are linearly independent.
 {x_{1},…,x_{k}}Î S is a basis for S if x_{1},…,x_{k}
are linearly independent and S = span{x_{1},…,x_{k}}.
 {x_{1},…,x_{k}} in F^{n} are mutually orthogonal
if x_{i}*x_{j}=0 forall i¹j and orthonormal
if x_{i}*x_{j}=d_{ij}
 orthogonal complement of a
subspace SÌ F^{n}
 S^{^}:={yÎ F^{n }: y*x=0 for all xÎ S}

3

 linear transformation A: F^{n }® F^{m }.
 kernel or null space: KerA=N(A):={xÎ F^{n }: Ax=0}
 image or range of A: ImA=R(A):={yÎ F^{m }:
y=Ax, xÎ F^{n}}
 Let a_{i}, i =1,2,…, n denote the columns of a matrix AÎ F^{m}^{´}^{n}.
Then
 Im A=span{a_{1},_{ }a_{2},_{
}…,_{ }a_{n}}.
 The rank of a matrix A is defined by
rank(A) = dim(ImA).
 rank(A) = rank(A*).
 AÎ F^{m}^{´}^{n}
is full row rank if m £ n and
rank(A) = m.
 A is full column rank if n £ m and rank(A) = n.
 unitary matrix U*U = I = UU*.
 Let D Î F^{n}^{´}^{k} (n < k) be such that D*D = I.
Then there exists a matrix D_{^} Î F^{n}^{´}^{(nk)} such that [D D_{^}] is a unitary matrix.

4

 Sylvester equation
 AX+XB
= C
 with AÎ F^{n}^{´}^{n},
BÎ F^{m}^{´}^{m}
, and CÎ F^{n}^{´}^{m}
has a unique solution XÎ F^{n}^{´}^{m}, if and only if l_{i}(A)+l_{j}(B)¹0, "i=1,2,…,n and j=1,2,…,m.
 “Lyapunov Equation”: B = A*.
 Let AÎ F^{m}^{´}^{n}
and BÎ F^{n}^{´}^{k}.
Then
 rank(A)+rank(B)n £ rank(AB) £ min{rank(A), rank(B)}.
 the trace of A=[a_{ij}]Î C^{n}^{´}^{n}:
Trace(A):= S
a_{ii}
 trace has the following
properties:
 Trace(aA)= a Trace(A), " a Î C, AÎ C^{n}^{´}^{n}
 Trace(A+B)=Trace(A)+Trace(B), "A, BÎ C^{n}^{´}^{n}.
 Trace(AB)=Trace(BA), "AÎ C^{n}^{´}^{m}, BÎ C^{m}^{´}^{n}.

5

 The eigenvalues and eigenvectors of AÎ C^{n}^{´}^{n} : l Î C, x Î C^{n}
 Ax =l x x is a right
eigenvector
 y is a left eigenvector:
y*A =l y*
 eigenvalues: the roots of det(lIA).
 spectral radius: r(A):=max
l_{i}
 Jordan canonical form: AÎ C^{n}^{´}^{n}, $ T such
that A=TJT^{1}.

6

 The transformation T has the following form:
 where t_{ij1} are the eigenvectors of A: A t_{ij1}= l_{i }t_{ij1} and t_{ijk}¹0 defined by the
following linear equations for k ³ 2
 (Al_{i}I)_{
}t_{ijk}=_{ }t_{ij(k1)}
 are called the generalized eigenvectors of A.
 AÎ R^{n}^{´}^{n}
with distinct eigenvalues can be diagonalized:

7

 and has the following spectral decomposition:
 A= S l_{i }x_{i }y_{i}*
 where y_{i}Î C^{n} is given by
 AÎ R^{n}^{´}^{n}
with real eigenvalue lÎ R Þ real
eigenvector xÎ R^{n}
 A is Hermitian,i.e., A=A* Þ $ unitary U such that A=ULU* and L=diag{l_{1 }, l_{2} , … , l_{n}} is real.

8


9


10

 A subspace S Ì C^{n}
is an Ainvariant subspace if AxÎ S for every xÎS.
 For example, {0}, C^{n}
, Ker A, and Im A are all Ainvariant
subspaces.
 Let l and x be an eigenvalue and a corresponding
eigenvector of AÎC^{n}^{´}^{n}. Then S := span{x} is an Ainvariant
subspace since
 Ax=lxÎ S.
 In general, let {l_{1},_{ }l_{2},_{ }…,_{
}l_{k}}
(not necessarily distinct) and x_{i }be a set of eigenvalues and
a set of corresponding eigenvectors and the generalized eigenvectors.
Then S=span{x_{1},…,x_{k}} is an invariant subspace
provided that all the lower rank generalized eigenvectors are included.

11

 An Ainvariant subspace S Ì C^{n} is called a stable invariant
subspace if all the eigenvalues of A constrained to S have negative real
parts.
 Stable invariant subspaces are used to compute the stabilizing
solutions of the algebraic Riccati equations.
 Example: Let A be such that
 with Re l_{1}<0,
l_{3}<0,
and l_{4}>0.
Then it is easy to verify that
 S_{1}=span{x_{1}}, S_{12}=span{x_{1 },
x_{2}}, S_{123}=span{x_{1} , x_{2} , x_{3}},
S_{3}=span{x_{3}}, S_{13}=span{x_{1 }, x_{3}}, S_{124}=span{x_{1}
, x_{2} , x_{4}},
S_{4}=span{x_{4}}, S_{14}=span{x_{1 },
x_{4}}, S_{34}=span{x_{3} , x_{4}}
 are all Ainvariant subspaces. Moreover, S_{1}, S_{3}, S_{12},
S_{13}, and S_{123} are stable Ainvariant subspaces.

12

 However, the subspaces S_{2}=span{x_{2}},
S_{23}=span{x_{2 }, x_{3}}, S_{24}=span{x_{2}
, x_{4}}, S_{234}=span{x_{2} , x_{3} , x_{4}} are not Ainvariant subspaces since
the lower rank generalized eigenvector x_{1} of x_{2} is
not in these subspaces.
 To illustrate, consider the subspace S_{23}. It is an Ainvariant
subspace if Ax_{2} Î S_{23}. Since Ax_{2}=l_{1}x_{2}
+x_{1} , Ax_{2} Î S_{23} would require that x_{1 }be
a linear combination of x_{2} and x_{3}, but this is
impossible since x_{1} is independent of x_{2} and x_{3}.

13

 Norm: Let X be a vector space. • is a norm if
 (i) x³0
(positivity);
 (ii) x=0 if and only if x = 0 (positive definiteness);
 (iii) ax=a x for any scalar
a
(homogeneity);
 (iv) x+y£x+y
(triangle inequality)
 for any xÎ X and yÎ X.
 Let xÎ C^{n}. Then we define the vector pnorm
of x as

14

 In particular, when p =1, 2, ¥ , we have
 Induced Matrix Norm: the matrix norm induced by a vector pnorm is
defined as
 In particular, for p = 1, 2, ¥ , the corresponding induced matrix norm can be
computed as

15

 Properties of Euclidean Norm: The Euclidean 2norm has some very nice
properties:
 Let xÎ F^{n } and ^{ }yÎ F^{m}
 1. Suppose n ³ m. Then
x=y iff there is a matrix UÎ F^{n}^{´}^{m} such that x = Uy and U*U = I.
 2. Suppose n = m. Then x*y £ x y. Moreover, the equality holds iff x=ay for some aÎ F or y = 0.
 3. x£ y
iff there is a matrix DÎ F^{nxm} with D £ 1 such that x=Dy. Furthermore, x<y iff D < 1.
 4. Ux=x for any
appropriately dimensioned unitary matrices U.

16

 Properties of Matrix Norm:
 Frobenius norm
 Let A and B be any matrices with appropriate dimensions. Then
 1. r(A) £ A (This is also
true for F norm and any induced matrix norm).
 2. AB£ A
B. In particular, this gives
A^{1}^{ }³ A^{1} if A is invertible. (This is
also true for any induced matrix norm.)
 3. UAV=A and UAV_{F}=A_{F}, for any
appropriately dimensioned unitary matrices U and V.
 4. AB_{F }£ A B_{F}, and AB_{F }£ B A_{F}

17

 Let AÎ F^{m}^{´}^{n}.
There exist unitary matrices
 U=[u_{1}, u_{2}_{ },…,u_{m} ]Î F^{m}^{´}^{m }, V=[v_{1}, v_{2}_{ },…,v_{n}
]Î F^{n}^{´}^{n}
 such that A=USV* where
 with S_{1}=diag{s_{1}, s_{2}, … ,s_{p}} and
 s_{1}³ s_{2}³ … ³ s_{p}³ 0, p=min{m, n}.
 = s_{max}(A)= s_{1}(A)= the
largest singular value of A;
 s(A) = s_{min}(A)= s_{p}(A)= the smallest singular value of A.
 Note that
 Av_{i}= s_{i}u_{i
}, A*u_{i}= s_{i}v_{i}
 A*Av_{i}= s_{i}^{2}v_{i}
, AA*u_{i}= s_{i}^{2}u_{i}

18

 Singular values are good measures of the “size” of the matrix singular
vectors are good indications of strong/weak input or output directions.
 Geometrically, the singular values of a matrix A are precisely the
lengths of the semiaxes of the hyperellipsoid E defined by
 E={y: y=Ax, x Î C^{n}, x=1}.
 Thus v_{1} is the direction in which y is largest for all
x=1, while v_{n} is the direction in which y is smallest
for all x=1
 v_{1}(v_{n}) is the highest (lowest) gain input
direction
 u_{1}(u_{m}) is
the highest (lowest) gain observing direction
 e.g.,
 A maps a unit disk to an ellipsoid with semiaxes of s_{1 }and_{ }s_{2}.

19

 Alternative definitions:
 and for the smallest singular value s of a tall matrix:
 Suppose A and D
are square matrices. Then

20

 Some useful properties
 Let AÎ F^{m}^{´}^{n}
and s_{1}³ s_{2}³ … ³ s_{r}> s_{r+1}=… =
0, r£min{m, n}. Then
 1. rank(A)=r;
 2. Ker A=span{v_{r+1}, v_{r+2}_{ },…,v_{n}}
and (Ker A)^{^}=span{v_{1}, v_{2}_{ },…,v_{r}};
 3. Im A=span{u_{1}, u_{2}_{ },…,u_{r}}
and (Im A)^{^}=span{u_{r+1}, u_{r+2}_{ },…,u_{m}};
 4. AÎ F^{m}^{´}^{n}
has a dyadic expansion:
 where U_{r}=[u_{1}, u_{2}_{ },…,u_{r}]^{
}, V_{r}=[v_{1},
v_{2}_{ },…,v_{r}], and S_{r}=diag{s_{1}, s_{2}_{ },…, s_{r}};
 5. A_{F}^{2} =s_{1}^{2}+s_{2}^{2}+
… +s_{r}^{2};
 6. A =s_{1
};
 7. s_{i}(U_{0}AV_{0})=
s_{i}(A),
i=1,2, …, p. for any
appropriately dimensioned unitary matrices U_{0} and V_{0}.

21

 Let AÎ F^{m}^{´}^{n}. XÎ F^{n}^{´}^{m} is a right inverse if AX = I. One of
the right inverses is given by X=A*(AA*)^{1}. YA = I then Y is
a left inverse of A.
 Pseudoinverse or MoorePenrose inverse A^{+}:
 (i) A A^{+}A=A; (ii) A^{+}A A^{+}= A^{+};
(iii) (A A^{+})*=A A^{+}; (iv) (A^{+}A)*=A^{+}A.
 Pseudoinverse is unique.
 Let A=BC where B has a full column rank and C has full row rank. Then A^{+}=C*(CC*)^{1}(B*B)^{1
}B* .
 or let A=USV*
with and S_{r}=diag{s_{1}, s_{2}, … ,s_{r}}.
 Then A^{+}=VS^{+}U*
with

22

 A=A* is positive definite (semidefinite) denoted by A>0(³0), if x*Ax>0(³0) for all x¹0.
 AÎ F^{n}^{´}^{n}
and A=A* ³0,
$BÎ F^{n}^{´}^{r}
with r³
rank(A) such that A=BB*.
 Let BÎ F^{m}^{´}^{n}
and CÎ F^{k}^{´}^{n}.
Suppose m ³
k and B*B=C*C. $UÎ F^{m}^{´}^{k} such that U*U=I and B=UC.
 Square root for a positive semidefinite matrix A, A^{1/2}=(A^{1/2})* ³0, such that A= A^{1/2 }A^{1/2 }.
 Clearly, A^{1/2} can be computed by using spectral
decomposition or SVD: let A=ULU*, then A^{1/2} = UL^{1/2}U*, where
L=diag{l_{1 }, l_{2} , … , l_{n}}, L^{1/2}=diag{(l_{1})^{1/2}_{
}, (l_{2})^{1/2}
, … , (l_{n})^{1/2}}

23

 A=A* >0 and B=B* ³ 0. Then A>B iff r(BA^{1})<1.
 Let X=X* ³ 0 be
partitioned as
 Then Ker X_{22 }Ì Ker X_{12 }. Consequently, if X_{22}^{+} is the pseudoinverse of X_{22},
then Y= X_{12}X_{22}^{+} solves
 YX_{22}=X_{12}
 and
