1
|
- linear subspaces
- eigenvalues and eigenvectors
- matrix inversion formulas
- invariant subspaces
- vector norms and matrix norms
- singular value decomposition
- generalized inverses
- semidefinite matrices
|
2
|
- linear combination: a1x1+a2x2
+…+akxk, xiÎ Fn, aiÎ F
- span{x1,…,xk}:={x= a1x1+…+akxk:
aiÎ F}
- x1,…,xkÎ Fn linearly dependent if there exists
a1,…,ak Î F not all zero such
that a1x1+…+akxk=0;
otherwise they are linearly independent.
- {x1,…,xk}Î S is a basis for S if x1,…,xk
are linearly independent and S = span{x1,…,xk}.
- {x1,…,xk} in Fn are mutually orthogonal
if xi*xj=0 forall i¹j and orthonormal
if xi*xj=dij
- orthogonal complement of a
subspace SÌ Fn
- S^:={yÎ Fn : y*x=0 for all xÎ S}
|
3
|
- linear transformation A: Fn ® Fm .
- kernel or null space: KerA=N(A):={xÎ Fn : Ax=0}
- image or range of A: ImA=R(A):={yÎ Fm :
y=Ax, xÎ Fn}
- Let ai, i =1,2,…, n denote the columns of a matrix AÎ Fm´n.
Then
- Im A=span{a1, a2,
…, an}.
- The rank of a matrix A is defined by
rank(A) = dim(ImA).
- rank(A) = rank(A*).
- AÎ Fm´n
is full row rank if m £ n and
rank(A) = m.
- A is full column rank if n £ m and rank(A) = n.
- unitary matrix U*U = I = UU*.
- Let D Î Fn´k (n < k) be such that D*D = I.
Then there exists a matrix D^ Î Fn´(n-k) such that [D D^] is a unitary matrix.
|
4
|
- Sylvester equation
- AX+XB
= C
- with AÎ Fn´n,
BÎ Fm´m
, and CÎ Fn´m
has a unique solution XÎ Fn´m, if and only if li(A)+lj(B)¹0, "i=1,2,…,n and j=1,2,…,m.
- “Lyapunov Equation”: B = A*.
- Let AÎ Fm´n
and BÎ Fn´k.
Then
- rank(A)+rank(B)-n £ rank(AB) £ min{rank(A), rank(B)}.
- the trace of A=[aij]Î Cn´n:
Trace(A):= S
aii
- trace has the following
properties:
- Trace(aA)= a Trace(A), " a Î C, AÎ Cn´n
- Trace(A+B)=Trace(A)+Trace(B), "A, BÎ Cn´n.
- Trace(AB)=Trace(BA), "AÎ Cn´m, BÎ Cm´n.
|
5
|
- The eigenvalues and eigenvectors of AÎ Cn´n : l Î C, x Î Cn
- Ax =l x x is a right
eigenvector
- y is a left eigenvector:
y*A =l y*
- eigenvalues: the roots of det(lI-A).
- spectral radius: r(A):=max
|li|
- Jordan canonical form: AÎ Cn´n, $ T such
that A=TJT-1.
|
6
|
- The transformation T has the following form:
- where tij1 are the eigenvectors of A: A tij1= li tij1 and tijk¹0 defined by the
following linear equations for k ³ 2
- (A-liI)
tijk= tij(k-1)
- are called the generalized eigenvectors of A.
- AÎ Rn´n
with distinct eigenvalues can be diagonalized:
|
7
|
- and has the following spectral decomposition:
- A= S li xi yi*
- where yiÎ Cn is given by
- AÎ Rn´n
with real eigenvalue lÎ R Þ real
eigenvector xÎ Rn
- A is Hermitian,i.e., A=A* Þ $ unitary U such that A=ULU* and L=diag{l1 , l2 , … , ln} is real.
|
8
|
|
9
|
|
10
|
- A subspace S Ì Cn
is an A-invariant subspace if AxÎ S for every xÎS.
- For example, {0}, Cn
, Ker A, and Im A are all A-invariant
subspaces.
- Let l and x be an eigenvalue and a corresponding
eigenvector of AÎCn´n. Then S := span{x} is an A-invariant
subspace since
- Ax=lxÎ S.
- In general, let {l1, l2, …,
lk}
(not necessarily distinct) and xi be a set of eigenvalues and
a set of corresponding eigenvectors and the generalized eigenvectors.
Then S=span{x1,…,xk} is an invariant subspace
provided that all the lower rank generalized eigenvectors are included.
|
11
|
- An A-invariant subspace S Ì Cn is called a stable invariant
subspace if all the eigenvalues of A constrained to S have negative real
parts.
- Stable invariant subspaces are used to compute the stabilizing
solutions of the algebraic Riccati equations.
- Example: Let A be such that
- with Re l1<0,
l3<0,
and l4>0.
Then it is easy to verify that
- S1=span{x1}, S12=span{x1 ,
x2}, S123=span{x1 , x2 , x3},
S3=span{x3}, S13=span{x1 , x3}, S124=span{x1
, x2 , x4},
S4=span{x4}, S14=span{x1 ,
x4}, S34=span{x3 , x4}
- are all A-invariant subspaces. Moreover, S1, S3, S12,
S13, and S123 are stable A-invariant subspaces.
|
12
|
- However, the subspaces S2=span{x2},
S23=span{x2 , x3}, S24=span{x2
, x4}, S234=span{x2 , x3 , x4} are not A-invariant subspaces since
the lower rank generalized eigenvector x1 of x2 is
not in these subspaces.
- To illustrate, consider the subspace S23. It is an A-invariant
subspace if Ax2 Î S23. Since Ax2=l1x2
+x1 , Ax2 Î S23 would require that x1 be
a linear combination of x2 and x3, but this is
impossible since x1 is independent of x2 and x3.
|
13
|
- Norm: Let X be a vector space. ||•|| is a norm if
- (i) ||x||³0
(positivity);
- (ii) ||x||=0 if and only if x = 0 (positive definiteness);
- (iii) ||ax||=|a| ||x|| for any scalar
a
(homogeneity);
- (iv) ||x+y||£||x||+||y||
(triangle inequality)
- for any xÎ X and yÎ X.
- Let xÎ Cn. Then we define the vector p-norm
of x as
|
14
|
- In particular, when p =1, 2, ¥ , we have
- Induced Matrix Norm: the matrix norm induced by a vector p-norm is
defined as
- In particular, for p = 1, 2, ¥ , the corresponding induced matrix norm can be
computed as
|
15
|
- Properties of Euclidean Norm: The Euclidean 2-norm has some very nice
properties:
- Let xÎ Fn and yÎ Fm
- 1. Suppose n ³ m. Then
||x||=||y|| iff there is a matrix UÎ Fn´m such that x = Uy and U*U = I.
- 2. Suppose n = m. Then ||x*y|| £ ||x|| ||y||. Moreover, the equality holds iff x=ay for some aÎ F or y = 0.
- 3. ||x||£ ||y||
iff there is a matrix DÎ Fnxm with ||D|| £ 1 such that x=Dy. Furthermore, ||x||<||y|| iff ||D|| < 1.
- 4. ||Ux||=||x|| for any
appropriately dimensioned unitary matrices U.
|
16
|
- Properties of Matrix Norm:
- Frobenius norm
- Let A and B be any matrices with appropriate dimensions. Then
- 1. r(A) £ ||A|| (This is also
true for F norm and any induced matrix norm).
- 2. ||AB||£ ||A||
||B||. In particular, this gives
||A-1|| ³ ||A||-1 if A is invertible. (This is
also true for any induced matrix norm.)
- 3. ||UAV||=||A|| and ||UAV||F=||A||F, for any
appropriately dimensioned unitary matrices U and V.
- 4. ||AB||F £ ||A|| ||B||F, and ||AB||F £ ||B|| ||A||F
|
17
|
- Let AÎ Fm´n.
There exist unitary matrices
- U=[u1, u2 ,…,um ]Î Fm´m , V=[v1, v2 ,…,vn
]Î Fn´n
- such that A=USV* where
- with S1=diag{s1, s2, … ,sp} and
- s1³ s2³ … ³ sp³ 0, p=min{m, n}.
- = smax(A)= s1(A)= the
largest singular value of A;
- s(A) = smin(A)= sp(A)= the smallest singular value of A.
- Note that
- Avi= siui
, A*ui= sivi
- A*Avi= si2vi
, AA*ui= si2ui
|
18
|
- Singular values are good measures of the “size” of the matrix singular
vectors are good indications of strong/weak input or output directions.
- Geometrically, the singular values of a matrix A are precisely the
lengths of the semi-axes of the hyperellipsoid E defined by
- E={y: y=Ax, x Î Cn, ||x||=1}.
- Thus v1 is the direction in which ||y|| is largest for all
||x||=1, while vn is the direction in which ||y|| is smallest
for all ||x||=1
- v1(vn) is the highest (lowest) gain input
direction
- u1(um) is
the highest (lowest) gain observing direction
- e.g.,
- A maps a unit disk to an ellipsoid with semi-axes of s1 and s2.
|
19
|
- Alternative definitions:
- and for the smallest singular value s of a tall matrix:
- Suppose A and D
are square matrices. Then
|
20
|
- Some useful properties
- Let AÎ Fm´n
and s1³ s2³ … ³ sr> sr+1=… =
0, r£min{m, n}. Then
- 1. rank(A)=r;
- 2. Ker A=span{vr+1, vr+2 ,…,vn}
and (Ker A)^=span{v1, v2 ,…,vr};
- 3. Im A=span{u1, u2 ,…,ur}
and (Im A)^=span{ur+1, ur+2 ,…,um};
- 4. AÎ Fm´n
has a dyadic expansion:
- where Ur=[u1, u2 ,…,ur]
, Vr=[v1,
v2 ,…,vr], and Sr=diag{s1, s2 ,…, sr};
- 5. ||A||F2 =s12+s22+
… +sr2;
- 6. ||A|| =s1
;
- 7. si(U0AV0)=
si(A),
i=1,2, …, p. for any
appropriately dimensioned unitary matrices U0 and V0.
|
21
|
- Let AÎ Fm´n. XÎ Fn´m is a right inverse if AX = I. One of
the right inverses is given by X=A*(AA*)-1. YA = I then Y is
a left inverse of A.
- Pseudo-inverse or Moore-Penrose inverse A+:
- (i) A A+A=A; (ii) A+A A+= A+;
(iii) (A A+)*=A A+; (iv) (A+A)*=A+A.
- Pseudo-inverse is unique.
- Let A=BC where B has a full column rank and C has full row rank. Then A+=C*(CC*)-1(B*B)-1
B* .
- or let A=USV*
with and Sr=diag{s1, s2, … ,sr}.
- Then A+=VS+U*
with
|
22
|
- A=A* is positive definite (semi-definite) denoted by A>0(³0), if x*Ax>0(³0) for all x¹0.
- AÎ Fn´n
and A=A* ³0,
$BÎ Fn´r
with r³
rank(A) such that A=BB*.
- Let BÎ Fm´n
and CÎ Fk´n.
Suppose m ³
k and B*B=C*C. $UÎ Fm´k such that U*U=I and B=UC.
- Square root for a positive semidefinite matrix A, A1/2=(A1/2)* ³0, such that A= A1/2 A1/2 .
- Clearly, A1/2 can be computed by using spectral
decomposition or SVD: let A=ULU*, then A1/2 = UL1/2U*, where
L=diag{l1 , l2 , … , ln}, L1/2=diag{(l1)1/2
, (l2)1/2
, … , (ln)1/2}
|
23
|
- A=A* >0 and B=B* ³ 0. Then A>B iff r(BA-1)<1.
- Let X=X* ³ 0 be
partitioned as
- Then Ker X22 Ì Ker X12 . Consequently, if X22+ is the pseudo-inverse of X22,
then Y= X12X22+ solves
- YX22=X12
- and
|