Singular Value Analysis of Linear- Quadratic Systems Robert Stengel Optimal Control and Estimation MAE 546 Princeton University, 2018 Copyright 2018 by Robert Stengel. All rights reserved. For educational use only. http://www.princeton.edu/~stengel/MAE546.html http://www.princeton.edu/~stengel/OptConEst.html ! Multivariable Nyquist Stability Criterion ! Matrix Norms and Singular Value Analysis ! Frequency domain measures of robustness ! Stability Margins of Multivariable Linear-Quadratic Regulators 1 Scalar Transfer Function and Return Difference Function As () : Transfer Function 1 + As () ⎡ ⎣ ⎤ ⎦ : Return Difference Function • Block diagram algebra Δys () = As () Δy C s () −Δys () ⎡ ⎣ ⎤ ⎦ 1 + As () ⎡ ⎣ ⎤ ⎦ Δys () = As () Δy C s () Δys () Δy C s () = As () 1 + As () = As () 1 + As () ⎡ ⎣ ⎤ ⎦ −1 • Unit feedback control law 2
19
Embed
Singular Value Analysis of Linear- Quadratic Systems · Matrix Norms and Singular Value Analysis! Frequency domain measures of robustness! Stability Margins of Multivariable Linear-Quadratic
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Singular Value Analysis of Linear-Quadratic Systems
Robert Stengel Optimal Control and Estimation MAE 546
Princeton University, 2018
Copyright 2018 by Robert Stengel. All rights reserved. For educational use only.http://www.princeton.edu/~stengel/MAE546.html
http://www.princeton.edu/~stengel/OptConEst.html
! Multivariable Nyquist Stability Criterion! Matrix Norms and Singular Value Analysis! Frequency domain measures of robustness
! Stability Margins of Multivariable Linear-Quadratic Regulators
1
Scalar Transfer Function and Return Difference Function
12Counter-clockwise encirclements of RHP (unstable) poles
ΔCL s( )ΔOL s( )
= Im +A s( ) ! a s( )+ jb s( ) Scalar
Limits of Multivariable Nyquist Stability Criterion
• Multivariable Nyquist Stability Criterion – Indicates stability of the nominal system– In the |I + A(s)| plane, Nyquist plot depicts
the ratio of closed-to-open-loop characteristic polynomials
• However, determinant is not a good indicator for the “size” of a matrix– Little can be said about robustness– Therefore, analogies to gain and phase margins
are not readily identified13
Determinant is Not a Reliable Measure of Matrix “Size”
A1 =1 00 2
⎡
⎣⎢
⎤
⎦⎥; A1 = 2
A2 =1 1000 2
⎡
⎣⎢
⎤
⎦⎥; A2 = 2
A3 =1 1000.02 2
⎡
⎣⎢
⎤
⎦⎥; A3 = 0
• Qualitatively,– A1 and A2 have the same determinant– A2 and A3 are about the same size
14
Matrix Norms and Singular Value Analysis
15
Vector NormsEuclidean norm
x = xTx( )1/2Weighted Euclidean norm
Dx = xTDTDx( )1/2
For fixed value of ||x||, ||Dx|| provides a measure of the “size” of D
16
Spectral Norm (or Matrix Norm)Spectral norm has more than one “size”
Also called Induced Euclidean normIf D and x are complex
x = xHx( )1/2
Dx = xHDHDx( )1/2
D = maxx =1
Dx is real-valued
dim x( ) = dim Dx( ) = n ×1; dim D( ) = n × n
17
xH ! complex conjugate transpose of x= Hermitian transpose of x
Spectral Norm (or Matrix Norm)Spectral norm of D
D = maxx =1
Dx
DTD or DHD has n eigenvaluesEigenvalues are all real, as DTD is symmetric and DHD is
HermitianSquare roots of eigenvalues are called singular values
18
Singular Values of DSingular values of D
σ i D( ) = λi DTD( ), i = 1,n
Maximum singular value of D
Minimum singular value of D
σmax D( ) σ D( ) D = max
x =1Dx
σmin D( ) σ D( ) = 1 D−1 = min
x =1Dx
19
Comparison of Determinants and
Singular Values
A1 =1 00 2
⎡
⎣⎢
⎤
⎦⎥; A1 = 2
A2 =1 1000 2
⎡
⎣⎢
⎤
⎦⎥; A2 = 2
A3 =1 1000.02 2
⎡
⎣⎢
⎤
⎦⎥; A3 = 0
• Singular values provide a better portrayal of matrix “size”, but ...
• “Size” is multi-dimensional• Singular values describe
magnitude along axes of a multi-dimensional ellipsoid
At low frequencyσ Im + A jω( )⎡⎣ ⎤⎦ > σmin ω( ) > 1
At high frequency
σ Im + A−1 jω( )⎡⎣ ⎤⎦−1{ } = 1
σ Im + A−1 jω( )⎡⎣ ⎤⎦< σmax ω( )
28
Desirable “Bode Gain Criterion” Attributes
σ A jω( )⎡⎣ ⎤⎦
σ A jω( )⎡⎣ ⎤⎦
ωC1 ωC2 logω
Undesirable Region
Undesirable Region
Crossover Frequencies29
Sensitivity and Complementary Sensitivity Matrices of A(s)
S s( ) Im + A s( )⎡⎣ ⎤⎦−1
Sensitivity matrix
Inverse return difference matrix
Im + A−1 s( )⎡⎣ ⎤⎦
T s( ) A s( ) Im + A s( )⎡⎣ ⎤⎦−1
Complementary sensitivity matrix
30
Sensitivity and Complementary Sensitivity Matrices of A(s)
S jω( ) Im + A jω( )⎡⎣ ⎤⎦−1
T jω( ) A jω( ) Im + A jω( )⎡⎣ ⎤⎦−1
31
Small S(jω) implies low sensitivity to parameter variations as a function of input frequency
Small T(jω) implies low response to noise as a function of input frequency
Sensitivity and Complementary Sensitivity Matrices of A(s)
• But
S jω( ) + T jω( ) ! Im +A jω( )⎡⎣ ⎤⎦−1+A jω( ) Im +A jω( )⎡⎣ ⎤⎦
−1
= Im +A jω( )⎡⎣ ⎤⎦ Im +A jω( )⎡⎣ ⎤⎦−1
= Im
• Therefore, there is a tradeoff between robustness and noise suppression
T jω( )S jω( )
logω32
Next Time: Probability and Statistics
33
Supplemental Material
34
Alternative Criteria for Multiplicative Variations in A(s)
ΔOL s( ) : Open-loop characteristic polynomial of original systemΔOL s( ) : Perturbed characteristic polynomial of original systemΔCL s( ) : Stable closed-loop characteristic polynomial of original system
• Definitions
ΔOL jω( ) = 0{ } implies that ΔOL jω( ) = 0
for any ω on ΩR (i.e., vertical component of "D contour")α = σ Im + Ao jω( )⎡⎣ ⎤⎦ for any ω on ΩR
Lehtomaki, Sandell, Athans, 1981
35
Alternative Criteria for Multiplicative Variations in A(s)
σ L−1 jω( ) − Im⎡⎣ ⎤⎦ <α = σ Im + Ao jω( )⎡⎣ ⎤⎦And at least one of the following is satisfied:• α < 1• LH jω( ) + L jω( ) ≥ 0