Top Banner
7.3: Systems of Linear Equations, Linear Independence, Eigenvalues A system of n linear equations in n variables: can be expressed as a matrix equation Ax = b: If b = 0, then system is homogeneous; otherwise it is nonhomogeneous. n n n n n n n n b b b x x x a a a a a a a a a 2 1 2 1 , 2 , 1 , , 2 2 , 2 1 , 2 , 1 2 , 1 1 , 1 , , 2 2 , 1 1 , 2 , 2 2 2 , 2 1 1 , 2 1 , 1 2 2 , 1 1 1 , 1 n n n n n n n n n n b x a x a x a b x a x a x a b x a x a x a
27

7.3: Systems of Linear Equations, Linear Independence ...park633/ma266/Boyce_DE10_ch...7.3: Systems of Linear Equations, Linear Independence, Eigenvalues • A system of n linear equations

Jan 26, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 7.3: Systems of Linear Equations, Linear

    Independence, Eigenvalues

    • A system of n linear equations in n variables:

    can be expressed as a matrix equation Ax = b:

    • If b = 0, then system is homogeneous; otherwise it is nonhomogeneous.

    nnnnnn

    n

    n

    b

    b

    b

    x

    x

    x

    aaa

    aaa

    aaa

    2

    1

    2

    1

    ,2,1,

    ,22,21,2

    ,12,11,1

    ,,22,11,

    2,222,211,2

    1,122,111,1

    nnnnnn

    nn

    nn

    bxaxaxa

    bxaxaxa

    bxaxaxa

  • Nonsingular Case

    • If the coefficient matrix A is nonsingular, then it is invertible and

    we can solve Ax = b as follows:

    • This solution is therefore unique. Also, if b = 0, it follows that

    the unique solution to Ax = 0 is x = A-10 = 0.

    • Thus if A is nonsingular, then the only solution to Ax = 0 is the

    trivial solution x = 0.

    1 1 1 1 Ax b A xAx A b I A Ax b b

  • Example 1: Nonsingular Case (1 of 3)

    • From a previous example, we know that the matrix A below is nonsingular

    with inverse as given.

    • Using the definition of matrix multiplication, it follows that the only solution

    of Ax = 0 is x = 0:

    4/14/34/1

    4/14/74/5

    4/14/54/3

    ,

    112

    211

    3211

    AA

    0

    0

    0

    0

    0

    0

    4/14/34/1

    4/14/74/5

    4/14/54/310Ax

  • Example 1: Nonsingular Case (2 of 3)

    • Now let’s solve the nonhomogeneous linear system Ax = b below using A-1:

    • This system of equations can be written as Ax = b, where

    • Then

    0834

    2301

    220

    321

    321

    321

    xxx

    xxx

    xxx

    1

    3 / 4 5 / 4 1/ 4 7 2

    5 / 4 7 / 4 1/ 4 5 1

    1/ 4 3 / 4 1/ 4 4 1

    x A b

    4

    5

    7

    ,,

    112

    211

    321

    3

    2

    1

    bxA

    x

    x

    x

  • Example 1: Nonsingular Case (3 of 3)

    • Alternatively, we could solve the nonhomogeneous linear system Ax = b

    below using row reduction.

    • To do so, form the augmented matrix (A|b) and reduce, using elementary row

    operations.

    1

    1

    2

    1

    2

    732

    1100

    2110

    7321

    4400

    2110

    7321

    10730

    2110

    7321

    10730

    2110

    7321

    4112

    5211

    7321

    3

    32

    321

    x

    bA

    x

    xx

    xxx

    42

    52

    732

    321

    321

    321

    xxx

    xxx

    xxx

  • Singular Case

    • If the coefficient matrix A is singular, then A-1 does not exist, and either a

    solution to Ax = b does not exist, or there is more than one solution (not

    unique).

    • Further, the homogeneous system Ax = 0 has more than one solution. That is,

    in addition to the trivial solution x = 0, there are infinitely many nontrivial

    solutions.

    • The nonhomogeneous case Ax = b has no solution unless (b, y) = 0, for all

    vectors y satisfying A*y = 0, where A* is the adjoint of A.

    • In this case, Ax = b has solutions (infinitely many), each of the form x = x(0)

    + , where x(0) is a particular solution of Ax = b, and is any solution of

    Ax = 0.

    *( , ) ( , ) ( , )b y Ax y x A y

    1 2 2:

    2 4 1A X c

  • Linear Dependence and Independence

    • A set of vectors x(1), x(2),…, x(n) is linearly dependent if

    there exists scalars c1, c2,…, cn, not all zero, such that

    • If the only solution of

    is c1= c2 = …= cn = 0, then x(1), x(2),…, x(n) is linearly

    independent.

    0xxx )()2(2)1(

    1

    n

    nccc

    0xxx )()2(2)1(

    1

    n

    nccc

  • Example 3: Linear Dependence (1 of 2)

    • Determine whether the following vectors are linear dependent

    or linearly independent.

    • We need to solve or

    0

    0

    0

    1131

    112

    421

    0

    0

    0

    11

    1

    4

    3

    1

    2

    1

    2

    1

    3

    2

    1

    21

    c

    c

    c

    ccc

    0xxx )3(3)2(

    2

    )1(

    1 ccc

    11

    1

    4

    ,

    3

    1

    2

    ,

    1

    2

    1)3()2()1(

    xxx

  • Example 3: Linear Dependence (2 of 2)

    • We can reduce the augmented matrix (A|b), as before.

    • So, the vectors are linearly dependent:

    • Alternatively, we could show that the following determinant is zero.

    (Question) The columns (or rows) of A are linearly independent

    if and only if A is nonsingular ?

    1 2 3

    2 3 33

    1 2 4 0 1 2 4 0 1 2 4 0

    2 1 1 0 0 3 9 0 0 1 3 0

    1 3 11 0 0 5 15 0 0 0 0 0

    2 4 0

    3 0 where can be any

    2

    3 number

    10 0

    c c c

    c c cc

    A

    c

    b

    11

    1

    4

    ,

    3

    1

    2

    ,

    1

    2

    1)3()2()1(

    xxx

    0

    1131

    112

    421

    )det(

    ijx(1) (2) 3)

    3

    (2if 1 3,c x x x 0

  • Linear Independence and Invertibility

    • Consider the previous two examples:

    – The first matrix was known to be nonsingular, and its column vectors were

    linearly independent.

    – The second matrix was known to be singular, and its column vectors were

    linearly dependent.

    • This is true in general: the columns (or rows) of A are linearly independent iff

    A is nonsingular iff A-1 exists.

    • Also, A is nonsingular iff detA 0, hence columns (or rows) of A are

    linearly independent iff detA 0.

    • Further, if A = BC, then det(C) = det(A)det(B). Thus if the columns (or

    rows) of A and B are linearly independent, then the columns (or rows) of C are

    also.

  • Linear Dependence & Vector Functions

    • Now consider vector functions x(1)(t), x(2)(t),…, x(n)(t), where

    • As before, x(1)(t), x(2)(t),…, x(n)(t) is linearly dependent on I if there exists

    scalars c1, c2,…, cn, not all zero, such that

    • Otherwise x(1)(t), x(2)(t),…, x(n)(t) is linearly independent on I

    See text for more discussion on this.

    ,,,,2,1,

    )(

    )(

    )(

    )(

    )(

    )(

    2

    )(

    1

    Itnk

    tx

    tx

    tx

    t

    k

    m

    k

    k

    k

    x

    Ittctctc nn allfor ,)()()()()2(

    2

    )1(

    1 0xxx

    (1) (2) (n) 1 2( ) ( ) ( ) 0, C=T

    nX t X t X t C c c c

  • Eigenvalues and Eigenvectors

    • The equation Ax = y can be viewed as a linear transformation that maps (or

    transforms) x into a new vector y.

    • Nonzero vectors x that transform into multiples of themselves are important in

    many applications.

    • Thus we solve Ax = x or equivalently, (A-I)x = 0.

    • This equation has a nonzero solution if we choose such that det(A-I) = 0.

    • Such values of are called eigenvalues of A, and the nonzero solutions x are called eigenvectors.

    (Example) Find eigenvalues of

    24

    13A

  • Example 4: Eigenvalues (1 of 3)

    • Find the eigenvalues and eigenvectors of the matrix A.

    • Solution: Choose such that det(A-I) = 0, as follows.

    24

    13A

    1,2

    122

    4123

    24

    13det

    10

    01

    24

    13detdet

    2

    IA

  • Example 4: First Eigenvector (2 of 3)

    • To find the eigenvectors of the matrix A, we need to solve (A-I)x = 0 for = 2 and = -1.

    • Eigenvector for = 2: Solve

    and this implies that . So

    0

    0

    44

    11

    0

    0

    224

    123

    2

    1

    2

    1

    x

    x

    x

    x0xIA

    (2(1)

    2

    1)1

    , arbitrary choose 1

    1

    1

    xc c

    x

    x x

    21 xx

  • Example 4: Second Eigenvector (3 of 3)

    • Eigenvector for = -1: Solve

    and this implies that . So

    0

    0

    14

    14

    0

    0

    124

    113

    2

    1

    2

    1

    x

    x

    x

    x0xIA

    (2)

    1

    2)1(1

    , arbitrary choos4

    e 4

    1

    4

    xc c

    x

    xx

    12 4xx

  • Normalized Eigenvectors

    • From the previous example, we see that eigenvectors are

    determined up to a nonzero multiplicative constant.

    • If this constant is specified in some particular way, then the

    eigenvector is said to be normalized.

    • For example, eigenvectors are sometimes normalized by

    choosing the constant so that ||x|| = (x, x)½ = 1.

  • Algebraic and Geometric Multiplicity

    • In finding the eigenvalues of an n x n matrix A, we solve det(A-I) = 0.

    • Since this involves finding the determinant of an n x n matrix, the problem

    reduces to finding roots of an nth degree polynomial.

    • Denote these roots, or eigenvalues, by 1, 2, …, n.

    • If an eigenvalue is repeated m times, then its algebraic multiplicity is m.

    • Each eigenvalue has at least one eigenvector, and an eigenvalue of algebraic

    multiplicity m may have q linearly independent eigenvectors, 1 q m,

    and q is called the geometric multiplicity of the eigenvalue.

  • Eigenvectors and Linear Independence

    • If an eigenvalue has algebraic multiplicity 1, then it is said to be simple, and the geometric multiplicity is 1 also.

    • If each eigenvalue of an n x n matrix A is simple, then A has n distinct

    eigenvalues. It can be shown that the n eigenvectors corresponding to these

    eigenvalues are linearly independent.

    • If an eigenvalue has one or more repeated eigenvalues, then there may be

    fewer than n linearly independent eigenvectors since for each repeated

    eigenvalue, we may have q < m. This may lead to complications in solving

    systems of differential equations.

  • Example 5: Eigenvalues (1 of 5)

    • Find the eigenvalues and eigenvectors of the matrix A.

    • Solution: Choose such that det(A-I) = 0, as follows.

    011

    101

    110

    A

    1,1,2

    )1)(2(

    23

    11

    11

    11

    detdet

    221

    2

    3

    IA

  • Example 5: First Eigenvector (2 of 5)

    • Eigenvector for = 2: Solve (A-I)x = 0, as follows.

    1

    1

    1

    choosearbitrary,

    1

    1

    1

    00

    011

    011

    0000

    0110

    0101

    0000

    0110

    0211

    0330

    0330

    0211

    0112

    0121

    0211

    0211

    0121

    0112

    )1(

    3

    3

    3

    )1(

    3

    32

    31

    xx cc

    x

    x

    x

    x

    xx

    xx

  • Example 5: 2nd and 3rd Eigenvectors (3 of 5)

    • Eigenvector for = -1: Solve (A-I)x = 0, as follows.

    1

    1

    0

    ,

    1

    0

    1

    choose

    arbitrary,where,

    1

    0

    1

    0

    1

    1

    00

    00

    0111

    0000

    0000

    0111

    0111

    0111

    0111

    )3()2(

    3232

    3

    2

    32

    )2(

    3

    2

    321

    xx

    x xxxx

    x

    x

    xx

    x

    x

    xxx

  • Example 5: Eigenvectors of A (4 of 5)

    • Thus three eigenvectors of A are

    where x(2), x(3) correspond to the double eigenvalue = - 1.

    • It can be shown that x(1), x(2), x(3) are linearly independent.

    • Hence A is a 3 x 3 symmetric matrix (A = AT ) with 3 real eigenvalues

    and 3 linearly independent eigenvectors.

    1

    1

    0

    ,

    1

    0

    1

    ,

    1

    1

    1)3()2()1(

    xxx

    011

    101

    110

    A

  • Example 5: Eigenvectors of A (5 of 5)

    • Note that we could have we had chosen

    • Then the eigenvectors are orthogonal, since

    • Thus A is a 3 x 3 symmetric matrix with 3 real eigenvalues and 3 linearly

    independent orthogonal eigenvectors.

    1

    2

    1

    ,

    1

    0

    1

    ,

    1

    1

    1)3()2()1(

    xxx

    0,,0,,0, )3()2()3()1()2()1( xxxxxx

  • Hermitian Matrices

    • A self-adjoint, or Hermitian matrix, satisfies A = A*, where we recall that

    A* = AT .

    • Thus for a Hermitian matrix, aij = aji.

    • Note that if A has real entries and is symmetric (see last example), then A is

    Hermitian.

    • An n x n Hermitian matrix A has the following properties:

    – All eigenvalues of A are real.

    – There exists a full set of n linearly independent eigenvectors of A.

    – If x(1) and x(2) are eigenvectors that correspond to different eigenvalues of A, then

    x(1) and x(2) are orthogonal.

    – Corresponding to an eigenvalue of algebraic multiplicity m, it is possible to

    choose m mutually orthogonal eigenvectors, and hence A has a full set of n

    linearly independent orthogonal eigenvectors.

  • Example 2: Singular Case (1 of 2)

    • Solve the nonhomogeneous linear system Ax = b below using row reduction.

    Observe that the coefficients are nearly the same as in the previous example

    • We will form the augmented matrix (A|b) and use some of the steps in

    Example 1 to transform the matrix more quickly

    03

    30

    32

    3000

    110

    321

    312

    211

    321

    321

    321

    2132

    1321

    321

    21

    1

    3

    2

    1

    bbb

    bbb

    bbxx

    bxxx

    bbb

    bb

    b

    b

    b

    b

    bA

    3321

    2321

    1321

    32

    2

    32

    bxxx

    bxxx

    bxxx

  • Example 2: Singular Case (2 of 2)

    • From the previous slide, if , there is no solution to the system

    of equations

    • Requiring that , assume, for example, that

    • Then the reduced augmented matrix (A|b) becomes:

    • It can be shown that the second term in x is a solution of the nonhomogeneous

    equation and that the first term is the most general solution of the homogeneous

    equation, letting , where α is arbitrary

    0

    3

    4

    1

    1

    1

    3

    4

    00

    3

    232

    3000

    110

    321

    3

    3

    3

    3

    32

    321

    321

    21

    1

    x

    x

    x

    x

    xx

    xxx

    bbb

    bb

    b

    xx

    03 321 bbb

    5,1,2 321 bbb

    03 321 bbb

    3321

    2321

    1321

    32

    2

    32

    bxxx

    bxxx

    bxxx

    3x