Professional Documents
Culture Documents
Fall 2014
Lecture 8
Content: Basic theory of systems of first order linear equations. Homogeneous
linear systems with constant coefficients.
Suggested Problems:
7.4: 1, 5, 8, 9
7.5: 11, 12, 13, 15, 17, 30, 32
Linear Independence
v1 = 3 ,
6
v2 = 4 ,
5
5
v3 = 0
9
c1 + 3c2 + 5c3 = 0
3c1 + 4c2 = 0
6c1 + 5c2 9c3 = 0
1 3 5 | 0
1 3
5
3 4 0 | 0 3R1 + R2 R2 , 6R1 + R3 R3 0 5 15
6 5 9 | 0
0 13 39
1 3
5
3
R2 /5 R2 0 1
0 13 39
1 3 5
13R2 + R3 R3 0 1 3
0 0 0
| 0
| 0
| 0
| 0
| 0
| 0
| 0
| 0
| 0
Hence c3 is free. In particular there are solutions of the equation other than c1 =
c2 = c3 = 0. This means that {v1 , v2 , v3 } is linearly dependent.
We can make a similar definition for a set of functions:
Definition 1.3 We say that a set of functions {f1 (t), . . . , fn (t)} is linearly independent if the equation c1 f1 (t) + . . . + cn fn (t) = 0 for all t implies c1 = c2 = . . . =
cn = 0.
2
t = 0 c1 + c2 + c3 = 0
c2
t = 1 c1 e + + c3 = 0
e
c1
t = 1
+ c2 e + c3 = 0
e
Now let us turn this into matrix form and row reduce.
1
1 1 | 0
1
1
1
e 1/e 1 | 0 eR1 + R2 R2 , 1 R1 + R3 R3 0 1/e e 1 e
e
1/e e 1 | 0 0 e 1/e 1 1/e
1
1
1
1e
R2 + R3 R3 0 1/e e
0
0
2 e 1/e
| 0
| 0
| 0
| 0
| 0
| 0
We can row reduce further but it is clear at this point that we will get a leading
1 in each column. Therefore there are no free variables, and the only solution is
c1 = c2 = c3 = 0. Hence the set is linearly independent.
There is another elegant way to test a set of functions for independence if the
functions are differentiable enough times. Suppose that f1 , . . . , fn are functions of
t which are differentiable n 1 times. In order to test them for independence we
start from the equation
c1 f1 (t) + c2 f2 (t) + . . . + cn fn (t) = 0
which holds for all t. Differentiate this equation repeatedly in order to get a linear
system:
c1 f 1 + c2 f 2 + . . . + cn f n
c1 f10 + c2 f20 + . . . + cn fn0
(n1)
c1 f 1
(n1)
+ c2 f 2
+ . . . + cn fn(n1)
= 0
= 0
...
= 0
f1
f2
0
f1
f20
(n1)
(n1)
f1
f2
...
fn
c1
0
c2 0
...
fn0
=
. . . . . .
...
(n1)
cn
0
. . . fn
...
fn
...
fn0
.
...
(n1)
. . . fn
rt
r2 t
r3 t
e1
e
e
W (er1 t , er2 t , er3 t ) = r1 er1 t r2 er2 t r3 er3 t
r12 er1 t r22 er2 t r32 er3 t
1 1 1
= er1 t er2 t er3 t r1 r2 r3
r12 r22 r32
1
0
0
= e(r1 +r2 +r3 )t r1 r2 r1 r3 r1
r12 r22 r12 r32 r12
1
0
0
1
1
= e(r1 +r2 +r3 )t (r2 r1 )(r3 r1 ) r1
2
r r2 + r1 r3 + r1
1
1
0
0
(r1 +r2 +r3 )t
1
0
=e
(r2 r1 )(r3 r1 ) r1
r12 r2 + r1 r3 r2
1
0
0
(r1 +r2 +r3 )t
1
0
=e
(r2 r1 )(r3 r1 )(r3 r2 ) r1
r12 r2 + r1 1
= e(r1 +r2 +r3 )t (r2 r1 )(r3 r1 )(r3 r2 ).
Since r1 , r2 , r3 are distinct the Wronskian is not (actually never) zero. Therefore
{er1 t , er2 t , er3 t } is linearly independent.
Exercise: Show that {er1 t , . . . , ern t } is linearly independent if r1 , r2 , . . . , rn are distinct.
Let us first consider the case of homogenous systems of first order linear ODEs,
namely, systems of the form:
5
dx
= A(t)x.
dt
(1)
Theorem 2.1 (Principle of superposition) Suppose that x(1) , x(2) are solutions of
(1). Then any linear combination c1 x(1) + c2 x(2) is also solution where c1 , c2 are
constants.
Proof Put c1 x(1) + c2 x(2) in (1) and see if it works:
dx(1)
dx(2)
d
(c1 x(1) + c2 x(2) ) = c1
+ c2
dt
dt
dt
(1)
= c1 A(t)x + c2 A(t)x(2)
= A(t)(c1 x(1) + c2 x(2) )
dx
= Ax,
dt
0
. . .
0
x(t0 ) =
1
0
. . .
0
where the only 1 in the vector above is at the ith position. By the existence-uniqeness
theorem, this initial value problem has a unique solution x(i) .
6
Theorem 2.2 The set of vector functions {x(1) , x(2) , . . . , x(n) } is linearly independent.
Solution: Suppose that c1 x(1) +c2 x(2) +. . .+cn x(n) = 0. This equation is equivalent
to the linear system
x(1) | x(2)
c
1
| . . . | x(n) . . . = 0
cn
Thus,
situation the natural
substitute for the Wronskian is W (x(1) , . . . , x(n) ) =
(1) in this
(2)
(n)
x
| x
| . . . | x . Evaluating the Wronskian at t0 , we get,
1 0 . . . 0
0 1 . . . 0
(1)
(n)
=1
W (x , . . . , x )(t0 ) =
.
.
.
0 0 . . . 1
Therefore, {x(1) , x(2) , . . . , x(n) } is linearly independent.
Theorem 2.3 Every solution of (1) can be written as a linear combination of
x(1) , x(2) , . . . , x(n) (in a unique way).
k1
Proof Suppose that x is an arbitrary solution of (1). Say x(t0 ) = . . .. Then
kn
(1)
(2)
(n)
k1 x + k2 x + . . . + kn x and x have the same value at t0 and they are both
solutions of (1). Therefore, by the existence-uniqueness theorem,
The results above say that {x(1) , x(2) , . . . , x(n) } is a basis for the space of solutions.
This basis is not unique. We can use the following results from linear algebra to test
whether or not a given set is a basis.
Theorem 2.4 (1) Any two bases for the same solution space have the same number
of elements. In particular, if A is an nn matrix, then any basis for for the solution
space has n elements.
(2) Any linearly independent set containing n solutions is a basis. (Under these
conditions, every solution is a linear combination of these)
These results imply that for an n n linear, homogenous system, it suffices to find
n linearly independent solutions. Then every other solution is a linear combination
of these.
dx
dvet
=
dt
dt
= vet
dx
dt
= Ax = Avet . So,
Avet = vet
Av = v
2 1
We start by finding the eigenvalues and eigenvectors of A =
.
1 2
2
1
det(A I) =
1
2
= (2 )2 1
= (3 )(1 )
1 1 | 0
1 1 | 0
R R
1 1 | 0 12 1 1 | 0
1 1 | 0
R1 + R2 R2
0 0 | 0
1
The eigenvectors for 1 = 3 are then v = k
with k =
6 0. For this pair, we can
1
1 3t
write the solution x(1) (t) =
e .
1
Next, let us look at eigenvectors for 2 = 1:
9
1 1 | 0
1 1 | 0
R + R2 R2
1 1 | 0 1
0 0 | 0
1
Therefore the eigenvectors for 2 = 1 are v = k
. For this pair, we can write
1
1 t
the solution x(2) (t) =
e.
1
Is the set {x(1) , x(2) } linearly independent? We can look at their Wronskian
(1)
W (x
(2)
,x
3t
e et
= 2e4t 6= 0
= 3t
e
et
Therefore the set is linearly independent. This implies that all solutions of the system
are
3t
t
e
e
x = c1 3t + c2
e
et
with c1 , c2 R.
10