SlideShare una empresa de Scribd logo
1 de 14
Descargar para leer sin conexión
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM
ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
Abstract. Given that every matrix has a Jordan canonical form, so does any
power of a matrix. Using the Z-transform, this article will demonstrate the
unique form of such a matrix, and the properties that pertain to it.
1. Introduction
In 1986, Schmidt wrote a paper [S86] on canonical forms of complex-valued
matrices. In his paper, he gave a proof of the Jordan decomposition of a matrix
by using the Laplace transform. This method of proof only involves elementary
calculus and an in-depth study of the matrix equation eA(s+t)
= eAs
eAt
, as opposed
to more advanced topics like other known proofs. In this paper, we prove a similar
result, written below and proven in Theorem 5.3, using only the Z-transform and
a careful study of the matrix equation Ak+`
= A`
Ak
.
Theorem 1.1. Let A be an n ⇥ n matrix with characteristic polynomial cA(z) =
(z 1)m1
· · · (z r)mr
. Then, there exists a unique decomposition
Ak
=
rX
i=1
k
i Pi +
mi 1X
q=1
Nq
i 'q, i
(k)
with the following properties
(1) PiNi = NiPi = Ni
(2) PiPj = 0 if i 6= j
PiPi = Pi
(3) Nmi
i = 0
(4) PiNj = NjPi = 0 if i 6= j
(5) A =
Pr
i=1 iPi + Ni
(6) I =
Pr
i=1 Pi.
Properties (2), (3), and (5) together state that any matrix A can be written in
a simple way as a combination of projections and nilpotents. This leads us to the
following definitions and example of the idea.
Definition 1.2. A projection is any square matrix P such that P2
= P.
Definition 1.3. A nilpotent matrix is a nonzero square matrix N such that
Nk
= 0 for some positive integer k.
Example 1.4. Consider the matrix
A =

1 4
1 3
.
Notice that we have the following decomposition
A =
1
2 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
Let us build some of the needed prerequisites.
2. The Z-Transform
The Z-transform is a useful operator that often comes up as a method to solve
di↵erence equations, but we will only use some basic properties of it. Here is its
definition.
Definition 2.1. Let y(k) be a sequence of complex numbers. We define the Z-
transform of y to be the function Z{y}(z), where z is a complex variable, by the
following formula:
Z{y}(z) =
1X
k=0
y(k)
zk
.
There are some interesting functions which we are most interested in applying
the Z-transform to. These ' functions will come up in all that follow. Before we
define them, we need to understand the falling factorial.
Definition 2.2. Let n be a nonnegative integer. The falling factorial is the
sequence kn
, with k= 0,1,2,..., given by the following formula
kn
= k(k 1)(k 2)...(k n + 1).
If k were allowed to be a real variable, then kn
could be characterized as the unique
monic polynomial of degree n that vanishes at 0, 1, ..., n 1.
Remark 2.3. We can recover the familiar factorial from the falling factorial. Notice
that kn
|k=n= n!.
Let us take a look at an example to illustrate how falling factorials work.
Example 2.4. We will compute 64
. Using the definition above, we see that
64
= 6(6 1)(6 2)(6 3) = 6(5)(4)(3) = 360.
Now that we have defined the falling factorial, we can give the definition of the
' functions.
Definition 2.5. Let a be a complex number and n be a nonnegative integer. The
following sequences ' functions are sequences defined as such,
'n,a(k) =
(
ak n
kn
n! a 6= 0
n(k) a = 0,
where n(k) is the sequence which is 0 for all k 6= n and n(n) = 1.
Since these functions are so important to everything that follows, we will compute
a few examples of them below.
Example 2.6.
'0,2(k) = 2k
= (1, 2, 4, 8, 16, ...)
'1,2(k) = 2k 1
k = (0, 1, 4, 12, 32, ...)
'2,0(k) = 2(k) = (0, 0, 1, 0, 0, 0, 0...)
It turns out that the Z-transform of the ' functions is particularly useful for our
purposes. This result is written below (see Proposition 5 of [T12] for the proof).
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 3
Proposition 2.7. Let a 2 C and n 2 N. Then,
Z{'n,a(k)}(z) =
z
(z a)n+1
.
3. Properties of a Z-Transform
Suppose a is a non-zero complex number, n 2 N = {0, 1, 2...} and y(k) is a se-
quence for which the Z -transform exists. Then
(1) Z{ak
}(z) = z
z a
(2) Z{ak
y(k)}(z) = Y (z
a )
(3) Z{y(k + n)}(z) = zn
Y (z)
Pn 1
m=0 y(m)zn m
(translation principle)
(4) Z{(k + n 1)n
y(k)}(z) = ( 1)n
zn
Dn
Y (z)
(5) Z{kn
}(z) = n!z
(z 1)n+1
Proposition 3.1. Let A be an n ⇥ n matrix with complex entries. Let cA(z) =
det(zI A) be the characteristic polynomial. Assume a1, . . . , aR are distinct roots
with corresponding multiplicities M1, . . . , MR. Then for each r, 1  r  R, and m,
0  m  Mr 1, there are n ⇥ n matrices Br,m such that
Ak
=
RX
r=1
Mr 1X
m=0
Br,m'm,ar (k)
Proof. Our assumptions imply that we can factor cA in the following way:
cA(z) =
RY
r=1
(z ar)Mr
.
The (i, j) entry of (zI A) 1
is of the form
pi,j (z)
cA(z) where pi,j(z) is some poly-
nomial with degree less than n. Using partial fractions, we can write
pi,j(z)
cA(z)
=
RX
r=1
Mr 1X
m=0
br,m(i, j)
(z ar)m+1
where br,m 2 C. It follows then that
z(zI A) 1
=
RX
r=1
Mr 1X
m=0
zBr,m
(z ar)m+1
where Br,m is the n ⇥ n matrix whose (i, j) entry is br,m(i, j) for each pair (r, m).
4 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
By Corollary 8 and Proposition 5 we get
Ak
=
RX
r=1
Mr 1X
m=0
Br,m'm,ar (k).
⇤
:
4. Examples
Example 4.1. Let
A =

3 2
1 0
where
Ak
=
RX
r=0
Mr 1X
m=0
Br,m'm,ar
(k)
and
'm,ar
(k) =
ak m
r km
m!
.
Note: km
is known as the falling factorial where km
= k(k 1)(k 2) · · · (k
m + 1). We must first find the characteristic polynomial CA(z) = det(zI A):
CA(z) = det

z 3 2
1 z
= z2
3z + 2
= (z 2)(z 1) = 0.
From the characteristic polynomial, we can find the eigenvalues ar and the multi-
plicities Mr. In this example there are two eigenvalues a1 = 2 and a2 = 1 with
multiplicities M1 = 1 and M2 = 1, so the Ak
equation becomes
Ak
=
RX
r=0
Mr 1X
m=0
Br,m'm,ar
(k)
=
2X
r=1
0X
m=0
B(1,0)'(0,2)(k) + B(2,0)'(0,1).
Let us define M = B(1,0) and N = B(2,0) for simplicity, so the equation now
becomes
Ak
= M'(0,2)(k) + N'(0,1)(k).
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 5
Similarly,
A`
=
RX
r=0
Mr 1X
m=0
Br,m'm,ar (`)
=
2X
r=1
0X
m=0
B(1,0)'(0,2)(k) + B(2,0)'(0,1)
= M'(0,2)(l) + N'(0,1)(`)
and
Ak+`
=
RX
r=0
Mr 1X
m=0
Br,m'm,ar
(k + `)
=
2X
r=1
0X
m=0
B(1,0)'(0,2)(k) + B(2,0)'(0,1)
= M'(0,2)(k + `) + N'(0,1)(k + `).
Now we can substitute these into the equation
Ak
+ A`
= Ak+`
to get:
M'(0,2)(k+`)+N'(0,1)(k+`) =
⇥
M'(0,2)(k) + N'(0,1)(k)
⇤ ⇥
M'(0,2)(`) + N'(0,1)(`)
⇤
.
After expanding the right side, the equation becomes:
M'(0,2)(k + `) + N'(0,1)(k + `) = M2
'(0,2)(k + `) + M'(0,2)(k)N'(0,1)(`)
+ N'(0,1)(k)M'(0,2)(l) + N2
'(0,1)(k + `)
(M'(0,2)(k))'(0,2)(l) + (N'(0,1)(k))'(0,1)(`) = (M2
'(0,2)(k) + NM'(0, 1)(k))'(0,2)(`)
+ (MN'(0,2)(k) + N2
'(0,1)(k))'(0,1)(`)
**Note that z-transform is 1-1 (one to one) and linear, and '0
s are the same so the
terms are linear independent**
M'(0,2)(k) = M2
'(0, 2)(k) + NM'(0, 1)(k)
N'(0, 1)(k) = MN'(0, 2)(k) + N2
'(0, 1)(k)
Linear independence is applied a second time, thus
M2
= M, N2
= N, NM = 0, and MN = 0
M and N are projections
6 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
Example 4.2. Let
A =

1 4
1 3
where
Ak
=
RX
r=1
Mr 1X
m=0
Br,m'm,ar (k)
and
'm,ar (k) =
ak m
r km
m!
.
Note: km
is known as the falling factorial where km
= k(k 1)(k 2) · · · (k
m + 1).
We must first find the characteristic polynomial cA(z) = det(zI A):
cA(z) = det(zI A)
=

z 1 4
1 z + 3
= (z 1)(z + 3) ( 4)
= z2
+ 2z + 1
= (z + 1)2
From the characteristic polynomial, we can find the eigenvalues ar and the multiplic-
ities Mr. In this example there is only one eigenvalue a1 = 1 and its multiplicity
is M1 = 2, so the Ak
equation becomes
Ak
=
1X
r=1
1X
m=0
Br,m'm,ar
(k)
= B1,0'0, 1(k) + B1,1'1, 1(k).
Let us define M = B1,0 and N = B1,1 for simplicity, so the equation then be-
comes
Ak
= M'0, 1(k) + N'1, 1(k).
Similarly
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 7
A`
=
1X
r=1
1X
m=0
Br,m'm,ar (`)
= B1,0'0, 1(`) + B1,1'1, 1(`)
= M'0, 1(`) + N'1, 1(`)
and
Ak+`
=
1X
r=1
1X
m=0
Br,m'm,ar
(k + `)
= B1,0'0, 1(k + `) + B1,1'1, 1(k + `)
= M'0, 1(k + `) + N'1, 1(k + `).
Now we can substitute these into the equation
Ak
A`
= Ak+`
to get:
[M'0, 1(k)+N'1, 1(k)][M'0, 1(`)+N'1, 1(`)] = M'0, 1(k+`)+N'1, 1(k+`)
After expanding the left side of the equation we get:
[M'0, 1(k) + N'1, 1(k)][M'0, 1(`) + N'1, 1(`)]
= M2
'0, 1(k)'0, 1(`) + MN'0, 1(k)'1, 1(`)
+ NM'1, 1(k)'0, 1(`) + N2
'1, 1(k)'1, 1(`).
Since
'm,ar
(k) =
ak m
r km
m!
,
the left side becomes the following equation:
8 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
M2
( 1)k
( 1)`
+ MN`( 1)k
( 1)` 1
+ NMk( 1)k 1
( 1)`
+ N2
k`( 1)k 1
( 1)` 1
= M2
( 1)k+`
+ MN`( 1)k+` 1
+ NMk( 1)k+` 1
+ N2
k`( 1)k+` 2
Therefore
Ak
A`
= Ak+`
implies
M'0, 1(k + `) + N'1, 1(k + `) = M2
( 1)k+`
+ MN`( 1)k+` 1
+ NMk( 1)k+` 1
+ N2
k`( 1)k+` 2
.
Since we know that the set {'m,ar (k)} is a linearly independent set of functions,
we conclude that
M2
= M
MN = NM = N
N2
= 0
Hence, M is a projection and N is nilpotent.
Example 4.3. Now let A be the 3 ⇥ 3 matrix
A =
2
4
10 2 9
12 4 9
8 4 4
3
5 .
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 9
Like the previous examples, we must find the characteristic polynomial for A first:
cA(z) = det
2
4
z 10 2 9
12 z + 4 9
8 4 z + 4
3
5
= z3
2z2
4z + 8
= (z 2)2
(z + 2).
We know
Ak
=
RX
r=1
Mr 1X
m=0
Br,m'm,ar (k)
where ar is the eigenvalues a1 = 2 and a2 = 2 and there multiplicities Mr are
M1 = 2 and M2 = 1.
So Ak
becomes:
Ak
= B1,0'0,2(k) + B1,1'1,2(k) + B2,0'0, 2(k).
We will say M = B1,0, N = B1,1 and P = B2,0 for simplicity, so the equation
becomes
Ak
= M'0,2(k) + N'1,2 + P'0, 2(k).
Similarly
A`
= B1,0'0,2(`) + B1,1'1,2(`) + B2,0'0, 2(`)
= M'0,2(`) + N'1,2(`) + P'0, 2(`)
and
Ak+`
= B1,0'0,2(k + `) + B1,1'1,2(k + `) + B2,0'0, 2(k + `)
= M'0,2(k + `) + N'1,2(k + `) + P'0, 2(k + `).
We can now substitute the three equations into the original equation
Ak
A`
= Ak+`
to get:
[M'0,2(k)+N'1,2(k)+P'0, 2(k)][M'0,2(`)+N'1,2(`)+P'0, 2(`)] = M'0,2(k+l)+N'1,2(k+l)+P'0, 2(k+l).
After expanding the left side of the equation we get:
10 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
M'0,2(k + `) + N'1,2(k + `) + P'0, 2(k + `) = M2
'0,2(k)'0,2(`) + MN'0,2(k)'1,2(`)
+ MP'0,2(k)'0, 2(`) + NM'1,2(k)'0,2(`)
+ N2
'1,2(k)'1,2(`) + NP'1,2(k)'(0, 2)(`)
+ PM'0, 2(k)'0,2(`) + PN'0, 2(k)'1,2(`).
+ P2
'0, 2(k)'0, 2(`)
Since we know
'm,ar =
ak m
r km
m!
the equation becomes
M(2)k+`
+ N(k + `)2k+`
+ P( 2)k+`
= M2
2k+`
+ MN`2k+` 1
+ MP(2)k
( 2)`
+ NMk2k+` 1
+ N2
k`2k+` 2
+ NPk(2)k 1
( 2)`
+ PM( 2)k
(2)`
+ PN`( 2)k
(2)`
+ P2
( 2)k+`
By Linear Independence of 'm,ar we can conclude,
M2
= M
P2
= P
N2
= 0
MP = PM = 0
MN = NM = N
PN = NP = 0
Therefore, M and P are both projections and N is nilpotent.
5. Linear Independence
Theorem 5.1. Let A 2 Mn(C). Let the characteristic polynomial of A,
cA(s) = (s 1)m1
(s 2)m2
· · · (s n)mn
Then the set
B = {'0, 1 , '1, 1 , · · · , 'm1, 1 , · · · , 'mn, n }
is linearly independent.
Proof. Because the Z transform is an injective linear transformation, it preserves
linear independence.
Therefore, if Z(B) is linearly independent, so is B
By the definition of the Z transform,
Z(B) = {
z
z 1
,
z
(z 1)2
, · · ·
z
(z 1)m1
, · · · ,
z
(z n)mn
}
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 11
Let c1, c2, · · · cl be constants, such that
c1z
z 1
+
c2z
(z 1)2
+ · · · +
cm1
z
(z 1)m1
+ · · · +
clz
(z n)mn
= 0
Rewrite this as
z · p1(z)
(z 1)m1
+ · · · +
z · pn(z)
(z n)mn
by making common denominators. Then, take the limit as z approaches 1
Since the sum above is identically zero, it’s limit must be zero everywhere. This
means, in particular, that
lim
z! 1
z · p1(z)
(z 1)m1
is finite, because the limit as z approaches 1 is finite for all other terms in the
sum, and must add up to zero.
This implies that p1(z) = 0. Since
p1(z) = c1(z 1)mn 1
+ c2(z 1)mn 2
+ · · · cmn
c1 is the only term that has z raised to the mn 1 power, so c1 = 0.
Reapply this logic to conclude that ci = 0, 81  i  l Therefore, Z(B) is linearly
independent, which implies that B is linearly independent. ⇤
Theorem 5.2. Let x, y 2 Z, n 2 N. Then
(5.1) (x + y)n
=
nX
k=0
✓
n
k
◆
xk
yn k
Proof. This proof relies on the fact that xk+1
= x · (x 1)k
We will do an inductive argument on n.
When n = 0, this equation reduces to 1 = 1, which is true.
Therefore, assume there exists an n 2 N such that (5.1) holds, and show that this
implies (5.1) holds for n + 1.
(5.2)
n+1X
k=0
✓
n + 1
k
◆
xk
yn+1 k
=
nX
k=0
✓
n + 1
k
◆
xk
yn+1 k
+ xn+1
Because we can just take out the last term of the sum.
Next, we will use Pascal’s Rule (which can be found in [KP01] or most any treatise
on combinatorics), which states that
✓
n + 1
k
◆
=
✓
n
k 1
◆
+
✓
n
k
◆
Apply this to the right hand side of (5.2), to get
(5.3)
n+1X
k=0
✓
n + 1
k
◆
xk
yn+1 k
=
nX
k=0
✓✓
n
k 1
◆
+
✓
n
k
◆◆
xk
yn+1 k
+ xn+1
This is the same as writing
(5.4)
n+1X
k=0
✓
n + 1
k
◆
xk
yn+1 k
=
nX
k=0
✓
n
k 1
◆
xk
yn+1 k
+
nX
k=0
✓
n
k
◆
xk
yn+1 k
+xn+1
12 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
But the first term on the right hand side of (5.4) is zero, so we can shift our index
and keep the same answer. This gives us
(5.5)
n+1X
k=0
✓
n + 1
k
◆
xk
yn k
=
n 1X
k=0
✓
n
k
◆
xk+1
yn+1 k
+
nX
k=0
✓
n
k
◆
xk
yn+1 k
+ xn+1
Add the term that was split o↵ back in to get
(5.6)
n+1X
k=0
✓
n + 1
k
◆
xk
yn k
=
nX
k=0
✓
n
k
◆
xk+1
yn+1 k
+
nX
k=0
✓
n
k
◆
xk
yn+1 k
Now we can use the fact that was mentioned at the start of the proof, namely that
xk+1
= x · (x 1)k
.
Apply this to the right hand side of (5.6) to get
(5.7)
n+1X
k=0
✓
n + 1
k
◆
xk
yn k
= x ·
nX
k=0
✓
n
k
◆
(x 1)k
yn+1 k
+ y ·
nX
k=0
✓
n
k
◆
xk
yn k
Now we can invoke our inductive hypothesis, and get that
(5.8)
n+1X
k=0
✓
n + 1
k
◆
xk
yn k
= x · (x + y 1)n
+ y · (x + y 1)n
The right hand side of (5.8) is the same as (x + y)(x + y 1)n
, so using the fact
mentioned at the start of the proof again, we get that
(5.9)
n+1X
k=0
✓
n + 1
k
◆
xk
yn k
= (x + y)n+1
This is what we were trying to show, so by the principle of mathematical induc-
tion, (5.1) is true. ⇤
Theorem 5.3. Let A be an n ⇥ n matrix with characteristic polynomial cA(z) =
(z 1)m1
· · · (z r)mr
. Then, there exists a decomposition
Ak
=
rX
i=1
k
i Pi +
mi 1X
q=1
Nq
i 'q, i
(k)
with the following properties
(1) PiNi = NiPi = Ni
(2) PiPj = 0 if i 6= j
PiPi = Pi
(3) Nmi
i = 0
(4) PiNj = NjPi = 0 if i 6= j
(5) A =
Pr
i=1 iPi + Ni
(6) I =
Pr
i=1 Pi.
Proof. This proof is essentially a careful study of the equation Ak+l
= Al
Ak
. From
Proposition 9 of [T12], we have a canonical decomposition of the matrix power
Ak+l
=
rX
i=1
Mi 1X
j=0
Mi,j'j, i
(k + l).
JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 13
Substituting the definition 'j, i (k +l) =
Pj
a=0
j
a
k j+l
i kj a
la
j! , the right-hand side
becomes
rX
i=1
1X
j=0
jX
a=0
Mi,j
✓
j
a
◆ k j+l
i kj a
la
j!
.
Swapping the inner two sums and making the appropriate changes to their limits,
we have
rX
i=1
1X
a=0
1X
j=a
Mi,j
✓
j
a
◆ k j+l
i kj a
la
j!
.
Letting j = j + a, rearranging, and simplifying, we get
rX
i=1
1X
a=0
1X
j=0
Mi,j+a
k j
i kj l a
i la
j!a!
.
Now, let k j
i =
Pr
b=1
k j
b i(b). Substituting into the above, our expression for
Ak+l
becomes
rX
i=1
1X
a=0
rX
b=1
1X
j=0
Mi,j+a i(b)
k j
b kj l a
i la
j!a!
.
Using the definition of the ' functions, we can rewrite this as
rX
i=1
1X
a=0
rX
b=1
1X
j=0
Mi,j+a i(b)'j, b
(k)'a, i (l).
Now we turn our attention to Al
Ak
. Using Proposition 9 of [T12], we have
Al
Ak
=
rX
i=1
1X
a=0
Mi,a'a, i
(l) ·
rX
b=1
1X
j=0
Mb,j'j, b
(k)
=
rX
i=1
1X
a=0
rX
b=1
1X
j=0
Mi,aMb,j'j, b
(k)'a, i
(l).
Now, we set equal our expressions for Ak+l
and Al
Ak
:
rX
i=1
1X
a=0
rX
b=1
1X
j=0
Mi,j+a i(b)'j, b
(k)'a, i
(l) ==
rX
i=1
1X
a=0
rX
b=1
1X
j=0
Mi,aMb,j'j, b
(k)'a, i
(l)
Invoking the linear independence of the ' functions given in Theorem 5.1, we have
a collection of equations
rX
b=1
1X
j=0
Mi,j+a i(b)'j, b
(k) =
rX
b=1
1X
j=0
Mi,aMb,j'j, b
(k),
one for each i and a. Using Theorem 5.1 once again on each of these equations, we
conclude that
Mi,j+a i(b) = Mi,aMb,j.
The six properties in the statement of the theorem come from a careful analysis
of this equation. Set Pi = Mi,0 and Ni = Mi,1. Notice that
PiNi = Mi,0Mi,1 = Mi,1 i(i) = Ni and NiPi = Mi,1Mi,0 = Mi,1 i(i) = Ni,
proving (1). ⇤
14 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED
References
[KP01] Walter G. Kelley and Allan C. Peterson, Di↵erence Equations: An Introduction with
Applications, Academic Press, San Diego, 2001.
[S86] E. J. P. Georg Schmidt, An Alternative Approach to Canonical Forms of Matrices, American
Mathematical Monthly 93 (1986), no. 3, 176-184.
[T12] Casey Tsai, The Cayley-Hamilton Theorem via the Z-Transform, Rose-Hulman Under-
graduate Mathematics Journal 13 (2012), no. 2, 44-52.
SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana
E-mail address: aage@xula.edu
SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana
E-mail address: thomas.credeur@gmail.com
SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana
E-mail address: tdirle@go.olemiss.edu
SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana
E-mail address: lar257@msstate.edu

Más contenido relacionado

La actualidad más candente

Solovay Kitaev theorem
Solovay Kitaev theoremSolovay Kitaev theorem
Solovay Kitaev theoremJamesMa54
 
Solving the energy problem of helium final report
Solving the energy problem of helium final reportSolving the energy problem of helium final report
Solving the energy problem of helium final reportJamesMa54
 
The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...
The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...
The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...ijtsrd
 
Solution to schrodinger equation with dirac comb potential
Solution to schrodinger equation with dirac comb potential Solution to schrodinger equation with dirac comb potential
Solution to schrodinger equation with dirac comb potential slides
 
How to Solve a Partial Differential Equation on a surface
How to Solve a Partial Differential Equation on a surfaceHow to Solve a Partial Differential Equation on a surface
How to Solve a Partial Differential Equation on a surfacetr1987
 
Introduction to Calculus of Variations
Introduction to Calculus of VariationsIntroduction to Calculus of Variations
Introduction to Calculus of VariationsDelta Pi Systems
 
Jacobi iterative method
Jacobi iterative methodJacobi iterative method
Jacobi iterative methodLuckshay Batra
 
Fixed Point Theorem in Fuzzy Metric Space Using (CLRg) Property
Fixed Point Theorem in Fuzzy Metric Space Using (CLRg) PropertyFixed Point Theorem in Fuzzy Metric Space Using (CLRg) Property
Fixed Point Theorem in Fuzzy Metric Space Using (CLRg) Propertyinventionjournals
 
線形回帰モデル
線形回帰モデル線形回帰モデル
線形回帰モデル貴之 八木
 
Jacobi and gauss-seidel
Jacobi and gauss-seidelJacobi and gauss-seidel
Jacobi and gauss-seidelarunsmm
 
Some Examples of Scaling Sets
Some Examples of Scaling SetsSome Examples of Scaling Sets
Some Examples of Scaling SetsVjekoslavKovac1
 
Calculus of variation problems
Calculus of variation   problemsCalculus of variation   problems
Calculus of variation problemsSolo Hermelin
 
Verification of Solenoidal & Irrotational
Verification of Solenoidal & IrrotationalVerification of Solenoidal & Irrotational
Verification of Solenoidal & IrrotationalMdAlAmin187
 

La actualidad más candente (19)

Solovay Kitaev theorem
Solovay Kitaev theoremSolovay Kitaev theorem
Solovay Kitaev theorem
 
Solving the energy problem of helium final report
Solving the energy problem of helium final reportSolving the energy problem of helium final report
Solving the energy problem of helium final report
 
The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...
The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...
The Estimation for the Eigenvalue of Quaternion Doubly Stochastic Matrices us...
 
Solution to schrodinger equation with dirac comb potential
Solution to schrodinger equation with dirac comb potential Solution to schrodinger equation with dirac comb potential
Solution to schrodinger equation with dirac comb potential
 
How to Solve a Partial Differential Equation on a surface
How to Solve a Partial Differential Equation on a surfaceHow to Solve a Partial Differential Equation on a surface
How to Solve a Partial Differential Equation on a surface
 
Introduction to Calculus of Variations
Introduction to Calculus of VariationsIntroduction to Calculus of Variations
Introduction to Calculus of Variations
 
Jacobi iterative method
Jacobi iterative methodJacobi iterative method
Jacobi iterative method
 
Fixed Point Theorem in Fuzzy Metric Space Using (CLRg) Property
Fixed Point Theorem in Fuzzy Metric Space Using (CLRg) PropertyFixed Point Theorem in Fuzzy Metric Space Using (CLRg) Property
Fixed Point Theorem in Fuzzy Metric Space Using (CLRg) Property
 
Ch02
Ch02Ch02
Ch02
 
線形回帰モデル
線形回帰モデル線形回帰モデル
線形回帰モデル
 
Jacobi and gauss-seidel
Jacobi and gauss-seidelJacobi and gauss-seidel
Jacobi and gauss-seidel
 
Some Examples of Scaling Sets
Some Examples of Scaling SetsSome Examples of Scaling Sets
Some Examples of Scaling Sets
 
exponen dan logaritma
exponen dan logaritmaexponen dan logaritma
exponen dan logaritma
 
Legendre functions
Legendre functionsLegendre functions
Legendre functions
 
Prime numbers
Prime numbersPrime numbers
Prime numbers
 
Calculus of variation problems
Calculus of variation   problemsCalculus of variation   problems
Calculus of variation problems
 
METHOD OF JACOBI
METHOD OF JACOBIMETHOD OF JACOBI
METHOD OF JACOBI
 
Gamma function
Gamma functionGamma function
Gamma function
 
Verification of Solenoidal & Irrotational
Verification of Solenoidal & IrrotationalVerification of Solenoidal & Irrotational
Verification of Solenoidal & Irrotational
 

Destacado

Sullivan_FinalStory1
Sullivan_FinalStory1Sullivan_FinalStory1
Sullivan_FinalStory1Cody Sullivan
 
Sullivan_Story2Final
Sullivan_Story2FinalSullivan_Story2Final
Sullivan_Story2FinalCody Sullivan
 
What is the significance of anzac day.
What is the significance of anzac day.What is the significance of anzac day.
What is the significance of anzac day.aghewlett
 
GUIA DE MUSCULACION
GUIA DE MUSCULACION GUIA DE MUSCULACION
GUIA DE MUSCULACION ZooNa ZeeRo
 
Sukanya's CV in May 2015
Sukanya's CV in May 2015Sukanya's CV in May 2015
Sukanya's CV in May 2015Sukanya Basu
 
Conference afai 03 04 2014 m soudry g gulla menez
Conference afai 03 04 2014 m soudry g gulla menezConference afai 03 04 2014 m soudry g gulla menez
Conference afai 03 04 2014 m soudry g gulla menezAntoine Vigneron
 
clima, cultura y cambio organizacional
clima, cultura y cambio organizacionalclima, cultura y cambio organizacional
clima, cultura y cambio organizacionalvaleriacorralm
 
June 2015 Dendro
June 2015 DendroJune 2015 Dendro
June 2015 DendroEllyn Shea
 
L'entreprise face à ses enjeux et risques numériques
L'entreprise face à ses enjeux et risques numériquesL'entreprise face à ses enjeux et risques numériques
L'entreprise face à ses enjeux et risques numériquesAntoine Vigneron
 

Destacado (14)

Sullivan_FinalStory1
Sullivan_FinalStory1Sullivan_FinalStory1
Sullivan_FinalStory1
 
May 2015 CV
May 2015 CVMay 2015 CV
May 2015 CV
 
Elocaso
ElocasoElocaso
Elocaso
 
Sullivan_Story2Final
Sullivan_Story2FinalSullivan_Story2Final
Sullivan_Story2Final
 
SukanyaCV_May2015
SukanyaCV_May2015SukanyaCV_May2015
SukanyaCV_May2015
 
SukanyaCV_May2015
SukanyaCV_May2015SukanyaCV_May2015
SukanyaCV_May2015
 
What is the significance of anzac day.
What is the significance of anzac day.What is the significance of anzac day.
What is the significance of anzac day.
 
GUIA DE MUSCULACION
GUIA DE MUSCULACION GUIA DE MUSCULACION
GUIA DE MUSCULACION
 
Sukanya's CV in May 2015
Sukanya's CV in May 2015Sukanya's CV in May 2015
Sukanya's CV in May 2015
 
Conference afai 03 04 2014 m soudry g gulla menez
Conference afai 03 04 2014 m soudry g gulla menezConference afai 03 04 2014 m soudry g gulla menez
Conference afai 03 04 2014 m soudry g gulla menez
 
clima, cultura y cambio organizacional
clima, cultura y cambio organizacionalclima, cultura y cambio organizacional
clima, cultura y cambio organizacional
 
Cif gps
Cif gpsCif gps
Cif gps
 
June 2015 Dendro
June 2015 DendroJune 2015 Dendro
June 2015 Dendro
 
L'entreprise face à ses enjeux et risques numériques
L'entreprise face à ses enjeux et risques numériquesL'entreprise face à ses enjeux et risques numériques
L'entreprise face à ses enjeux et risques numériques
 

Similar a smile project

Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...
Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...
Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...Kelly Lipiec
 
Kittel c. introduction to solid state physics 8 th edition - solution manual
Kittel c.  introduction to solid state physics 8 th edition - solution manualKittel c.  introduction to solid state physics 8 th edition - solution manual
Kittel c. introduction to solid state physics 8 th edition - solution manualamnahnura
 
New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...Alexander Litvinenko
 
machinelearning project
machinelearning projectmachinelearning project
machinelearning projectLianli Liu
 
Las funciones L en teoría de números
Las funciones L en teoría de númerosLas funciones L en teoría de números
Las funciones L en teoría de númerosmmasdeu
 
Dynamical systems solved ex
Dynamical systems solved exDynamical systems solved ex
Dynamical systems solved exMaths Tutoring
 
Relative superior mandelbrot and julia sets for integer and non integer values
Relative superior mandelbrot and julia sets for integer and non integer valuesRelative superior mandelbrot and julia sets for integer and non integer values
Relative superior mandelbrot and julia sets for integer and non integer valueseSAT Journals
 
Matrix 2 d
Matrix 2 dMatrix 2 d
Matrix 2 dxyz120
 
pendulm.pdf
pendulm.pdfpendulm.pdf
pendulm.pdframish32
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component AnalysisSumit Singh
 
5.vector geometry Further Mathematics Zimbabwe Zimsec Cambridge
5.vector geometry   Further Mathematics Zimbabwe Zimsec Cambridge5.vector geometry   Further Mathematics Zimbabwe Zimsec Cambridge
5.vector geometry Further Mathematics Zimbabwe Zimsec Cambridgealproelearning
 
SMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last versionSMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last versionLilyana Vankova
 
Z transform and Properties of Z Transform
Z transform and Properties of Z TransformZ transform and Properties of Z Transform
Z transform and Properties of Z TransformAnujKumar734472
 

Similar a smile project (20)

Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...
Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...
Arbitrary Pole Assignability By Static Output Feedback Under Structural Contr...
 
PutzerPaper
PutzerPaperPutzerPaper
PutzerPaper
 
Kittel c. introduction to solid state physics 8 th edition - solution manual
Kittel c.  introduction to solid state physics 8 th edition - solution manualKittel c.  introduction to solid state physics 8 th edition - solution manual
Kittel c. introduction to solid state physics 8 th edition - solution manual
 
New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...
 
machinelearning project
machinelearning projectmachinelearning project
machinelearning project
 
Las funciones L en teoría de números
Las funciones L en teoría de númerosLas funciones L en teoría de números
Las funciones L en teoría de números
 
ch3.ppt
ch3.pptch3.ppt
ch3.ppt
 
Dynamical systems solved ex
Dynamical systems solved exDynamical systems solved ex
Dynamical systems solved ex
 
Relative superior mandelbrot and julia sets for integer and non integer values
Relative superior mandelbrot and julia sets for integer and non integer valuesRelative superior mandelbrot and julia sets for integer and non integer values
Relative superior mandelbrot and julia sets for integer and non integer values
 
Matrix 2 d
Matrix 2 dMatrix 2 d
Matrix 2 d
 
pendulm.pdf
pendulm.pdfpendulm.pdf
pendulm.pdf
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component Analysis
 
5.vector geometry Further Mathematics Zimbabwe Zimsec Cambridge
5.vector geometry   Further Mathematics Zimbabwe Zimsec Cambridge5.vector geometry   Further Mathematics Zimbabwe Zimsec Cambridge
5.vector geometry Further Mathematics Zimbabwe Zimsec Cambridge
 
Debojyoit
Debojyoit Debojyoit
Debojyoit
 
SMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last versionSMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last version
 
Z transform and Properties of Z Transform
Z transform and Properties of Z TransformZ transform and Properties of Z Transform
Z transform and Properties of Z Transform
 
lec z-transform.ppt
lec z-transform.pptlec z-transform.ppt
lec z-transform.ppt
 
NCM LECTURE NOTES ON LATIN SQUARES(27)
NCM LECTURE NOTES ON LATIN SQUARES(27)NCM LECTURE NOTES ON LATIN SQUARES(27)
NCM LECTURE NOTES ON LATIN SQUARES(27)
 
NCM LECTURE NOTES ON LATIN SQUARES(27) (1) (1)
NCM LECTURE NOTES ON LATIN SQUARES(27) (1) (1)NCM LECTURE NOTES ON LATIN SQUARES(27) (1) (1)
NCM LECTURE NOTES ON LATIN SQUARES(27) (1) (1)
 
Number theory
Number theoryNumber theory
Number theory
 

smile project

  • 1. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED Abstract. Given that every matrix has a Jordan canonical form, so does any power of a matrix. Using the Z-transform, this article will demonstrate the unique form of such a matrix, and the properties that pertain to it. 1. Introduction In 1986, Schmidt wrote a paper [S86] on canonical forms of complex-valued matrices. In his paper, he gave a proof of the Jordan decomposition of a matrix by using the Laplace transform. This method of proof only involves elementary calculus and an in-depth study of the matrix equation eA(s+t) = eAs eAt , as opposed to more advanced topics like other known proofs. In this paper, we prove a similar result, written below and proven in Theorem 5.3, using only the Z-transform and a careful study of the matrix equation Ak+` = A` Ak . Theorem 1.1. Let A be an n ⇥ n matrix with characteristic polynomial cA(z) = (z 1)m1 · · · (z r)mr . Then, there exists a unique decomposition Ak = rX i=1 k i Pi + mi 1X q=1 Nq i 'q, i (k) with the following properties (1) PiNi = NiPi = Ni (2) PiPj = 0 if i 6= j PiPi = Pi (3) Nmi i = 0 (4) PiNj = NjPi = 0 if i 6= j (5) A = Pr i=1 iPi + Ni (6) I = Pr i=1 Pi. Properties (2), (3), and (5) together state that any matrix A can be written in a simple way as a combination of projections and nilpotents. This leads us to the following definitions and example of the idea. Definition 1.2. A projection is any square matrix P such that P2 = P. Definition 1.3. A nilpotent matrix is a nonzero square matrix N such that Nk = 0 for some positive integer k. Example 1.4. Consider the matrix A =  1 4 1 3 . Notice that we have the following decomposition A = 1
  • 2. 2 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED Let us build some of the needed prerequisites. 2. The Z-Transform The Z-transform is a useful operator that often comes up as a method to solve di↵erence equations, but we will only use some basic properties of it. Here is its definition. Definition 2.1. Let y(k) be a sequence of complex numbers. We define the Z- transform of y to be the function Z{y}(z), where z is a complex variable, by the following formula: Z{y}(z) = 1X k=0 y(k) zk . There are some interesting functions which we are most interested in applying the Z-transform to. These ' functions will come up in all that follow. Before we define them, we need to understand the falling factorial. Definition 2.2. Let n be a nonnegative integer. The falling factorial is the sequence kn , with k= 0,1,2,..., given by the following formula kn = k(k 1)(k 2)...(k n + 1). If k were allowed to be a real variable, then kn could be characterized as the unique monic polynomial of degree n that vanishes at 0, 1, ..., n 1. Remark 2.3. We can recover the familiar factorial from the falling factorial. Notice that kn |k=n= n!. Let us take a look at an example to illustrate how falling factorials work. Example 2.4. We will compute 64 . Using the definition above, we see that 64 = 6(6 1)(6 2)(6 3) = 6(5)(4)(3) = 360. Now that we have defined the falling factorial, we can give the definition of the ' functions. Definition 2.5. Let a be a complex number and n be a nonnegative integer. The following sequences ' functions are sequences defined as such, 'n,a(k) = ( ak n kn n! a 6= 0 n(k) a = 0, where n(k) is the sequence which is 0 for all k 6= n and n(n) = 1. Since these functions are so important to everything that follows, we will compute a few examples of them below. Example 2.6. '0,2(k) = 2k = (1, 2, 4, 8, 16, ...) '1,2(k) = 2k 1 k = (0, 1, 4, 12, 32, ...) '2,0(k) = 2(k) = (0, 0, 1, 0, 0, 0, 0...) It turns out that the Z-transform of the ' functions is particularly useful for our purposes. This result is written below (see Proposition 5 of [T12] for the proof).
  • 3. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 3 Proposition 2.7. Let a 2 C and n 2 N. Then, Z{'n,a(k)}(z) = z (z a)n+1 . 3. Properties of a Z-Transform Suppose a is a non-zero complex number, n 2 N = {0, 1, 2...} and y(k) is a se- quence for which the Z -transform exists. Then (1) Z{ak }(z) = z z a (2) Z{ak y(k)}(z) = Y (z a ) (3) Z{y(k + n)}(z) = zn Y (z) Pn 1 m=0 y(m)zn m (translation principle) (4) Z{(k + n 1)n y(k)}(z) = ( 1)n zn Dn Y (z) (5) Z{kn }(z) = n!z (z 1)n+1 Proposition 3.1. Let A be an n ⇥ n matrix with complex entries. Let cA(z) = det(zI A) be the characteristic polynomial. Assume a1, . . . , aR are distinct roots with corresponding multiplicities M1, . . . , MR. Then for each r, 1  r  R, and m, 0  m  Mr 1, there are n ⇥ n matrices Br,m such that Ak = RX r=1 Mr 1X m=0 Br,m'm,ar (k) Proof. Our assumptions imply that we can factor cA in the following way: cA(z) = RY r=1 (z ar)Mr . The (i, j) entry of (zI A) 1 is of the form pi,j (z) cA(z) where pi,j(z) is some poly- nomial with degree less than n. Using partial fractions, we can write pi,j(z) cA(z) = RX r=1 Mr 1X m=0 br,m(i, j) (z ar)m+1 where br,m 2 C. It follows then that z(zI A) 1 = RX r=1 Mr 1X m=0 zBr,m (z ar)m+1 where Br,m is the n ⇥ n matrix whose (i, j) entry is br,m(i, j) for each pair (r, m).
  • 4. 4 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED By Corollary 8 and Proposition 5 we get Ak = RX r=1 Mr 1X m=0 Br,m'm,ar (k). ⇤ : 4. Examples Example 4.1. Let A =  3 2 1 0 where Ak = RX r=0 Mr 1X m=0 Br,m'm,ar (k) and 'm,ar (k) = ak m r km m! . Note: km is known as the falling factorial where km = k(k 1)(k 2) · · · (k m + 1). We must first find the characteristic polynomial CA(z) = det(zI A): CA(z) = det  z 3 2 1 z = z2 3z + 2 = (z 2)(z 1) = 0. From the characteristic polynomial, we can find the eigenvalues ar and the multi- plicities Mr. In this example there are two eigenvalues a1 = 2 and a2 = 1 with multiplicities M1 = 1 and M2 = 1, so the Ak equation becomes Ak = RX r=0 Mr 1X m=0 Br,m'm,ar (k) = 2X r=1 0X m=0 B(1,0)'(0,2)(k) + B(2,0)'(0,1). Let us define M = B(1,0) and N = B(2,0) for simplicity, so the equation now becomes Ak = M'(0,2)(k) + N'(0,1)(k).
  • 5. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 5 Similarly, A` = RX r=0 Mr 1X m=0 Br,m'm,ar (`) = 2X r=1 0X m=0 B(1,0)'(0,2)(k) + B(2,0)'(0,1) = M'(0,2)(l) + N'(0,1)(`) and Ak+` = RX r=0 Mr 1X m=0 Br,m'm,ar (k + `) = 2X r=1 0X m=0 B(1,0)'(0,2)(k) + B(2,0)'(0,1) = M'(0,2)(k + `) + N'(0,1)(k + `). Now we can substitute these into the equation Ak + A` = Ak+` to get: M'(0,2)(k+`)+N'(0,1)(k+`) = ⇥ M'(0,2)(k) + N'(0,1)(k) ⇤ ⇥ M'(0,2)(`) + N'(0,1)(`) ⇤ . After expanding the right side, the equation becomes: M'(0,2)(k + `) + N'(0,1)(k + `) = M2 '(0,2)(k + `) + M'(0,2)(k)N'(0,1)(`) + N'(0,1)(k)M'(0,2)(l) + N2 '(0,1)(k + `) (M'(0,2)(k))'(0,2)(l) + (N'(0,1)(k))'(0,1)(`) = (M2 '(0,2)(k) + NM'(0, 1)(k))'(0,2)(`) + (MN'(0,2)(k) + N2 '(0,1)(k))'(0,1)(`) **Note that z-transform is 1-1 (one to one) and linear, and '0 s are the same so the terms are linear independent** M'(0,2)(k) = M2 '(0, 2)(k) + NM'(0, 1)(k) N'(0, 1)(k) = MN'(0, 2)(k) + N2 '(0, 1)(k) Linear independence is applied a second time, thus M2 = M, N2 = N, NM = 0, and MN = 0 M and N are projections
  • 6. 6 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED Example 4.2. Let A =  1 4 1 3 where Ak = RX r=1 Mr 1X m=0 Br,m'm,ar (k) and 'm,ar (k) = ak m r km m! . Note: km is known as the falling factorial where km = k(k 1)(k 2) · · · (k m + 1). We must first find the characteristic polynomial cA(z) = det(zI A): cA(z) = det(zI A) =  z 1 4 1 z + 3 = (z 1)(z + 3) ( 4) = z2 + 2z + 1 = (z + 1)2 From the characteristic polynomial, we can find the eigenvalues ar and the multiplic- ities Mr. In this example there is only one eigenvalue a1 = 1 and its multiplicity is M1 = 2, so the Ak equation becomes Ak = 1X r=1 1X m=0 Br,m'm,ar (k) = B1,0'0, 1(k) + B1,1'1, 1(k). Let us define M = B1,0 and N = B1,1 for simplicity, so the equation then be- comes Ak = M'0, 1(k) + N'1, 1(k). Similarly
  • 7. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 7 A` = 1X r=1 1X m=0 Br,m'm,ar (`) = B1,0'0, 1(`) + B1,1'1, 1(`) = M'0, 1(`) + N'1, 1(`) and Ak+` = 1X r=1 1X m=0 Br,m'm,ar (k + `) = B1,0'0, 1(k + `) + B1,1'1, 1(k + `) = M'0, 1(k + `) + N'1, 1(k + `). Now we can substitute these into the equation Ak A` = Ak+` to get: [M'0, 1(k)+N'1, 1(k)][M'0, 1(`)+N'1, 1(`)] = M'0, 1(k+`)+N'1, 1(k+`) After expanding the left side of the equation we get: [M'0, 1(k) + N'1, 1(k)][M'0, 1(`) + N'1, 1(`)] = M2 '0, 1(k)'0, 1(`) + MN'0, 1(k)'1, 1(`) + NM'1, 1(k)'0, 1(`) + N2 '1, 1(k)'1, 1(`). Since 'm,ar (k) = ak m r km m! , the left side becomes the following equation:
  • 8. 8 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED M2 ( 1)k ( 1)` + MN`( 1)k ( 1)` 1 + NMk( 1)k 1 ( 1)` + N2 k`( 1)k 1 ( 1)` 1 = M2 ( 1)k+` + MN`( 1)k+` 1 + NMk( 1)k+` 1 + N2 k`( 1)k+` 2 Therefore Ak A` = Ak+` implies M'0, 1(k + `) + N'1, 1(k + `) = M2 ( 1)k+` + MN`( 1)k+` 1 + NMk( 1)k+` 1 + N2 k`( 1)k+` 2 . Since we know that the set {'m,ar (k)} is a linearly independent set of functions, we conclude that M2 = M MN = NM = N N2 = 0 Hence, M is a projection and N is nilpotent. Example 4.3. Now let A be the 3 ⇥ 3 matrix A = 2 4 10 2 9 12 4 9 8 4 4 3 5 .
  • 9. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 9 Like the previous examples, we must find the characteristic polynomial for A first: cA(z) = det 2 4 z 10 2 9 12 z + 4 9 8 4 z + 4 3 5 = z3 2z2 4z + 8 = (z 2)2 (z + 2). We know Ak = RX r=1 Mr 1X m=0 Br,m'm,ar (k) where ar is the eigenvalues a1 = 2 and a2 = 2 and there multiplicities Mr are M1 = 2 and M2 = 1. So Ak becomes: Ak = B1,0'0,2(k) + B1,1'1,2(k) + B2,0'0, 2(k). We will say M = B1,0, N = B1,1 and P = B2,0 for simplicity, so the equation becomes Ak = M'0,2(k) + N'1,2 + P'0, 2(k). Similarly A` = B1,0'0,2(`) + B1,1'1,2(`) + B2,0'0, 2(`) = M'0,2(`) + N'1,2(`) + P'0, 2(`) and Ak+` = B1,0'0,2(k + `) + B1,1'1,2(k + `) + B2,0'0, 2(k + `) = M'0,2(k + `) + N'1,2(k + `) + P'0, 2(k + `). We can now substitute the three equations into the original equation Ak A` = Ak+` to get: [M'0,2(k)+N'1,2(k)+P'0, 2(k)][M'0,2(`)+N'1,2(`)+P'0, 2(`)] = M'0,2(k+l)+N'1,2(k+l)+P'0, 2(k+l). After expanding the left side of the equation we get:
  • 10. 10 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED M'0,2(k + `) + N'1,2(k + `) + P'0, 2(k + `) = M2 '0,2(k)'0,2(`) + MN'0,2(k)'1,2(`) + MP'0,2(k)'0, 2(`) + NM'1,2(k)'0,2(`) + N2 '1,2(k)'1,2(`) + NP'1,2(k)'(0, 2)(`) + PM'0, 2(k)'0,2(`) + PN'0, 2(k)'1,2(`). + P2 '0, 2(k)'0, 2(`) Since we know 'm,ar = ak m r km m! the equation becomes M(2)k+` + N(k + `)2k+` + P( 2)k+` = M2 2k+` + MN`2k+` 1 + MP(2)k ( 2)` + NMk2k+` 1 + N2 k`2k+` 2 + NPk(2)k 1 ( 2)` + PM( 2)k (2)` + PN`( 2)k (2)` + P2 ( 2)k+` By Linear Independence of 'm,ar we can conclude, M2 = M P2 = P N2 = 0 MP = PM = 0 MN = NM = N PN = NP = 0 Therefore, M and P are both projections and N is nilpotent. 5. Linear Independence Theorem 5.1. Let A 2 Mn(C). Let the characteristic polynomial of A, cA(s) = (s 1)m1 (s 2)m2 · · · (s n)mn Then the set B = {'0, 1 , '1, 1 , · · · , 'm1, 1 , · · · , 'mn, n } is linearly independent. Proof. Because the Z transform is an injective linear transformation, it preserves linear independence. Therefore, if Z(B) is linearly independent, so is B By the definition of the Z transform, Z(B) = { z z 1 , z (z 1)2 , · · · z (z 1)m1 , · · · , z (z n)mn }
  • 11. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 11 Let c1, c2, · · · cl be constants, such that c1z z 1 + c2z (z 1)2 + · · · + cm1 z (z 1)m1 + · · · + clz (z n)mn = 0 Rewrite this as z · p1(z) (z 1)m1 + · · · + z · pn(z) (z n)mn by making common denominators. Then, take the limit as z approaches 1 Since the sum above is identically zero, it’s limit must be zero everywhere. This means, in particular, that lim z! 1 z · p1(z) (z 1)m1 is finite, because the limit as z approaches 1 is finite for all other terms in the sum, and must add up to zero. This implies that p1(z) = 0. Since p1(z) = c1(z 1)mn 1 + c2(z 1)mn 2 + · · · cmn c1 is the only term that has z raised to the mn 1 power, so c1 = 0. Reapply this logic to conclude that ci = 0, 81  i  l Therefore, Z(B) is linearly independent, which implies that B is linearly independent. ⇤ Theorem 5.2. Let x, y 2 Z, n 2 N. Then (5.1) (x + y)n = nX k=0 ✓ n k ◆ xk yn k Proof. This proof relies on the fact that xk+1 = x · (x 1)k We will do an inductive argument on n. When n = 0, this equation reduces to 1 = 1, which is true. Therefore, assume there exists an n 2 N such that (5.1) holds, and show that this implies (5.1) holds for n + 1. (5.2) n+1X k=0 ✓ n + 1 k ◆ xk yn+1 k = nX k=0 ✓ n + 1 k ◆ xk yn+1 k + xn+1 Because we can just take out the last term of the sum. Next, we will use Pascal’s Rule (which can be found in [KP01] or most any treatise on combinatorics), which states that ✓ n + 1 k ◆ = ✓ n k 1 ◆ + ✓ n k ◆ Apply this to the right hand side of (5.2), to get (5.3) n+1X k=0 ✓ n + 1 k ◆ xk yn+1 k = nX k=0 ✓✓ n k 1 ◆ + ✓ n k ◆◆ xk yn+1 k + xn+1 This is the same as writing (5.4) n+1X k=0 ✓ n + 1 k ◆ xk yn+1 k = nX k=0 ✓ n k 1 ◆ xk yn+1 k + nX k=0 ✓ n k ◆ xk yn+1 k +xn+1
  • 12. 12 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED But the first term on the right hand side of (5.4) is zero, so we can shift our index and keep the same answer. This gives us (5.5) n+1X k=0 ✓ n + 1 k ◆ xk yn k = n 1X k=0 ✓ n k ◆ xk+1 yn+1 k + nX k=0 ✓ n k ◆ xk yn+1 k + xn+1 Add the term that was split o↵ back in to get (5.6) n+1X k=0 ✓ n + 1 k ◆ xk yn k = nX k=0 ✓ n k ◆ xk+1 yn+1 k + nX k=0 ✓ n k ◆ xk yn+1 k Now we can use the fact that was mentioned at the start of the proof, namely that xk+1 = x · (x 1)k . Apply this to the right hand side of (5.6) to get (5.7) n+1X k=0 ✓ n + 1 k ◆ xk yn k = x · nX k=0 ✓ n k ◆ (x 1)k yn+1 k + y · nX k=0 ✓ n k ◆ xk yn k Now we can invoke our inductive hypothesis, and get that (5.8) n+1X k=0 ✓ n + 1 k ◆ xk yn k = x · (x + y 1)n + y · (x + y 1)n The right hand side of (5.8) is the same as (x + y)(x + y 1)n , so using the fact mentioned at the start of the proof again, we get that (5.9) n+1X k=0 ✓ n + 1 k ◆ xk yn k = (x + y)n+1 This is what we were trying to show, so by the principle of mathematical induc- tion, (5.1) is true. ⇤ Theorem 5.3. Let A be an n ⇥ n matrix with characteristic polynomial cA(z) = (z 1)m1 · · · (z r)mr . Then, there exists a decomposition Ak = rX i=1 k i Pi + mi 1X q=1 Nq i 'q, i (k) with the following properties (1) PiNi = NiPi = Ni (2) PiPj = 0 if i 6= j PiPi = Pi (3) Nmi i = 0 (4) PiNj = NjPi = 0 if i 6= j (5) A = Pr i=1 iPi + Ni (6) I = Pr i=1 Pi. Proof. This proof is essentially a careful study of the equation Ak+l = Al Ak . From Proposition 9 of [T12], we have a canonical decomposition of the matrix power Ak+l = rX i=1 Mi 1X j=0 Mi,j'j, i (k + l).
  • 13. JORDAN DECOMPOSITION VIA THE Z-TRANSFORM 13 Substituting the definition 'j, i (k +l) = Pj a=0 j a k j+l i kj a la j! , the right-hand side becomes rX i=1 1X j=0 jX a=0 Mi,j ✓ j a ◆ k j+l i kj a la j! . Swapping the inner two sums and making the appropriate changes to their limits, we have rX i=1 1X a=0 1X j=a Mi,j ✓ j a ◆ k j+l i kj a la j! . Letting j = j + a, rearranging, and simplifying, we get rX i=1 1X a=0 1X j=0 Mi,j+a k j i kj l a i la j!a! . Now, let k j i = Pr b=1 k j b i(b). Substituting into the above, our expression for Ak+l becomes rX i=1 1X a=0 rX b=1 1X j=0 Mi,j+a i(b) k j b kj l a i la j!a! . Using the definition of the ' functions, we can rewrite this as rX i=1 1X a=0 rX b=1 1X j=0 Mi,j+a i(b)'j, b (k)'a, i (l). Now we turn our attention to Al Ak . Using Proposition 9 of [T12], we have Al Ak = rX i=1 1X a=0 Mi,a'a, i (l) · rX b=1 1X j=0 Mb,j'j, b (k) = rX i=1 1X a=0 rX b=1 1X j=0 Mi,aMb,j'j, b (k)'a, i (l). Now, we set equal our expressions for Ak+l and Al Ak : rX i=1 1X a=0 rX b=1 1X j=0 Mi,j+a i(b)'j, b (k)'a, i (l) == rX i=1 1X a=0 rX b=1 1X j=0 Mi,aMb,j'j, b (k)'a, i (l) Invoking the linear independence of the ' functions given in Theorem 5.1, we have a collection of equations rX b=1 1X j=0 Mi,j+a i(b)'j, b (k) = rX b=1 1X j=0 Mi,aMb,j'j, b (k), one for each i and a. Using Theorem 5.1 once again on each of these equations, we conclude that Mi,j+a i(b) = Mi,aMb,j. The six properties in the statement of the theorem come from a careful analysis of this equation. Set Pi = Mi,0 and Ni = Mi,1. Notice that PiNi = Mi,0Mi,1 = Mi,1 i(i) = Ni and NiPi = Mi,1Mi,0 = Mi,1 i(i) = Ni, proving (1). ⇤
  • 14. 14 ASIA AGE, THOMAS CREDEUR, TRAVIS DIRLE, AND LAFANIQUE REED References [KP01] Walter G. Kelley and Allan C. Peterson, Di↵erence Equations: An Introduction with Applications, Academic Press, San Diego, 2001. [S86] E. J. P. Georg Schmidt, An Alternative Approach to Canonical Forms of Matrices, American Mathematical Monthly 93 (1986), no. 3, 176-184. [T12] Casey Tsai, The Cayley-Hamilton Theorem via the Z-Transform, Rose-Hulman Under- graduate Mathematics Journal 13 (2012), no. 2, 44-52. SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana E-mail address: aage@xula.edu SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana E-mail address: thomas.credeur@gmail.com SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana E-mail address: tdirle@go.olemiss.edu SMILE@LSU, Louisiana State University, Baton Rouge, Louisiana E-mail address: lar257@msstate.edu