In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues.[1] More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by
.
Definition
Matrices
Let λ1, ..., λn be the eigenvalues of a matrix A ∈ Cn×n. The spectral radius of A is defined as

The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand,
for every natural matrix norm
; and on the other hand, Gelfand's formula states that
. Both of these results are shown below.
However, the spectral radius does not necessarily satisfy
for arbitrary vectors
. To see why, let
be arbitrary and consider the matrix
.
The characteristic polynomial of
is
, so its eigenvalues are
and thus
. However,
. As a result,

As an illustration of Gelfand's formula, note that
as
, since
if
is even and
if
is odd.
A special case in which
for all
is when
is a Hermitian matrix and
is the Euclidean norm. This is because any Hermitian Matrix is diagonalizable by a unitary matrix, and unitary matrices preserve vector length. As a result,

Bounded linear operators
In the context of a bounded linear operator A on a Banach space, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values
for which
is not bijective. We denote the spectrum by

The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum:

Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting
denote the operator norm, we have

A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its numerical radius. An example of such an operator is a normal operator.
Graphs
The spectral radius of a finite graph is defined to be the spectral radius of its adjacency matrix.
This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e. there exists some real number C such that the degree of every vertex of the graph is smaller than C). In this case, for the graph G define:

Let γ be the adjacency operator of G:

The spectral radius of G is defined to be the spectral radius of the bounded linear operator γ.
Upper bounds
Upper bounds on the spectral radius of a matrix
The following proposition gives simple yet useful upper bounds on the spectral radius of a matrix.
Proposition. Let A ∈ Cn×n with spectral radius ρ(A) and a sub-multiplicative matrix norm ||⋅||. Then for each integer
:

Proof
Let (v, λ) be an eigenvector-eigenvalue pair for a matrix A. By the sub-multiplicativity of the matrix norm, we get:

Since v ≠ 0, we have

and therefore

concluding the proof.
Upper bounds for spectral radius of a graph
There are many upper bounds for the spectral radius of a graph in terms of its number n of vertices and its number m of edges. For instance, if

where
is an integer, then[2]

Symmetric matrices
For real-valued matrices
the inequality
holds in particular, where
denotes the spectral norm. In the case
where
is symmetric, this inequality is tight:
Theorem. Let
be symmetric, i.e.,
Then it holds that
Proof
Let
be the eigenpairs of A. Due to the symmetry of A,
all
and
are real-valued and the eigenvectors
are orthonormal.
By the definition of the spectral norm, there exists an
with
such that
Since the eigenvectors
form a basis
of
there exists
factors
such that
which implies that

From the orthonormality of the eigenvectors
it follows that

and

Since
is chosen such that it maximizes
while satisfying
the values of
must be such that they maximize
while satisfying
This is achieved by setting
for
and
otherwise, yielding a value of
Power sequence
The spectral radius is closely related to the behavior of the convergence of the power sequence of a matrix; namely as shown by the following theorem.
Theorem. Let A ∈ Cn×n with spectral radius ρ(A). Then ρ(A) < 1 if and only if

On the other hand, if ρ(A) > 1,
. The statement holds for any choice of matrix norm on Cn×n.
Proof
Assume that
goes to zero as
goes to infinity. We will show that ρ(A) < 1. Let (v, λ) be an eigenvector-eigenvalue pair for A. Since Akv = λkv, we have

Since v ≠ 0 by hypothesis, we must have

which implies
. Since this must be true for any eigenvalue
, we can conclude that ρ(A) < 1.
Now, assume the radius of A is less than 1. From the Jordan normal form theorem, we know that for all A ∈ Cn×n, there exist V, J ∈ Cn×n with V non-singular and J block diagonal such that:

with

where

It is easy to see that

and, since J is block-diagonal,

Now, a standard result on the k-power of an
Jordan block states that, for
:

Thus, if
then for all i
. Hence for all i we have:

which implies

Therefore,

On the other side, if
, there is at least one element in J that does not remain bounded as k increases, thereby proving the second part of the statement.
Gelfand's formula, named after Israel Gelfand, gives the spectral radius as a limit of matrix norms.
Theorem
For any matrix norm ||⋅||, we have[3]
.
Moreover, in the case of a consistent matrix norm
approaches
from above (indeed, in that case
for all
).
Proof
For any ε > 0, let us define the two following matrices:

Thus,

We start by applying the previous theorem on limits of power sequences to A+:

This shows the existence of N+ ∈ N such that, for all k ≥ N+,

Therefore,

Similarly, the theorem on power sequences implies that
is not bounded and that there exists N− ∈ N such that, for all k ≥ N−,

Therefore,

Let N = max{N+, N−}. Then,

that is,

This concludes the proof.
Corollary
Gelfand's formula yields a bound on the spectral radius of a product of commuting matrices: if
are matrices that all commute, then

Numerical example
Consider the matrix

whose eigenvalues are 5, 10, 10; by definition, ρ(A) = 10. In the following table, the values of
for the four most used norms are listed versus several increasing values of k (note that, due to the particular form of this matrix,
):
Notes and references
Bibliography
- Dunford, Nelson; Schwartz, Jacob (1963), Linear operators II. Spectral Theory: Self Adjoint Operators in Hilbert Space, Interscience Publishers, Inc.
- Lax, Peter D. (2002), Functional Analysis, Wiley-Interscience, ISBN 0-471-55604-1
See also
|
---|
Spaces | |
---|
Theorems | |
---|
Operators | |
---|
Algebras | |
---|
Open problems | |
---|
Applications | |
---|
Advanced topics | |
---|
|
|
---|
Basic concepts | |
---|
Main results | |
---|
Special Elements/Operators | |
---|
Spectrum | |
---|
Decomposition | |
---|
Spectral Theorem | |
---|
Special algebras | |
---|
Finite-Dimensional | |
---|
Generalizations | |
---|
Miscellaneous | |
---|
Examples | |
---|
Applications | |
---|