g
q
q
x
6
q
q
6
6
x
x
x
(a)
(b)
A point and a subgradient of at is shown in (a),
Figure
13.4
x
g
x
together with three level curves of . Note that
is not a descent direction
;g
for at :
increases for any movement from in the directions .
x
x
g
However,
is a descent direction for the distance to any minimizer. In (b)
;g
the level curves for the distance ( ) =
are shown
points into
z
kz
;
x
k
;g
the circle through .x
13.2
Supporting Hyperplanes
If is a convex subset of n and is a point on its boundary, then we say that the
C
R
x
hyperplane through with normal ,
T( ) = 0 , is a supporting hyperplane
x
g
fz
j
g
z
;
x
g
to at if is contained in the half-space T(
) 0. Roughly speaking, if
C
x
C
g
z
;
x
the set is \smooth" at , then the plane that is tangent to at is a supporting
C
x
C
x
hyperplane, and is its outward normal at , as shown in gure 13.5. But the notion
g
x
of supporting hyperplane makes sense even when the set is not \smooth" at . A
C
x
basic result of convex analysis is that there is at least one supporting hyperplane
at every boundary point of a convex set.
If has the form of a functional inequality,
C
=
( )
C
fz
j
z
g
where is convex (or quasiconvex), then a supporting hyperplane to at a boundary
C
point is simply T(
) = 0, where is any subgradient (or quasigradient).
x
g
z
;
x
g
If is a convex subset of the in nite-dimensional space and is a point of its
C
V
x
boundary, then we say that the hyperplane
sh
(
) = 0
z
z
;
x
where
is nonzero linear functional on , is a supporting hyperplane for at
sh
V
C
point if
x
(
) 0 for all
sh
z
;
x
z
2
C
:
13.3 TOOLS FOR COMPUTING SUBGRADIENTS
299
g
q
x
C
A point on the boundary of a convex set . A supporting
Figure
13.5
x
C
hyperplane T(
) = 0 for at is shown: lies entirely in the half-space
g
z
;
x
C
x
C
(
) 0.
T
g
z
;
x
Again we note that this general de nition agrees with the one above for = n
V
R
if we do not distinguish between vectors in n and their associated inner product
R
functionals.
13.3
Tools for Computing Subgradients
To use the algorithms that we will describe in the next two chapters we must be able
to evaluate convex functionals and nd at least one subgradient at any point. In
this section, we list some useful tools for subgradient evaluation. Roughly speaking,
if one can evaluate a convex functional at a point, then it is usually not much more
trouble to determine a subgradient at that point.
These tools come from more general results that describe all subgradients of, for
example, the sum or maximum of convex functionals. These results can be found
in any of the references mentioned in the Notes and References at the end of this
chapter. The more general results, however, are much more than we need, since
our purpose is to show how to calculate one subgradient of a convex functional
at a point, not all subgradients at a point, a task which in many cases is very
di cult, and in any case not necessary for the algorithms we describe in the next
two chapters. The more general results have many more technical conditions.
Dierentiable functional: If is convex and di erentiable at , then its deriva-
x
300