Applied Probability by Paul E Pfeiffer - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

Chapter 13Transform Methods

13.1Transform Methods*

As pointed out in the units on Expectation and Variance, the mathematical expectation E[X]=μX of a random variable X locates the center of mass for the induced distribution, and the expectation

(13.1)
_autogen-svg2png-0002.png

measures the spread of the distribution about its center of mass. These quantities are also known, respectively, as the mean (moment) of X and the second moment of X about the mean. Other moments give added information. For example, the third moment about the mean _autogen-svg2png-0003.png gives information about the skew, or asymetry, of the distribution about the mean. We investigate further along these lines by examining the expectation of certain functions of X. Each of these functions involves a parameter, in a manner that completely determines the distribution. For reasons noted below, we refer to these as transforms. We consider three of the most useful of these.

Three basic transforms

We define each of three transforms, determine some key properties, and use them to study various probability distributions associated with random variables. In the section on integral transforms, we show their relationship to well known integral transforms. These have been studied extensively and used in many other applications, which makes it possible to utilize the considerable literature on these transforms.

Definition. The moment generating functionMX for random variable X (i.e., for its distribution) is the function

(13.2)
_autogen-svg2png-0004.png

The characteristic functionφX for random variable X is

(13.3)
_autogen-svg2png-0005.png

The generating functiongX(s) for a nonnegative, integer-valued random variable X is

(13.4)
_autogen-svg2png-0007.png

The generating function _autogen-svg2png-0008.png has meaning for more general random variables, but its usefulness is greatest for nonnegative, integer-valued variables, and we limit our consideration to that case.

The defining expressions display similarities which show useful relationships. We note two which are particularly useful.

(13.5)
_autogen-svg2png-0009.png

Because of the latter relationship, we ordinarily use the moment generating function instead of the characteristic function to avoid writing the complex unit i. When desirable, we convert easily by the change of variable.

The integral transform character of these entities implies that there is essentially a one-to-one relationship between the transform and the distribution.

Moments

The name and some of the importance of the moment generating function arise from the fact that the derivatives of MX evaluateed at s=0 are the moments about the origin. Specifically

(13.6)
_autogen-svg2png-0011.png

Since expectation is an integral and because of the regularity of the integrand, we may differentiate inside the integral with respect to the parameter.

(13.7)
_autogen-svg2png-0012.png

Upon setting s=0, we have MX'(0)=E[X]. Repeated differentiation gives the general result. The corresponding result for the characteristic function is _autogen-svg2png-0015.png.

Example 13.1The exponential distribution

The density function is fX(t)=λeλt for t≥0.

(13.8)
_autogen-svg2png-0018.png
(13.9)
_autogen-svg2png-0019.png
(13.10)
_autogen-svg2png-0020.png

From this we obtain _autogen-svg2png-0021.png.

The generating function does not lend itself readily to computing moments, except that

(13.11)
_autogen-svg2png-0022.png

For higher order moments, we may convert the generating function to the moment generating function by replacing s with es, then work with MX and its derivatives.

Example 13.2The Poisson (μ) distribution

_autogen-svg2png-0024.png, so that

(13.12)
_autogen-svg2png-0025.png

We convert to MX by replacing s with es to get _autogen-svg2png-0026.png. Then

(13.13)
_autogen-svg2png-0027.png

so that

(13.14)
_autogen-svg2png-0028.png

These results agree, of course, with those found by direct computation with the distribution.

Operational properties

We refer to the following as operational properties.

(T1): If Z=aX+b, then
(13.15)
_autogen-svg2png-0030.png
For the moment generating function, this pattern follows from
(13.16)
_autogen-svg2png-0031.png
Similar arguments hold for the other two.
(T2): If the pair _autogen-svg2png-0032.png is independent, then
(13.17)
_autogen-svg2png-0033.png
For the moment generating function, esX and esY form an independent pair for each value of the parameter s. By the product rule for expectation
(13.18)
_autogen-svg2png-0036.png
Similar arguments are used for the other two transforms.
A partial converse for (T2) is as follows:
(T3): If MX+Y(s)=MX(s)MY(s), then the pair _autogen-svg2png-0038.png is uncorrelated. To show this, we obtain two expressions for _autogen-svg2png-0039.png, one by direct expansion and use of linearity, and the other by taking the second derivative of the moment generating function.
(13.19)
_autogen-svg2png-0040.png
(13.20)
_autogen-svg2png-0041.png
On setting s=0 and using the fact that MX(0)=MY(0)=1, we have
(13.21)
_autogen-svg2png-0044.png
which implies the equality E[XY]=E[X]E[Y].

Note that we have not shown that being uncorrelated implies the product rule.

We utilize these properties in determining the moment generating and generating functions for several of our common distributions.

Some discrete distributions

  1. Indicator function _autogen-svg2png-0046.png

    (13.22)
    _autogen-svg2png-0047.png

  2. Simple random variable_autogen-svg2png-0048.png (primitive form) _autogen-svg2png-0049.png

    (13.23)
    _autogen-svg2png-0050.png
  3. Binomial_autogen-svg2png-0051.png. _autogen-svg2png-0052.png
    We use the product rule for sums of independent random variables and the generating function for the indicator function.

    (13.24)
    _autogen-svg2png-0053.png
  4. Geometric(p). P(X=k)=pqkk≥0E[X]=q/p We use the formula for the geometric series to get

    (13.25)
    _autogen-svg2png-0057.png
  5. Negative binomial_autogen-svg2png-0058.png If Ym is the number of the trial in a Bernoulli sequence on which the mth success occurs, and Xm=Ymm is the number of failures before the mth success, then

    (13.26)
    _autogen-svg2png-0060.png
    (13.27)
    _autogen-svg2png-0061.png

    The power series expansion about t=0 shows that

    (13.28)(1+t)m=1+C(–m,1)t+C(–m,2)t2+⋯for–1<t<1

    Hence

    (13.29)
    _autogen-svg2png-0064.png

    Comparison with the moment generating function for the geometric distribution shows that Xm=Ymm has the same distribution as the sum of m iid random variables, each geometric (p). This suggests that the sequence is characterized by independent, successive waiting times to success. This also shows that the expectation and variance of Xm are m times the expectation and variance for the geometric. Thus

    (13.30)
    _autogen-svg2png-0067.png
  6. Poisson(μ)_autogen-svg2png-0069.png In Example 13.2, above, we establish gX(s)=eμ(s–1) and _autogen-svg2png-0071.png. If {X,Y} is an independent pair, with X Poisson (λ) and Y Poisson (μ), then Z=X+Y Poisson (λ+μ). Follows from (T1) and product of exponentials.

Some absolutely continuous distributions

  1. Uniform on _autogen-svg2png-0079.png_autogen-svg2png-0080.png

    (13.31)
    _autogen-svg2png-0081.png
  2. Symmetric triangular_autogen-svg2png-0082.png

    (13.32)
    _autogen-svg2png-0083.png
    (13.33)
    _autogen-svg2png-0084.png
    (13.34)
    _autogen-svg2png-0085.png

    where MY is the moment generating function for Y uniform _autogen-svg2png-0087.png and similarly for MZ. Thus, X has the same distribution as the difference of two independent random variables, each uniform on _autogen-svg2png-0088.png.

  3. Exponential(λ)_autogen-svg2png-0090.png
    In example 1, above, we show that _autogen-svg2png-0091.png.

  4. Gamma_autogen-svg2png-0092.png_autogen-svg2png-0093.png

    (13.35)
    _autogen-svg2png-0094.png

    For α=n, a positive integer,

    (13.36)
    _autogen-svg2png-0096.png

    which shows that in this case X has the distribution of the sum of n independent random variables each exponential (λ).

  5. Normal_autogen-svg2png-0098.png.

    • The stan