<< Chapter < Page
  Random processes   Page 1 / 1
Chapter >> Page >
This module introduces characteristic functions.

You have already encountered the Moment Generating Function of a pdf in the Part IB probability course. This function was closely related to the Laplace Transform of the pdf.

Now we introduce the Characteristic Function for a random variable, which is closely related to the Fourier Transform of the pdf.

In the same way that Fourier Transforms allow easy manipulation of signals when they are convolved with linear system impulseresponses, Characteristic Functions allow easy manipulation of convolved pdfs when they represent sums of random processes.

The Characteristic Function of a pdf is defined as:

Φ X u u x x u x f X x u
where u is the Fourier Transform of the pdf.

Note that whenever f X is a valid pdf, Φ 0 x f X x 1

Properties of Fourier Transforms apply with u substituted for ω . In particular:

  • Convolution - (sums of independent rv's)
    Y i 1 N X i f Y f X 1 f X 2 f X N Φ Y u i 1 N Φ X i u
  • Inversion
    f X x 1 2 u u x Φ X u
  • Moments
    u n Φ X u x x n u x f X x X n x x n f X x 1 n u 0 u n Φ X u
  • Scaling If Y a X , f Y y f X x a from this equation in our previous discussion of functions of random variables, then
    Φ Y u y u y f Y y x u a x f X x Φ X a u

Characteristic function of a gaussian pdf

The Gaussian or normal distribution is very important, largely because of the Central Limit Theorem which we shall prove below. Because of this (and as part of the proofof this theorem) we shall show here that a Gaussian pdf has a Gaussian characteristic function too.

A Gaussian distribution with mean μ and variance σ 2 has pdf:

f x 1 2 σ 2 x μ 2 2 σ 2
Its characteristic function is obtained as follows, using a trick known as completing the square of the exponent:
Φ X u u x x u x f X x 1 2 σ 2 x x 2 2 μ x μ 2 2 σ 2 u x 2 σ 2 1 2 σ 2 x x μ u σ 2 2 2 σ 2 2 u σ 2 μ u 2 σ 4 2 σ 2 u μ u 2 σ 2 2
since the integral in brackets is similar to a Gaussian pdf and integrates to unity.

Thus the characteristic function of a Gaussian pdf is also Gaussian in magnitude, u 2 σ 2 2 , with standard deviation 1 σ , and with a linear phase rotation term, u μ , whose rate of rotation equals the mean μ of the pdf. This coincides with standard results from Fourier analysis of Gaussianwaveforms and their spectra (e.g. Fourier transform of a Gaussian waveform with time shift).

Summation of two or more gaussian random variables

If two variables, X 1 and X 2 , with Gaussian pdfs are summed to produce X , their characteristic functions will be multiplied together (equivalent toconvolving their pdfs) to give

Φ X u Φ X 1 u Φ X 2 u u μ 1 μ 2 u 2 σ 1 2 σ 2 2 2
This is the characteristic function of a Gaussian pdf with mean ( μ 1 μ 2 ) and variance ( σ 1 2 σ 2 2 ).

Further Gaussian variables can be added and the pdf will remain Gaussian with further terms added to the aboveexpressions for the combined mean and variance.

Central limit theorem

The central limit theorem states broadly that if a large number N of independent random variables of arbitrary pdf, but with equal variance σ 2 and zero mean, are summed together and scaled by 1 N to keep the total energy independent of N , then the pdf of the resulting variable will tend to a zero-mean Gaussian with variance σ 2 as N tends to infinity.

This result is obvious from the previous result if the input pdfs are also Gaussian , but it is the fact that it applies for arbitrary input pdfs that is remarkable, and is the reason for the importance of the Gaussian (or normal) pdf. Noise generated in nature isnearly always the result of summing many tiny random processes (e.g. noise from electron energy transitions in a resistor ortransistor, or from distant worldwide thunder storms at a radio antenna) and hence tends to a Gaussian pdf.

Although for simplicity, we shall prove the result only forthe case when all the summed processes have the same variance and pdfs, the central limit result is more general than this and applies in many caseseven when the variance and pdfs are not all the same.

Proof:

Let X i ( i 1 to N ) be the N independent random processes, each will zero mean and variance σ 2 , which are combined to give

X 1 N i 1 N X i
Then, if the characteristic function of each input process before scaling is Φ u and we use to include the scaling by 1 N , the characteristic function of X is
Φ X u i 1 N Φ X i u N Φ u N N
Taking logs:
Φ X u N Φ u N
Using Taylor's theorem to expand Φ u N in terms of its derivatives at u 0 (and hence its moments) gives
Φ u N Φ 0 u N Φ 0 1 2 u N 2 2 Φ 0 1 6 u N 3 3 Φ 0 1 24 u N 4 4 Φ 0
From the Moments property of characteristic functions with zero mean:
  • valid pdf Φ 0 X i 0 1
  • zero mean Φ 0 X i 0
  • variance 2 Φ 0 2 X i 2 σ 2
  • scaled skewness 3 Φ 0 3 X i 3 γ σ 3
  • scaled kurtosis 4 Φ 0 4 X i 4 κ 3 σ 4
These are all constants, independent of N , and dependent only on the shape of the pdfs f X i .

Substituting these moments into and and using the series expansion, 1 x x + (terms of order x 2 or smaller), gives

Φ X u N Φ u N N 1 u 2 2 N σ 2 ** N u 2 σ 2 2 N ** u 2 σ 2 2 ##
where ** represents the terms of order N 3 2 or smaller and ## represents the terms of order N 1 2 or smaller. As N , Φ X u u 2 σ 2 2 Therefore, as N
Φ X u u 2 σ 2 2
Note that, if the input pdfs are symmetric, the skewness will be zero and the error terms will decay as N -1 rather than N 1 2 ; and so convergence to a Gaussian characteristic function will be more rapid.

Hence we may now infer from , and that the pdf of X as N will be given by

f X x 1 2 σ 2 x 2 2 σ 2
Thus we have proved the required central limit result .

shows an example of convergence when the input pdfs are uniform, and N is gradually increased from 1 to 50 . By N 12 , convergence is good, and this is how some 'Gaussian' random generator functions operate - by summingtypically 12 uncorrelated random numbers with uniform pdfs.

For some less smooth or more skewed pdfs, convergence can be slower, as shown for a highly skewed triangular pdf in ; and pdfs of discrete processes are particularly problematic in this respect, asillustrated in .

Convergence toward a Gaussian pdf (Central Limit Theorem) for 3 different input pdfs for N 1 to 50 . Note that the uniform pdf (a) with smallest higher-order moments convergesfastest. Curves are shown for N 1 2 3 4 6 8 10 12 15 20 30 50 .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Random processes. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10204/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Random processes' conversation and receive update notifications?

Ask